With VR now an increasingly accessible consumer technology its potential uses for healthcare are vast. Researchers can acquire current high-end VR headsets for the purposes of research at an increasingly reduced cost. Apps that claim therapeutic benefits can be purchased from online stores, as well as social apps that may offer patient-therapist interaction. The suitability of many of these apps may not meet a patient’s needs and, often it is more appropriate and clinically useful to design an experience from the ground up with clear and targeted clinical outcomes in mind. VR hardware requires little technical training to set up and run, however the software to create and implement VR experiences requires a greater amount of practice and know-how to use effectively. Understanding the effects of VR on health outcomes is often carried out by social researchers/scientists; reflexivity is an important part of a social researcher’s work, that is, the capacity to understand their own influence on the person or phenomena being studied [1].
The aim of this paper is to consider our personal influence and the input of patients over design and development of a VR tool to improve mental health. We present a VR experience that was created by a social researcher with support from a wider research team. We discuss and justify the choices we made in the process of creating the VR experience and consider their impact on the overall intervention.
Clinical outcomes and intervention design
We aimed to use VR to improve mental wellbeing outcomes. A critical review was carried out in order to understand potential gaps in literature and guide the study design [4]. The majority of identified literature targeted anxiety, and there was little research on the use of VR to address depression, stress and general wellbeing. Stress reduction has been demonstrated in previous VR studies [5] and its connection to general wellbeing [6] [7] provides further justification for the inclusion of wellbeing as an outcome. Finally, we consider it useful to measure levels of ‘presence’ felt by patients in the VR environment. In the context of a virtual environment, presence is the degree to which the user feels situated in that environment over the real world [8]. For example a user in a virtual environment whose hand movements are visible and accurately tracked should feel a greater level of presence compared to someone who does not have their hands tracked in the environment. Whilst it is now widely accepted that there is an association between presence and fear in VR, its wider implication on the efficacy of clinical outcomes is still contested [9] [10]. We also aimed then to assess whether presence has an effect on the efficacy of VR interventions designed for health and wellbeing.
An appropriate context for the use of VR to reduce anxiety and improve mental wellbeing is in secondary care. Attending hospital is known to be stressful, and sitting in a waiting room can increase the anxiety in the patient awaiting examination or clinical procedure [11] [12]. Moreover, the time a patient spends waiting for an appointment offers researchers a chance to implement an intervention at an optimal moment without requiring the patient to make a separate visit to the health centre/ research location. For a pilot study, we looked to recruit patients suffering from a chronic condition which is associated with poor mental wellbeing, such as diabetes. Three in five people with diabetes report feeling low mood due to living with their condition [13], with anxiety and, depression prevalent and emotional wellbeing affected (14).
Mindfulness based interventions have shown promising results for emotional wellbeing amongst people with diabetes, as well as aiding glycaemic control [15] [16]. With this in mind we aimed to add a mindfulness component to our VR experience. Initially we planned to aim the intervention at young people as feedback from diabetes specialist nurses (DSN) stated that often it is this age group who have the least experience in dealing with a chronic condition and are often the ones in need of greatest support. However, due to the limited demographic numbers in our rural research area (Scottish Highlands and Islands) and positive feedback from local elderly citizens to VR exposure, we decided to widen the inclusion criteria to all age groups. The diabetes centre operated by the National Health Service (NHS) allows DSNs to assist patients who are struggling with controlling their condition. DSNs were identified as the gatekeepers to advertise our intervention to patients who may be at risk of poor mental wellbeing due to their long-term condition.
Hardware Choice
In this study the VR environment is viewed, heard and interacted with via the Oculus Rift head mounted display (HMD). The Rift contains two touch censored controllers and tracks users’ movements via a six degrees of freedom (6DOF), which is then fed into a PC via USB connected sensors. The HMD itself has a display resolution of 1080 × 1200 per eye, a 90 Hz refresh rate and a field of view of 110°. At the time of choosing which HMD to use, the Oculus Rift was the most used HMD (used by 46.39% of VR users on Steam (the largest digital distributor for PC gaming) [17]. We chose computer-based VR over mobile VR despite the extra costs associated with this type of VR, the reason being that in a fast-developing field, what is the norm for PC-based VR will be in the future, implemented in mobile VR.
In our study, the Oculus Rift was connected to a desktop computer via HDMI and USB, the specifications of the computer used were: A Windows operated intel i7-7700 k processor, an 8 GB Nvidia GeForce GTX 1080 graphical processing unit, 16 GB of DDR4 RAM and a 500 GB solid state drive.
Software Choice
At the initial research design stage the research team intended to seek external software development support owing to time constraints. We received demos from a couple of small VR companies who specialised in building VR environments for simulation and other educational purposes. Numerous virtual health-related studies have been carried out on Second Life, a 2D virtual social space [18]. We trialled the follow up to Second Life, a VR compatible program called Sansar [19]. The downside to using any of these was the general lack of assets options to build a customised environment (3D models (static meshes), texture packs, animations, foliage, sound effects, special effects, heads up displays), the smaller companies would charge a fee and it was felt these programs were not utilizing the full graphical power that our computer was capable of, or what we wanted to achieve with the concept of presence. Based on feedback from focus groups, we wanted to have enough asset choice to create the audio and visual components required for a relaxing and comfortable environment iteratively. We aimed to make patients temporarily forget about the hospital setting they were in by creating such an environment with a high level of virtual presence.
Pre-made apps (NatureTreksVR [20] and Guided Meditation VR [21]) available on both Oculus and Steam store were also tested by the research team and focus groups with, public health staff, psychologists and elderly citizens which were undertaken to test some of our assumptions around VR. These assumptions were based around age as an indicator of acceptability towards VR, and people’s anxiety towards trying VR for the first time. One of our key findings from the focus groups we ran with VR apps was that the preferences of the user were very personal. We learned that using a pre-made app with existing pre-determined user options would not give us the flexibility and personalisation element we wished to offer potential participants.
We then turned our attention to larger well-known game development software programs, namely Unity [22] and Unreal Engine (UE) [23]. Both are used commonly, and both have communities that offer starter assets, forums to help new developers and a wide range of paid and free tutorials. A sufficient period of time was spent on familiarisation with both pieces of software. Unity uses the coding language C# and offers a visual scripting service for those unfamiliar with coding. Alternatively, UE uses C + + which some argue is a slightly harder to learn version of the C language [24] [25], however games can also be made using UE’s blueprint models. For a social scientist with very limited coding experience, the blueprints were quicker to understand and a more visual learning process. Moreover, it became apparent that it was easier to make a more visually pleasing environment with UE. A generous number of free assets and pre-made VR blueprints were available ready to be placed into any newly created made environment, supported by a large online community. We decided then that UE gave us the best chance of creating what the current literature suggested as most suitable for improved mental wellbeing outcomes and what we set out to achieve in an iterative manner. The aspects of learning the UE software, including environment building, lighting, post process effects, inserting audio meditation and participant chosen music are described below.
Learning the Software
As is often the case with software development, there can be multiple routes to the same goal, a section of code can be written in different forms but ultimately serve the same purpose. The same applies for the blueprint method within UE. In social research there is an emphasis on the reliability of the methodology, put simply, if the results of the study can be reproduced under similar conditions [26]. Whilst we aren’t assessing the reliability of a research instrument, we are presenting a method of design, which should be replicable for others with the same level of experience. The learning of the software coincided with the wider research process which is illustrated in diagram 1. A working beta ready for pilot testing was developed over a six-month timeframe, and multiple environments were made during this period. Initial training came from an online tutorial by Online Media Tutor [27]. At 4 hours and 26 minutes long the tutorial guided viewers through the tools and interface, starting up a project, importing static meshes, creating landscapes, importing and applying textures, creating materials for static meshes, using foliage tools to apply grass and trees, creating and using lights and post processing effects. The uploader supplied assets and textures to complete the video. This tutorial acted as the foundation and structure for learning the engine and was viewed multiple times by the researcher, first to complete the tutorial exactly as instructed, and thereafter with every following attempt new approaches were piloted including, introducing new static meshes, new foliage and new textures and materials. (A list of the resources is provided in appendix 1). The repetition of the tutorial was essential for being able to navigate the user interface with confidence and without instruction. By the end of the process we had made our own virtual environment that consisted of a woodland area and a beach area. To further improve the visual style of the environment we re-visited the lighting and visual effects. Some aspects of the original tutorial video were helpful, particularly the post processing effects, but as the lighting is such an important aspect of everyday life and in video games, we looked for further guidance. An educational online video from the UE page. ‘Volumetric Fog and Lighting in Unreal Engine 4 | GDC 2018’ [28] provided insight and examples in the configuration and understanding of volumetric fog, volume materials, particle systems, and advanced tricks with volume materials.
Navigation was implemented via a teleportation technique that came as standard with the UE VR setup [29]. Users hold down the thumbstick – on the Oculus Rift touch controller - to bring up a teleporter which they then direct at an area they wish to move to. Upon releasing the thumbstick they then move to that area. At the end of the teleporter is a pointer, users need to angle the thumbstick while holding it down to dictate which direction the pointer faces, this in turn dictates which direction the user faces when the teleportation is complete. The teleporter being used can be seen in the image below.
We found a tutorial [30] that changed the navigation to a more simpler method where users were only required to push the thumbstick in the direction they wished to move in, we chose to stick with the teleportation as we believed it would be less likely to invoke motion sickness.
Part of our intervention contained the use of audio files, a ‘music corner’ where users can play their favourite music, and a mindfulness beach which played guided meditation audio clips. We were given permission to use these meditation clips from psychologists at a local psychiatric hospital, who used them as part of mindfulness courses. They were based around meditation of the breath. We wanted to keep the controller inputs as simple as possible. Trigger boxes were used to trigger events via a tutorial on the implementation of web browsers within UE [31], we wanted to repeat this for audio in a similar way. After some brainstorming with blueprints we found a way to play music without using a controller input button but we were unable to stop the music. We found the answer on UE’s ‘Answer Hub’ section of their website from a question from another user [32].
The use of the flipflop blueprint acted as the stop-start button. In the above example, the user is able to do that by pressing the key ‘F’. In our version which does not use controller input the user walks into the trigger or puts their virtual hand into the trigger area via a waving motion to play the song or guided meditation. This process is repeated to stop the audio.
The AnswerHub and other forums including the UE subreddit provided useful information [33], which helped our development and resolve problems. The importance of community resources and support cannot be overstated.
Finally, the default VR hands provided in the UE VR blueprint appeared robotic. We purchased the static meshes for human hands from the UE marketplace, then followed an online [34] tutorial that set up the meshes with the same functionality that the original UE VR blueprint provided. Early feedback from stakeholders brought attention to the unrealistic ‘robotic’ like default VR hands, this change was made to address that.