Designing and Implementing a VR Mental Wellbeing Experience using Digital Creative Practice.

Background An increasing amount of evidence is emerging surrounding virtual reality’s (VR) potential as a clinical and therapeutic tool for mental wellbeing treatment. The design and creation of VR software requires a skillset vastly different to that required to evaluate its effect on mental wellbeing outcomes. In this article we present a VR experience designed to improve mental wellbeing in patients in secondary care. The resulting piece of software is a VR intervention that combines nature and mindfulness, allowing patients with diabetes to spend time in a calming environment before regular clinic appointments with diabetes specialist nurses. Some initial insights from patient feedback are provided. Conclusion This piece of research demonstrates that engagement with creative practice allows researchers to identify research problems and find solutions in a creative manner. It illustrates how someone with no prior experience of software design is able to create and implement a VR experience, used in a real-world setting, targeted at improving patient wellbeing. By documenting this process, we hope to encourage more social researchers to create digital tools aimed at tackling health issues and to create a base of knowledge that bridges the gap between the computer and social sciences.


Results
The resulting piece of software is a VR intervention that combines nature and mindfulness, allowing patients with diabetes to spend time in a calming environment before regular clinic appointments with diabetes specialist nurses. Some initial insights from patient feedback are provided.

Conclusion
This piece of research demonstrates that engagement with creative practice allows researchers to identify research problems and find solutions in a creative manner. It illustrates how someone with no prior experience of software design is able to create and implement a VR experience, used in a realworld setting, targeted at improving patient wellbeing. By documenting this process, we hope to encourage more social researchers to create digital tools aimed at tackling health issues and to create a base of knowledge that bridges the gap between the computer and social sciences.

Background
With VR now an increasingly accessible consumer technology its potential uses for healthcare are vast. Researchers can acquire current high-end VR headsets for the purposes of research at an increasingly reduced cost. Apps that claim therapeutic benefits can be purchased from online stores, as well as social apps that may offer patient-therapist interaction. The suitability of many of these apps may not meet a patient's needs and, often it is more appropriate and clinically useful to design an experience from the ground up with clear and targeted clinical outcomes in mind. VR hardware requires little technical training to set up and run, however the software to create and implement VR experiences requires a greater amount of practice and know-how to use effectively. Understanding 3 the effects of VR on health outcomes is often carried out by social researchers/scientists; reflexivity is an important part of a social researcher's work, that is, the capacity to understand their own influence on the person or phenomena being studied [1].
The aim of this paper is to consider our personal influence and the input of patients over design and development of a VR tool to improve mental health. We present a VR experience that was created by a social researcher with support from a wider research team. We discuss and justify the choices we made in the process of creating the VR experience and consider their impact on the overall intervention.

Creative Practice
This piece of research uses creative practice to further the contribution of knowledge in the fields of digital health and mental wellbeing. To understand what we mean by creative practice we must first acknowledge practice-led research. Practice-led research encompasses identifying problems and research questions followed by a method, contextual and output driven focus on creative practice [2].
The creative arts have used practice led research as a form of research that generates data, insights and knowledge through the creation of art [2]. In practice led research, the creative artefact acts as the basis for the contribution of knowledge [3]. As we are designing the intervention there is a process involved where we move from our research question/problem to the output of a VR programme. In this piece of research the VR intervention is the creative artefact. By being reflective and documenting the journey we can generate critical discussion that informs existing knowledge.
These insights can then be documented, generalised and used to develop new or inform established theories.

Clinical outcomes and intervention design
We aimed to use VR to improve mental wellbeing outcomes. A critical review was carried out in order to understand potential gaps in literature and guide the study design [4]. The majority of identified literature targeted anxiety, and there was little research on the use of VR to address depression, stress and general wellbeing. Stress reduction has been demonstrated in previous VR studies [5] and its connection to general wellbeing [6] [7] provides further justification for the inclusion of wellbeing 4 as an outcome. Finally, we consider it useful to measure levels of 'presence' felt by patients in the VR environment. In the context of a virtual environment, presence is the degree to which the user feels situated in that environment over the real world [8]. For example a user in a virtual environment whose hand movements are visible and accurately tracked should feel a greater level of presence compared to someone who does not have their hands tracked in the environment. Whilst it is now widely accepted that there is an association between presence and fear in VR, its wider implication on the efficacy of clinical outcomes is still contested [9] [10]. We also aimed then to assess whether presence has an effect on the efficacy of VR interventions designed for health and wellbeing.
An appropriate context for the use of VR to reduce anxiety and improve mental wellbeing is in secondary care. Attending hospital is known to be stressful, and sitting in a waiting room can increase the anxiety in the patient awaiting examination or clinical procedure [11] [12]. Moreover, the time a patient spends waiting for an appointment offers researchers a chance to implement an intervention at an optimal moment without requiring the patient to make a separate visit to the health centre/ research location. For a pilot study, we looked to recruit patients suffering from a chronic condition which is associated with poor mental wellbeing, such as diabetes. Three in five people with diabetes report feeling low mood due to living with their condition [13], with anxiety and, depression prevalent and emotional wellbeing affected (14).
Mindfulness based interventions have shown promising results for emotional wellbeing amongst people with diabetes, as well as aiding glycaemic control [15] [16]. With this in mind we aimed to add a mindfulness component to our VR experience. Initially we planned to aim the intervention at young people as feedback from diabetes specialist nurses (DSN) stated that often it is this age group who have the least experience in dealing with a chronic condition and are often the ones in need of greatest support. However, due to the limited demographic numbers in our rural research area (Scottish Highlands and Islands) and positive feedback from local elderly citizens to VR exposure, we decided to widen the inclusion criteria to all age groups. The diabetes centre operated by the National Health Service (NHS) allows DSNs to assist patients who are struggling with controlling their condition. DSNs were identified as the gatekeepers to advertise our intervention to patients who may 5 be at risk of poor mental wellbeing due to their long-term condition.

Hardware Choice
In this study the VR environment is viewed, heard and interacted with via the Oculus Rift head mounted display (HMD). The Rift contains two touch censored controllers and tracks users' movements via a six degrees of freedom (6DOF), which is then fed into a PC via USB connected sensors. The HMD itself has a display resolution of 1080 × 1200 per eye, a 90 Hz refresh rate and a field of view of 110°. At the time of choosing which HMD to use, the Oculus Rift was the most used HMD (used by 46.39% of VR users on Steam (the largest digital distributor for PC gaming) [17]. We chose computer-based VR over mobile VR despite the extra costs associated with this type of VR, the reason being that in a fast-developing field, what is the norm for PC-based VR will be in the future, implemented in mobile VR.
In our study, the Oculus Rift was connected to a desktop computer via HDMI and USB, the specifications of the computer used were: A Windows operated intel i7-7700 k processor, an 8 GB Nvidia GeForce GTX 1080 graphical processing unit, 16 GB of DDR4 RAM and a 500 GB solid state drive.

Software Choice
At the initial research design stage the research team intended to seek external software development support owing to time constraints. We received demos from a couple of small VR companies who specialised in building VR environments for simulation and other educational purposes. Numerous virtual health-related studies have been carried out on Second Life, a 2D virtual social space [18]. We trialled the follow up to Second Life, a VR compatible program called Sansar [19]. The downside to using any of these was the general lack of assets options to build a customised environment (3D models (static meshes), texture packs, animations, foliage, sound effects, special effects, heads up displays), the smaller companies would charge a fee and it was felt these programs were not utilizing the full graphical power that our computer was capable of, or what we wanted to achieve with the concept of presence. Based on feedback from focus groups, we wanted to have enough asset choice to create the audio and visual components required for a relaxing and 6 comfortable environment iteratively. We aimed to make patients temporarily forget about the hospital setting they were in by creating such an environment with a high level of virtual presence.

Pre-made apps (NatureTreksVR [20] and Guided Meditation VR [21]) available on both Oculus and
Steam store were also tested by the research team and focus groups with, public health staff, psychologists and elderly citizens which were undertaken to test some of our assumptions around VR.
These assumptions were based around age as an indicator of acceptability towards VR, and people's anxiety towards trying VR for the first time. One of our key findings from the focus groups we ran with VR apps was that the preferences of the user were very personal. We learned that using a pre-made app with existing pre-determined user options would not give us the flexibility and personalisation element we wished to offer potential participants.
We then turned our attention to larger well-known game development software programs, namely Unity [22] and Unreal Engine (UE) [23]. Both are used commonly, and both have communities that offer starter assets, forums to help new developers and a wide range of paid and free tutorials. A sufficient period of time was spent on familiarisation with both pieces of software. Unity uses the coding language C# and offers a visual scripting service for those unfamiliar with coding.
Alternatively, UE uses C + + which some argue is a slightly harder to learn version of the C language [24] [25], however games can also be made using UE's blueprint models. For a social scientist with very limited coding experience, the blueprints were quicker to understand and a more visual learning process. Moreover, it became apparent that it was easier to make a more visually pleasing environment with UE. A generous number of free assets and pre-made VR blueprints were available ready to be placed into any newly created made environment, supported by a large online community. We decided then that UE gave us the best chance of creating what the current literature suggested as most suitable for improved mental wellbeing outcomes and what we set out to achieve in an iterative manner. The aspects of learning the UE software, including environment building, lighting, post process effects, inserting audio meditation and participant chosen music are described below.
Learning the Software 7 As is often the case with software development, there can be multiple routes to the same goal, a section of code can be written in different forms but ultimately serve the same purpose. The same applies for the blueprint method within UE. In social research there is an emphasis on the reliability of the methodology, put simply, if the results of the study can be reproduced under similar conditions [26]. Whilst we aren't assessing the reliability of a research instrument, we are presenting a method of design, which should be replicable for others with the same level of experience. The learning of the software coincided with the wider research process which is illustrated in diagram 1. A working beta ready for pilot testing was developed over a six-month timeframe, and multiple environments were made during this period. Initial training came from an online tutorial by Online Media Tutor [27]. At 4 hours and 26 minutes long the tutorial guided viewers through the tools and interface, starting up a project, importing static meshes, creating landscapes, importing and applying textures, creating materials for static meshes, using foliage tools to apply grass and trees, creating and using lights and post processing effects. The uploader supplied assets and textures to complete the video. This tutorial acted as the foundation and structure for learning the engine and was viewed multiple times by the researcher, first to complete the tutorial exactly as instructed, and thereafter with every following attempt new approaches were piloted including, introducing new static meshes, new foliage and new textures and materials. (A list of the resources is provided in appendix 1). The repetition of the tutorial was essential for being able to navigate the user interface with confidence and without instruction. By the end of the process we had made our own virtual environment that consisted of a woodland area and a beach area. To further improve the visual style of the environment we re-visited the lighting and visual effects. Some aspects of the original tutorial video were helpful, particularly the post processing effects, but as the lighting is such an important aspect of everyday life and in video games, we looked for further guidance. An educational online video from the UE page.
'Volumetric Fog and Lighting in Unreal Engine 4 | GDC 2018' [28] provided insight and examples in the configuration and understanding of volumetric fog, volume materials, particle systems, and advanced tricks with volume materials.
Navigation was implemented via a teleportation technique that came as standard with the UE VR 8 setup [29]. Users hold down the thumbstick -on the Oculus Rift touch controller -to bring up a teleporter which they then direct at an area they wish to move to. Upon releasing the thumbstick they then move to that area. At the end of the teleporter is a pointer, users need to angle the thumbstick while holding it down to dictate which direction the pointer faces, this in turn dictates which direction the user faces when the teleportation is complete. The teleporter being used can be seen in the image below.
We found a tutorial [30] that changed the navigation to a more simpler method where users were only required to push the thumbstick in the direction they wished to move in, we chose to stick with the teleportation as we believed it would be less likely to invoke motion sickness.
Part of our intervention contained the use of audio files, a 'music corner' where users can play their favourite music, and a mindfulness beach which played guided meditation audio clips. We were given permission to use these meditation clips from psychologists at a local psychiatric hospital, who used them as part of mindfulness courses. They were based around meditation of the breath. We wanted to keep the controller inputs as simple as possible. Trigger boxes were used to trigger events via a tutorial on the implementation of web browsers within UE [31], we wanted to repeat this for audio in a similar way. After some brainstorming with blueprints we found a way to play music without using a controller input button but we were unable to stop the music. We found the answer on UE's 'Answer Hub' section of their website from a question from another user [32].
The use of the flipflop blueprint acted as the stop-start button. In the above example, the user is able to do that by pressing the key 'F'. In our version which does not use controller input the user walks into the trigger or puts their virtual hand into the trigger area via a waving motion to play the song or guided meditation. This process is repeated to stop the audio.
The AnswerHub and other forums including the UE subreddit provided useful information [33], which helped our development and resolve problems. The importance of community resources and support cannot be overstated.
Finally, the default VR hands provided in the UE VR blueprint appeared robotic. We purchased the static meshes for human hands from the UE marketplace, then followed an online [34] tutorial that 9 set up the meshes with the same functionality that the original UE VR blueprint provided. Early feedback from stakeholders brought attention to the unrealistic 'robotic' like default VR hands, this change was made to address that.

Implementation Ethics and Training
All participants were given a content form to be completed and returned before their first session after giving written and verbal consent to take part in the study. This allowed participants to request specific music and provide us with personal pictures to personalise their experience. We outlined the type of songs and pictures we recommended patients choose in the guidance document provided. For example, when considering what pictures to choose we asked that participants think about something/someone inspirational to them, a picture of a proud achievement, a hobby or a fond place, as well as pictures of loved ones, family or friends. This was designed in line with positive psychology to avoid distress for participants and increase relaxing effect of the VR environment.
We also offered a tutorial session and a handbook to provide participants with help should they need it; a detailed copy of the handbook was provided next to the VR setup, and a shorter version was available for participants to take home with them. The handbook provided the following information: an aerial view map that highlighted the key areas (music, pictures and meditation), their starting point when entering the VE, landmarks to help navigate when in the environment; a controller guide that explained how to navigate the environment, and the role of the Oculus home button which allowed participants to return to the intervention. Finally, we included a resources page which provided links to the meditation clips used in the intervention as well as sources' mental health advice.

Stakeholder and Patient input
Input into the design of the VR intervention was split between stakeholders with specific professional expertise and volunteering patients. Focus groups with stakeholders identified secondary care (diabetes) and nature to be at the centre of the intervention. Design changes and user improvements came from pilot testing and initial implementation of the experience in a diabetes clinic, input received from the two latter phases will now be discussed further.

Pilot Testing
The aim of pilot testing was to improve overall usability and acceptability of the intervention, using the two-approach model set by Nigel Bevan [35]. The 'bottom up' approach identifies simply the ease of use (time to learn the software and technical glitches) and the 'top-down' approach where usability is interpreted as the ability to use the software for its intended purpose. This phase was conducted with healthcare and research staff from different backgrounds (psychology, neurology, diabetes and immunology) and PhD students from the University of the Highlands and Islands. No diabetes patients were involved at this stage to prevent weakening our recruitment numbers at the next phase, initial implementation in the clinic.
The first major usability problem to be noted was the difficulties and time taken in learning how to use the software. To date, we used a handbook which contained instructions on how to use the Oculus touch controllers via a diagram and written instructions on how the teleportation worked. Without seeing the teleportation in action, it was hard for users to visualise it before they tried it, a verbal and written instruction was not sufficient enough. To overcome this a video was created which displayedsimultaneously -someone using the touch controllers and how this corresponded in the virtual environment. It was decided that the two videos that focused on basic navigation and triggering audio would be implemented into the tutorial session for patients.
Some users found it difficult to find the mindfulness area of the beach, as this was a vital area of the intervention, it was important to make finding it easier. For the pilot testing the audio interaction circles were placed on ground level, on the sand. To make them easier to find they were propped up on top of some brickwork with a stair entrance as shown below.
To further improve navigation and direction, clearer signposting was used to show key areas of the map. This was done via a UK road traffic like sign, a method of navigating that is familiar with potential users.
Two technical issues became apparent, firstly distant trees flickering. Trees had their level of detail reduced when they were far away to optimise computer performance. The flicking caused by changing level of detail proved to be distracting as reported by multiple participants, as a result all trees were set to keep the same level of detail regardless of their distance from the user. Moreover, some users were teleporting inside some rock meshes and getting stuck inside them, this also happened with a cabin in the environment which was placed for the purpose of decoration only.
Changes to navigation bounds prevented users from getting stuck inside the cabin and some problematic rocks were removed.
Responses to the meditation were positive, with users saying they felt they were being used appropriately in this context. Conflicting views occurred about the content of the meditation clips, one user stated they felt that using a meditation that focused on meditation of the breath -as used -was less appropriate for a relaxing VR environment and that a meditation that utilises the views and scenery of the environment may be more suitable. However, the diabetes staff noted that the mindfulness used was a good method of pulling patients back to reality and out of the headspace many often live in when thinking about their condition. No changes to the meditation were made following the pilot testing, although some users stated they preferred to listen to the sound of the beach instead. To offer a further alternative between the sounds of the environment and the meditation, a relaxing instrumental audio clip was added to the meditation area of the beach.
Following a period of pilot testing, the study progressed to the initial implantation of the intervention within a diabetes clinic with real patients. Patients were recruited through diabetes specialist nurses.
The next section describes feedback from the first eight patients to have completed the intervention.
Each patient was asked to complete a tutorial session followed by a minimum of three further sessions each before a patient was expected to have a regular diabetes consultation. A session consisted of 20 minutes of VR exposure and 10 minutes for questionnaires. As a result, each patient would have had a minimum of one-hour and twenty minutes exposure in VR. The feedback has been collected by observations from the researcher as the patient used the equipment and from discussions with the patient after each session.

Initial feedback from Diabetes Patients
In its current form the intervention is what would be described in video game jargon as 'open world' -even if on a small scale -as patients are able to openly explore any part of the environment at any one time. Patients can interact with any of the audio clips as they choose to, or not at all. For some users this was ideal as they made use of this freedom and quickly identified their preferences. For others, the freedom was at times more confusing. Some patients asked for instructions on what to do next after listening to a meditation clip, questions such as "do you want me to move around now?" and "what do I do next?" were asked in some of the earlier sessions from certain users. Some patients may have benefit more from a linear experience with a clearer progression path, or even a more passive experience. However, in the later sessions these patients tended to become more accustomed to the open world setting.
The videos used as part of the tutorial were beneficial for patients, less of a verbal explanation was now needed and the lead researcher could spend more time assessing how well the patient was learning the controls. Some slight difficulties continued with using the pointer of the teleporter -by angling the thumbstick -to dictate which direction the patient faced after teleporting. All patients were using a swivel chair with a 360 degree rotation, this proved a good addition to the setup, a number of patients used the chairs rotation to quickly change direction in the virtual environment, this lessened the pressure of perfecting the pointer on the teleporter as they could change direction without using it. There was an even split between users who remained completely still on the chair and those who used the chair to navigate throughout the sessions. All patients chose to navigate with one controller, using only their stronger hand. The second Oculus Touch controller was not used by any of the patients. The teleportation method of navigating may have been more difficult to learn for patients compared with simply pushing the thumbstick in the direction the patient wanted to move.
However, we chose the teleportation as we believed it to be the most comfortable form of navigation in terms of motion sickness; this decision was justified as none of the users in both the pilot study and the exploratory study in the diabetes clinic reported any symptoms of motion sickness.
Some patients struggled initially to tell whether audio was playing or not, and subsequently would make multiple attempts at triggering the audio. This meant that in some cases the audio was being stopped by the patient as they were not aware it had been triggered successfully. A lighting system -13 using red for off and green for on -was implemented so that patients could easily identify whether they had triggered audio, as shown below. As we did not know how to trigger two events at once (audio and colour change) we put up a thread on the Unreal Engine Reddit page asking for assistance.
Within one hour we had been given instruction on how to trigger two events at once using blueprints, we were then able to work out the rest of blueprint arrangement based on our existing knowledge.
This was the only technical change made to the intervention after recruitment of patients had started.
Further patient feedback and observations indicated a clear preference for the beach area over the woodland area. All patients chose to stand on the beach and look out towards the water when listening to the meditation or relaxing music. Nearly all patients remained static while listening to the audio, putting their full attention towards it rather than navigating the environment. There was no clear preference between the meditation, relaxing music or environment sounds. Some users reported deep relaxation with the meditation, others commented that they preferred to listen to the sound of the waves, the freedom to choose was beneficial for individual preferences. The content form provided to patients allowing them to personalise the environment with their own music or pictures was not used by any of the patients. The freedom that the open world setting provided seemed to offer enough personal choice without the need for further options.

Technical limitations of our intervention
A VR environment can be refined indefinitely. However, with a finished working version of the experience we can reflect on its current technical limitations. Firstly, the functionality of the touch controllers has not been fully utilised. Each Oculus touch controller has two triggers, one for the index finger and one for the ring finger, the face of the controller also has a motion sensor for detecting the user's thumbs. This allows the controller to track five types of hand movements. It is capable of pointing at something via the index trigger, giving a 'thumbs up' gesture via the surface sensor, the secondary trigger for the ring finger allows a grabbing motion, if all of these are in use the hand will clench and if none are triggered the hand gives an open motion. However, these are not programmed into our inputs, we only have one grabbing motion via the index trigger. This could be argued as a missed opportunity in our pursuit of presence. However, the hand tracking of the Oculus Rift is implemented well enough for this not to be a serious issue, moreover patients choosing to only use one controller/hand meant creating the illusion of both hands was no longer as important as we anticipated.
To ensure consistent frame rate, avoiding movement judder and screen tearing, we chose to turn off the casting of shadows by some of the foliage. It is possible when building the lighting -which results in the correct placement of shadows -to consider every piece of grass in that process as visible in example below.
A number of smaller flowers and plants were also left without shadows, however the added effect of tree shadows makes this less noticeable. Finally, we added two deer to the environment in the woodland area. Animals appearing and acting as they would in the real world was thought to add to the overall level of presence we were aiming to achieve. Their movements are minimal, limited to changing between static and eating animations, in future versions of the environment these and more animals would roam -with boundaries -parts of the environment. Users are unable to get close up to the deer which mitigates some of limitations of the deer animations. able to look further into patient preferences in a virtual environment. We hope to publish the full results of this study soon. Future work may incorporate similar versions of the software on mobile HMDs such as Oculus Quest. We have used a diabetes clinic for the setting of this research study, the software has been designed in a way that it can be applicable for various secondary care settings and is not limited to one specific condition. In the future we hope to carry out a randomised control study using this software as well as expanding into further secondary care settings.

Conclusions
We have demonstrated that it is possible to learn and develop a virtual environment aimed at improving mental wellbeing outcomes with game engine software and a VR HMD, without previous experience of doing so. By engaging in creative practice we were able to identify a research problem via stakeholder discussion and patient input, design and create a digital solution from the bottom up, then simultaneously liaise with stakeholders and end users to produce a VR intervention that tackles the problem identified at the start of the process iteratively. During this process the creative journey is recorded and linked to the wider research problem. For the scope of our project carrying out this process with a social researcher's background was possible. However more complex interventions particularly those that wish to integrate sophisticated AI into the environment may not be achievable without expert assistance. Our choice of software came down to ease of use, costs and speed of development. The time saved making the environment has been put towards learning C + + and future work would look to make use of both code and the blueprint system.
Creating a bespoke VR environment has allowed us to develop an intervention suited to our specific context and enabled a more robust control over the research conditions, potentially contributing to more clinically valid outcomes. By sharing the learning process and resources we used to fulfil our research needs, we hope that other researchers -who have not come from a games design background -will be motivated to try it themselves, and help add to the growing evidence of the potential of VR interventions for improved mental wellbeing.

Ethical approval and consent
Ethical approval from the NHS Research Ethics Committee via the Integrated Research Application System (IRAS) for research to be conducted on an NHS site and from the University of the Highlands and Islands internal ethics board. All participants gave written and verbal consent to be part of the study.

Consent for publication
All named authors provide consent for the publication of this research.

Availability of data and materials
Project name: Using virtual reality to address mental well-being   TutorialsandAssets.docx