How Do You Feel About Your Scars

An Immersive Virtual Reality Meditation Experience Visualizing Scar Information by Using EEG Data

Designed and Developed by Wanyue (Luna) Wang

Tools: Unreal Engine, Oculus Rift, Python, Emotiv BCI, Emotiv 14-channel EEG Headset

Scars are reported to play a role in the development of individuals’ sense of self; in particular, individuals discuss how their scars represent an aspect of their identity and understanding of whom they have become. I designed an immersive virtual reality meditation experience that guides participants to recall their scar memory and transforms scar information into artwork by using mental states from EEG Signals. I hope this experience can help individuals rethink their scars and find the beauty of them.

Before moving the project to digital space, I did a series of experiments in physical space, check the details and reasonings here.

DEMO VIDEO
PROJECT FRAMEWORK

 The project includes two phases: guided meditation and generative art. During the meditation phase, the participant will hear and see instructions that guide him/her to think of his/her scars including related memory, the shape of the scar and feelings towards the scar. The participant does not need to say anything. The EEG data (#1) will be recorded in this phase and will be used to generate a virtual environment for the second phase. In the second phase, the participant explores the generated environment in virtual reality with soothing background music by moving his/her body or using the VR controllers. EEG data (#2) will also be recorded in the second phase and will be used to compare it with the data from the first phase. All EEG data will be recorded and saved to a local sheet anonymously. 


I am curious about the impact of the immersive VR experience on participants’ cognition about their scars and wonder whether the experience can positively influence their consciousness. I plan to compare the EEG data #1 and EEG data #2 to investigate the change of their brain activities and the change of mental states. The study involves human subjects recruited to participate in the experience, so I applied the Institutional Review Board (IRB) for the data collecting and comparison part.

DEVELOPMENT FRAMEWORK

The hardware I used in the project is an EMOTIV EPOC+ 14-Channel EEG headset and an Oculus Rift VR headset. On the software side, I used Unreal Engine 4.23 for VR development, EMOTIV BCI for EEG headset calibration, and python for automatic EEG data collection and data processing. The development framework of the project is shown below.
 

DATA MAPPING

EMOTIV provides six types of mental state data including stress, engagement, interest, focus, excitement, relaxation. Among those dimensions, stress, engagement, interest, and relaxation data, which are more related to scar experience, is used to change the variables of material and texture thus generate a unique environment in the second phase. The way I mapped mental state data is shown below.

With the engagement level higher and the size of the object larger, the user will feel more immersive. The interest level decides the color of the object -- the higher the interest level the warmer the color, the lower the interest level the colder the color. The stress level determines the cloud object density -- higher stress level results in denser clouds, just like clouds in the real world, dense clouds always related to negative feelings. The offset of the object is determined by the degree of relaxation, which means that the more relaxed user is, the more dynamic the object is.

The first step is using EMOTIV BCI to calibrate the EEG headset to ensure a higher connection quality. During the whole experience, the EEG headset is connecting with EMOTIV BCI. When the experience starts, the VR development tool, Unreal Engine, triggers python scripts to automatically collect EEG data from the headset by using WebSockets and Emotiv API and then stored as local files on the investigator's (my) computer. Once finishing the meditation, the Unreal Engine automatically imports EEG data from the local file and generates a virtual environment based on the EEG data. 

GALLERY
PEOPLE IN THE EXPERIENCE

This is my Master's thesis project. Pictures above are all from user testing and exhibition. If you are interested in more details behind the project, please feel free to contact me.

Check my scar-related experiments in physical space here.

MOVING FORWARD

YOUtopia - MIT Reality Hack 2020

Extended the framework of the Scar VR to another scenario.

Copyright © Wanyue (Luna) Wang

  • Black LinkedIn Icon
  • Black Instagram Icon
  • github-mark
  • Black YouTube Icon