This past May, I presented our work entitled Lucid Loop as a demo at the conference on Computer-Human Interaction (CHI) in Glasgow, Scotland with over 3,600 attendees.
Lucid Loop is an immersive virtual reality experience designed to support self-regulation of focused awareness using deep dream images and spatialized audio that change in real-time with your brainwaves. The work is inspired by lucid dreaming–the experience where one knows they are dreaming while dreaming, and whose practice is centred around focused awareness of the present moment. We use the Oculus Go, an immersive virtual reality headset, in conjunction with the Muse 2 EEG headband, which measures your brain’s electrical signal on your forehead.
I demoed Lucid Loop across the course of two days of the conference during coffee and lunch breaks. Many people were interested in the topic and were eager to share their own dream experiences with me. A few were skeptical, and a handful actually tried out the full experience. Here are the main findings from this short demo:
- Lucid Loop’s visuals were whimsical, dream-like and interesting to look at. Some people wanted to keep watching for longer than the 5 min demo, even though the experience was only a loop of 14 seconds long.
- Lucid Loop’s audio was reminiscent of Autonomous sensory meridian response (ASMR). Often triggered by things like whispering voices, ASMR is a feeling of well-being combined with a tingling sensation in the scalp and down the back of the neck.
- Lucid Loop’s mapping between a person’s brain waves and the immersive visuals/audio was self-evident to most people.
- The perceived accuracy of Lucid Loop’s focused awareness component was mixed. It seemed that half of the people who tried Lucid Loop felt they were able to control the visuals and audio by focusing and defocusing their awareness. However, the other half felt the visuals were changing at random or were unsure whether they were controlling it or not.
- Overall, people enjoyed the experience and were eager to see the results of an actual study.
These insights show promise that we are heading in the right direction. At present, we are working on the next iteration based on the above feedback. In the fall, we will be recruiting lucid dreamers to try our research prototype and take part in an interview about the experience. If you’re interested, then please let me know!
For more info on our project and lucid dreaming, check out our Medium article.
Poster print for those interested here.
Kitson, A., DiPaola, S., & Riecke, B. E. (2019, May). Lucid Loop: A Virtual Deep Learning Biofeedback System for Lucid Dreaming Practice. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems Extended Abstracts, Glasgow, UK. ACM: 1-6. doi: 10.1145/3290607.3312952