Ongoing research project to decode dream content in virtual reality.
Concept, experiment design, EEG data collection and analysis, text-to-mesh generation
Taisija Demchenko, Ninon Lizé Masclef
VR, brain-computer interface, machine learning, neurosciences, immersive art, research
Studies show that up to 90% of dreams are forgotten because of the active dream forgetting mechanism. But dreams are valuable in psychotherapy, cognitive science, and creativity. Can we record dreams while sleeping to review them later?
Recent advancements in generative models allow for synthesizing visuals from a text prompt, opening doors for new artistic expressions, and populating our imaginaries. Following ancient traditions of oneirocritics stating that dream content is structured as a rebus or word plays, this project explores dreams from their language representation in the brain. Leveraging recent text-to-mesh generative models (ClipMatrix, DreamFusion), ReaDream generates VR scenes from semantic features extracted from EEG signals of dreaming individuals. Blending virtual reality, artificial intelligence and neurosciences, ReaDream explores hybridization of imaginaries between biological and artificial brains.