Dream paintings generated from semantic features extracted from EEG signals of sleeping participants.
Concept, EEG data collection and analysis, text-to-image generation
Esther Senneker, Valentina Gambo, Andreas Knoben, Beyza Nur Çay, Gwendolyn van der Bie, Ninon Lizé Masclef
brain-computer interface, neurosciences, AI art, hackathon
During the 24h BR41N.IO hackathon organized by g.tec and IEEE Brain, my team "Brain't" won the first place with our dream paintings project. We extracted semantic data from dreams so as to generate images from text, using AI. To be more precise, we collected EEG data from participants during REM sleep, extracted valence and brain region and then fed the data into VQGAN-CLIP to generate dream paintings. For this project, I received fundings from Dassault Systèmes so as to purchase the 8-channel EEG Unicorn Hybrid Black from g.tec. The video above is a condensed loop of 3h30 dream paintings that we gathered from one of our participants.