Multi-Sensory Interactions with EEG Data
Continuing part of MFA thesis research into the future of human-computer interaction, I created a project for the 25th ACM Multimedia Conference in Mountain View, CA.
Using an EEG headset, I designed a system that allowed users to interact with data from their brains in three different forms: visual, auditory, and tactile. The following artifacts were created:
A 3D CAD model, which each participant received a print-ready file of at the end of each demo. They could print it out at whatever scale they want, enabling them to touch data from their brains!
A data visualization showing data from each of the wavelengths being monitored by the headset. This was the most easily decipherable of all the forms.
Music reflecting their state of mind. The more relaxed they got, the more relaxed the music sounded; becoming harsher as they became more attentive. This was done by mapping different wavelengths of the data onto corresponding musical instruments.
A visualization of the generated music in the form of blinking dots and lines on a grid. This grid was also formed using the type of data collected.