IMMERSIVE INSTALLATION

Interactive Art

Immersive Installation Controlled by Brainwaves

DREAM 2.2 (2018)

National Taiwan Museum of Fine Arts, TAIWAN, February-June 2018

DREAM 2.2 is an immersive installation by Betty Sargeant and Justin Dwyer (PluginHUMAN). Audiences use their brainwaves to control the installation’s audio-visuals.

DREAM 2.2 features an original brain-computer interface designed using TouchDesigner and Ableton Live, and EEG (electroencephalogram) headsets. These headsets track the electrical activity of the brain. This provides audiences with an immersive installation experience and a personalised connection to their neural data.

EEG Brainwave ART
DREAM 2.2 an immersive exhibition controlled by brainwaves

THE INSTALLATION SPACE

DREAM 2.2 features an immersive room design designed shaped by a ‘brain forest’; a 13m x 6m maze. The maze is made from 100 hand-painted, 4-meter-high PVC panels that hang from the ceiling. When people move through the maze they are surrounded by abstract projection-mapped visuals. The outer walls are covered in reflective mirrors. The effect of the mirrored walls and the projection-mapped maze creates a mesmerising, uncanny immersive art experience shaped by neural data, light projections, reflections and quadrophonic sound.

This exhibition also features a poem that is projection-mapped onto the front of the installation’s resting platform. DREAM 2.2 performer Coco Disco wrote the poem, drawing inspiration from their own dreams. The poem operates as a linguistic tool that further assists audiences to immerse themselves in the otherworldly, dreamlike environment of the installation.

Immersive Art Experience
The DREAM 2.2 immersive room

THE INTERACTIVE ART EXPERIENCE

During the installation people can sit or lie down on the DREAM 2.2 resting platform. Each person can wear an EEG headset. They can view a tablet displaying graphs of their neural data. The five graphs on the display track changes in Alpha, Beta, Delta, Theta and Gamma neural signals. When no one is wearing the EEG headset, the graph’s lines are flat. As soon as the EEG is fitted on someone’s head, the graphical lines move, tracking changes in neural activity. This provides audiences with a traditional scientific display of their neural data, and it’s a clear indication that the headset is working. When they wear the headset, people’s neural activity controls the audio and visuals in the installation space, in real time. People see immediate changes in their surrounding environment. Their neural function forms abstract visuals that are projected onto the installation’s maze and they trigger audio effects. Different neural function creates different audio-visual effects, so everyone’s experience is unique.

interactive audio-visual installation
The DREAM 2.2 immersive design.

THE PERFORMANCES

DREAM 2.2 performances feature two sleeping performers and PluginHUMAN (Betty Sargeant and Justin Dwyer). PluginHUMAN assign visual and audio effects to the performers’ neural data in real-time. This data is then projection-mapped onto the exhibition’s maze. The data takes the form of abstract visualisations. The audio consists of a 30-minute electronic soundtrack. When the performers generate specific neural signals, new audio sounds are automatically triggered and layered over the soundtrack.

Interactive Art
DREAM 2.2 Immersive design

Investigating Novel BCI Displays that Support Personalised Engagement and Interpersonal Connections

CREDITS

Betty Sargeant ~ Artist (maze design), producer

Justin Dwyer ~ Artist (projection mapping), programmer

Jet Disco ~ Performer, writer

Levi Dwyer ~ Performer

Andrew Ogburn ~ Composer

PRESS

Customised Reality: The Lure and Enchantment of Digital Art

CONTACT