Eye Conductor is a musical interface that allows people with physical disabilities to play music through eye movements and facial gestures. Using a $99 eye tracker and a regular webcam, Eye Conductor detects the gaze and selected facial movements, thereby enabling people to play any instrument, build beats, sequence melodies or trigger musical effects. The system is open, designed for inclusion and can be customised to fit the physical abilities of whoever is using it.
Eye Conductor was built in 8 weeks as Andreas Refsgaards final project at Copenhagen Institute of Interaction Design.
Imagine playing a musical instrument without using your hands. It can be hard to conceive and even harder to actually do. For a lot of people with physical disabilities the lack of fine motor skills exclude them from producing music on traditional instruments.
With Eye Conductor I wanted to push the boundaries of the interaction design by exploring how eye and face tracking technologies could be used for creative purposes. I believe that the ability to express oneself artistically should be available to all, regardless of physical disabilities or challenges. Therefore I wanted to create a solution that operated in the same domain as traditional instruments. Something that gives people a lot of freedom, but also requires them to practice, just like a regular instrument.
The project relied heavily on user research, and I visited several schools and housing communities for people with physical disabilities, as well as families with children in wheelchairs in their private homes.
The people I meet were extremely diverse in terms of physical abilities, but music seemed to be a unifying interest for them all.
At Jonstrupvang, a home for people with different disabilities, "Music Thursdays" was the activity that gathers the most people every week, despite the fact that almost half the people doesn't produce any sound, because of physical inability to do so.
About half of the people I tested early stage prototypes with were unable to speak, but as soon as I showed them my interactive prototypes we immediately connected.
From all my user insights the following three stood out:
1) Regardless of physical abilities music is a major interest, which has the ability to connect people (abled or disabled) through a shared activity.
2) The ability to create music functions as an important identity marker and a channel for expressing deep emotions, but often requires aid or execution by others.
3) The technology and the hardware exists, but tools for creating music in real time for people who cannot use their arms and fingers are missing.
Eye Conductor translates eye gaze into musical notes or beats in a drum sequencer. Raising your eyebrows can be used to transpose all played notes up one full octave while opening your mouth can add a delay, reverb or filter effect to the instrument being played. Thresholds for facial gestures can be adjusted and saved to fit the unique abilities of different users.
Eye Conductor is programmed in Processing and early prototypes uses FaceOSC by Kyle McDonald for face tracking.
This is a tool that would deeply and actually transform lives, and we loved how ambitious, technical, and delightful it was. It showed a high level of execution and impact.