My design challenge

I am creating a musical instrument that people can interact with using only eye movements and facial gestures. This allows disabled people who cannot use their arms and hands to play music.


Constraints

I want to focus on a group of users who cannot use their arms or fingers to interact with instruments.

Therefore my solution will be a musical instrument / interface, that is controlled only by interactions that can be performed with the face. Interactions are therefore limited to things like:

  • Eye gaze positions
  • Facial gestures (position of mouth, eyebrows, jaw, eyes open/closed)
  • Head position
  • Blow/sing into microphone
  • Voice commands


What value do you want to deliver with your design and to whom?

I believe that the ability to express oneself artistically should be available to all, regardless of physical disabilities or challenges.

My user group is deliberately narrow, but exists. My instrument may also prove to be of value of others later on.


Are you designing for the present, for the near future, or for a long-term one?

I am design for the present. The instruments/software I end up designing will work 100%, and should be easy to implement on schools or in the homes of disabled persons. I don’t want to fake any interactions or a make a video demo of something that does actually work.


What technology are you relying on?

Sensors:

1) Eyetracking via dedicated sensor. I am currently working with a 99$ eyetracker from The Eye Tribe. Although it is not the most precise tracker on the market, it is affordable and works across platforms (Mac, PC, Linux, Android). The accessibility is important to my, since most other trackers are quite expensive and only work on the Windows platform.

2) Facetracking via webcam. Most computers have a webcam, and if not it is easy and affordable to hook up an external one. I want to use facial gestures as a supplement to eyetracking interactions.


Software:

Processing will transform the raw sensor inputs (x- and y-coordinates from the eyetracker + webcam input) into meaningful interactions for musical real time performances. Designing a user interface that can be controlled exclusively via eye gaze and face gestures will be a big challenge, but should be possible. I am gonna rely on existing libraries for Processing (oscP5, netP5 openCV, etc), but will put a lot of effort into writing software that enables users to interact musically with a DAW (like Ableton) using only eye and face interactions.

– I am gonna use Max4Live to transform OSC-messages from Processing into musical commands (trig notes, insert notes in sequencer, adjust effect, change scale, make more experimental/generative musical systems, etc)

– I will use Ableton Live to trig sounds and add effects


Which are the key stakeholders in the domain of your project?

Institutions, NGO’s, schools and music teachers working with people with physical disabilities as well as individuals with physical disabilities and an interest in playing music.

The project might also be interested to the more artistic maker/hacker community, who might build on top of my design in unexpected ways if I provide them with an open system.


In which direction do you want to push the boundaries of the interaction design practice?

Explore how an emerging technology (eyetracking) mainly used for market research or within everyday assistive technology can be used for creative purposes. In addition to the primary purpose of enabling the user group to express themselves artistically, a project like this (with it’s constrained focus on face interactions) may awaken interest in alternative use cases for the technology.


What has already been done in this domain that could inspire/limit your concept?

– See this post: Related Projects + Inspiration

– Some people with severe physical disabilities already use eyetrackers to interface their computers.

Open Up Music (a UK music initiative for young disabled musicians) makes very simple instruments using eyetracking and face position.


Progress, prototypes and user tests so far…


1) Kasper who has muscular dystrophy explains about his costume made drum kit, the software he uses and who he deals with challenges as a musician

2) Kasper tries out a lofi prototype that allows him to play notes with his eyes

3) Simple prototype where a microphone placed near the mouth determines whether a note is played or not

4) A prototype using facetracking where the height of the mouth determines the velocity of the notes

5) A prototype where Kasper controls the master volume of Ableton by moving the white line up or down with his eyes

6) Me exploring whether your eyebrows could have the functionality of a button/toggleswitch

7) Me demoing a simple 8×4 sequencer that is entirely controlled by eye movements