In my explorations so far I have considered combining different types of facial interactions. If I’m gonna work use eyeTracking maybe eyebrow gestures would be worth exploring?

The tracking of eyebrows seems more stable than the mouth with faceOSC, and I am considering whether a combination of eyes gaze and eyebrow gestures would work in a musical interaction.

To help me make a decision about what facial gestures to explore further, I created made a simple program to visualise the incoming values from faceOSC in order to better assess their precision and reliability.

In the video the values are plotted as running averages from the last five readings. The graph suggest that eyebrow height could hold three states [low position , neutral position and high position] without any unwanted glitch values.

 

 

I still have some issues since some values (for instance Eyebrows) are not relative to the eyes, but absolute… This gives me issues when I move my head on the y-axis (up and down).

Right now I’m working on comepensating for this, by doing the calculations myself. As you can see in this video it (more or less) fixes the y-axis issue, but not the z-axis (how close to the webcam my head is).

These issues can be solved at a later stage, so I think I’ll leave the technical issues for now…