Doodle Tunes

Nabi AI Hackathon (Seoul, South Korea) project with Gene Kogan


This project lets you turn doodles (drawings) of musical instruments into actual music. A camera looks at your drawing, detects instruments that you have drawn, and begins playing electronic music with those instruments.

How does it work?

It’s a software application, built with openFrameworks, that uses computer vision (OpenCV) and convolutional neural networks (ofxCcv) to analyze a picture of a piece of paper where instruments have been hand drawn, including bass guitars, saxophones, keyboards, and drums.

The classifications are made using a convolutional neural network trained on ImageNet, and sends OSC messages to Ableton Live, launching various clips playing the music for each instrument.

The software will be made available at

Made at Nabi AI Hackathon 2016 in Seoul in collaboration with Gene Kogan

My role

My role in this project involved ideation, prototyping, coding and sound design.

I came up with the initial concept for Doodle Tunes, prototyped a lo-fi interactive version, coded the communication between openFrameworks and Ableton and did the sound design.