For my Master's project, I wrote an app that allows you to change equalizer levels via Microsoft Kinect's skeleton tracker.  Unlike the more conventional binary on/off controls seen in foot pedals, the app provided a novel way to allow you to have continuous levels of control to guitar effects using gestures, all while playing a live guitar.

Kinect/EQ Interface hardware setup

After gathering feedback from early versions of the app through interviews and rapid iterative prototyping, I conducted a study of the final version to measure the app's perceived usefulness.


The final application, which I dubbed the "Kinect/EQ Interface", is a Windows app that enables you to change equalizer levels in real-time with acceptable accuracy while playing your acoustic guitar.  I wrote the app using C# with Kinect SDK, Max for Ableton Live patches, and Open Sound Control.

Head tracking movement and corresponding direction of real-time knob level adjustment

The hardware implementation of the Kinect/EQ Interface consists of the following parts and signal flow:

  • Sony Vaio VPCF115FM laptop using 2 USB 2.0 ports
  • On-board internal soundcard w/ audio output headphone jack, and audio input microphone jack
  • Kinect for Xbox 360 w/ USB cable and power supply
  • M-Audio Oxygen8 v2 w/ USB cable
  • Piano pedal
  • Acoustic guitar with quarter inch output jack
  • Guitar cable
  • Quarter inch jack to 3.5mm adapter
  • External speakers

Hardware signal flow

The software implementation of the Kinect/EQ Interface consists of the following parts and architecture:

  • SkeletalTracking.exe (C# using Kinect SDK v1.8 and Ventuz.OSC 1.4.0)
  • Masters.Prototype.Pedal.maxpat (Max 6.1)
  • Masters.Prototype.View.maxpat (Max 6.1)
  • Masters.Prototype.als (Ableton Live)
    • EQ Three (Ableton Live Audio Effect)
    • Min Max Init Rack (Ableton Live Audio Effect)
  • Max Kinect Dial 1, 2, and 3 (custom Max for Live patches)
  • Open Sound Control
  • ASIO4ALL 2.1.2

Software architecture


To measure the perceived usefulness of the Kinect/EQ Interface, I conducted an IRB-approved study at the University of Florida School of Music.

Flyer for the study

29 participants went through the same procedure


The results of the study were as follows:

  • A one sample t-test was conducted
  • Participants = 29, Mean = 51, Std Dev = 5.76, Range = 22, Minimum = 41, Maximum = 63, df = 28, T-Val = 0.93, p-value = 0.35, Confidence Interval (95.0%) = [48.81, 53.19]

Histogram of perceived usefulness total scores

Although the mean of the data was greater than 50, with a p-value of 0.35, it was not significant.   Hence, it cannot be concluded that users in general will perceive the Kinect/EQ Interface to be useful during their live performance.


I started this Master's project with the observation of the inaccessibility of continuous controls during performance due to its existing form factors, and I developed the Kinect/EQ Interface app as a novel way to address that.   I then ran a study to measure the app's perceived usefulness during live guitar performance.

This project took me several semesters to complete, so there's a lot of details I left out.  To view more information about the project regarding its motivation, research, interviews, prototypes, implementation specifics, as well as the study procedure, discussions, and future work, you can download the final paper (mirror).