Using the Kinect for public User-Interfaces on Ubuntu Linux

The Microsoft Kinect is a affordable (around 100 EUR) device for natural interaction. I present a way to use the Kinect under Ubuntu Linux 10.10 (x64) as an input device for a browser remote control using OpenNI, Sensor Kinect and the NITE toolkit.
Kudos goes to Mathias Frey (Vienna University of Economics) for being cool enough to commission us with this cutting edge evaluation!


Just install the following standard Ubuntu packages which are needed to install the 3rd party libraries:

apt-get install g++ python monodevelop libusb-1.0.0-dev \
freeglut3-dev xdotool

You can install OpenNI, Sensor Kinect and NITE by downloading them:


In every extracted directory, just run

sudo ./

When installing NITE, you are asked by a key (you can use 0KOIk2JeIBYClPWVnMoRKn5cdY4= for evaluation purposes as written on the SensorKinect github page).

You can check your installation afterwards by starting NiViewer (located in OpenNI/Samples/Bin/Release). This will show you a depth and a color image captured from the Kinect:

One interesting demo application is the hand detection from the NITE toolkit. Without calibrating your skeleton, your hand can be tracked just by waving it a few times:

According to the NITE API such easy trackings can be written in little code. Just make sure to enable the context manager on the events “Wave” and “RaiseHand”:

XnStatus rc = XN_STATUS_OK;
// Create and initialize point tracker
g_pSessionManager = new XnVSessionManager;
rc = g_pSessionManager->Initialize(&g_Context, "Wave",
if (rc != XN_STATUS_OK)
    printf("Couldn't init the Session Manager: %s\n",
    delete g_pSessionManager;
    return rc;

To use your hand as input device, just instantiate a XnVSelectableSlider2D (you could also use XnVSelectableSlider1D, but then you are limited to horizontal or vertical movements).

g_pMainSliderXY = new XnVSelectableSlider2D(1,1);

Inside the MainSliderXY_OnValueChange callback you can add your gesture detection, as you get the x/y value of your tracked hand.

void XN_CALLBACK_TYPE MainSliderXY_OnValueChange(
    XnFloat fxValue, XnFloat fyValue, void* cxt)

When you have detected a gesture (for example: xValue is at least some time at value 0 or similar) you can send commands to any X11-Window you can imagine.
One small example sends a keystroke to a web browser (written in a small python wrapper script) – just make sure that the window is focused before you send a keystroke, as xdotool just generates raw Xevent keypresses.

import subprocess

BROWSER = "firefox"

def focus():
    p = subprocess.Popen(["xdotool", "search", "--name",
    out =
    idwin =	out.split("\n")[0]["xdotool", "windowactivate",
            "%s" % idwin])

def send_command(command):
    print "sending command %s " %command["xdotool", "key", "%s" % command ])

The Kinect is a great tool to develop natural interaction devices. I hope this low-level article clarifies the basics of using the SensorKinect library.

If you want to have a full working example, don’t hesitate to ask by email!

Hope this helps,


This entry was posted in Python, User-Interface and tagged , , , . Bookmark the permalink.

12 Responses to Using the Kinect for public User-Interfaces on Ubuntu Linux

  1. Pingback: Mathias' Blog » Interactive Digital Signage with MS Kinect

  2. Hi…

    I’m trying to follow your tutorial but in the part of installing the sensor of avin’s, there’s not any file

    any idea??


  3. Sorry for the delay, I was on holiday the last week.
    I think I missed the following steps in my tutorial in order to create the of avin’s sensor package (copy and paste from the readme file):

    Building Sensor:
    1) Go into the directory: "Platform/Linux-x86/CreateRedist".
    Run the script: "./RedistMaker".
    This will compile everything and create a redist package in the "Platform/Linux-x86/Redist" directory.
    It will also create a distribution in the "Platform/Linux-x86/CreateRedist/Final" directory.
    2) Go into the directory: "Platform/Linux-x86/Redist".
    Run the script: "sudo ./" (needs to run as root)

  4. luis miguel says:

    Hi Gregor,

    Could I see a complete working example?

    • Hello luis miguel!

      Thanks for your interest, I will package up the needed code and send it to you via e-mail.

      • roberto says:

        hi gregor
        really nice article,i was interested too
        could u send me a complete working example?
        i often use codeblocks to compile and run my tests
        i would like to use a bash script for the xdotool option
        any sugg?

        • Hello Roberto,
          I will package up the code and send it to you next week. Using xdotool from inside a bash script should be no problem – where do you have difficulties?

          Best regards,

  5. It looks like they have removed the Linux NITE middleware binaries. Is there an alternate source for these? I just see the Windows and Mac binaries on the NITE page you link to – no Ubuntu binary.

  6. Pingback: Python & Kinect. Part 2 « Project Network 4

  7. Pingback: URL

  8. Dinesh Lokhande says:

    can U send me complete mouse implementation in ubuntu using kinect?