Wednesday, July 27, 2011

HOWTO: Using Microsoft Kinect on Tegra Ventana (Android 3.0)

  In this tutorial, we will show you how to write a native Android application (NDK) that uses Microsoft Kinect on the Tegra Ventana (Android 3.0) development kit. Although we have not verified that our setup will run on other platforms, the process we describe below shall be easily ported to other Linux based devices. To achieve real-time performance, we have used OpenGL ES2 to render the depth data as a 2D texture (see demo video), thus reduced the overheads in transferring and re-rendering the frames over the Java layer. Since the data capturing and rendering engine are fully multi-threaded, our approach can utilize the multi-core on the Tegra 2 platform.

Environment Setup (Ubuntu/Debian):
To get started, first you need to install Nvidia Tegra Android Development Kit on your machine. In our setup, we have used the Ubuntu-Linux (64-bit) version.

The setup shall be straight forward.
chmod +x
My default installation path is ~/NVPACK. Please make sure you flash the Ventana board with Android 3.0 (if you haven't done so before) at the end of the installation process. Also, please do backup the data before doing so.

Now once the setup is completed, you shall have the android ndk, eclipse, and TDK sample code, etc... installed.

raymondlo84@ealab_corei7:~/NVPACK$ ls -C
android-ndk-r5c    android-sdk-linux_x86  eclipse             oprofile    TDK_Samples
Android_OS_Images  apache-ant-1.8.2       nvsample_workspace  readme.txt  uninstall

Now to verify if we have setup everything correctly, we will run eclipse (the one in NVPACK directory), compile the source code (use Build Projects or Ctrl+B), and run the multitouch source code on the device (right click on the project name, Run, and Run As Application). Now you can touch the screen and see your fingers (up to 10) being tracked in real-time. Pretty amazing.

Important: Please make sure you uninstall the multitouch application after testing. Else, it will conflict with our current application.

Hardware setup:
At this step, we shall be able to run the sample code on the Ventana and confirm that the development environment is setup properly. If not, please check if the Ventana is connected to the PC properly.

The easiest way to know is to run
$adb shell
This shall bring you to the shell on the Ventana.

Next, we plug the Microsoft Kinect to the Ventana's USB port. To check if the camera is detected properly, we can check the dmesg with the following commands. ($ - bash shell, # the shell on the Ventana).

$adb shell
Then, we shall see the device is detected and mounted. Notice that Kinect is actually recognized as multiple devices: Xbox NUI Camera and Xbox NUI Audio. Looks like the Microsoft Kinect has a usb hub internally.

<6>[70990.661082] usb 2-1.1: new high speed USB device using tegra-ehci and address 9
<6>[70990.696832] usb 2-1.1: New USB device found, idVendor=045e, idProduct=02ad
<6>[70990.703809] usb 2-1.1: New USB device strings: Mfr=1, Product=2, SerialNumber=4
<6>[70990.711490] usb 2-1.1: Product: Xbox NUI Audio
<6>[70990.716013] usb 2-1.1: Manufacturer: Microsoft
<6>[70990.720483] usb 2-1.1: SerialNumber: A44887C10800045A
<6>[70992.181268] usb 2-1.3: new high speed USB device using tegra-ehci and address 10
<6>[70992.224649] usb 2-1.3: New USB device found, idVendor=045e, idProduct=02ae
<6>[70992.240886] usb 2-1.3: New USB device strings: Mfr=2, Product=1, SerialNumber=3
<6>[70992.256521] usb 2-1.3: Product: Xbox NUI Camera
<6>[70992.274979] usb 2-1.3: Manufacturer: Microsoft
<6>[70992.280051] usb 2-1.3: SerialNumber: A00367A07065045A
#ls /dev/bus/usb/002/*
To override the permission problem temporarily, we can run the following commands. (IMPORTANT: We have to run this command every time we restart the machine, unplug the Kinect, or if the device goes to sleep! oh well!)

$adb shell
#chmod -R 777 /dev/bus/usb/002/
Once we have confirmed that the Kinect is detected successfully, we shall then replace the multitouch source code in the ~/NVPACK/TDK_Samples/Android_NVIDIA_samples_2_20110315/apps directory with the one we have provided below.

Now, we will go back to eclipse, and then refresh the project. (click on the project folder in eclipse and then press F5). Again. #chmod -R 777 /dev/bus/usb/002/ if you haven't done so, or you will see a blank screen). Rebuild and run!

If everything goes well, we shall have the application running like the following video. To change the tilt angle on the Kinect, we can simply use the touchscreen as shown below.

Code Structure and Optimization:
In this section, we will explain the structure of the source code, the optimization steps and customization that we have made to make the code runs as efficient as possible.

Figure 1. The source code structure of our demo application. 

Instead of recompiling the libfreenect and the libusb libraries from external sources (of course we can do that with static linked library approach or so), in this tutorial we provide a complete source tree, which includes the libfreenect and libusb libraries. (Note: Free feel to contact us if we shall not include this in our package.).

As we can see from Figure 1, the structure of the source code is fairly simple.
  • multi.cpp - the main code which handles OpenGL rendering, key/touchscreen events, and other logics (adopted from the TDK sample code).
  • kinect.cpp -  a wrapper for the kinect driver, convert depth map to RGB and handles other callback functions from libfreenect (adopted from the libfreenect sample code)
  • libusb/* - the libusb source code for the USB interface.
  • libfreenect/* - the libfreenect source code which interfaces with the Microsoft Kinect.


The runtime of the rendering loop is ~16ms, which translates to ~60fps. The key bottleneck of the algorithm is the texture loading step which costs about ~14ms to perform.
    struct timeval start, end;
    double t1,t2;
    static double elapsed_sec=0;
    static int count=0;
    gettimeofday(&start, NULL);
    gettimeofday(&end, NULL);
    elapsed_sec += (t2-t1);
    char buf[512];
    sprintf(buf, "Display loop %f (s)\n", (elapsed_sec)/100.0);
__android_log_write(ANDROID_LOG_INFO, "Render Loop:", buf);
 Total run time (averaged 100 trials) :

I/Render Loop:( 4026): Display loop 0.016168 (s)

Tegra Android Development Pack

Tested Platform:
Tegra Ventana Development Kit (from Nvidia)
Ubuntu 10.04.2 (64 bits)

Source Code:


svn co multitouch

for the latest source.

Other Demo video:

Blind navigation with a wearable range camera and vibrotactile helment:

This work is accepted and will be published in the proceeding of ACM Multimedia 2011 (ACMM2011).

See: and for a list of our publications.

Known Issues:
1. The application will crash when we change the orientation of the device.
2. The application does not wake up properly if it were sent to the background.

Special Thanks:
James Fung, Nvidia Technology Development for supplying the Ventana Development Kit.

... to be continued.


  1. Would it be possible to port this on Android 2.3 ? As many Android phones support USB host mode it could work nicely on them too.

  2. Haven't tested it on Android 2.3, but I have a feeling that it will work just fine as long as it supports OpenGL ES2. Go ahead and download the source, and compile it and run it. Let me know how it goes!

  3. Post your dmesg here after you connect the kinect to the device. Then ls /dev/bus/usb/002/* and see if the kinect is mounted.

  4. hey could you report this to xda-developers? i think much people there would look forward to more development at this.

  5. xda-developers? Feel free to distribute this link to them. Also, we are getting a GPU-based 3D reconstruction going on the Tegra 2. It is work-in-progress, but shall be done within a week or two.

  6. would you mind sharing the apk please ?

    1. I have the source for download... The APK may not be useful?

  7. So what exactly goes wrong if you don't set the 777 permission?

  8. Without the 777 we cannot access the Kinect camera (i.e., no write access)... and it crashes the libfreenect driver...

  9. I was considering trying to do this for my embedded systems class project, but ran into a problem on one of the initial phases.

    I am running Android 4.0.3 on a rooted Galaxy Tab 10.1. I've followed your instructions exactly, but the original un-edited source from the NVidia samples crashes with a ClassNotFoundException for:


    I have looked through the NVidia forums and can find nothing related to this and don't know where to go to find more information because the development on Tegra 2 seems to not be a very mainstream programming topic up for discussion in a lot of places.

    I don't know if you can help since I'm developing on a different platform, but I thought maybe you could offer some advice, or maybe contact me to through e-mail to discuss this topic because this is something I have been interested in doing for some time.

    -Matthew Johnson

  10. com.nvidia.devtech.multi.Multi
    That's interesting. Did you try to run the multitouch demo? (i.e., the original one comes with the NDK?) I felt sorry that my solution requires you to overwrite an existing project to make it work. Please let me know how I could help. Also, are you using the latest copy from the svn?

  11. Yeah the multitouch demo is what wouldn't work but I figured it out. I just removed the workspace and reloaded it and made sure to clean all the projects before building them and it worked fine.

    I completely didn't think about the input for the Kinect not being standard USB though (I got mine in the xbox bundle) so I'm at another roadblock until I get the adapter to actually plug into my tablet. I will probably stay in touch with you though if you don't mind, in case I run in to problems or have questions. I don't want to take up your time with my project, but I would like to get an understanding of this stuff because it is a very big interest of mine. Thanks for your quick reply!

  12. You are welcome. Feel free to email me or leave comments here if you have any new findings. It will be great if you can show me a video if the setup works on your tablet as well.

    I've reached several limitations on this Tegra 2 platform~ I have written some code that allows you to send the raw data to a remote server through wifi with some sort of compression. Take a look at the source code and you will find the magic! I will provide the server side on my next tutorial :)

  13. Hello all ....Its almost many days gone. Is Kinect now available for android? Can we use as plug in device for android and start recognizing gestures of kinect.
    please let me know some details thanking you guys.

  14. Hi ray ,
    Your project is Great. We plan to develop same kind of Application for Android which don't have Tegra. Is it possible to get a source from you. It's great if we can get help from you.
    Advance Thanks

  15. You should look into our new eyeglasses.! I'm the CTO of the company :)