The Upright Spass – A Javascript Instrument in Thin Air

Well, well, well….

All my talk of OpenNI, C++, NodeJS, etc in recent months was pretty much all boring until you put it into practice and make something cool.

I did just that….well, I think it’s cool.  And just plain weird, really.  Here’s a motion controlled instrument I made that’s Javascript through and through.  It’s Node.js at the heart, with a HTML/Javascript display.  And yah – I snuck in some C++ to wrap the ever awesome OpenNI SDK.

I present to you….the “Upright Spass”:

I’ve played around several months ago with the Kinect SDK playing a keyboard in thin air.  What I was playing with then was Windows only, Kinect only, and need Adobe AIR to route things to websockets for the browser.

So using my new found powers over the past few months with:

  • OpenNI/NiTE
  • My Asus Xtion Pro Live depth camera
  • C++ Addons in NodeJS

….I now have a nice little handtracking utility that runs in Node.js using OpenNI and NiTE to power my skeleton tracking.

I didn’t care for the horizontal layout of my old virtual piano – so I inverted the axis, and made the instrument control upright.  Hence – “Upright Spass”….the anti-bass, the bass that is not a bass, just empty space.

What was also crazy hard was producing decent sound with Javascript.  I don’t care what language you do this in, creating sounds from scratch is hard.  You could go years studying and tweaking new sounds to match what already exists in the world.

So to solve this?  MIDI.  Hell yes, MIDI!  I found a nice robust Node.js MIDI addon.  So instead of making my own sounds banks, I send it out over my E-MU MIDI USB controller to my Korg X3 keyboard

x3

 

And wow….the site I grabbed this image from is calling this keyboard (made in 1993) “vintage”.  I feel old, damn.

Anyway – I’m running Ubuntu for this whole operation, so to route the MIDI from Node.js to my keyboard, I used Jack.  Jack offers you a nice little audio server.  You can patch in your MIDI through out to the E-MU MIDI USB device in.  Voila, start make the link and start the Jack server.

So, I got this motion controlled midi thing all rigged up, and it’s REALLY hard to play.  There were  a few problems:

  1. Playing straight notes with 2 hands in an unfamiliar environment can lead to disharmony.  Seriously, on top of being hard to play, it’s way too easy to play the wrong notes.  So, I restricted the instrument space to only be able to play notes in a certain key signature.  I randomly chose A# Minor.
  2. The coordinates of your 3D world will vary based on where you stand and where the camera is positioned.  So, on top of sending the hand coordinates from my Node.js AddOn, I also sent the torso position.  That way, all the hand positions can be calculated outward from the center of your body – and your vertical instrument is always in your center.  Muscle memory is a major factor in learning to play an instrument, and you can’t learn to play if your instrument keeps shifting around on you.  Ideally, I should get the user’s height and make calculations on where the instrument notes are from there as well, but I haven’t done so yet.
  3. No feedback in thin air.  Yah….that’s a problem.  Usually with an instrument, you have tactile feedback to tell you how you are playing it – but lacking that, I went with visual feedback.  I rigged up an HTML/Javascript page.  The page listened for hand positions events over websockets from Node.js.  It offers the user feedback on where their hands are in relation to the instrument – in the center of the screen and the center of your body.

Even after solving a few of these problems, the Upright Spass is really hard to play.  My performance was pretty much a disaster – but maybe I can tweak and practice and get passable at it.

My code for this is up on github.   I mentioned the link for my Node.js AddOn previously – that’s here:

https://github.com/bengfarrell/node-sweatintotheweb

and this particular project, the Upright Spass, is here:

https://github.com/bengfarrell/uprightSpass

 

5 thoughts on “The Upright Spass – A Javascript Instrument in Thin Air”

  1. This is AWESOME. The NodeJS combo with OpenNI is particularly exciting.

    I’m interested in sending that kind of output to a visualizer system via OSC. I’m working on bringing this kind of interaction to live, improvised visual performance. Getting the technology to cooperate can be a hurdle, though. How granular can you make these motions? Could your upright Spass track different finger positions while they’re in motion? How difficult are you finding it to program gestures in Javascript?

    I’m working on a project this summer to tackle the design challenges in this kind of an interface. Just came across your blog posts and got giddy. I’ll definitely be trying our your code!

    Thanks!

    1. Mucho thanks for the compliments! So, what OpenNI/NiTE offers you out of the box does not give you fingers in the overall skeleton. I’ve seen some folks use OpenNI in combo with other middleware to track fingers, but it looks like they create this whole rig with 2 motion cameras mounted above your hands, it doesn’t look ideal. That’s probably where the Leap Motion will shine when it gets to market. I think one of the reasons for that crazy rig is tracking fingers when one finger is hidden behind the other, for example – so that’s why they set up 2 cameras, to get different angles.

      As far as gestures go, OpenNI/NiTE kinda suck for offering gestures, so like you said, you need to create your own. And it is pretty difficult, I’m actually coding some in C++, so you can just listen for the events via Node. I figure it’ll probably calculate faster in C++ than it will in JS. I haven’t hit any performance issues, just kind of expecting them.

      And the logic of the gestures themselves…well, basically it’s a lot of testing and seeing what works. I’m going to blog on a couple swipe gestures I wrote (or rather copied an example from a nice person on G+ groups). Left and right swipes are done by checking if your hand is above the elbow, and tracking how long it takes going from one part of your body to the other. Problem is that the user doesn’t know your logic, so if they do something slightly different, it doesn’t take! I also tried to do swipe up/down gestures. And one of the problems there is if you position your hand on top to make a swipe down, you’re accidentally swiping up to get there!

      So it’s complicated, but honestly, it’s a lotta fun too. I’m taking what I learned from these disorganized experiments and funneling it into a new Git repo called nuimotion which should be pretty all purpose and I hope to release in the node package manager soon.

      Love to hear what you get going! I’m going to do a lot more experimentation, and love to hear from others

  2. This is a very inspiring project, thanks for sharing. Just got my xtion pro live and installed the included OPENNI 1.5 and windows SDK 7 and then Microsoft Visual C++ 2010 Express.

    I’m intending to start with a much less ambitious project that involves triggering samples (mp3’s maybe) by touching areas in space wit your hand and having a image on the screen that shows guidance where these areas are… Any help would be greatly appreciated as I’m a bit lost at the moment…
    Cheers

    1. Glad you like it! I’m actually less familiar with OpenNI 1.5, and more familiar with 2.0. OpenNI is a confusing mix of what NiTE accomplishes in 2.0 and some of the depth camera features. I’d say you might be better off with OpenNI 2.0 and NiTE 2.0 because it’s more straightforward, but it does lose some nice gesture support.

      Anyway – either way, its pretty easy to grab the X, Y, and Z coordinates of hands. Check out the examples it ships with and break them down. I’m fairly certain that each of those projects ship with a skeleton tracker, both a text based output one and a nice OpenGL view where it draws the skeleton.

      In terms of C++ and Windows dev with C++, I am a little lost on how to do it. So, I’m no help to you graphically – it’s exactly why I ported it over to Node.js so I could run it over websockets to the browser. If I had to do C++, I’d probably run over to OpenFrameworks or Cinder as I understand they do some very nice things for creative people wanting to do projects in C++.

      Best of luck! Love to see it when you’re done!

Leave a Reply