![leap motion max msp leap motion max msp](https://i.ebayimg.com/images/g/XcwAAOSwoFZasJTx/s-l300.jpg)
with the visual interface to musical effect (processed and generated in Max/MSP).
![leap motion max msp leap motion max msp](https://d3i71xaburhd42.cloudfront.net/63ac95132f536f6e17b5bf9fdff4ccc6233ac640/2-Figure2-1.png)
Then what comes out of the two split objects should be all four directions Leap Motion controller to create an immersive 3D VMI environment. Uses gestural input from a Leap Motion controller to trigger chord progressions and vocoder parameters via MAX/MSP. Then, both channels out of that to two cv.jit.mass obj, "split 0 90000" (or whatever your highest value might be). This range is limited by LED light propagation through space, since it becomes much harder to. Then if you want to get fancy, pipe all of the differenced matrix into a cv.jit.erode, then a cv.jit.dilate, and then into your cv.jit.hsflow object. The Leap Motion Controller’s viewing range is limited to roughly 2 feet (60 cm) above the device. So to do background subtraction, I pipe the video signal into a jit.slide object, and also into a "jit.op absdiff", and pipe the slided matrix into the right channel of the absdiff – that way you get a black frame with only the pixels that have changed.
![leap motion max msp leap motion max msp](https://i.pinimg.com/originals/62/c1/c5/62c1c5700da6b60722726c14da5bcc40.jpg)
In order to work with those values, I used a cv.jit.mass to add all the bright pixels together, but it can kind of get a little crazy if you're not in ideal lighting conditions. Use a cv.jit.hsflow object, pipe that into a "jit.unpack 2" obj, and you've got left/right, up/down directions coming out of those terminals.