For some time, myself and Ryan at Spiral Technica have been contemplating and conceptualising ways of using the HTV Vive VR system with Ableton Live in tandem. Whilst visiting Ryan and his partner in Dunedin, we decided to try and get the two talking.
Here is the end result, as demonstrated by Ryan:
The resulting ‘instrument’ is probably best described as a two-voice theremin. Vertical motion controls pitch, left/right controls the voices panning, and front/back increases/decreases the filter resonance on the synth patch to add a little emphasis. Note On messages are triggered by the Vive controllers trigger button, and Note Off messages by the grip pads.
Each controller controls a single voice, and the controllers positional data is sent via OSC from Unity to Ableton, where they are received by Ethno Tekh’s ‘Tekh Map‘ set of Max for Live devices.
More technical detail may be found on the GitHub repository that Ryan and I have set up for the project, here.
Simple, but it’s a proof-of-concept that opens the door to more complex iterations of technical and creative possibilities.
One major thing that we’ve established the need for is some form of relative range control. As such, Ryan is looking into enabling the performer to set their position, so that all parameter values are adjusted relative to that point.
Another major point will be to develop Max for Live patches specific to the Vive controllers’ set of parameters. These include positional (XYZ), and rotational (pitch, yaw, and roll) data sets, as well as button on/off messages, and trackpad data, etc.
Development in these two areas should be sufficient in our being able to develop a range of expressive electronic instruments.
Having performed DJ/VJ sets together before, Ryan and I wanted to explore ways of increasing the interactivity between the performance applications we use to create a more dynamic experience for both ourselves and our audience. The next step was to try and bridge the gap between our technical interests and another creative realm, the circus and flow arts. This is more or less where the project stems from.
In doing this, we’ve effectively prototyped a means of interfacing circus (or dance, etc) performers with our computer systems. Vive/Live is our means of interfacing the Vive with Ableton Live.
There are numerous ways in which we envision Vive/Live being used. The primary application we’ve envisioned for ourselves is to use it as a gestural control system in a live setting, however it has numerous applications in a studio/production setting also. For example: I forsee it as being very useful in recording expressive parameter automation, and acting as a modulation source.
First port of call is to dive into Max, and start learning how to work with OSC in that environment. After that, development of the first purpose-designed Max for Live device shouldn’t be too tricky.
After that? Well, start experimenting. I have a growing list of concepts, thoughts, and ideas that I wish to test out. Hopefully i’ll be able to either re-visit the South or otherwise find another Vive system locally to work with. Then, it’s play time!