Not just a ‘theremin’ — that totally downplays the power of midi.
Someone else mentioned the mimu gloves, and I love the idea of a vision based controller - almost everyone has a phone, tablet or laptop with a camera, especially if theyre making music.
I also love that this could blur the lines of music playing and dance.
Open the app and open Logic Pro. Create a MIDI track on Logic, try waving to the app it should automatically receive MIDI message from all channels and all MIDI devices.
Then if you want to filter the track to receive specific MIDI channel from specific device, for example AirBending channel 2, then find it in the dropdown in the MIDI inspector section in the same MIDI track.
From the article: "When using AirBending for pitch control, you can lock your gestures to specific musical scales and keys. This ensures every note you play is perfectly in tune with your composition"
Reminds me of the Moog Theremini - that was a fun bit of kit.
Since I developed it using Vision framework from Apple, the current focus is still for Apple devices. So, not in the near future to develop for Linux and Windows.
Not just a ‘theremin’ — that totally downplays the power of midi. Someone else mentioned the mimu gloves, and I love the idea of a vision based controller - almost everyone has a phone, tablet or laptop with a camera, especially if theyre making music.
I also love that this could blur the lines of music playing and dance.
Great job OP, thanks for sharing.
That looks great!
Do you have any suggestion for how to learn how to hook this up to Logic for anyone who hasn’t used midi before?
I wonder how this would perform under live stage lighting conditions, i.e coloured strong lights and high contrast.
Open the app and open Logic Pro. Create a MIDI track on Logic, try waving to the app it should automatically receive MIDI message from all channels and all MIDI devices.
Then if you want to filter the track to receive specific MIDI channel from specific device, for example AirBending channel 2, then find it in the dropdown in the MIDI inspector section in the same MIDI track.
From the article: "When using AirBending for pitch control, you can lock your gestures to specific musical scales and keys. This ensures every note you play is perfectly in tune with your composition"
Reminds me of the Moog Theremini - that was a fun bit of kit.
https://en.wikipedia.org/wiki/Theremini
Imogen Heap demonstrating her Mi.Mu gloves: https://www.youtube.com/watch?v=ci-yB6EgVW4
Using the gloves during an NPR Tiny Desk concert: https://www.youtube.com/watch?v=3QtklTXbKUQ&t=555s
Theremin.
Here's someone playing a theremin who's reasonably good.[1]
[1] https://www.youtube.com/watch?v=K6KbEnGnymk
This instruments timbre and tone are literally dream shit to me, so wavy and I dont know—unearthly/worldly
Seems like the same software could be used as a soundtrack for Tai Chi exercises. Would be pretty neat.
It seems possible, since Apple’s Vision framework can read body pose too. Maybe I can try it for the next update.
Great work but wouldn't the iPhone with the lidar depth sensor be a better device?
It’s on the plan to expand this app for iPhone. But, I haven’t tried lidar, so I decided to release for macOS first.
Also with iPhone, I have to think how to transmit MIDI data to DAW on laptop. Well, most likely via USB or network.
Apple devices can do MIDI over Bluetooth. I've used this in the past to send VisionPro hand tracking data as MIDI.
I do this with "Apple Lightning to USB Camera Adapter". iPad basically talks to the midi sequencer via USB.
I wonder if Linux version will be available.
Since I developed it using Vision framework from Apple, the current focus is still for Apple devices. So, not in the near future to develop for Linux and Windows.
This is really cool, thanks for sharing.