Audio + Movement to MIDI
Convert audio and movement to MIDI so you can control a digital instrument with any sound source including physical instruments or your voice.
I created an application using C++/JUCE that converts incoming audio, either from a musical instrument or microphone, into MIDI Notes. It then sends those notes to your Digital Audio Workstation (DAW) or soft-synth. The application also receives Accelerometer and Gyroscope values wirelessly from an Arduino, and converts those to MIDI CC values that control parameters in your DAW or soft-synth.
You can sing or play a musical instrument, and use it as a MIDI Controller. Because the application is a standalone app, not a plug-in, the DAW sees the app as a MIDI Controller. So you can assign the different sensor values to control whatever parameters you want within the DAW using MIDI Mapping. You can also send the MIDI data out of your computer to control MIDI Hardware.
There is more to the music we hear beyond the notes coming out of a musical instrument. The physicality of instruments impacts the way musicians play them, impacting the music beyond the instrument's sound. By capturing the audio and physicality of instruments, and converting them to data, in this case MIDI, any instrument can be used to create any sound, in any instrument style.
Since the application communicates wirelessly with the Arduino, the sensors can be separated from the associated instrument and musician. They could be mounted on other performers, or perhaps dancers, so that there is interactivity between different performers and the timbral qualities of the digital instrument.
You can sing or play a musical instrument, and use it as a MIDI Controller. Because the application is a standalone app, not a plug-in, the DAW sees the app as a MIDI Controller. So you can assign the different sensor values to control whatever parameters you want within the DAW using MIDI Mapping. You can also send the MIDI data out of your computer to control MIDI Hardware.
There is more to the music we hear beyond the notes coming out of a musical instrument. The physicality of instruments impacts the way musicians play them, impacting the music beyond the instrument's sound. By capturing the audio and physicality of instruments, and converting them to data, in this case MIDI, any instrument can be used to create any sound, in any instrument style.
Since the application communicates wirelessly with the Arduino, the sensors can be separated from the associated instrument and musician. They could be mounted on other performers, or perhaps dancers, so that there is interactivity between different performers and the timbral qualities of the digital instrument.
Side story This whole thing came about because I wanted to be able to play my bass guitar and have crazy synth sounds come out for Drum & Bass with live instruments. The rabbit hole kept getting deeper.
Frequently Asked Questions
What inspired you to do this?
I always wanted to play my bass guitar but have completely unreal sounds come out. This essentially lets me play my bass and produce any sound a digital instrument can synthesize or sample. From there, needed a way to control different parameters as if I was using a traditional MIDI Controller. Since both my hands would be playing the bass, capturing gesture was an obvious alternative to knobs and buttons.
How long did it take to make it?
The first working version essentially took one semester to make. It was the final project for my masters program. Prior to that semester, I initially attempted similar applications using MaxMSP but did not like the results for a variety of reasons. I have also been tweaking, improving, and modifying it into the version you see now.
How long have you been doing things like this?
Audio programming with C++? About a year. Making, hacking, modifying, and breaking musical instruments? Since i first de-fretted and upgraded the pickups on a bass guitar when i was a teen.
How much did this cost to do?
Since it's mostly software, the only things i needed to buy was the Arduino Nano 33 IoT, a good USB cable for it, and a battery pack. Roughly £25.
Have you done other things like this?
I have made a synth plugin. I am currently working on a compressor plugin. And, I am working on another project that will convert a bass guitar into a digital bass guitar using another C++/Juce app on a Raspberry Pi with an Arduino sending data from multiple sensors to control effects directly onboard.
What did you wish you knew before you started this?
Not much. Part of the fun is figuring out and learning all the things.
Are there plans available to make this? Do you sell this?
If i can get it to reliably perform on a level that satisfies me, I would love to sell it. Until then, the source code is available here: https://github.com/RFullum/MastersProject1
What’s next?
Like i mentioned above, i'm currently working on two other projects, a compressor plugin and a Digital Bass Guitar. I have some ideas for other synthesizers i want to program. Also, i have experimented with audio-to-color analysis in the past and would like to revisit that in a more robust using real time FFT analysis and blending proportional blending of colors based on the harmonic frequencies' amplitudes.
Resoures?
amandaghassaeiFollow. (n.d.). Arduino Frequency Detection. In Instructables. Retrieved June 26, 2020, from https://www.instructables.com/id/Arduino-Frequency-Detection/ (https://www.instructables.com/id/Arduino-Frequency-Detection/)
Andersen, K., & Gibson, D. (2017). The instrument as the source of new in new music. Design Issues, 33(3), 37–55. https://doi.org/10.1162/DESI_a_00450 (https://doi.org/10.1162/DESI_a_00450)
Gibson, J. J. (1977). The theory of affordances. In J. B. Robert E Shaw (Ed.), Perceiving, acting, and knowing: Toward an ecological psychology (pp. pp.67–82). Hillsdale, N.J. : Lawrence Erlbaum Associates. https://hal.archives-ouvertes.fr/hal-00692033 (https://hal.archives-ouvertes.fr/hal- 00692033)
Gillian, N., & Nicolls, S. (2012, September). A Gesturally Controlled Improvisation System for Piano.
Hamilton, M. (2018). JUCE on Raspberry Pi. In mhamilt.github.io. https://mhamilt.github.io/blog/2018/10/23/JUCE-on-Pi.html (https://mhamilt.github.io/blog/2018/10/23/JUCE-on-Pi.html)
McLeod, P., & Wyvill, G. (2005). A SMARTER WAY TO FIND PITCH. 4.
Vitos, B. (2014). Along the Lines of the Roland TB-303: Three Perversions of Acid Techno. Dancecult, 6(1). https://doi.org/10.12801/1947-5403.2014.06.01.14 (https://doi.org/10.12801/1947-5403.2014.06.01.14)
Wessel, D., & Wright, M. (2002). Problems and Prospects for Intimate Musical Control of Computers. Computer Music Journal, 26(3), 11–22.
As well as Juce's documentation and forum https://juce.com/
and The AudioProgrammer's YouTube channel https://www.youtube.com/channel/UCpKb02FsH4WH4X_2xhIoJ1A
Andersen, K., & Gibson, D. (2017). The instrument as the source of new in new music. Design Issues, 33(3), 37–55. https://doi.org/10.1162/DESI_a_00450 (https://doi.org/10.1162/DESI_a_00450)
Gibson, J. J. (1977). The theory of affordances. In J. B. Robert E Shaw (Ed.), Perceiving, acting, and knowing: Toward an ecological psychology (pp. pp.67–82). Hillsdale, N.J. : Lawrence Erlbaum Associates. https://hal.archives-ouvertes.fr/hal-00692033 (https://hal.archives-ouvertes.fr/hal- 00692033)
Gillian, N., & Nicolls, S. (2012, September). A Gesturally Controlled Improvisation System for Piano.
Hamilton, M. (2018). JUCE on Raspberry Pi. In mhamilt.github.io. https://mhamilt.github.io/blog/2018/10/23/JUCE-on-Pi.html (https://mhamilt.github.io/blog/2018/10/23/JUCE-on-Pi.html)
McLeod, P., & Wyvill, G. (2005). A SMARTER WAY TO FIND PITCH. 4.
Vitos, B. (2014). Along the Lines of the Roland TB-303: Three Perversions of Acid Techno. Dancecult, 6(1). https://doi.org/10.12801/1947-5403.2014.06.01.14 (https://doi.org/10.12801/1947-5403.2014.06.01.14)
Wessel, D., & Wright, M. (2002). Problems and Prospects for Intimate Musical Control of Computers. Computer Music Journal, 26(3), 11–22.
As well as Juce's documentation and forum https://juce.com/
and The AudioProgrammer's YouTube channel https://www.youtube.com/channel/UCpKb02FsH4WH4X_2xhIoJ1A
Rob
: Music Producer, Composer, Performer, Audio Programmer, Sound designer, Maker, etc.
I make music and sounds, and programs that make music and sounds.
Connect with Rob
How I can help you:
I produce original electronic dance music, from sound design, through composition, mix-down, and mastering. I've also produced and remixed artists from Indie Rock, to Pop/R&B, to Hip Hop. I also do Sound Design. And i can do different music tech: C++/JUCE audio plugin development, Unity/Wwise audio programming in C#, sketching in Processing 3 for audiovisuals, Ardiuno making/coding, Raspberry Pi & Linux, simple Python scripting.
I produce original electronic dance music, from sound design, through composition, mix-down, and mastering. I've also produced and remixed artists from Indie Rock, to Pop/R&B, to Hip Hop. I also do Sound Design. And i can do different music tech: C++/JUCE audio plugin development, Unity/Wwise audio programming in C#, sketching in Processing 3 for audiovisuals, Ardiuno making/coding, Raspberry Pi & Linux, simple Python scripting.
How you can help me:
More experienced C++ devs would probably know the most efficient way to implement code. I often feel like i'm reinventing the wheel. Also, GUI devs with design backgrounds could probably make better GUIs, faster, and better than I could by myself.
More experienced C++ devs would probably know the most efficient way to implement code. I often feel like i'm reinventing the wheel. Also, GUI devs with design backgrounds could probably make better GUIs, faster, and better than I could by myself.