The Ætheric: making music from EMI
May 2023TLDR: I made a pair of gloves that pickup the close-range electromagnetic interference around an object, and modify its pitch and panning for an aethereal auditory experience. I used an analog inductive coupling frontend, bend-sensitive material, two ESP32s with a Python receiver to communicate with Ableton Live.
Perceiving the Inperceptible
I’ve been interested in sonification for a while. It’s where you convey non-auditory (or non-speech) data in an auditory way. One prime candidate for sonification right up my alley is electromagnetic waves, in particular the sheer amount of electromagnetic waves that we’re drenched in on a daily basis from our power outlets, Wifi routers, Bluetooth earbuds, etc.
Even though these devices operate in different frequency bands way above our hearing range (except power lines at 60Hz), they still interfere with each other, producing waves that end up living in our audible range (we’re talking at most -50dBm or 8 orders of magnitude smaller than your phone charger’s power). Note though that these waves contain electromagnetic energy, not sound energy–they need a mechanism to turn them into actual air molecules vibrating.
As I was finishing up 21M.370 Digital Instrument Design, I wanted to make an instrument that could sonify these interferences, a sort of lute for the “aether,” or invisible medium scientists used to think light traveled through. I also wanted to give a certain aesthetic:
- Whatever antenna or pickup for these waves should ideally be mobile I can move around and sample different objects in realtime
- I should be able to manipulate the sound I’m getting from the pickup in an easy interface
- All the electronics should be visible
The Antenna
Ian Hattwick, who teaches the class, pointed me to a few existing products with similar ideas. For example, SOMA’s Ether V2 is a wideband receiver that AM-demodulates everything it receives. The Elektroslutch 3+ uses inductors to couple with nearby EMI, and jacks it up to audible range with a beefy op-amp. I chose the focus on the latter because with my limited circuit knowledge at the time1, it being open-source was incredibly helpful.
Elektrosluch’s EMI sensor circuit simplified
Even though I didn’t know a lot about circuit design, I tried to understand what each part did and spec parts according to what I could get. I wanted:
- large so I could get a larger induced voltage. 10 mH actually turned out to be the biggest I could get on Amazon at a good size and price
- small to minimize attenuation. I had a few 100 Ohms lying around that could do the trick.
- large to AC couple and minimize harmonic distortion (I want just the EMI, not op amp noise!) The lab had beefy 100uF so I went with those.
Finally and were chosen for high gain >50dB up to 20kHz. I wasn’t exactly sure how much gain I’d need, but I guessed this number was high enough to start. Next, Elektrosluch’s schematic uses an OPA1612, which doesn’t come in 8-DIP through-hole form. So I opted for the LM4562N, a slightly cheaper through-hole option with support for ±2.5V rails from my 5V USB supply.
Circuit tested with analog audio out going directly to an audio interface into Ableton, then quickly put onto a piece of packing tape for a test fit
Samples from a switching regulator in the power brick, an iPad that starts using Wifi, and earbuds that start Bluetooth advertising. You can even hear Lorde’s Royals from the radio
Making Sensors
Besides just picking up raw audio, I wanted a way to change the way it sounded via hand gestures with an ESP32 at my disposal. This played into the “EMI wizard harnessing the latent sound of objects” vibe. I played around with the idea of having a solid piece similar to an ergonomic mouse that would house the antenna circuitry with buttons for my fingers to play different notes, but opted for something I thought was easier to fabricate: a pair of gloves with bend sensors on each finger to modify pitch continuously.
Making cheap bend sensors with Velostat, copper tape on both sides, and a Cricut sealer
Since we didn’t have any actual bend sensors lying around, I opted to make some with Velostat, a material whose resistance changes with pressure. By putting copper tape on each side (stuck onto the plastic, not Velostat), then soldering on jumpers, I hoped I could use the ESP’s ADC to track bendiness.
Bend sensors attached to each finger with some 3D-printed joint guides I glued to some black gloves from Amazon
After fiddling with some smoothing filters, I managed to be able to track open palm and curled fist pretty well. The main problem was the sensors were not reliable over sessions, so they didn’t really have a “range.” So I wrote this calibration routine that would expect an open and close gesture at the start of each session to reach mins and max ADC values. These extrema would continuously tighten or relax if the current values were outside of to the current extrema.
For further gestural flexibility, I also opted for an IMU for the roll on one hand to change audio panning. The audio setup in the classroom had two large subwoofers, so I wanted to take advantage of that locality of sound to further emphasize waves traveling in the aether. This also required some averaging filters.
Putting It All Together
The dream was to go completely wireless, giving myself the freedom to move around and really explore sounds. In reality, audio transmission was the main bottleneck. For ease of implementation, I settled on the following:
Full system to minimize number of wires: only two super long 1/8” stereo cables from each glove
I was able to connect the ESPs to MIT’s WPA2-Enterprise with PEAP Wifi after some fanfare, and UDP packets were streaming in. The next thing to do was control Ableton via Python, which AbletonOSC was able to do. By setting up a local OSC client to talk to Ableton and setting up my pitch-shift and panning effects on separate tracks, I could use the system end-to-end!
Conclusion
I unfortunately can’t find videos of the live performance I gave for 21M.370, but I dressed in all-black with sunglasses, brought in mice, keyboards, power cords, routers, and headphones as objects to “sample.” I started with more ambient sources (such as power bricks, 60Hz outlets) to play around with pitch and panning then moved to triggering larger squeals on devices by using Wifi and Bluetooth. A few portable batteries came unplugged, but the show must go on!
I certainly feel the project is in dire need of better physical construction, so I see a new iteration in the future. Besides that, I enjoyed making something that wasn’t just on the computer for the first time–there were a lot of nuances both in aesthetic choice and actual technical execution. In the meantime, I’m interested in how I can larger things–like my saxophone–into an antenna. Stay tuned!
-
MIT just got rid of its pure EE major. It’s pretty ridiculous I could’ve graduated without learning about a BJT ↩