Your Body on Music

Ongoing

Funded by the City University of New York

I have held an interest in the rhythms of the body since I was in college. Likely, because of my training as a drummer, I always felt that the sounds produced by my controlled body were in some way informed by the sounds produced by my uncontrolled body. In this project, my intention is to see if that works the other way around, as well.

As a starting point, it’s rather simple. I suspect that the body reacts rather strongly to external influences in the form of sound especially. This is, in my mind, primarily due to sound being the fastest of the senses to be registered by the brain. It takes just a few milliseconds for us to ‘hear’ something in the world. Yes, light travels the most quickly through space, but our brain takes a comparative eternity—around 160 milliseconds—to register it.

The evolutionary reasoning behind this is likely related to predators, and danger more generally, but, nonetheless, it persists inextricably as part of our sensory experience as humans. While much of our world shifts toward the visual—emphasizing stimuli that capture our eyes in order to speak to the rest of us—this relationship to sound, however artful or practical it may be, is still fundamental.

Funding for the project—namely, the sensors used—is provided by the City University of New York.

Part I:

The ultimate goal is create a standalone interface which can satisfy the following requirements: be portable, have the capacity to measure heart rate, blood oxygen, and skin temperature, be modular (so as to build/reduce as needed), and lastly to provide historical data.

These are all possible, with no exceptions, using my 2019 Macbook and some sensors I acquired from a few different places. But, rather importantly, it is unreliable. Establishing (and reëstablishing, ad nauseum) a wired or BLE connection through a software last updated in 2018 is a labor of love. The sensors I’m using are highly delicate in their readings—for example, in the example I share below, approximately one-quarter of the readings I receive for heartbeat data are either the max- or minimum- level values of the sensor itself, for which I have to establish a filter for—and TouchDesigner, as wonderful a software as it is, is incapable of exporting a program to be run outside its box.

In essence, the example below is a first-draft, if not a prototype. Of the accuracy that can be achieved with low-level equipment, and a GPU that isn’t quite up to the task I’m demanding of it. More text below it.

The subject here is my fiancé, Danielle. A classical pianist, I asked her to choose a song that—no more, no less—meant something to her. And so she sat and listened to Rachmaninoff for this experiment. I isolated this one minute excerpt because you can see her heart follow the song both in anticipation, and actual listening experience. To me, it feels like a glimpse into an expert’s mind without using a fully fledged EEG system. Something which I, of course, would love to use.

More soon.

Next
Next

Citibike Tracker