Tagline

The Studio of Eric Valosin

Saturday, October 15, 2016

OOOA Part TwOOOA

The performance date for our "Object Oriented Ontological Actions" crept nearer as I continued to refine my computer vision speech synthesizer project from my last post, The Word That Speaks Itself (Objectless Oriented Program).

I adjusted the variables to locate the hand positions relative to the person's body rather than to the screen itself, which helped account for a variety of user positions near or far from the camera. I also added in the complete array of spoken code, and tweaked the timing of the voices for a closer unison.

Unfortunately I had to manually input all 1478 "words," but I did at least devise a way to automate the process of fitting all those entries into the necessary code blocks. By listing all the items in separate columns of a Numbers spreadsheet, I could copy and paste each element: "speech[__] = '__';" where the first space could be algorithmically numbered and the second space was the respective word in the array. Then I used a formula to merge all the rows into a single column, so that I could copy and paste the whole thing into Pages where I could systematically use Find/Replace to remove the extra spaces and replace the quote marks with straight quotes. Then I could copy and paste into Processing.

Our first performance...


was slated for Gallery Aferro in mid May, and would be followed by a second performance in Brooklyn in June. I met with five of the other participating artists to discuss the framework around the John Zorn COBRA inspired game rules that would dictate the performance.

Everyone's sound objects would be stationed around the gallery, and artists and viewers alike would be able to interact with them. We also contracted a jazz trio as a "house band," which (unbeknownst to them) would play standards and improvise in conjunction with the chaotic cacophony of all the sound objects joining the composition. 

A big, occultish wheel-of-fortune style wheel would then dictate musical rules, including time signature changes, volume or tempo changes, switching instruments, offering a vocalized "prayer," playing blindly by wearing an oversized mask, using a toy sword to cut an instrument out of the music for a time, simulating your instrument's sound a cappella, or manipulating a wooden paddle in whatever less-than-sadistic way the user saw fit. 

These of course became implausibly precise rules that most of the atonal instruments were incapable of following in the first place, and it added to the anarchic anti-structure of the whole ontological experiment.

setting up the wheel and the tables 
My setup, a mac mini, keyboard, and kinect sensor packed into the cabinet of an amp with a monitor on top. The white tape denotes optimal range for interaction.

Some of the other projects included a delay-looped feedback pendulum, contact mics submerged in water with various crystals and rocks, a telegraph-turned-alarm bell, salvaged broken electronics out of which sound is coaxed (including a theramin!), and more.




One of the other artists interacting with my project

Molto-Allegro!

Fortissimo!

Heidi Lorenz-Wettach Hussa's repurposed theramin and other broken electronics



Jamming

There were still several challenges to troubleshoot. The biggest difficulty was programming it to account for all the unexpected variations in environment and interaction. 




I was thankful my program apparently had the flexibility to recognize many unorthodox body shapes, but it often had a hard time detecting users, especially with multiple people in the frame. If their hands went behind their back the tempo would sometimes get stuck in super slow motion, essentially freezing the program. It would also get stuck after loosing a tracked user, not realizing it was no longer tracking them. Often times viewers didn't know if they were being tracked or not. And of course I wasn't thrilled by the shrunken visuals, but I'd need to adjust many of my equations before being able to size up the screen.

All of this thankfully went hand in hand with the provisional, experimental nature of the show. 

Many kinks to work out before...

Off to Brooklyn!

In June we were invited to reprise our performance as a part of the "Summit: Nature, Technology, Self" conference led by the High Focus Institute in Brooklyn's Kilroy Metal Ceiling.

The venue was a gloriously defunct old warehouse, literally falling apart at the seams. It housed the projects of several other collectives and individuals, including prototypes of Terreform One's sustainable food shelters, which I got to see in person just days before I saw news articles popping up about it online!

Many groups held performances of one sort or another, including a DJ, a playwright, and an immersive projection/performance installation that involved an almost alien-mystic enlightenment "interview." It was tongue and cheek and utterly implausible and yet it was maybe the most impactful and stirring performance I've ever engaged with.





Sitting in the Queue to "interview" with the High Performance Institute.

After helping you craft your "resume," (by infusing the objects on the table with your own personal history and ideologies) I'm taken back behind the curtain to speak with his female counterpart. The psychedelic projection landscape enveloping you and the strangely empathic line of questioning slowly turns into a trancelike, swaying dance that may have changed my life. Not even being hyperbolic.

 This time, we substituted our Jazz trio with a Progressive Noise Band out of Philly. The effect was extraordinarily different!




 I reworked my project a bit, building in some textual user feedback on the tracking process to help confused viewers hang in there while it calibrates, and eliminated the perpetual slow motion glitch. Most importantly though, I got it to automatically abort and restart the tracking process when a user left the screen and it thought they were still there. I also built in a fail-safe nuclear option, in which one stroke of the keyboard could be used to reboot the whole program.



And it turned out I'd need it. Though the open space behind my piece's station caused some confusion in user tracking, it worked brilliantly. ...Until our noise band lived up to its name and cranked it up to 12 or 13 (11 is so 80's). 

I still don't know exactly what went wrong, but the leading theory is that the sound vibrations were so intense it disrupted the computer's hard drive and crashed the whole thing. Not just the program, the whole computer. Nothing would do anything. Either that or there were so many amps and instruments producing wonky electromagnetic fields that it too messed with the hard drive. In any case, I spent a solid portion of the performance fighting with the technology and becoming a bystander. At least it had its moment in the sun beforehand, and it added to the anarchic feel of the whole endeavor.

I guess you could say it was a very Object Oriented moment for me, in the truest and most Heideggerian sense, as the tool's malfunctions betrayed the tool to me in all it's glorious Ontological dysfunction. Mission accomplished?

Lots more to figure out before there's ever a third iteration of this project, but until then, if you're interested, here's a link to my full processing code. Feel free to excerpt and adapt, just give credit where it's due.

This was a really fun challenge, and afforded some experiences I likely wouldn't otherwise get, working with the Oculus Collaborative and infiltrating Brooklyn's hipster-grunge experimental music scene. All in the name of Ontology.



<< PREVIOUS POST