To close out a very busy 2016, I was invited to have a solo show at Warren County Community College for the month of December. Ontotechnologic: Where the ontological and the technological meet. Here's how the work came to be:
After visiting the gallery space, which existed in a hallway, what struck me was the long stretch of windows on one wall that overlooked the library down below, as well as the inherent traverse of the space. I decided to make this the central reference point for the work's site specificity.
As this solo show gave me free rein to self-curate art that represented my entire body of work, I wanted to make sure I included some projection, some drawings, some computer vision, and some site specific installation. So I decided to make these windows a focal point of the exhibition with a new, large piece on that window, as well as a corresponding projection negation project on the wall opposing it, along with several existing pieces that dealt with the theme of windows and site-specificity.
Google Sketchup rendering of the WCCC gallery space
The first major new project to tackle was the focal point, the windows. On the back of my recent show at Andover Newton Theological School's Sarly Gallery which also had a massive wall of windows, I had a plan up my sleeve. Using theater lighting gels, I could create a stained glass-like mosaic that could be temporarily adhered using suction, sort of like DIY window decals. I began using the gallery floorpan as the inspiration for the layout, converting it into abstract geometry, with doorways assuming the typical T-shape that's conventional to Yamantaka mandalas.
The name owes to the site specificity of the work. Just as a file extension references the location of the work, these works reference the venue in which they is installed. It in turn becomes an extension of the windows themselves, in a sense using the "pixels" of color to digitize the space in an analog way. Hence, .windows/WCCC.
3D rendering of the windows
My initial design concept, with swatches of gels represented below the image for purposes of mapping them to the design.
Fitting the colored gels into the shapes (and hoping for max material efficiency and little waste). However, a fun surprise came when I visited the site and found the bottom row of window panes to be whited out. Some adjustments would be needed. But either way the design sat a bit higher on the windows than I would have liked for viewing purposes, and had a lot of issues with the edges not lining up with the window leading. This would lead to issues of peeling and unsightly seams, and just felt like an all-around pain in the butt to cut and piece together.
My final design accommodating the altered window dimensions. This design allowed for a much more elegant geometry, better window placement, and, to be honest, less cutting.
Then it was a matter of cutting out the gels to the proper sizes, and then squeegee-ing them onto the windows.
The view from the library down below, on the other side of the windows
Obligatory self-portrait looking through my reflection in .windows/WCCC, to my reflection in the mirrors across the expanse, back through my installation. Also a glimpse of W(ith)indow 4 in the reflection. More on that to come below...
Windows glimpsed through windows
Ontotechnologic
The next major new project to tackle would be to use the window piece as inspiration for a new, corresponding projection piece. As the ambient light would be bright, not dark, I decided to go in the direction of my projection negation series rather than my Hyalo series. This would also capitalize on the hallway's role as a thoroughfare, allowing students to inadvertently interact with the pieces and be startled by nice moments of discovery on their way to class.
A digitized version of .windows/WCCC. This would be projected on the wall, then taped off, painted, and then the colors of the projection would be calibrated to optically blended with the painted colors.
The calibrated colors that would be projected onto the painting. With the proper blend of complementary colors, one can get every color to negate each other, resulting in tones of gray. In some similar projects I chose to have every color blend to the same neutral gray, including the background. In others I left the background uncolored, but had all the geometric shapes blend to the same gray. In this one I decided to let there be some tonal variation in the gray blocks as well.
If you're not familiar with this series of work, the premise is an attempt to create an experience of apophasis - that is, mystical negation - in which the viewers's perceptual expectations are undermined as they paradoxically reveal the painting in their shadow as they obstruct the projection. The result is a somewhat sublime interactive experience in which, true to the Greek notion of aletheia, the work is simultaneously revealed and concealed.
Artist Christine Romanell interacting with Ontotechnologic.
W(ith)indows
The last piece of the puzzle was the inclusion of two pieces from my w(ith)indow series, which also had its origins in the above mentioned exhibition at Andover Newton. In that exhibition, I had created 8 site-specific frosted glass window projections that fit the architecture of the venue, entitled without_within. I had recently begun excerpting those windows one at a time as standalone pieces.
For this exhibition I decided to include the one that I had already built, plus a new, second one.
The two "windows," built to appear to function like monitors, but actually function as projection screens
The code was already written for without_within, but now I had to rescale and modify everything to work with two projectors and two screens, both run by a single computer, and remap the interactive zones to fit the venue (essentially a matter of cropping and isolating the vision of the depth camera in 3D space). This was made especially tricky by the fact that my Kinect sensor was occupied in another show at the time, so I was essentially coding blind, with no way to test the output. By some miracle, when I got the Kinect back with just days to spare before install, everything actually worked perfectly with only some minor tweaks (which almost never happens).
On to building mounts for the projectors and Mac mini, and then to begin installing.
Tidying the wires was more of a challenge than anticipated, as you can see the cable tracking up across the ceiling, then across the room, down along a door frame, to the floor where the Kinect sensor was housed and plugged. I also had to run an extension cord down and along the base of the windows to where my other projector was located.
This is why Epson projectors are amazing - the ability to adjust each corner individually when keystone correcting is a life saver for projection mapping at odd angles.
And with that, I was able to test out the interactive hot zones and make sure everything was working. All that was left was to hang the 2-D works, and then set out the promotional materials.
This show was a really great experience getting the artistic freedom to curate my own work in a way that seemed appropriate for the space. I very much enjoyed the opportunity to share my work with students and faculty, including one of the art classes during a gallery tour, and I hope those who passed through when I wasn't there to witness got something meaningful out of their experience with the work.
I adjusted the variables to locate the hand positions relative to the person's body rather than to the screen itself, which helped account for a variety of user positions near or far from the camera. I also added in the complete array of spoken code, and tweaked the timing of the voices for a closer unison.
Unfortunately I had to manually input all 1478 "words," but I did at least devise a way to automate the process of fitting all those entries into the necessary code blocks. By listing all the items in separate columns of a Numbers spreadsheet, I could copy and paste each element: "speech[__] = '__';" where the first space could be algorithmically numbered and the second space was the respective word in the array. Then I used a formula to merge all the rows into a single column, so that I could copy and paste the whole thing into Pages where I could systematically use Find/Replace to remove the extra spaces and replace the quote marks with straight quotes. Then I could copy and paste into Processing.
Our first performance...
was slated for Gallery Aferro in mid May, and would be followed by a second performance in Brooklyn in June. I met with five of the other participating artists to discuss the framework around the John Zorn COBRA inspired game rules that would dictate the performance.
Everyone's sound objects would be stationed around the gallery, and artists and viewers alike would be able to interact with them. We also contracted a jazz trio as a "house band," which (unbeknownst to them) would play standards and improvise in conjunction with the chaotic cacophony of all the sound objects joining the composition.
A big, occultish wheel-of-fortune style wheel would then dictate musical rules, including time signature changes, volume or tempo changes, switching instruments, offering a vocalized "prayer," playing blindly by wearing an oversized mask, using a toy sword to cut an instrument out of the music for a time, simulating your instrument's sound a cappella, or manipulating a wooden paddle in whatever less-than-sadistic way the user saw fit.
These of course became implausibly precise rules that most of the atonal instruments were incapable of following in the first place, and it added to the anarchic anti-structure of the whole ontological experiment.
setting up the wheel and the tables
My setup, a mac mini, keyboard, and kinect sensor packed into the cabinet of an amp with a monitor on top. The white tape denotes optimal range for interaction.
Some of the other projects included a delay-looped feedback pendulum, contact mics submerged in water with various crystals and rocks, a telegraph-turned-alarm bell, salvaged broken electronics out of which sound is coaxed (including a theramin!), and more.
One of the other artists interacting with my project
Molto-Allegro!
Fortissimo!
Heidi Lorenz-Wettach Hussa's repurposed theramin and other broken electronics
Jamming
There were still several challenges to troubleshoot. The biggest difficulty was programming it to account for all the unexpected variations in environment and interaction.
I was thankful my program apparently had the flexibility to recognize many unorthodox body shapes, but it often had a hard time detecting users, especially with multiple people in the frame. If their hands went behind their back the tempo would sometimes get stuck in super slow motion, essentially freezing the program. It would also get stuck after loosing a tracked user, not realizing it was no longer tracking them. Often times viewers didn't know if they were being tracked or not. And of course I wasn't thrilled by the shrunken visuals, but I'd need to adjust many of my equations before being able to size up the screen.
All of this thankfully went hand in hand with the provisional, experimental nature of the show.
Many kinks to work out before...
Off to Brooklyn!
In June we were invited to reprise our performance as a part of the "Summit: Nature, Technology, Self" conference led by the High Focus Institute in Brooklyn's Kilroy Metal Ceiling.
The venue was a gloriously defunct old warehouse, literally falling apart at the seams. It housed the projects of several other collectives and individuals, including prototypes of Terreform One's sustainable food shelters, which I got to see in person just days before I saw news articles popping up about it online!
Many groups held performances of one sort or another, including a DJ, a playwright, and an immersive projection/performance installation that involved an almost alien-mystic enlightenment "interview." It was tongue and cheek and utterly implausible and yet it was maybe the most impactful and stirring performance I've ever engaged with.
Sitting in the Queue to "interview" with the High Performance Institute.
After helping you craft your "resume," (by infusing the objects on the table with your own personal history and ideologies) I'm taken back behind the curtain to speak with his female counterpart. The psychedelic projection landscape enveloping you and the strangely empathic line of questioning slowly turns into a trancelike, swaying dance that may have changed my life. Not even being hyperbolic.
This time, we substituted our Jazz trio with a Progressive Noise Band out of Philly. The effect was extraordinarily different!
I reworked my project a bit, building in some textual user feedback on the tracking process to help confused viewers hang in there while it calibrates, and eliminated the perpetual slow motion glitch. Most importantly though, I got it to automatically abort and restart the tracking process when a user left the screen and it thought they were still there. I also built in a fail-safe nuclear option, in which one stroke of the keyboard could be used to reboot the whole program.
And it turned out I'd need it. Though the open space behind my piece's station caused some confusion in user tracking, it worked brilliantly. ...Until our noise band lived up to its name and cranked it up to 12 or 13 (11 is so 80's).
I still don't know exactly what went wrong, but the leading theory is that the sound vibrations were so intense it disrupted the computer's hard drive and crashed the whole thing. Not just the program, the whole computer. Nothing would do anything. Either that or there were so many amps and instruments producing wonky electromagnetic fields that it too messed with the hard drive. In any case, I spent a solid portion of the performance fighting with the technology and becoming a bystander. At least it had its moment in the sun beforehand, and it added to the anarchic feel of the whole endeavor.
I guess you could say it was a very Object Oriented moment for me, in the truest and most Heideggerian sense, as the tool's malfunctions betrayed the tool to me in all it's glorious Ontological dysfunction. Mission accomplished?
Lots more to figure out before there's ever a third iteration of this project, but until then, if you're interested, here's a link to my full processing code. Feel free to excerpt and adapt, just give credit where it's due.
This was a really fun challenge, and afforded some experiences I likely wouldn't otherwise get, working with the Oculus Collaborative and infiltrating Brooklyn's hipster-grunge experimental music scene. All in the name of Ontology.