October 5 - 14, at Yerba Buena Center for the Arts, I share the stage with George Frideric Handel and my old friends, the Kronos Quartet, for what promises to me an evening of powerful inspirational choreography by Alonzo King. Alonzo asked me to compose musical interludes to connect excerpts from Handel’s music. Using granular synthesis to prolong orchestral strings and throwing in some of my own odd samples, my contribution was composed in two days.
Last week I visited Montréal’s Société des Arts Technologiques, where I heard an amazing TouchDesigner dome performance (above) by Vincent Houzé, learned all about immersive sound from Jean-Marc Jot, heard an amazing 360°-video-plus-AR demo by Zack Settel, heard an impressive 3D speaker by Peter Otto, learned how to build AR iOS apps using ARkit and Unity with Isaac Cohen, experienced 3d photogrammetric point clouds in a beautiful VR experience by Priam Givord, and had lots of fun conversations about the future of XR.
Former collaborator Advanced Systems Group invited me to a VR/AI/HDR/IP conference in San Francisco. I met Blackmagic Design’s Tim Cuthbertson, who introduced me to Fusion 9 Studio, a beautiful “Shake-like” compositor with strong support for VR (spherical tracking, stabilization, and rotoscoping!). Colorist Shane Rugieri described Dolby Vision’s workflow for mastering in Rec. 2020 and delivering to smart video monitors. The highlight of the event was a presentation by Colleen Kessler, who shared practical experiences and tips about shooting sports using the Jaunt 360° video camera.
Curator Alex Corbett invited me to create a video projection for her “Queer Eye Rococo” exhibition running September 9 through October 6 at the Naming Gallery, 335 15th St, Oakland. The image suggests a rigid carved shape, but it undulates as if underwater. Rococo ornamentation was originally meant to evoke “natural” motifs, but to our 21st century eyes they appear as much of an aesthetic construct as cake decoration. In the same way, “natural” sexual expression is often revealed to be a social construct. The work’s title is both descriptive and imperative: “Flourish.”
My new work “ask” explores the sonic/somatic realm of ASMR, with vocals by Megan Jones and ASMR consultation by Erin Dougherty. In addition, Rob Ramirez of Cycling ’74 helped with implementing inverse kinematics using rigged models made in Maya and animated with jit.phys objects.
It’s a continuation of my research using video to interact with dance directly. A Roli blockpad connected to Max’s blocks.multitouch object in Ableton Live works great as a compact multi-channel 3D controller. I am able to improvise arms and legs in the projection and “jam” with dancer Lindsey Renee Derry.
I attended a lecture on using block chains to authenticate digital art at Eyebeam’s new home in Industry City. But the real highlight was catching up with an old friend, Eyebeam Director Roddy Schrock.
He’s transformed Eyebeam from a showcase for video art into an incubator where artists help push and redefine technology–instead of the other way around. The evening event was presented during the final show for the artist residency program. Afterwards I spoke with Kelly Rae Aldridge, curator for an exciting new museum project, current.mu.
I attended the three-day 2016 Gray Area Festival curated by Barry Threw. The festival is remarkable for its diversity of subject matter: media art, politics of art, gaming, movies, installations, software, and media theory.
High points included: an overview of node-based programming and an introduction to vvvv by joreg; a drawing robot that complements human drawing using cv.jit by Sougwen Chung; an overview of context in the Internet and VR by Pablo Garcia; exploring non-traditional video games and installations with Heather Kelley; music sets by Pharmakon and Container.
Ashley Bellouin invited me to speak to her class “Programming for Sound, Performance, and Installation Using Max/Msp/Jitter” at the San Francisco Art Institute. I described techniques for building reliable installations, featuring Max’s computer vision library, my collaboration with Jeffrey Shaw and Jonathan Bachrach, and my work at the Choreographic Coding Lab.
This week I visited Montréal’s Société des Arts Techniques, where I spoke with my old friend Zack Settel. He gave a sneak peek demo of his impressive recent work for the Métalab–where he has worked since its inception. The Metalab is a large immersive projection hemisphere with a 32-channel sound system (using 157 Meyer MM-4 speakers) that is used for performances and research. Zack’s project used the physical modeling engine of Unity, not only to control the video projections, but also to control sound synthesis in Pd and Supercollider, so that the virtual physics interactions created a very fluid and appealing form of counterpoint. It was enchanting.
Zack and I met in Paris when we both worked at IRCAM. Zack contributed the Forbidden Planet patch to MSP’s first release, and I updated the patch for the pfft~ object a few years later. We shared a digital studio in Paris, where he developed control software for Yamaha digital mixers and I proposed and tested the first version of MSP’s vst~ object.
I was sad to learn that Pierre Boulez passed away yesterday. I was his musical assistant for the world premiere of “…explosante-fixe…” on September 13, 1993.
As a composer, he inherited and furthered the legacies of Webern and Stravinsky. As a conductor, he made the thicket of Schoenberg crystal clear. He publicly opposed Cage’s aleatoricism, but concealed chance operations deeply in his own works.
His apartment was full of artworks: Bacon, Miró, Klee. His opinions were always vocal and often vehement.
And he could tell a very dirty joke with polite refinement.
During the Mills College winter break, I was up to my ears in analog.
I modified Mills’s (freshly out-of-warranty) API. I also chipped in and helped Garry Creiman install Tiny Telephone‘s magnificent old Neve in the new Oakland studio.
Invited to the fifth Choreographic Computing Lab at UCLA, I had the pleasure of five days of playing with movement data from dancers and developing real-time environments for eliciting movement. Organized by Florian Jenett and Scott Delahunta, CCL5 was an ideal opportunity to experiment with digital media and dance. In the photo, dancer Hannah Simmons beta tests my "choreographic object", a Kinect-Max environment that gives musical and video feedback in response to various dance actions–in this case, parallelism.
Meyer Sound recently posted an article about the sound system at Mills College, installed earlier in 2015. In it, I provide some information and detail about how the system was designed for the audience and the students in mind:
Mills College's new system had to meet the needs of CCM's experimental composers. "Our students use sound in all kinds of unconventional ways," Stuck explains. "Instead of surrounds, we have independent points where we can pan sound. For example, we have an overhead HMS-12 directly above the mixing station to get a convincing effect of sound coming from above. With this upgrade, composers can pan, spread sounds apart, and experiment with unusual effects."
Today the Dresden Opera premieres its production of William Forsythe’s Impressing the Czar, which features my music. This full-evening work also includes my collaboration with Thom Willems: In the middle, somewhat elevated.
I'm attending the Frankfurt Ballet reunion, which coincides with the final performances of Kammer/Kammer and the end of William Forsythe’s residence in Frankfurt. It's been an incredibly fun celebration, reconnecting with everyone, watching a Tanztheater masterpiece performed by the Forsythe Company, and learning about Motion Bank.