Choral Vision is an interactive 360 projection created for Cisco Live 2017. Participants control a melodic swarm of star particles, exploding and flowing as they wield their device. Touch pressure controls the rate of emission and amplitude of sound while tilt and compass control the position of the particles, frequency and quad-panning of audio.
Audio / Web Development: Ray Mcclure
Particle Programming: Chelley Sherman
| Chelley Sherman and Lavender
| Soundwave SonicLabs 8 - Networked Utopias
| Yud Gallery, Contemporary Jewish Museum
| Nov 2nd, 2017
Live A/V performance, manipulated realtime. Based off of ritual and hypnogogia, this performance is a peer into a virtual realm in development which utilizes an unconscious process that blurs the boundary of self and others while developing a synchronicity through rhythmic and photic stimuli.
Gray Area Festival 2017 w/ BLEIE, Nonotak & Rival Consoles
Visuals for BLEIE
Audio Reactive piece created in Unity 3D (Credit to Keijiro for Reaktion Library)
Live Audio Visual Performances using Unity controlled through Midi and OSC.
Das Is is a transition through mortality, crossing through multiple planes and memories. Through this interactive journey, a disembodiment self reflects on monuments of stone navigating through unforgotten impressions of life while transitioning through manifolds on space. Every portal triggering distinct emotional responses as one converges more with a greater expansiveness.
It was first created for the Art of Dying event, hosted by OpenIDEO’s Re: Imagine series and featured on The Creator's Project. This VR experience uses the HTC Vive so the audience can interact and traverse through portals within the scene.
Mentored a group of 6 Bay Area Female Creative Code Apprenticeships to develop an interactive artwork for the renowned Dolby Laboratories' Digital Ribbon Screen on Market Street.
Using Processing, Ableton, VDMX, we created a generative installation, and programmed sound design utilizing Dolby's Atmos System.
We later lead a 4 hour workshop mentoring young creative coders to use processing to turn their code into art works.
Apsis is an interactive installation using Kinect IR tracking. When approaching the undulating object, the audience become the apsides of the system which affect the angular orbit of the object's center of mass.
Interaction coded in Processing. Visual Design in Cinema4D and MadMapper.
Exhibited at Ghosting.TV "How Many of Us" in Los Angeles 2016.
Named after the unused radio channel reserved for radio astronomy, Channel 37 is an interactive and audio reactive installation which uses Kinect IR tracking. The audience's interaction with the piece alters the landscape, creating a particle system which responds to their movements.
Animations and design created in Cinema4D, AfterEffects, and MadMapper.
Interaction coded in Processing.
Exhibited at Gray Area Foundation in San Fransisco 2015.
Ionos is a Leap Motion driven theremin installation using procedural audio in Unity3D which drive animations and shader interaction. The piece was output into the Looking Glass Factory's Volume, a volumetric cube display.
Exhibited at the Fridman Gallery in New York City 2016
Sound Design: Hugo Paris (Lavender)
Project using Scatter's DepthKit to record 3D video for use in VR. Audio Reactive scenes scripted in Unity3D.
Design: Jeff Chang
It would be unfair to glaze over some of my most recognized work. As a pet project to see if I could use the power of social media to break my ketogenic diet, I 3D scanned myself into different donut dream worlds. This project was an absolute success, resulting in 8 dozen free donuts and about 10 pounds on my scale.
Animated in Cinema4D and AfterEffects. Scanned with Skanect3D and DepthKit.
FaceOSC controlled Arduino Neopixel grid detecting facial gesture and mood.