CCRMA, Stanford University

Virtual Reality

A landing page for all my VR projects, also described individually in the list below.

12 Sentiments for VR (2019)

12 Sentiments for VR is an extended aesthetic exploration of the emotional life cycle of a plant. Each of its 12 movements uses musical interaction to explore a moment in a plant’s life. A different emotional aesthetic (such as "excitement, longing") is used to drive the design decisions made in the crafting of each movement. Users embody vines, seedlings, wind, and earth as they grow, cross the world, become lost, and find their way home again. The piece creates space for self-expression and exploration through making music. It also creates space for being, a deep stillness resulting from calm and intentional, inward reflection.

Resilience (2019)

Resilience is a piece for laptop orchestra and one VR performer. A prequel to the longer, individual VR experience 12 Sentiments for VR (an aesthetic exploration of the emotional life cycle of a plant), it follows a group of seedlings as they search for a new home. The piece is an exploration of resilience through traumatic life events, finding peace and joy in small moments, and reconnecting with the ability to grow.

Strange Environments for VR (2018)

A series of strange, weird, and wonderful musical environments for VR. Each explores interactions that would not be possible in physical reality, with special attention to making worlds full of creatures that feel whimsical and "alive".

Includes "Crab Surfer" (swim around and pop bubbles, using your hands or by throwing crabs that roll around together), "Wheebox" (launch spheres that enjoy being thrown, but become anxious afterward), and "Oh No! Bots" (try to interact with little robots that are too polite to tell you they'd rather be left alone...).

Strange Instruments for VR (2018)

A series of musical instruments for VR that would not be impossible to create or play in the real world.

Includes "Canyon Drum" (a drum that is very large and very far away), "Twist Flute" (a flute that changes its timbre when you twist your hands), "Hair Instruments" (flexible chimes that you can wear on your head), and "Shake Marimba" (a marimba whose mallets are imbued with the note you've most recently played).

Junk Junction (2019)

This piece explores the values of modern American society. Sounds mimic other sounds. Particular sounds with a specific meaning are transmuted into a chaotic, impenetrable aether. Reflect. Rejoice. Regret. Lament. Mourn. Microwave.

Featuring poetry by Campbell McGrath. Concatenative synthesis chugin written by myself.

Mapping by Example (2018)

I created a tool for using interactive machine learning to create mappings to control audio synthesis algorithms. Specifically, the tool allows users to place examples of specific sounds in 3D locations in a virtual reality environment; the tool then generalizes a mapping for the entire space based on the positions and/or velocities of the example sounds.

Circle of Life (2018)

Circle of Life is an audiovisual experience. Users control flocks of plankton, fish, and garbage in order to play music through surviving. All good things must come to an end, but when life is cyclical, the end is a new beginning.

VRAPL (2017)

VRAPL is a block-based sculptural programming language. It is used from within VR to manipulate the virtual world around you. It can be used to generate streams of audio for playing out loud and for controlling the physics of the virtual world objects. Its motivating goal was to enable the user to fully program the virtual world with sound.

Chunity (2016-Present)

Chunity is the wrapper for using the programming language ChucK inside the game development environment Unity. The community is always growing -- check it out!!

waVR (2016)

waVR is a musical audiovisual narrative where users make music using broad arm movements, creating waves in the ocean and swimming with whales. It was built for early developmental VR hardware, and used a Gametrak for hand controls since VR hand controllers were not available for developers at the time.

Glowsualizer (2016)

Glowsualizer is a general music visualizer built for experimental VR hardware that displays frequency and loudness information with vibrating, glowing, rainbow strings. The strings surround the user in a series of rings and wiggle, pulse, and ascend in time to the music.

Leap The Dips (2016)

This rolling ball sculpture invites participants to test their skill at "leaping the dips" on a copper model of the world's oldest operating roller coaster. The project's aesthetic draws from a practice certainly much older than the roller coaster: teenage rebellion, and the ensuing adult panic over the activities of "kids these days." Marbles roll over tracks and supports that are fashioned out of soldered copper wire. The tracks feature dips that cause the marbles to lift off the track and crash back down, as was possible in early roller coasters without up-stop wheels on the underside of the track. Take care in the placement of your marble not to cause the marbles to completely fly off the track! The dips are fitted with sensors that drive an algorithm in Max/MSP for giving aural feedback and a cultural experience to the users.

Inter-String Time Delay Zither (2015)

The Inter-String Time Delay Zither is a plucked string instrument that changes its sound based on how fast you pluck it. Its strings are arranged into note pairs (groups of two strings that are tuned to the same note), and using a system of pickups under two bridges, it detects the difference in pluck time for the left and right strings of each note pair. The strings’ vibrations are picked up with piezos and routed through some audio processing in Max/MSP, where the inter-string time delay is used to drive audio effects like beating and distortion. The interaction of manipulating sound via the speed at which you pluck each note thus affords an additional level of control beyond those present in a traditional zither.

Chorest (2015)

Chorest (chord forest) is a world populated by glowing strings, which grow and create sound based on your voice. The world remembers the chords you grow and even shows you ghosts of your past actions, which can be heard and viewed from top-down, third-person, or first-person perspectives. This work was a precursor to my later work on virtual reality.

The Precipice of the Uncanny Valley (2015)

Timbre mimicking is the development of algorithms for imitating the timbre of existing sounds. It's often used to recreate known sounds, like those of acoustic instruments, using new synthesis techniques. I used genetic algorithms to mimic the timbre of a target sound using granular synthesis with a source sound. The algorithm breaks the source sound into short grains. Then, it rearranges them randomly in order to generate a few examples, and tweaks and recombines the examples until they mimic the target timbre very closely.


UC Berkeley

Adventures in Patchblocks (2014)

I created a series of patches for musical exploration on small interlocking synth hardware blocks, and documented my learning process and the new control gestures I created on a Tumblr blog. Programming for the blocks involved a combination of a high-level Max/MSP-like visual programming language and creating new modes of operation by programming in C. Two of the gestures I created are using a knob to ‘flick’ a parameter with inertia, and holding down a button to cause a change over time. I also ran a series of experiments where I subjected new users to the hardware blocks to gestures of varying complexity, documented their exploration process, and reported my findings on the blog.

NoteScript (2014)

NoteScript is a domain-specific language for quickly writing down melodies and hearing them back. My team and I created constructs for notes, chords, rests, rhythm, articulations, keys, transposition, and functions with arguments. We desugared some constructs to a core language of chords, rests, and rhythms, then compiled this core language to ChucK to synthesize the sound. The language is capable of playing any number of parts, each using a different oscillator and performing at a different tempo, at once.

Remix Novelty Ranker (2014)

The Remix Novelty Ranker is a suite of unsupervised machine learning algorithms that rank a particular music label’s remix releases based on how novel they are when compared to their respective original mixes. The algorithms used include k-means, gaussian mixture models, kernel density estimation, and principal component analysis. Feature vectors for songs are formed using music information retrieval techniques to obtain rhythm and pitch histograms, and also use the mel-frequency cepstral coefficients as a measure of timbre.

The Melody Stochaster (2012)

The Melody Stochaster is meant for playing stochastic, or random, melodies, that are based around a melody you feed it. These melodies are played back with a timbre that randomly moves through a space of square, saw, and triangle waves. However, you can also play the instrument like a traditional synthesizer using this random timbre movement.