Month: December 2019

Application UT @ Austin

Audiovisual Examples

Deep Map #1

This piece is illustrative of my approach to audiovisual composition and in particular of a technique I am developing: “deep mapping”. I define deep mapping as a way of, first, identifying salient features of a composition and, secondly, making sure that those features are available for multimedia rendering in a way that can be efficiently and intuitively rendered. In other words deep mapping is an approach that allows the composer to store and render musical data into visuals by “catching” the data at its source, at a compositional stage. The advantages of this approach are: accuracy and discreteness in the representation of musical features; computational efficiency; and, more abstractly, the stimulation of a practice of audiovisual composition that encourages composers to envision their multimedia output from the early stages of their work. The drawbacks are: prerecorded sounds cannot be deep-mapped; deep mapping presupposes an algorithmic compositional approach. 

Cuachemar sur Cire

This is an excerpt from an audiovisual installation that implements multiple point lights and musical “glitches”


Locked In

Marimba and live electronics, to be performed by I-Jen Fang at UVA’s Technosonics Festival (02/13/20)

Score Locked In

Variazioni Su AlDoClEmenti

Chamber orchestra written using generative algorithms designed in OpenMusic.

Score Variazioni

First Movement: Invenzione


Saxophone quartet performed by the Radnofski Quartet

Boron (IV Movement)

Score He-Li-Be-B