COMPEL Omeka Dev

Browse Items (868 total)

  • From "The Unrecordables 4" Electronic Music Showcase at Wayne State University: Composition, one day at a time 2022-2023
  • terracotta is an audiovisual work specifically composed to explore the sounds of arbitrarily shaped drums
  • Chaos Bells is a very large (2 metres wide and tall) instrument in which bell sounds can drone and become chaotic
  • Bambuchla Shadows is a short exploration of colorful responsive sounds and live sound processing, for homemade bamboo flute and SuperCollider. It's my first fully realized piece for SuperCollider and live performer. The computer-generated sounds create a sparkling, shifting cloud in response to the very rustic and woody flute melodies, punctuated sometimes by live granulation and triggered samples from an ancient Buchla. I am an intermediate SC user, and I worked out the coding for this piece over several months; moreover, I also made the (admittedly rather crude) bamboo flute myself. It took several attempts to figure out how to cut and bore the finger holes and mouth hole correctly, and also how to cure the bamboo in the oven without setting the flute on fire. (That did happen on a previous attempt.) I like the contrast between high-tech computer sounds and low-tech flute I've painstakingly created in this piece, as both the digital and physical musical elements required experimentation and attention to detail. This piece is largely improvised, with scored cues for improvisation popping up as images triggered by the performance code.
  • In Five Songs, analyses of the flute performance drive the electroacoustic music, modifying various parameters that affect its realization in a way that is closely related to the flutist’s sound and gesture. Each part complements the musical capabilities of the other: sometimes they fuse into a compound voice that is simultaneously narrative and abstract, and at other times they oppose one another in stark relief. Inspired by the poetry of Stephen Crane, the five brief sections of the work manifest contrasting moods, interaction strategies, and approaches to material.
  • Placeholder- Artifact.png

    This instrument uses deep learning to generate its control signals based on muscle and motion data of the performer’s actions. The generated control signals automate the live sound processing based on layered time-based effects modules.
  • A Mollenhauer alto recoder is realtime controlled via computer: blowing intensity, fingering, vibrato, flutter-tounge-effect, Helium pitchbend

    All sound ist acoustic, also percussive artifact sounds form fingers. The fingers can be positioned by linear actuators for experimental fingerings. No solenoids, analog MIDI controllers. First public performance in 2021 based on fixed Media (=MIDI file) in a duet with human flautist.
Output Formats

atom, dcmes-xml, json, omeka-xml, rss2