projects


my ethos is to create meaningful experiences with technology that engage the senses and enchant the imagination -  this means playing with code, cameras, musical hardware, open source software, old toys, recyclables, gaming controllers, sharpies, copper tape, cameras, and code to make new fun.

ImageSTEAM AI and Computer Vision Curriculum (2020-present)
https://www.imagesteam.org/
The ImageSTEAM AI and Computer Vision Curriculum project is a National Science Foundation (NSF DRL-1949384) funded program designed to introduce AI and computer vision concepts to middle school students through the development of innovative learning activities in visual and computational media. Our focus is encouraging students to pursue interests in STEM and empowering teachers to train the next generation in upcoming AI technologies that will be prominent in the 21st century. We are working with both teachers and students to create and distribute our curriulum, including 60+ lessons and 20+ foundational concepts videos for use in the classroom (see our Videos page).





ZenDen at the I.D.E.A Museum in Mesa, Arizona (Nov 2019 - Feb 2024)
An interactive space which responds with changing meditative sounds in response to movement, actually non-movement, of museum goers in the geodesic dome. Each bench adds a different layer of sound, and each layer of sound moves slowly on a unique stochastic path throughout the 10 speakers in the surround sound system, giving a sense of movement and life to the soundscape while encouraging stillness and reflection.
Max MSP, surround sound, 180 degree camera tracking, I.D.E.A Museum

Soundscape at the I.D.E.A Museum in Mesa, Arizona (2014-2018)
Interactive installation
An interactive environment that creates sound in response to movement and position of kids within a geodesic dome. Uses the kinect api to track the positions of head, torso, feet, and hands on up to 6 humans simultaneously and sends that data to Max MSP, which parses that data  into 4 spatialized zones with designated loops and different interactions within each one; jump to splash in the pond, wave your arms to knock birds out of trees, dance to ask more and more drummers to join in the circle, and draw large circles with your arms to play chimes.
Kinect, Max, MIDI, found sounds, recorded sounds, synths, surround sound speakers
thanks to Xiao Wang for the Kinect api access


AI Valentines (2024)
pedagogical mini project
I created these and then presented as an in-class project in my freshman level course. Make your own Valentine using gen AI and image editing tools. 
Generative AI, image editing, print design, prompt engineering

Generative AI resources for teachers (2024)
website resource: https://www.imagesteam.org/resources-for-genai-tools
I created this website as part of a week long professional development workshop I ran in summer of 2024. 
Generative AI, teaching, prompt engineering

Winter Song (2020)
music video
Made at home in 2020 this is a duet, a cover of an ingrid michaelson song, with a custom video made by interactive visual programming in p5.js.
Vocals, snow, p5.js, processing, creative code

Makey Makey Sampler (2014)
mini app
This simple application was originally custom built for a specific need of a music therapist, and was built in one day. It allows you to trigger sounds with strokes of the typing keyboard. Drag and drop your own sound files and assign keys for each sample. It is meant to accompany the MaKey MaKey, “an invention kit for the 21st century” as described by its creators, two MIT grads, Jay Silver and Eric Rosenbaum (makeymakey.com, $50)
Max, Makey Makey, conductive materials, speakers

DDR Dance pad Instrument (2014)
mini app
Like the Makey Makey sampler, this came directly from a curricular need in my experience teaching in a school for high functioning ASD youth. This application takes data from a DDR pad via USB and allows you to specify sounds for each button on the pad, so you can have a large size sampler and unique sound performances with the body.
Max, DDR pad, DDR ps2-to-usb converter, speakers

Sonic virtual reality game: How does your body sound? (2010)
Paper/Poster at NIME 2010 (PDF Available Here)
authors: K Headlee (now Swisher), T Koziupa, D Siwiak
Interactive system that uses the body and biosensors as a generative tool for creating music. Uses a multi-player game paradigm, where players work together to add layers to a soundscape of three distinct environments. Project goals are to explore innovative ways to make music, create self-awareness, and provide the opportunity for unique, interactive social experiences. Max, nerf squish balls, stretch and bend sensors to arduino
---------
teaching
projects
about
home