Caroline Pegram's ANU Residency | 2024
WombTunes is a platform that transforms foetal ultrasounds into music, blending data visualisation with artistic expression. As part of Caroline Pegram's Cybernetic Imagination Residency, I developed the core technology to convert ultrasound images into 'sonic fingerprints' and integrated user inputs to create personalised musical compositions.
Kopi Su Studios
Caroline Pegram
Creative Technologist
AI Researcher
Full Stack Developer
Next.js
MusicGen
The core of WombTunes lies in its ability to transform ultrasound images into unique sonic representations. I developed a process to analyse the brightness of the ultrasound image from left to right, normalise the data, and convert it into a melodic tone. This 'sonic fingerprint' serves as the foundation for the music generation.
Click Play above to listen to the fingerprint created from an ultrasound.
The animation and interactive player above show the process of transforming ultrasound data into a unique sonic fingerprint, which forms the basis for the generated music.
To make the experience interactive and personalised, I refined the generative process to blend the ultrasound with user-selected preferences for emotion, genre, and tempo. This involved creating a matrix of prompts, adjusting the scale of the fingerprint to ensure its unique characteristics were preserved, and applying audio fades to create smoother, more polished endings.
Electronic | BPM: 160 | Excited
Ambient | BPM: 90 | Calm
Classical | BPM: 120 | Joyful
The animation shows the dynamic loading screen, accompanied by three sample sounds generated with different styles and tempos.
WombTunes was featured on a panel at SXSW, where I discussed the technical and creative aspects of the project. The panel explored how AI and generative tools can be used to create accessible and meaningful artistic experiences. This opportunity highlighted the potential of WombTunes to inspire new ways of understanding and interacting with data through sound.