Closed Beta | 2024
KOUP Music is a community platform designed to transform music creation, currently in closed beta with a growing community of users and seed funding. As the Creative Technologist, I collaborated with artists to explore, train, and deploy AI music models using Python and Docker. Additionally, I developed the backend with Next.js and Supabase, and created an adaptive user interface.
Kopi Su Studios
Creative Technologist
AI Researcher
Full Stack Developer
Python
Next.js
Supabase
Tailwind CSS
KOUP evolved from Sonic Mutations, a bespoke tool I helped to create for artists performing at the Sydney Opera House. The platform had evolved significantly during the year I was away. My first task was to research the latest advances in AI music models – assessing their output quality and exploring how they could be fine-tuned. I examined and fine-tuned several models, including MusicGen, StableAudio, and Riffusion. After weighing the pros and cons of each approach, I settled on MusicGen as the base model due to its superior audio quality and the fact that we no longer needed real-time generation as we had with Riffusion.
Lofi
Model: Base
Length: 10s
Atomspheric
Model: Dark
Length: 8s
Relaxing space
Model: Ambient
Length: 8s
Drums
Model: Vocal Drums
Length: 5s
Click around above to listen to the different kinds of outputs from KOUP. Each audio clip is generated from a different artist's fine-tuned model.
Using the KOUP studio interface to quickly remix audio on mobile.
KOUP's presence at SXSW demonstrated the platform's potential, with musicians grasping the different ways it could fit into their creative workflows. The closed beta has continued growing, attracting seed funding and building a community of hundreds of users who are contributing to defining responsible AI-assisted music creation.
Visitors testing KOUP at SXSW, where they could record and manipulate audio with the platform's interface on a touch screen.