By
Alan Lecheng Chao

A hardware interface to control AI-generative melodies

A MIDI controller that is controlled semi-automatically. A human will move a 4-dimensional arm, while a real-time algorithm with analyze the movement then create a melody flow, then to a sound generation.

Project Video: https://youtube.com/shorts/6JLwb_b0RfM?si=TjfzVxhpmUfQxNw8

Abstract

New technologies always defines the direction music evolves, as now generative AI is showing its capacity in producing different content. To build a interactive interface that allows human to control the generative process, instead of building a static UI, this project approaches the way that resembles more of an instrument that could be played in realtime. Built with a hardware device that captures angle movement controlled by the performer, the control data will be processed through an algorithm, combined with Google Magenta’s Music-VAE model, and needed through a sound output.

Photos

 

Project Logbook

GitHub: https://github.com/AlanLechengChao/Capstone

Keywords: NIME, Interactive Music, Human-AI Co-Creation