
When 91亚色 Associate Professor Doug Van Nort steps onto a stage, he isn鈥檛 just surrounded by musicians 鈥 he鈥檚 surrounded by collaborators, both human and non-human.
For more than a decade, has been developing AI-driven machine partners, or machine agents, that improvise alongside performers. Much of that work has been done through the , which he founded in 2015, to explore new modes of creative expression through technology.
There, rather than imagining a future where AI replaces human creativity, Van Nort says he is intent on challenging artists to think, feel and listen differently.
鈥淚 always foreground that AI should be for creativity support, not creativity replacement,鈥 says Van Nort, who teaches in the Department of Computational Arts, (AMPD) at 91亚色.

In an era filled with alarm bells about automation and displacement, Van Nort鈥檚 vision is notably human-centred. 鈥淚 remain steadfast in my interest because I鈥檝e seen how these systems can inspire new directions for trained musicians and for people with no musical background at all. The goal is more creative engagement, not less.鈥
Van Nort鈥檚 research in this area began when he was a grad students with a deceptively simple question: what does it mean to play a digital instrument?
As his doctoral work unfolded, the question evolved into: what if the computer isn鈥檛 just an instrument, what if it is a partner?
Improvisation, especially the open, exploratory form Van Nort practices, is already a complex social negotiation, he says. Players listen, respond, assert themselves and negotiate one another鈥檚 musical identities. Introducing a machine agent into this space doesn鈥檛 merely add a new sound. Rather, it reshapes the ecosystem, he says.
Audience reaction to this type of music can be strong. 鈥淛ust calling something AI or an agent sets people鈥檚 expectations. They start listening for its identity: What does it contribute? What are its edges? How does it change the group dynamic?鈥
He explains that it鈥檚 these shifting relationships that fascinate him most.
Although AI music research dates back decades, the last 10 years have brought what Van Nort calls 鈥渞adical advancements.鈥
Yet, in contrast to large-scale models trained on vast, untraceable internet data, his approach is intentionally intimate. One of his projects involves training machine agents on years of recordings from his own ensembles, specifically his professional group, .
This curated dataset ensures ethical transparency. He knows every musician whose sound is being learned and fosters what he describes as 鈥渁 deep relational quality鈥 between the agents and the performers.
He is also experimenting using a camera to track the gestures he uses to guide live improvisers. The same visual cues instruct the machine agents, creating a shared field of communication.
鈥淭he humans see me. The machine sees me. They鈥檙e reacting to the same thing.鈥
Additionally, he choreographs interactions between humans and machines: humans respond to AI gestures and vice versa, generating an ever-shifting conversational fabric.
These explorations don鈥檛 stay confined to his research in The DisPerSion Lab. It also actively shapes Van Nort鈥檚 teaching at 91亚色.
He leads the (different from the project mentioned above) 鈥 a large ensemble of students that blends laptops, digital instruments, electronic processing and acoustic players. The group typically involves 25 to 35 students. Some are trained musicians, while others come from digital media or have never formally studied music at all.
This diversity is intentional. 鈥淒emocratizing music-making is near and dear to my heart,鈥 Van Nort says.
Attentive listening, not years of training, is the entry point. Students learn how to contribute meaningfully in a collaborative environment where sound, gesture and technology intertwine.
Within this ensemble, Van Nort occasionally introduces machine agents to see how the group reacts, and how the AI learns to behave in a larger creative ensemble.
He also incorporates 鈥渟oundpainting,鈥 a gestural vocabulary for composing live music. With a sweep of his hands, Van Nort can reshape an unfolding piece, cue performers or shift musical textures. When the AI agents respond to the same gestures, the boundary between composition, improvisation and programming dissolves.
鈥淭he ensemble becomes a living organism,鈥 he says. 鈥淢achines are part of that ecosystem.鈥
While Van Nort is often the public face of this work, he emphasizes that his research is聽collaborative. Graduate students聽working in the lab聽鈥撀爏uch as聽current聽PhD student聽Rory Hoy听补苍诲听蹿辞谤尘别谤听尘补蝉迟别谤鈥檚听蝉迟耻诲别苍迟听Kieran Maraj聽鈥撀爃ave聽made important contributions to the systems he now uses in research聽and teaching, including code development and interface design.聽
For Van Nort, AI isn鈥檛 about efficiency or optimization. It鈥檚 about creating the conditions for deeper expression.
鈥淗ow can you enrich your own creative voice through your own data, your own way of working?鈥 he asks. It鈥檚 a question that applies not just to musicians, but to writers, artists and anyone experimenting with new tools of expression.
With files from Karen Martin-Robbins
