This is the project website & blog of MIRLCAuto: A Virtual Agent for Music Information Retrieval in Live Coding, a project funded by the EPSRC HDI Network Plus Grant - Art, Music, and Culture theme.

Sonic Haikus Premiere & Participation at ARTIFICIA Festival – May 6, 2021

07 May 2021 • Anna Xambó • Post a comment

Screenshot from the event "Session 3: Musical performance & education with AI" at ARTIFICIA Festival. From left to right and top to bottom: Sergio Giraldo, Rafael Ramírez, George Waddell, Enric Guaus, Alia Morsi, Anna Xambó and Luisa Pereira.

On 6 May 2021, I had the opportunity to participate in the event Session 3: Musical performance & education with AI as part of the first edition of the ARTIFICIA FESTIVAL. The event was presented and moderated by Sergio Giraldo (MTG, UPF) and I shared the bill with Rafael Ramírez (MTG, UPF), George Waddell (Royal College of Music, London), Luisa Pereira (ITP, NYU), Enric Guaus (ESMUC), and Alia Ahmed Morsi (MTG, UPF).

This session was part of a 3-event series about Music AI & Creativity. The previous two sessions were about "Musical Improvisation with AI" and "Musical Composition with AI", which are worth checking out. The third session named "Musical Performance & Education with AI" explored questions around what is a musical performance and how is it related to creativity, what are the challenges for fostering creativity in musical education, whether AI can help students improve interpretation and creativity, how AI technology can promote inclusion in musical studies, and whether we should need to improve the incorporation of women in this artistic discipline.

The event started with presentations of the projects TELMI by Rafael Ramírez and George Waddell, MIRLCAuto by Anna Xambó with the premiere of the piece "Sonic Haikus" and "The Code of Music" by Luisa Pereira. "Sonic Haikus” explores the sonification of five haikus using MIRLCa. Haikus have a highly constrained structure, which finds commonalities with the high-level and also constrained MIRLC language. Haikus use natural imagery to make Zen-like observations about reality, which can be enhanced with tagged sounds from Freesound. In this performance, the live coder explores the online database by only retrieving sounds predicted as “good” sounds when using the retrieval methods from the live coding system, in this case, sounds by “tag”.

We then moved to a debate where Enric Guaus and Alia Ahmed Morsi started with provocative reflections and then we continued with a discussion where we also answered questions from the audience from the YouTube channel. The debate focused on AI technologies that support democratization and enhancement of music learning, with a special interest in aspects such as gender balance and inclusion in music education. We discussed topics ranging from how to involve parents in the learning experience, whether we can create new music with AI that was not possible before, whether including AI enhances musical creativity, how to keep the balance between coding and making music, what tools we are using, and what are the types of interactions. The event concluded with an audiovisual video piece by Luisa Pereira. You can watch the video of the full event here.

It was great to get to catch up with the fascinating work of colleagues, some of them who I already knew as well as new colleagues. Music performance and education with AI is a rich and novel area of exploration that can take many angles, from supporting music instrument learning to supporting STEAM education, to promoting new approaches to music-making.

Acknowledgements #

Thank you ARTIFICIA Festival for the opportunity to be part of this panel and for a great organisation, with special thanks to Xavier Satorra, Lissette Lemus and Xavier García. Also thank you to Pol Thomas Ferrero for the great technical support during the streaming. Many thanks to Gerard Roma and Petros Galanakis for their help related to the piece's conceptualisation and execution.

← Home