Creative applications of Music and Audio Research

10:00-11:20 on Monday, 2nd September

P350, Parkside

The definition of Artificial Intelligence (AI) varies wildly depending on the context. In industry, this term can be quite overused and misunderstood. This tutorial attempts to lift the veil on companies who are using AI–or those who are building effective solutions in other ways. We will showcase industry projects (such as apps, platforms, games, performances, pieces of music or audio) that are not necessarily transparent and provide a “forensic” analysis of the research they’re using based on our own experience of this field. We will touch on the trend of creativity in the machine learning community, with the challenges it poses to the scientific process. Additionally, we will rely on our own extensive experience as MIR practitioners and consultants in the music and audio creative space to detail the systematic process of creating machine learning products when there is little or no existing research to rely on.

Amélie Anglade
Amélie Anglade

Dr. Amélie Anglade is a Music Information Retrieval and Data Science consultant. She completed her PhD at Queen Mary University of London with highest distinction for her work on "Logic-based Modelling of Musical Harmony for Automatic Characterisation and Classification". Amélie has always had a strong focus on industry applications of her research, first taking on research student positions in R&D labs such as Sony CSL, Philips Research and CNRS, and then being employed as an MIR expert for Music Tech startups such as SoundCloud and frestyl. For the past 5 years she has further developed her expertise in music identification and discovery–assisting startups and larger companies in the AI and music or multimedia space as an independent consultant, researching, prototyping, and scaling up Machine Learning solutions for them. Additionally, Amélie is an independent technical expert for the EU Commission, in charge of reviewing proposals and ongoing EU research projects. She was co-Chair of the Workshop on Music Recommendation and Discovery (ACM Recommender Systems Conference, 2010 and 2011), co-Program Chair of the Workshop on the Future of Music Information Retrieval (ISMIR, 2009 and 2010), and has been a co-organiser of the Berlin MIR Meetup since 2017. Amélie was an elected Board member of the International Society for Music Information Retrieval (ISMIR) from 2015-2017. She also was a co-founder of the Women in Music Information Retrieval (WiMIR) initiative, and is a teacher and mentor for women in the field of data science through multiple organizations.

Websitehttp://amelieanglade.net/

Ryan Groves
Ryan Groves

Ryan Groves is an award-winning music researcher and veteran developer of intelligent music systems. He has been a long-standing member of the ISMIR community, both as a reviewer and as a committee member for the Women in Music Information Retrieval (WiMIR) community. He received his B.S. in Computer Science from UCLA, and continued on to complete a Master's in Music Technology from McGill University. In 2016, his work on “A Supervised Probabilistic Context-Free Grammar for Melodic Reduction” was awarded the Best Paper at ISMIR. He also has extensive experience in industry, building musical products that leverage machine learning. As the former Director of R&D for Zya, he developed a musical messenger app that automatically sings your texts, called Ditty. Ditty won the Best Music App of 2015 by the Appy Awards. More recently, he co-founded Melodrive, where he and his team built the first artificially intelligent composer that could compose music in realtime and react to interactive scenarios such as games and VR experiences. He now works as a consultant and startup advisor in Berlin, with a focus on expanding the use cases of music and audio through the application of AI.

Websitehttp://arconamusic.com

LinkedInhttps://www.linkedin.com/in/ryan-groves/