RNNs for Rhythm Analysis & MIR in Music Creation and Collaboration


Details
For the third Berlin MIR Meetup we're happy to present the following talks:
Sebastian Böck (JKU Linz): Automatic Beat and Downbeat Tracking
Automatic analysis of rhythmic structure in music pieces has been an active research field since the 1970s. It is of prime importance for musicology and tasks such as music transcription, automatic accompaniment, expressive performance analysis, music similarity estimation, and music segmentation. In this talk I will give an overview of the state-of-the-art in computational rhythm description systems, with a special focus on beat and downbeat tracking with recurrent neural networks (RNNs). These systems are usually built, tuned, and evaluated with music of low rhythmic complexity. Based on difficult examples, I will analyse the capabilities of these systems, investigate their shortcomings, and discuss challenges and ideas for future research with regard to music of higher complexity.
Bio: Sebastian Böck holds a diploma degree in electrical engineering and a doctoral degree in computer science. His research in the field of music information retrieval (MIR) focusses on automatic rhythm and time event series analysis, including beat and downbeat tracking, meter analysis, and tempo estimation. He has a strong interest in artificial neural networks and Bayesian methods. He is the main developer of the open source Python audio and music signal processing library mamdom (https://github.com/CPJKU/madmom).
Peter Knees (TU Wien): Music Retrieval and Recommendation: Applications in Music Creation and Collaboration
The field of music information retrieval – among other things – deals with the extraction of musical information from audio signals as well as meta-data in the widest sense. This requires a variety of intermediate steps, from onset detection, to beat tracking, to key estimation, to melody extraction, to instrument recognition. The extracted features have proven to be applicable for a number of tasks, such as music recommendation and cover song retrieval. They also facilitate several applications for intelligent music production as can be found in today’s digital audio workstations. In this talk I will give an introduction to the field of MIR and the prevalent techniques used in audio and user-generated data analysis. I will connect these to some of the outcomes of the recently finished GiantSteps project, as well as to new challenges we are facing in a recently started project on recommendation in an online jam community.
Bio: Peter Knees holds a doctoral degree in computer science and is an assistant professor of the TU Wien, Austria. For over a decade, he has been an active member of the music information retrieval (MIR) research community, branching out to the related areas of multimedia, text IR, recommender systems, and digital media arts. In the recently finished EU-funded research project GiantSteps, he investigated the role of MIR in music production and performance and supported developing new tools for practitioners. Currently, he is leading the project SmarterJam with the Vienna-based startup sofasession to facilitate collaborations and enhance the experience in online jam sessions.

RNNs for Rhythm Analysis & MIR in Music Creation and Collaboration