A Minimal Template for Interactive Web-based Demonstrations of Musical Machine Learning

Vibert Thio, Hao-Min Liu, Yin-Cheng Yeh, Yi-Hsuan Yang

New machine learning algorithms are being developed to solve problems in different areas, including music. Intuitive, accessible, and understandable demonstrations of the newly built models could help attract the attention of people from different disciplines and evoke discussions. However, we notice that it has not been a common practice for researchers working on musical machine learning to demonstrate their models in an interactive way. To address this issue, we present in this paper an template that is specifically designed to demonstrate symbolic musical machine learning models on the web. The template comes with a small codebase, is open source, and is meant to be easy to use by any practitioners to implement their own demonstrations. Moreover, its modular design facilitates the reuse of the musical components and accelerates the implementation. We use the template to build interactive demonstrations of four exemplary music generation models. We show that the built-in interactivity and real-time audio rendering of the browser make the demonstration easier to understand and to play with. It also helps researchers to gain insights into different models and to A/B test them.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment