Automatic Music Production Using Generative Adversarial Networks

When talking about computer-based music generation, two are the main threads of research: the construction of $\textit{autonomous music-making systems}$, and the design of $\textit{computer-based environments to assist musicians}$. Despite consistent demand from music producers and artists, however, little effort has been done in the field of automatic music arrangement in the audio domain. In this work, we propose a novel framework for $\textit{automatic music arrangement from raw audio in the frequency domain}$. Using several songs converted into Mel-spectrograms -- a two-dimensional time-frequency representation of audio signals -- we were able to automatically generate original arrangements for both bass and voice lines. Treating music pieces as images (Mel-spectrograms) allowed us to reformulate our problem as an $\textit{unpaired image-to-image translation}$ problem, and to tackle it with CycleGAN, a well-established framework. Moreover, the choice to deploy raw audio and Mel spectrograms enabled us to more effectively model long-range dependencies, to better represent how humans perceive music, and to potentially draw sounds for new arrangements from the vast collection of music recordings accumulated in the last century. Our approach was tested on two different downstream tasks: given a bass line creating credible and on-time drums, and given an acapella song arranging it to a full song. In absence of an objective way of evaluating the output of music generative systems, we also defined a possible metric for the proposed task, partially based on human (and expert) judgment. To the best of our knowledge, we are the first to address the music arrangement task in the audio domain, to treat music pieces as images, and to propose a quantitative approach to evaluate the model results.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods