Music composition and interpretation using transformer networks
View/ Open
Date
2020-12-04Author
Naranjo de las Heras, Rubén
Metadata
Show full item recordAbstract
This work presents the development of a deep learning model capable of generating and completing musical compositions automatically through generative algorithms of machine learning from a language modeling approach.
Throughout the document, different neural network structures are studied and compared from vanilla recurrent neural networks to transformers, and the representation of data is discussed, as well as some design aspects for the creation of a model capable of composing and interpreting musical compositions.
The model is trained and tested three times, one for each of the two different datasets and finally one with both together. Then, the three resultant models are discussed and one of them is tested with human subjects to validate the generated musical compositions.
The document also presents the design and implementation of a web-interface aimed at non-technical users, to assist them in the creative process of music composition.