Skip to main content
School of Electronic Engineering and Computer Science

Mr Pedro Sarmento


Room Number: Engineering, Eng 104


Project Title:

Guitar-Oriented Neural Music Generation in Symbolic Format


Despite recent advances in symbolic music generation leveraging deep learning architectures, most of the literature is based around the MIDI format. Its syntax is built upon a descriptive type of notation, where symbols represent pitches and velocity values, in opposition to a prescriptive type of notation, where there is a relation between symbols and actions on a given instrument. This fact is particularly relevant in fretted instruments such as the guitar, in which there are multiple possibilities for a player to produce a given note. Such type of notation is used in tablatures and there are software and formats which are dedicated to its edition (e.g. Guitar Pro, Tux Guitar, Power Tab).

In its final stage, it is expected that this project generates new knowledge regarding how deep learning models, specifically sequence-to-sequence models such as the Transformer, can be used to create novel music in symbolic format that explores the tablature style of notation, emphasizing the prescriptive approach, hence producing instructions about how to play the content on the instrument. For this we explore dadaGP, a dataset of music in symbolic format, consisting of more than 25,000 songs in both Guitar Pro and token format. The token format comprises an English-like representation of special words that signify certain actions and notational figures, capturing similar information as the MIDI format, but also annotations that are specific to guitars and fretted instruments (e.g. techniques like palm mute, bending, hammer-ons and pull-offs). Regarding the task of music generation, we intend to explore three case studies, namely a Conditional Music Generation task, where we promote style/artist transfer, conditioning the outcome on the style of a blend of artists/genres; a case study dubbed Assistant Composer, in order to test our system in a real-world scenario, by generating new pieces (or segments of pieces) for a band to play; a Sonification of Smart City Data use-case, in which we map note density in guitar solos to data from smart city sensors, namely air pollution.

C4DM theme affiliation: 

Music Informatics


Research Interests:

Sound and Music Computing

Back to top