Dinh-Viet-Toan Le, Yi-Hsuan Yang
Abstract. Western music is often characterized by a homophonic texture, in which the musical content can be organized into a melody and an accompaniment. In orchestral music, in particular, the composer can select specific characteristics for each instrument's part within the accompaniment, while also needing to adapt the melody to suit the capabilities of the instruments performing it. In this work, we propose METEOR, a model for Melody-aware Texture-controllable Orchestral music generation. This model performs symbolic multi-track music style transfer with a focus on melodic fidelity. We allow bar- and track-level controllability of the accompaniment with various textural attributes while keeping a homophonic texture. We show that the model can achieve controllability performances similar to strong baselines while greatly improve melodic fidelity.
Audio samples are synthesized from the generated MIDI files using the Musescore default soundfonts.
The indicated polyphonicity and rhythmicity values are differences with the analyzed value from the reference
piece.
Reference file: 3814911.mid (SymphonyNet dataset), bar 34
Reference file: 268756.mid (SymphonyNet dataset), bar 8
In the following samples, bar-level and track-level controls are unmodified. The instrumentation is automatically chosen and the melodic instrument is enforced. The melodic tracks are represented in black on the pianorolls.