Escaping from the abyss of manual annotation: New methodology of building polyphonic datasets for automatic music transcription
Journal
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Journal Volume
9617 LNCS
ISBN
9783319462813
Date Issued
2016-01-01
Author(s)
Su, Li
Abstract
While recent years have witnessed large progress in the algorithm of automatic music transcription (AMT), the development of general and sizable datasets for AMT evaluation is relatively stagnant, predominantly due to the fact that manually annotating and checking such datasets is labor-intensive and time-consuming. In this paper we propose a novel note-level annotation method for building AMT datasets by utilizing human’s ability in following music in real-time. To test the quality of the annotation, we further propose an efficient method in qualifying an AMT dataset based on the concepts of onset error difference and the tolerance computed from the evaluation result. According to the experiments on five piano solos and four woodwind quintets, we claim that the proposed annotation method is reliable for evaluation of AMT algorithms.
Subjects
Automatic music transcription | Multipitch estimation | Note tracking | Onset error difference
Type
conference paper