Abstract



Robust 'any-time' music tracking
Andreas Arzt; Gerhard Widmer

In our talk we will present a new method that permits a computer to listen to, and follow, live music in real-time, by analysing the incoming audio stream and aligning it to a symbolic representation (e.g., score) of the piece(s) being played. In particular, we will present a multi-level music matching and tracking algorithm that, by continually updating and evaluating multiple high-level hypotheses, effectively deals with almost arbitrary deviations of the live performer from the score -- omissions, forward and backward jumps, unexpected repetitions, or (re-)starts in the middle of the piece. Also, we will show that additional (automatically computed) knowledge about the structure of the piece can be used to further improve the robustness of the tracking process. We will discuss the resulting system in the context of an automatic page-turning device for musicians, but it will be of use in a much wider class of scenarios that require reactive and adaptive musical companions.