BTrack is a causal beat tracking algorithm, intended for use in live performances. It was developed as part of my PhD research at Queen Mary University of London, in collaboration with Matthew Davies and Mark Plumbley.
Beat tracking is the act of tapping ‘in time’ with music, similar to tapping your foot to a piece of music. Automatic beat tracking algorithms attempt to replicate this process computationally.
The code is open source and I still regularly maintain and update the library. Please check out the GitHub repository.
Having used a computer on stage with my band, and having seen many others do so too, it is clear to me that computers are inflexible bandmates. Musicians must yield to the fixed tempo of a computer, which means the band must play to a flat tempo ‘click track’.
Loads of great music has been made to a click track, but why does it have to be that way when computers get involved? It should be a creative choice, not an inevitability. In my view, the increased use of click tracks has implications for music, both in composition and performance. So, my motivation when looking into beat tracking was to allow computers to take part in performances without putting restrictions on human performers.
Here are some use-cases I had in mind when developing the algorithm:
- Having a computer ‘backing track’ automatically stay in time with a live band, without the need for the drummer to play to a click
- Many bands and performers use live visuals to accompany their performances – perhaps a beat tracking algorithm could adjust the visuals to keep them in sync with tempo changes
- “Machine musicianship”: perhaps ‘intelligent’ computer musicians could play parts in bands or ensembles, and having an ability to follow tempo changes is one of the first things such a machine musician might need.
Please note that the above are all huge research areas in themselves, and I didn’t come up with any of these ideas and they were dreamt up and explored long before I started getting into this kind of thing. But they were what I had in mind when I took an interest in beat tracking.
2011 – Musicians and Machines: Bridging the Semantic Gap in Live Performance – Chapter 3
A. M. Stark, PhD Thesis, Queen Mary University of London, 2011.
2009 – Real-Time Beat-Synchronous Analysis of Musical Audio
A. M. Stark, M. E. P. Davies and M. D. Plumbley. . In Proceedings of the 12th International Conference on Digital Audio Effects (DAFx-09), Como, Italy, September 1-4, 2009.