Showing posts with label press. Show all posts
Showing posts with label press. Show all posts

Friday, October 2, 2015

"Data in a Major Key": MIT Spectrum, FSU


MIT Spectrum has an article by Kathryn M. O'Neill on my work, music21, and computational musicology:
“IF I WANT TO KNOW how the guitar and saxophone became the important instruments throughout classical repertory or how chord progressions have changed, those are questions musicology has been unable to approach,” says Associate Professor of Music Michael Cuthbert. Spotting trends and patterns in a large corpus of music is nearly impossible using traditional methods of study, because it requires the slow process of examining pieces one by one. What his field needed, Cuthbert determined, was a way to “listen faster.”
Read more at http://spectrum.mit.edu/articles/data-in-a-major-key/.

In other news, Clifton Callender at Florida State University is currently teaching a doctoral seminar on music theory techniques using music21.  His course description is at http://cliftoncallender.com/teaching/.



Friday, May 23, 2014

Speed, Speed, Speed, ... and news.

The newest GitHub repository contains a huge change to the under-the-hood processing of .getContextByClass() which is used in about a million places in music21.  It is the function that lets any note know what its current TimeSignature (and thus beatStrength, etc.) is, lets us figure out whether the sharp on a given note should be displayed or not given the current KeySignature, etc.  While we had tried to optimize the hell out of it, it’s been a major bottleneck in music21 for working with very large scores. We sped up parsing (at least the second time through) a lot the last commit. This was the time to speed up Context searching.  We now use a form of AVL tree implemented in a new Stream.timespans module — it’s not well-documented yet, so we’re only exposing it directly in one place, stream.asTimespans(recurse=True|False).  You don’t need to know about this unless you’re a developer; but I wanted to let you know that the results are extraordinary.

Here’s a code snippet that loads a score with three parts and 126 measures and many TimeSignatures and calculates the TimeSignature active for every note, clef, etc. and then prints the time it takes to run:

>>> c = corpus.parse('luca/gloria')
>>> def allContext(c):
...     for n in c.recurse():
...         k = n.getContextByClass('TimeSignature')
... 
>>> from time import time as t
>>> x = t(); allContext(c); print t() - x

with the 1.8 release of Music21:
42.9 seconds

with the newest version in GitHub:
0.70 seconds

There’s a lot of caching that happens along the way, so the second call is much faster:

second call with 1.8 release:
44.6 seconds ( = same within a margin of error)
with the newest version in GitHub if the score hasn’t changed:
0.18 seconds

You’ll see the speedup immensely in places where every combination of notes, etc. needs to be found.  For instance, finding all parallel fifths in a large score of 8 parts could have taken hours before. Now you’ll likely get results in under a few seconds.

I have not heard of any issues arising from the change in sorting from the last posting on April 26, so people who were afraid of updating can breath a bit more easily and update to the version of music21 at least as of yesterday. The newer version, like all GitHub commits, should be used with caution until we make a release.

Thanks to the NEH and the Digging into Data Challenge for supporting the creation of tools for working with much bigger scores than before.

In other news: 

Music21j — a Javascript implementation of music21’s core features — is running rapidly towards a public release.  See http://prolatio.blogspot.com/2014/05/web-pages-with-musical-examples.html for an example of usage.  We’ll be integrating it with the Python version over the summer.

Ian Quinn’s review of Music21 appeared in the Journal of the American Musicological society yesterday.  Prior to this issue, no non-book had ever been reviewed. It’s a great feeling to have people not on this list know about the software as well.

Oh, and MIT was foolhardy enough to give me tenure! Largely on the basis of music21.  If you’re an academic working on a large digital project, I still advice proceeding with caution, but know that it can be done.  Thanks everyone for support.

Sunday, July 29, 2012

Music21 in LinuxMagazin.de (auf Deutsch)

Eine kurze Einführung in music21 hat in einem Blog-Post in Linuxmagazin erschienen:
http://www.linux-magazin.de/NEWS/Music-21-Python-Toolkit-fuer-Musikwissenschaftler

Meine Lieblings-Satz:
"In den vergangenen Jahren habe sich der Einsatz von Informationstechnologie in Geistes- und Kunstwissenschaften von einem randständigen Hobby interessierter Geeks zu einem anerkannten Werkzeug für alle Forscher entwickelt, schreibt der Projektleiter Michael Scott Cuthbert"

If only I could actually write that well auf Deutsch!

The music21 team had a great time in Germany visiting with our colleagues at LMÜ München and at the DH2012 conference in Hamburg, in addition to sampling Currywurst in Berlin, Bach-arcana in Leipzig, and fine beer and warm people everywhere.  Thank you to our German friends and the German government for supporting our work.

Tuesday, July 10, 2012

Music21 in the Boston Globe; Lilypond

Sunday's Boston Globe has an excellent article by Leon Neyfakh titled, "When Computers Listen to Music, What do they Hear? "  which includes a great discussion of the latest techniques in computational musicology including a number of references to music21 (including a graphic only in the print edition).  Here's one of my quotes in the article:

“You get a bird’s eye view of something where the details are so fascinating—where the individual pieces are so engrossing—that it’s very hard for us to see, or in this case hear, the big picture...of context, of history, of what else is going on,” said Cuthbert. “Computers are dispassionate. They can let us hear things across pieces in a way that we can’t by even the closest study of an individual piece.”
For anyone who already knows about music21, I'd appreciate it if any Lilypond hackers/users who are adventurous would be willing to upgrade to the newest SVN and test out Lilypond support there.  We've completely rewritten our Lilypond exporter as an object-oriented system with the aim of getting it caught up to MusicXML in the near future.  It's a ground-up reconception, so there may be some bugs.  It doesn't support a lot of things still, but it's now a flexible system that we can expand in the future.