In nearly all courses of introductory music theory, students are taught the rules of common-practice contrapuntal composition. In 1725 Johann Joseph Fux published what is often considered the first “textbook” on composition, Gradus ad Parnassum, in which he outlined the many rules of counterpoint according to the Palestrina style. Surprisingly, the approach to teaching music theory has not changed much since its publication. Students learn the rules by reading a textbook, listening to musical excerpts, and studying with their teacher. They are asked to complete written compositional assignments in which they adhere to these strict rules. The teacher then must go through the assignments, checking for each rule. The entire process is fairly laborious and tedious, which can often be discouraging for both student and teacher.
The music21 Theory Analyzer utilizes the python-based music21 toolkit to transform the way students and teachers approach common-practice music theory education. The package pre-grades student assignments by analyzing them for common practice errors, checking the accuracy of textual responses, and returning results to the student’s professor.
The project began at the Boston Music HackDay in November 2011 where a small proof-of-concept music theory checker site was developed. Since then, the project has expanded in functionality and features. The curriculum of the package is specifically tailored to one of the most commonly used books to study music theory, The Musician’s Guide to Theory and Analysis, published by W.W. Norton & Company, Inc.
The package is currently implemented as a plugin for the open-source music notation editor, MuseScore. Through the plugin, students navigate to the exercise they wish to complete, and the exercise is loaded from the music21 server.
The student reads instructions regarding the exercise, and completes the assignment, often involving part-writing above or below a cantus in addition to several textual components such as labeling harmonic intervals. The student may then submit the assignment to their professor via email from within the plugin.
The professor then receives an email with the results of the music21 theory analyzer. This email contains a list of the comments generated by the analysis regarding the student’s assignment.
The package’s modular design allows different assignments to be easily analyzed for different subsets of music theory rules. For example, a typical novice-level part-writing assignment might check for basic counterpoint errors, such as parallel motion by fifth or octave and improper resolutions of dissonant harmonic intervals. The assignment would only be checked for counterpoint rules learned for that assignment, disregarding more complex rules taught later in the course. The package can also analyze textual input submitted by the student, dynamically determining accuracy by comparing the responses to the notes the student actually wrote.
Additionally, the results email includes an attachment with an annotated version of the student’s exercise. The score is colored according to the errors identified, allowing the professor to more easily locate the students’ mistakes.
The student reads instructions regarding the exercise, and completes the assignment, often involving part-writing above or below a cantus in addition to several textual components such as labeling harmonic intervals. The student may then submit the assignment to their professor via email from within the plugin.
The professor then receives an email with the results of the music21 theory analyzer. This email contains a list of the comments generated by the analysis regarding the student’s assignment.
The package’s modular design allows different assignments to be easily analyzed for different subsets of music theory rules. For example, a typical novice-level part-writing assignment might check for basic counterpoint errors, such as parallel motion by fifth or octave and improper resolutions of dissonant harmonic intervals. The assignment would only be checked for counterpoint rules learned for that assignment, disregarding more complex rules taught later in the course. The package can also analyze textual input submitted by the student, dynamically determining accuracy by comparing the responses to the notes the student actually wrote.
Additionally, the results email includes an attachment with an annotated version of the student’s exercise. The score is colored according to the errors identified, allowing the professor to more easily locate the students’ mistakes.
Music21 Theory Analyzer is designed as a pre-grading and instructional tool. The package may be easily adapted for use by both the student and professor, serving as a tremendous educational tool.
The package is currently under development, although we welcome comments and suggestions. Future plans include expanding the analysis routines to include a larger suite of music theory concepts. We are also investigating additional interface options beyond MuseScore. This package is being developed as a UROP project by MIT undergraduates Beth Hadley and Lars Johnson, with support from the lab’s principal investigator Michael Scott Cuthbert, lead programmer Chris Ariza, and fellow UROP student Jose Cabal-Ugaz.
The package is currently under development, although we welcome comments and suggestions. Future plans include expanding the analysis routines to include a larger suite of music theory concepts. We are also investigating additional interface options beyond MuseScore. This package is being developed as a UROP project by MIT undergraduates Beth Hadley and Lars Johnson, with support from the lab’s principal investigator Michael Scott Cuthbert, lead programmer Chris Ariza, and fellow UROP student Jose Cabal-Ugaz.
Have you tried it with any of the alternate notations? That would be interesting.
ReplyDeleteHey Beth, thanks a lot for this great write-up. We are stoked with what you are all doing with MuseScore!
ReplyDelete