Public Radio For Eastern North Carolina 89.3 WTEB New Bern 88.5 WZNB New Bern 91.5 WBJD Atlantic Beach 90.3 WKNS Kinston 88.5 WHYC Swan Quarter 89.9 W210CF Greenville
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
US

Watch A Mind-Blowing Visualization Of 'The Rite Of Spring'

Composer, pianist and software engineer Stephen Malinowski has created one brilliant solution to an age-old problem: how to communicate and understand what's going on in a piece of music, particularly if you don't know standard musical notation. Over the course of some forty years, he's honed what he calls his "Music Animation Machine" from a 20-foot printed scroll to the software and iPad apps he's created — but the results are art.

His animations posted on YouTube have gleaned over 100 million page views, from Debussy's "Claire de lune" to the Allegretto movement of Beethoven's Seventh Symphony. And Bjork enjoyed his work so much that she commissioned him to create the animations for her "Biophilia" project.

Most recently, Malinowski has created animations for Stravinsky's ballet The Rite of Spring, just in time for the ballet's 100th anniversary on May 29. (He even created a one-minute version for our own #ritenpr project.) Through this visualization, you can start to follow and understand the composer's dazzlingly dense interplays of melody, instrumentation and the relationships between the instruments.

I reached the California-based Malinowski by phone a few days ago to discuss his work on The Rite of Spring, and to talk a bit more about how his Music Animation Machine came to be — and how it's evolved in the four decades since its original incarnation.

I think what you create is just incredible and brilliant— especially as a way for non-musicians to get a better grasp of what they're hearing. From your vantage point, what's the benefit of visualizing scores?

People usually respond to sound in a unitary way. It's the reason why you can't follow more than one conversation at a time at a party, for example. But with vision, your brain is trained to comprehend multiple things at once: you can take in many more elements simultaneously. In music, there's often much more going on than you can grasp in that moment of hearing. When you have a visualization, your eyes lead your ears through the music. You take advantage of your brain's ability to process multiple pieces of visual information simultaneously.

When I was young and studying piano and really getting into music, I started reading along with scores as I listened to music — that's the way you're traditionally supposed to learn what's going on in a piece. But I started to get really frustrated by very complicated scores.And the symbols in conventional musical notation only go so far. For example, an eighth note takes up the same amount of space on a page as a whole note, though a whole note sounds for much longer. So you have to learn all those symbols in order to perceive what's going on, which can be really frustrating. And the score can never clue you in to the differences in instrumental timbre — how different a note of the same pitch can sound if played by a trumpet versus a violin, for example. And I came to realized that scores are really for musicians, not for listeners. When that information is presented as graphs, it's very easy to understand.

Do you have a standard way of notating different pieces of music?

At this point, I have a toolkit of what I call different renderers. I have about 20 now, and each renders a different visual effect. When I start working with a piece of music, I have to figure out which renderers will work within that particular composer's work. And if it turns out that nothing I already have in that toolkit accurately represents what the composer is doing, I write new programming.

But I feel like I'm only at the very beginning of what's possible. I'm working with the very basics, technologically speaking — it's like I'm at the caveman level of evolution, just as they figured out that if you took an animal bone and blew into it, it would make a sound, and that you could make other nice sounds if you banged some holes into the bone and then wiggled your fingers over the holes.

There are certain elements that the Music Animation Machine can do quite well, like tonality and schemes of harmonic pitch. But it doesn't touch rhythm at all, and of course that's such a strong part of music.

You've mentioned to me that you didn't really know The Rite of Spring particularly well before you took on this visualization project in time for the piece's 100th anniversary. So what did you learn about the architecture of the piece during the process of translating it into visuals?

I was not aware of the kind of harmonic things Stravinsky has going on. There's a lot of bitonality — he'll have multiple tonal areas going one after the other, and then they'll coexist for a while. And I have rendered each of those in different color schemes, so you can see them as they exist independently and then come together, in and out. There are also a lot of places in the score with very subtle shifts in instrumentation and texture, and the software can represent those differences in timbre. It's incredible — Stravinsky continually torques you, startles you, and frustrates your anticipations

Copyright 2021 NPR. To see more, visit https://www.npr.org.

US
Anastasia Tsioulcas is a reporter on NPR's Arts desk. She is intensely interested in the arts at the intersection of culture, politics, economics and identity, and primarily reports on music. Recently, she has extensively covered gender issues and #MeToo in the music industry, including backstage tumult and alleged secret deals in the wake of sexual misconduct allegations against megastar singer Plácido Domingo; gender inequity issues at the Grammy Awards and the myriad accusations of sexual misconduct against singer R. Kelly.