Quantcast
Channel: peabody institute Archives - Baltimore Fishbowl
Viewing all articles
Browse latest Browse all 21

Baltimore’s Patrick McMinn: Pawn Shop Guitars, Robot Pianos, and the “Ultimate Democratization of Music”

$
0
0

Patrick McMinn moved to Baltimore from Austin in 2010 to study music composition at the Peabody Institute. But an early performance at a Baltimore house venue may have been just as edifying. “Just a week or two after I moved to Baltimore, I was invited to play a solo show in this space on the West side of town, America. I ended up playing there a few times before I started school, and it very much changed my perception of Academia,” Patrick recalls. “I played with kids there with cobbled-together, home-built synths who were making the most amazing, exciting music. In sound and timbre, it wasn’t too far removed from what the electronic composers at Peabody were making, but there is such a sharp divide between the academic circle and the experimental musicians outside of it. There’s not much dialogue at all. I’d love to try to change that.”

That ambivalence (in the positive, active sense of the word) is a basic element of Patrick’s musical identity — just as you’d expect from someone who, despite his newly minted graduate degree in Computer Music Composition, has been an autodidact from the beginning. An early interest in Sonic Youth inspired Patrick to coax outside harmonies and timbres from “really cheap, awful pawn shop guitars” — he was particularly enamored with the musical effect he got from “de-tuning the strings and grinding the headstock into the floor.”

Patrick’s recent musical investigations have focused on strange and sometimes sublime interactions between live musicians and computers. While at Peabody, Patrick composed a couple exciting pieces for disklavier, an instrument he describes as “a fully acoustic robot piano,” a real grand piano which can be used both as a highly sensitive and precise player piano, and as a sort of a player piano in reverse, tracking and responding to a live pianist in real time; he also put together a suite for live string quartet and electronics. (Listen below.)

For the past few months Patrick got to put his odd collection of skills to use coding some of the features of Dan Deacon’s new interactive smartphone app — to be used by the audience during his live shows — working alongside primary app developer Keith Lea.

Patrick took some time out to answer my questions regarding his own music and his role in developing the app.

You’re both an adventurous and technical composer and a skilled programmer. Do you see those two worlds as fundamentally linked?

For me, coding and composing are the same thing. Most of the music software I’ve built involves altering and processing the sound of a live performer, and I’ve always viewed these systems as meta-instruments of a sort — an instrument to play an instrument. Most of my music is written by improvising with an instrument through a software system, and seeing where the limits are, seeing where the system cracks or breaks. This, for me, is where the musical output is most interesting. Then I’m able to go back and allow for these fractures in the system, which in turn changes what I might play through it.

Through improvising in this fashion, I feel like I’m chipping away at the sound and the code until the music emerges. It’s often instinctual, and what comes out is usually really surprising to me, which can be both disconcerting and fantastic.

The Conservatory system, I think for its own comfort and safety, really sets up a hierarchy of musical value, with the Western Standard Repertoire set on this absurd pedestal, and the rest of musical culture — whether non-western, popular, new, or experimental — mostly patronized, disdained, or ignored. Certainly this isn’t true everywhere, and I’m sure it’s changing, but I’ve encountered this mentality over and over throughout my education. Working with computers for performance really allowed me to start breaking through whatever subconscious limitations I had burdened myself with.


Tell us about your string quartet.

In the string quartet, each player has a clip-on microphone, and can be individually processed by the computer in realtime. I was working with such amazing players that I really wanted to give them as much control as possible, over not only the musical expression of their instruments, but also the software system they were playing through.

To do this, I mapped certain natural parameters of their playing to effects on the computer. For example, the volume of the violin player could control how much a certain effect is applied to their sound. Or the pitch of the cello could be mapped onto the filter cutoff on the effect of one of the other instruments. In this way, each player’s actions affect not only their own sound, but the affected sound of the entire ensemble.

Performers complain all the time about electronic music taking away their natural expression, by making them play to click or backing tracks, or constraining their actions in some way. I wanted to make sure that each player in the ensemble was responsible for and actively controlling how their instrument was being modified, while utilizing facets of their playing that come naturally in the first place.

What did you create for the Dan Deacon app?

My role was to handle the sound generating portions, both in the Instrument portion and for Dan’s live performances. In performance, Dan is able to improvise with a set of oscillators [which create musical tones] and have each person’s phone respond in real time by generating a random sequence that will [harmonize with] whatever note he lands on.

In addition, each person in the audience can control the speed and waveshape of the sequence by tilting the phone around at different angles. The effect is of a shimmering, constantly shifting chord that changes depending on where one is located in the audience.

The “instrument” function is based on this improvisation section, and allows users to choose the notes in the sequence, as well as how long the sequence is. Also, I can’t get through talking about the app without mentioning Keith Lea’s work on it, which is brilliant, and is the only thing that made my work on the internal instrument possible.

Do you see the greater and greater proliferation of smart phones and tablets as a potential game changer for creative music?

Working on Dan’s app, I really began to be excited by the potential of smartphones for music and art making. Each phone is just a pocket-sized computer carried around by a constantly-growing portion of the population. Dan’s app allows each person to have a unique aesthetic experience and still feel connected to the audience and performers around them, which is a revolutionary idea to me.

Many dead horses have been beaten over the “death” of the traditional album and the fall of album sales in an era where the entire history of music is at one’s fingertips. I’m not sure people want a single, unchanging artifact of music anymore. An app on a phone can create a singular experience for the person using it in a way that a static song or album never could.

I’m excited by the idea of writing pieces that would change and react based on the actions of the user, or the user’s location, or by how many other smartphone users are in the immediate vicinity, or the weather, or information from a remote server, or, really, the almost infinite amount of data online that a user could interact with. Musical apps could be the ultimate democratization of music; each person as composer and performer at the same time.

Patrick McMinn’s four-piece band Impatience Machine opens Hampdenfest at 11 a.m. on Saturday, September 8.


Viewing all articles
Browse latest Browse all 21

Latest Images

Trending Articles



Latest Images