On September 5-6, 2014, Ben Bacon and I participated in the first official Music Hack Day of Berlin! We partnered up with our awesome friends at ROLI to hack their magnificent new instrument: The Seaboard Grand. Needless to say, it …
On September 5-6, 2014, Ben Bacon and I participated in the first official Music Hack Day of Berlin!
We partnered up with our awesome friends at ROLI to hack their magnificent new instrument: The Seaboard Grand. Needless to say, it was an amazing experience experimenting with different mappings and sound synthesis methods. You can learn a little more about the instrument by watching the short video below:
At MHD Berlin we came up with a few different input/output mappings. These where were all largely inspired by the Seaboard’s ability to provide the player continuous control of the instrument’s sound. Because the Seaboard’s silicone/rubber playing surface is highly manipulatable, we employed the use of gestural metaphors (squeezing, pushing, and even bowing) to guide the intuitive process of our mapping strategies.
The first mapping approach was the most experimental, and repurposed the keyboard to work more like a bowed string. By working with the visual programming IDE Max/MSP, we hooked-up the MIDI input (note value, velocity and note on/off), aftertouch signal (finger pressure on the keys) and 14 bit pitch bend (awesome!) to the physical modeling engine PerColate. Originally designed as the Synthesis Toolkit (STK) by Perry Cook and Gary Scavone, PerColate was ported to Max/MSP by Dan Trueman, and has since become one of the most popular physical-modeling engines around. PerColate gives the user the ability to directly manipulate the model’s physical parameters.
Given that we had a virtual universe of realistic and, uh, imaginative (blotar~ !) engines to choose from, we decided to go with the model of a bowed string. In this case, we wanted to make use of the expansive control surface of the ROLI Seaboard. While there exists the general form of a piano’s keyboard within the contour of the performance space, the areas above and below the keyboard can be used to manipulate sound of the instrument as well. We wanted to take advantage of this unique feature by developing a mapping that allowed the user to define a “string length” (i.e. pitch) by choosing a distance and pressing down on two points on the area below the keyboard.
The amount of pressure performed on the keyboard was mapped to the bow pressure parameter, while the average position on the Seaboard was mapped to the bow position parameter on the string. A note can be played by swiping one of the hands (bowing) in either direction on the flat surface after choosing a string length. The change in position influences the pitch slightly, along with the vibrato frequency. In addition, the exponential moving deviation of the position change is calculated (with help from the Digital Orchestra Toolbox), which serves as an excellent variable for mapping of sound intensity. The faster you swipe, the louder the virtual string is bowed: This give you the feeling of actually bowing the SeaBoard.
For the second approach we developed a more conventional mapping for a keyboard interface. By employing Max/MSP once more, we implemented the scansynth~ external developed by Jean-Michel Couturier. This amazing synthesizer creates sound by using the continuous readings of an ever changing wavetable. This approach is called Scanned Synthesis. A virtual mass-spring damper system can then be manipulated by forces that are controlled by the input parameters of the Seaboard. Overall, the system is represented as a warbling virtual string.
Finger pressure on the key was mapped to act as a force on the string. By dragging down on the key, the pitch is bent. The phase of the pitch bend acts as a shifting force, which when increased is perceived as a sharpening of the timbre. Furthermore, we implemented polyphony, and the synth is fully suitable for live performance. We christened it ScannedSea.
The final mapping developed for MHD Berlin using the ROLI, was an interactive microtonal patch inspired by the gelatinous form of the ROLI keyboard. Envisioning the Seaboard as just a MIDI keyboard is surely an understatement. Yet, MIDI is somewhat of a ridged protocol. Developed long long ago (sorry people born before the 80’s!) in a galaxy far away, MIDI at its core adheres to the 12-tone western scale, despite the presence of the pitch bending wheel.
The ROLI demands a dynamic performance environment. Therefore, adhering strictly to the 12-tone scale would does not suffice! To allow notes, scales, and tonalities of all frequencies to be considered equally, a patch was developed using the Native Instruments Massive synthesizer. In this mapping scheme, the user has the ability to “detune” the playing surface. By pressing the “R” atop the Seaboard, and then swiping up the length of the entire keyboard with varying pressure, each key is either detuned-up or down. The resultant scale provides the performer with a customized microtonal environment in which to play.
The challenges of DMI Mapping
Input/output instrument mapping lies at the crux of what makes an instrument successful. What do people hear when a specific action is perfumed? Are rhythm and timbre integrated or separated from gestural movement? Is there a disconnect between what we see and what we hear? These questions have often proven themselves to be quite difficult when a new interface is presented. The ROLI is a great example of, as Perry Cook would say, “leveraging expert technique.” By designing around the familiar shape of a piano keyboard, millions of performers are instantly aware of the Seaboard’s performance capabilities. The unique alterations of sensitive force sensors under the rubber/silicone padding offer just enough changes to retain the familiarity of the piano, while instantly tapping into the continuous control capabilities through the flexible playing surface.
Both of us can say with certainty that this instrument was a blast to work with, and hope to encounter one in our playing and programming careers soon!
Patches of our work from MHD Berlin can be found below, as well as powerful mapping tools from the Input Devices and Music Interaction Laboratory at McGill University.
Scanned Synthesis Original Paper (Bill Verplank & Max Mathews)
scansynth~ by Jean-Michel Couturier
dot.emd (exponential moving deviation) Max object by Digital Orchestra Toolbox (IDMIL)
libmapper (OSC network mapping)
This text was written collaboratively with Benjamin Bacon.