DinnerTable is a fun and crazy music instrument, or, to be consistent with the literature, a new interface for musical expression. I built it together with Pablo Alonso and Gerard Erruz as the final project for a course on Real-Time Interaction at the Music Technology Group at UPF.
Its goal is to provide a familiar, tangible and organic way for musicians to interact with software instruments. It is a multiplayer instrument: some players can pluck the strings while others tap on the wood. Granular synthesis is used to generate artificial sounds based on the interaction with the strings. As it retains some properties of the string such as its pitch and dynamics, the resulting sound is both complex and easy to interact with. Machine learning techniques are used to classify the different gestures that can be applied on the wood, allowing the percussion players to trigger different sounds when knocking, tapping or thumping.
The assignment was very open: apply the knowledge you’ve been given on synthesis –from additive to granular– and interfaces –from strings to touchscreens–, and make whatever music instrument you can think of. Our starting point: interacting with a physical instrument to create synthetic sounds. In order to deliver the right feedback to the player, we soon added the idea of mapping the synthetic sounds back to the properties of the physical elements that created them. After some DIY lutherie and playing around with different ideas in PureData, the DinnerTable was completed. We plan to improve it by using better woods and electronics, adding more strings and further exploring the synthesis engines.
More about DinnerTable here: