Valse Automatique is a design performance made to illustrate the symbiosis between humans and technology by translating music to form over the use of a kuka industrial robot.
Invited to the project by Hermann Weizenegger – an industrial designer and design professor running his studio haw in Berlin – I was responsible for designing the overall interface between the music by composer and violinist miki and the production process of the robot. Together with the incredibly talented chris jeffs, who provided me with the audio analysis and musical advice as well as Wolf Deiss and Roman Kühnert of artis gmbh we realized what to me was one of the most complex, challenging and inspiring productions I was part of to date.
Hermann’s vision of the overall performance consisted of five variations of a musical piece by MIKI represented in five objects. To address this vision, wax was chosen as a base material, since it allowed rapid manufacturing through milling and application of heat as well as a possible further use for casting. Thus, the performance was conceptualized as a two stages production process showing MIKI and a pianist along the robot manufacturing the objects. In a pre-performance process, the wax base shapes were milled to reflect the musical atmosphere, in the second process the robot finalized the objects with the use of a Bunsen burner in reaction to MIKI’s play.
Given my initial lack of experience with product design and the related manufacturing processes, I teamed up with steffen fiedler to create a formal concept that will work in the timeframe we had. Even though this initial design process happened in Java/Processing with the help of Chris’ analysis tools made in SuperCollider, we realized quickly we would need an entirely different toolset to create data for such a high performance manufacturing process. Thus, the formal concept to match the requirements of the short timeframe for milling was implemented in Rhino/Grasshopper. The concept consisted of a fluent terrain being distorted according to the music. The terrain was chosen to provide the greatest flexibility for how the robot would manufacture it. This translation process is also shown in a minimal dynamic visualization I created in Processing mainly for the audience to understand the process. It was integrated into the lights control done by Chris as well. The large scale renderings for the exhibition were made in sunflow – a fabulous open source renderer written in Java.