Go to ThinkParametric

Kinetic Pavillion

The kinetic pavilion is the result of months of research by Elise Elsacker and Yannick Bontinckx as part of a school assignment: “Parametric Design and Digital Fabrication” at the mmlab of Saint-Lucas Ghent, University college of Science and Art.

The Kinetic Pavillion project orients itself on the development of a new kind of pavilion that’s capable of acting upon changing weather conditions, human movement or human moods/mindsets. It’s shape has been made dependable of ecological choices and parameters extracted from the pavilion’s surroundings. Just like every other organism, this new prototype changes itself when parameters take on other values.
The scale model (90×120 cm) has been built starting from a rational grid, which allowed us to test the listed parameters.
1. Based upon weather data.
A different reaction pattern in cold and warm areas.
– In cold environments:The pavilion picks up solar alignment and tries to catch as many solar irradiation (heat gains) as possible. Places with higher irradiation levels result in a height-difference in the pavilion’s roof structure so it expands and attracts more heat.
– In warm environments:The pavilion takes on an aerodynamic shape and available winds cool down the pavilion. Places with high irradiation levels result in a change of the roof structure and create shadow spots.
2. Human movement
– Movement in space:This is also an architecture that reacts on movement. The roof structure reacts upon the dynamic movements of the people using the pavilion. It creates a dialogue between the user and the architecture and the perception of the space they find themselves in.
To illustrate this idea, we’re using an iPad to recreate human movements by touching and sliding over the screen. The finger’s coordinates are processed through OSCtouch to Grasshopper, which controls the height data of the pavilion.
– Body Movements:Webcams are able to process movement data into preset shape patterns of the pavilion. For example dancing people can trigger the pavilion to react upon the dynamic movements.
3. Human behaviour and interactions.
The pavilion is capable of filtering through twitter feeds and picking up trends. For example: when a certain number of pavilion users tweet ‘party’ or a synonym the pavilion starts moving actively. On the other hand when users tweet messages containing words like ‘tired’, ‘lazy’, ‘sleepy’, etc.. the pavilion starts moving in a swaying kind of way.

How it works:
1. Input: Ipad, Ecotect, Processing (sine function, webcam, twitterfeeds,…)
2. Process:
a) input parameters are sent through OSCTouch , gHowl, geco GH2Ecotect and UPD to Grasshopper.
b) Different kinds of data are translated into height coordinates.
c) These processed coordinates are sent through Firefly to an Arduino-board
3. Output:
a) Arduino controls 28 servos
b) Spur gears translate these coordinates into a vertical movement, controlling the roof structure.

For more information you can contact Elise or Yannick at their project website.

  • Very cool. I love seeing work that closes the loop between physical sensor data and parametric processing by manipulating a physical model in real-time.

  • Love it. How many possibilities this opens. Bravo!