Wearable Computing and Dance

For this performance art piece we design an environment where the stage performers are not bound to static expressions of sound and visuals. We are working with behavior-based approaches in reaction with interactive software settings for sound and visuals. These approaches and technologies affect the environment; adjusting the situations of appearance of shadows, their recognition as mirror effect and projections, their acceptance in the complex of the personality, change of the position of ego towards those elements. We want to achieve an emotional and physical sensation for the audience where the sound and visuals are in interaction with the dancer embodying the space we will create. This will be done with original visual and audio content, software and hardware interaction. This dynamic interactivity engages both the stage performers and audience to a deeper level of interest and identification.
Another possible interpretation Shadows of aiKia is research and examination of the relationship in between body and its biological existence and system of elements, light and sound, both vibrations in the space existing in due to time line. What is the relationship between human consciousness, space where it’s placed and its perception of time. We reproduce a metaphorical creation of the relativity of those dimensions, according to theory of special relativity, testing the limits of speed and trying to win over the equation defining energy and mass.

The visual and audio elements will be connected with the dancer’s movements speed and rotation, using an Arduino board, 1 accelerometer and 1 gyroscope, which will be the consequences of mind and body reactions onto the audiovisual environment.

The dance and movement will focus on duality and the juxtaposition of one’s ego of light and/or dark shadow in the space defined by and in relationship to the visuals and sound.

The space and time, light and sound will become other shadows of the person in the labyrinth of its existence.

We used Max/jitter, AbletonLive, and built a prototype with Arduino, 1 accelerometer and 1 gyroscope.
both sound and visuals do react depending on the movement of the dancer