Synbody

Introduction

Synbody studies the synesthetic aspects of body movement and its relation to music. Traditionaly, dancer always dance following tha music. Synbody permutes this relation and sets out the question of how the music would be if it emerges from the dancer movement.

A complete explanation of the project, its theoretical issues and evolution can be found in the alfabody website.

For developing such study we have implemented a platform for detecting dancer movements so we could take and interpret them inside a sonic generation system. So, the sound is generated in real-time from the movements of the dancer. A musician is also involved in the definition of the sound landscape.

DART has developed several wearing sensors that include accelerometers, telemeters, and spetial switches. The gathered information is send via Bluetooth, XBee, or WiFi to an external computer. This computer implements the conctrol system, wher data is interpreted and translated as MIDI messages. These messages are used for generating the sound by an Ableton LIVE application, or a puredata patch.

classic%20dance%20system.png

In the classic model, the dancers move their body following some aspect of the ongoing music. The model propsed in this project is to close a loop between the movements of the dancer and the sound generation. Notice that the generated sound also is going to influence in the ongoing dancing, so the classical loop is included in the synbody model.

SynBody.jpg synbody%20dance%20system.png

Wearable technology is used for including sensors, and a wireless control system in the clothing of the dancer.

Technical Description

The first prototype of wearable sensors set is depicted in the following figures:

traje%201%20esquema.png
traje.jpg

Here are the sensors, wired to an Arduino controller. We use the SHARP IR sensor as a telemeter, and the ADXL3xx as an accelerometer.

arduino.jpg

Details of the control system based on Arduino. The sesors send their informatio to an external computer by a Bluetooth connection.

synbodyperf.jpg

The complete dress worn by Javi Aparicio (alphabody) artist. The first presentation of the whole system was in the International Metabody Forum. Brunel University, London, 8/04/2016, inside the program named as Performance Architectures, Wearables and Gestures of Participation

Control System

The Control System takes charge of translating the body movements in MIDI messages. This is done by two applications: Sensor Control System and MIDI Control System.
The Sensor Control System runs inside a wearable Arduino. It reads all the information from the body sensors and calculates the derivate of the signals. Both the instant sensor values and its derivate are sent by Bluetooth or another wireless protocol to an external computer.
The MIDI Control System runs in that external computer and is implemented in Processing. This system takes the signals sent by the Sensor Control System application and makes decision about the MIDI messages it has to send. This decision is designed and implemented manually by the programmer.

Sensor Control System

There are two types of sensor information: analog and digital. In the first developed wearable prototype, these are the used sensors:

Sensor Sensor Type Signal Range
09269-02.jpg
Accelerometer ADXL335 Analog 0 - 1024
sharp-distance-sensor-infrared-light-z.jpg
Infrared SHARP telemeter Analog 0 - 1024
fdc.jpg
Switch Digital 0-1

For more information about sensors and how to handle them, please visit the [| Sensors] chapter inside dartecne wiki.

Despite the possible range of measurement for the accelerometer and the telemeter is 0-1024 reading form the analog inputs of Arduino, in reality, these sensors give values in a range of some tens of units. Therefore, the real range usually is 200-300.

MIDI Control System

This application implemented in Processing receives the instant sensor values and their derivates calculated by the Sensor Control System. The application has an GUI that allows the user to change some controls of the system. For instance, connect to the wearable device, see the instant sensor values and others.

GUI%20traje%20v1.PNG

In the previous figure it is possible to see values from 2 accelerometers of 2 axis, values from the IR sensor and 4 switches in the squares at the bottom. Each analog value is associated to five columns:

  • The first one represents the sensor raw value.
  • The second one the filter value, it just removes the noise variation from the raw value.
  • The following two defines the min and max values for calibrating the sensor filter value.
  • The last column is the sent MIDI value that is inside the range 0-127. This value is calculated mapping the filter value inside the configurated by the min and max values.

The MIDI Control System sends MIDI messages in two ways: a direct relation between the sensor changes and the MIDI message, and an by a complex algorithm. Direct values work as follows.

  • Control Changes - CC - by each of the analog sensor variation. For instance, IR telemeter gives sends CC-5
  • NoteOn - by the switches

The algorithm also detects some body movements events. For instance, if there is an abrupt change in the accelerometer, it sends a specific MIDI noteOn. It also has a table of notes and chose a random note from the table when a switch sensor is pulsed. If an obstacle goes inside a defined region the IR sensor sends a specific MIDI noteOn.

In general, there are many possibilities in the link between sensor readings and MIDI messages, because they run as uncoupled applications.

Sound Generation

The sound landscape could be generated by any application that accepts MIDI as input. In our first prototypes we have used Ableton LIVE as the SW platform for sound generation. In LIVE you map MIDI messages in a developed project, and the MIDI Control System just modifies the defined project parameters.

whole%20system.png

Shows

Different versions of the Wearable Wireless MIDI Device have been presented around the world. The following pics belong to different places including London (UK), Seul (Corea), and Madrid (Spain).

International Metabody Forum 2016 - London


ArtMadrid 2017

synbodymakersmc.jpg

The dress worn by the dancer Maricruz Planchuelo was presented at ArtMadrid fair, on Feb the 15th, 2017. (Photo by Miguel Angel Garcia)


SunWei, Taiwan 2018

korea.png

For more videos and picts, please visit the project webside: synbody

Github

You can access to the whole developed code in the following git hub project. It includes both Arduino and Processing codes.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License