DANCE Platform version 1

In this webpage you can find instructions necessary to download, install and run the DANCE software platform. The platform is based on EyesWeb XMI (http://www.infomus.org/eyesweb_eng.php), allowing users to perform synchronized recording, playback, and analysis of a multimodal stream of data. Details about the platform architecture and data stream formats are provided in Deliverable 4.1.

The platform supports several research activities of the DANCE Project:

creation of a multimodal repository of recordings of movement qualities;

fine-grain synchronization of multimodal data

segmentation of the recordings in fragments, according to the chosen qualities

playback and testing of the repository

extraction of the movement features and qualities

real-time interaction sonification

design and development process of scientific experiments of DANCE

design and development of the prototypes of applications

design and development of artistic projects exploiting the results of the DANCE project (e.g., artistic performances)

1. Download and install EyesWeb XMI

EyesWeb XMI is a modular system that allows both expert (e.g., researchers in computer engineering) and non-expert users (e.g., artists) to create multimodal installations in a visual way. EyesWeb provides software modules, called blocks, that can be assembled intuitively (i.e., by operating only with mouse) to create tools and programs, called patches, that exploit system's resources such as multimodal files, webcams, sound cards, multiple displays and so on.

Once you have finished installing the EyesWeb, proceed to step 2 in order to be able to run the DANCE platform patches.

2. Download the DANCE example tools and patches

The DANCE example tools and patches are programs, written to be execute by EyesWeb, that allow the user to record, playback and analyze multimodal data (video, audio, motion capture, sensors). To run tools you will need to download the corresponding installers, launch them and execute the tools as normal Windows applications. To run patches you will need to download and load them into the EyesWeb application (see step 1 on how to download and install EyesWeb). The current version of the DANCE example tools and patches includes applications allowing you to perform different tasks:

a) Tools and patches for recording and playing back multimodal data

The tool shows the current framerate (50 frames per second in the shown example), the name used for this trial (trial_000, progressive numbers are automatically assigned to each trial), and the value of the reference clock (HHHH:MM:SS.mmm; 0000:00:07.440 in the above picture).

The recording tool records avi files. The video is encoded in MPEG-4 format, the resolution is 1280x720 and the framerate is 50 fps. Audio is encoded in AAC format at 48000 Hz. Two channels are recorded: the left channel contais audio from the system’s audio input device (e.g., a microphone), whereas the right channel is the reference clock encoded in SMPTE audio format.

Multiple instances of the video recorder tool can be started and can work standalone, or synchronized with the other recorders. The options panel allows you to configure the working mode of the recorder.

The recorder creates a stereo file in WAVE format, and the SMPTE is added as the right channel. Audio is sampled at 48000 Hz. The user interface is very similar to the video recorder tool. The main difference is the visualization part. In this tool the audio waveform is shown instead of the video stream. The options panel allows you to configure the working mode of the recorder.

The recorder expects input values (sent as OSC packets, see Deliverable D4.1 for details about this file format) from 4 Inertial Movement Units (IMUs). The graph in the figure shows the values selected by the user (one among accelerometer, gyroscope, or compass) for each of the 4 IMUs. In the lower left part of the recorder interface you can read the current streaming framerate related to each sensor (49.95 fps in this example). Below the graph you can read both the trial name and the reference clock.

The data is saved by the recording tool in CSV format. CSV format is commonly used to to provide portability towards by external softwares (e.g., Matlab), and, obviously, can be read by EyesWeb itself for playback or analysis purposes. The options panel allows you to configure the working mode of the recorder.

Recorder tools options panel

All the recording tools share the same options panel depicted below:

The control type section controls the synchronization mode. If set in master/standalone mode then the tool works with its own clock, without synchronizing it to other devices. In slave mode the tool receives the clock time from an external device (the master). The clock time can be received both via Network (OSC protocol) and via audio (SMPTE Sync).

Playback patch

Once you recorded some audio, video and IMU data, you can play it back using the playback patch.

For example, if the recorded/downloaded data is located in:C:\Users\my_username\recordings then the patch path will be:C:\Users\my_username\recordings\DANCE_platform_reader.zip

Start EyesWeb and load the reader patch. You will see the following screen:

1) Run the patch by pressing the play button on the EyesWeb toolbar

2) Use the slider “Trial Selector” to select a specific trial to be played back3) To PLAY press the green "Play" button (or press the "s" key)4) To STOP press the red "Stop" the playback (or press the "t" key) 5) To PAUSE the playback press the (or press the "p" key)

IMPORTANT: When changing from a recording to another you have firstly to stop the currently played segment and then you can start the new one.

During the playback of a file, video, 3D mocap data and IMU signals will be displayed. An example:

Patches for analyzing multimodal data

Now that you recorded or downloaded some multimodal data and you can successfully play it back, you can procced by performing some analysis on it.

In the DANCE project we aim to innovate the state of art on the automated analysis of the expressive movement. We consider movement as a communication channel allowing humans to express and perceive implicit high-level messages, such as emotional states, social bonds, and so on. That is, we are not interested in physical space occupation or movement direction per se, or in “functional” physical movements: our interest is on the implications at the expressive level. For example: a hand movement direction to the left or to the right may be irrelevant, instead the level of fluidity or impulsiveness of such movement might be relevant. Example: let us consider the movement “Knocking at a door”. We do not want to analyze the functional action of “knocking at a door”, but the intention that lies behind it (e.g., the lover that knocks at the door of his beloved). To study it, we focus on the sets of non-verbal expressive features that are described in detail in Deliverable 5.1.

The following expressive features can be extracted on multimodal data using the patches you can download below:

Slowness. This feature indicates whether the movement is performed slowly or not.

Smoothness. This feature is based on Energy and Slowness. If movement exhibits high (respectively, low) slowness and no (respectively, many) energy peaks are detected then smoothness is high (respectively, low).

Weight. This feature is related to the Laban’s Weight quality (for details, see: Rudolf Laban and Frederick C. Lawrence. 1947: Effort. Macdonald & Evans.) It is computed by extracting the Energy vertical component normalized to the overall amount of Energy in the movement.

Suddenness. It is computed using alfa-stable distributions.An alpha-stable fit is performed on peaks of accelerations. A movement is sudden when the product between alpha and gamma is high (see Deliverable 2.1 for details). The algorithm takes as input the 3D joint accelerations on a time window on which the suddenness has to be computed, and then it fits it into the alfa-stable distribution. The output of the app gets close to 1 (i.e., very sudden movements) when there are abrupt increases of the joint's velocity in the input signal, and vice-versa.

Impulsivity. An impulsive movement can be performed by a part of the body or by the whole body and is characterized by the following properties: (P1) it is sudden, that is, it presents a high variation of speed (either from low to high or from high to low); (P2) it is executed with no preparation.

Fluidity. A Fluid movement can be performed by a part of the body or by the whole body and is characterized by the following properties: (P1) the movement of each involved joint of the (part of) the body is smooth, following the standard definitions in the literature of biomechanics; (P2): the energy of movement (energy of muscles) is free to propagate along the kinematic chains of (parts of) then body (e.g., from head to trunk, from shoulders to arms) according to a coordinated wave-like propagation. That is, there is an efficient propagation of movement along the kinematic chains, with a minimization of dissipation of energy. Fluidity is computed as the distance between the evolution in time of Humanoid Mass-Spring model (i.e., a model of the human body conceived as a set of masses connected by springs, see Deliverable 2.1 for details) and the actual movement of a user.

Besides the above expressive features, we are interested in extracting analysis primitives: thay are unary, binary, or n-ary operators that summarize with one or more values the temporal development of low-level features in an analysis time unit (a movement unit or a time window). The simplest unary analysis primitives are statistical moments (e.g., average, standard deviation, skewness, and kurtosis). Further examples of unary operators, that are more complex, include shape (e.g., slope, peaks, valleys), entropy, recurrence, and various time-frequency transforms). Models for predictions (e.g., HMM) can also be applied. Binary and n-ary operators can be applied e.g., for measuring relationship between low-level features computed on the movement of different body parts. For example, synchronization can be used to assess coordination between hands. Causality can provide information on whether the movement of a joint leads or follows the movement of another joint.

The following analysis primitive can be extracted on multimodal data using the patches you can download below:

3. Download the IMU and motion capture sample data

As reported in the above paragraphs, you have to download and extract some sample data in order to run the DANCE example patches. Without some sample data the example patches will not start, or will start but will not provide any output. The sample data is contained in a zip file and it is a collection of 2 trials consisting in data recorded by a motion capture system, a videocamera, and 4 IMU sensors placed on the dancer's limbs (wrists and ankles). The zip archive contains the following folders and files: