The goal of the caddy project was to develop an autonomous underwater robot who was
able to assist a diver, interact with the diver and control the divers
state. The technical challenges for our research were the development
of diverNet - a motion capturing array of
sensors and devices for recording of heart rate and diver
breathing. Another challenges was the devlopment of an underwater
tablet which could be used by the divers to communicate and fill out
questionnaires.

We
developed multilayer perceptrons for diver behaviour interpretation
using emotion, motion and physiological data collected in dry land and
diver experiments.
We showed as a proof of concept that it is possible to determine
experienced emotional states from body motions captured with DiverNet
motion sensors.
Although the information is present in breathing motions – the primary source which gives
much better results are body motions. Using body motions for prediction
is not only possible in walking behaviour but also for a diver swimming
underwater. Thus produced motion is a basic predictor for experienced
emotioal states in a pleasure arousal and dominance circumplex
model. As far as we know there is no scientific report on this
relation up to now. Another highlight is the categorization of
behaviour categories from body motions. A neural network can recognize
what the diver is doing at the moment. In addition we developed a
real time analysis program which plays back dive recordings and
analyses them in real time. Such systems are of gret interest for
mission planning and mission surveillance, for instance in SWAT teams
or in space when visibility is low or zero.