Dear Colleagues,
For those of you interested, we would like to announce the start of the INTERSPEECH 2014 Computational Paralinguistics ChallengE (ComParE).
Hoping you may find interest in the new tasks and with best wishes,
Björn Schuller, Stefan Steidl, Anton Batliner & Julien Epps
Computational Paralinguistics Challenge (ComParE), Interspeech 2014
Cognitive & Physical Load
Call for Participation<http://emotion-research.net/CFP-ComParE-IS14.pdf> as PDF.
*Get started:*
License Agreement (CLSE)<http://emotion-research.net/IS14-Challenge-Agreements-CLSE.pdf> for the dataset download (Cognitive Load Signals Sub-Challenge)
License Agreement (MBC)<http://emotion-research.net/IS14-Challenge-Agreements-MBC.pdf> for the dataset download (Physical Load Sub-Challenge)
The Challenge
The Interspeech 2014 Computational Paralinguistics ChallengE (ComParE) is an open Challenge dealing with states of speakers as manifested in their speech signal's acoustic properties. There have so far been five consecutive Challenges at INTERSPEECH since 2009 (cf. the repository<http://compare.openaudio.eu/>), but there still exists a multiplicity of not yet covered, but highly relevant paralinguistic phenomena. Likewise, we introduce two new tasks by the Cognitive Load Sub-Challenge and the Physical Load Sub-Challenge.
For these Challenge tasks, the COGNITIVE-LOAD WITH SPEECH AND EGG database (CLSE), and the MUNICH BIOVOICE CORPUS (MBC) with high diversity of speakers and different languages covered (Australian English and German) are provided by the organisers. For these Challenge tasks, the COGNITIVE-LOAD WITH SPEECH AND EGG database (CLSE), and the MUNICH BIOVOICE CORPUS (MBC) with high diversity of speakers and different languages covered (Australian English and German) are provided by the organisers. CLSE features Australian speakers recorded during different cognitive load. Next, MBC contains speech under physical exercising. Heart rate and skin conductance were measured by according sensors.
In these respects, the INTERSPEECH 2014 Computational Paralinguistics Challenge (ComParE) shall help bridging the gap between excellent research on paralinguistic information in spoken language and low compatibility of results.
Two Sub-Challenges are addressed:
§ In the Cognitive Load Sub-Challenge, the ternary level of cognitive load has to be classified based on acoustics.
§ In the Physical Load Sub-Challenge, the exercising state (running / resting) and by that heart rate state (high pulse / low pulse) have to be classified automatically.
The measure of competition will be Unweighted Accuracy. Transcription of the train and development sets will be known. Both Sub-Challenges allow contributors to find their own features with their own machine learning algorithm. However, a standard feature set will be provided per corpus that may be used. Participants will have to stick to the definition of training, development, and test sets. They may report on results obtained on the development set, but have only five trials to upload their results on the test sets, whose labels are unknown to them. Each participation will be accompanied by a paper presenting the results that undergoes peer-review and has to be accepted for the conference in order to participate in the Challenge. The organisers preserve the right to re-evaluate the findings, but will not participate themselves in the Challenge.
Overall, contributions using the provided or equivalent data are sought for (but not limited to):
§ Participation in a Sub-Challenge
§ Contributions focussing on Computational Paralinguistics centred around the Challenge topics
The results of the Challenge will be presented at Interspeech<http://www.interspeech2014.org/> 2014
in Singapore, Singapore.
Prizes will be awarded to the Sub-Challenge winners.
If you are interested and planning to participate in the Computational Paralinguistics Challenge, or if you want to be kept informed about the Challenge, please send the organisers an e-mail<mailto:[log in to unmask]> to indicate your interest.
Organisers:
Björn Schuller<mailto:[log in to unmask]> (Imperial College London, UK)
Stefan Steidl<mailto:[log in to unmask]> (FAU Erlangen-Nuremberg, Germany)
Anton Batliner<mailto:[log in to unmask]> (TUM, Germany)
Julien Epps<mailto:[log in to unmask]> (University of New South Wales / NICTA, Australia)
Sponsored by:
Association for the Advancement of Affective Computing (AAAC)<http://emotion-research.net/>
iHEARu<http://ihearu.eu/>
ASC-Inclusion<http://www.asc-inclusion.eu/>
Deadlines:
24 March 2014 Paper submission to INTERSPEECH 2014<http://www.interspeech2014.org/>
16 June 2014 Final result upload
20 June 2014 Camera-ready paper
___________________________________________
PD Dr.-Ing. habil.
Björn W. Schuller
Senior Lecturer in Machine Learning
Imperial College London
Department of Computing
Machine Learning Group
Huxley Bldg., Room 421
180 Queen's Gate, SW7 2AZ London, UK
[log in to unmask]<mailto:[log in to unmask]>
tel: +44-207-594-48357
http://www.schuller.it
___________________________________________
---------------------------------------------------------------
For news of CHI books, courses & software, join CHI-RESOURCES
mailto: [log in to unmask]
To unsubscribe from CHI-ANNOUNCEMENTS send an email to
mailto:[log in to unmask]
For further details of CHI lists see http://listserv.acm.org
---------------------------------------------------------------