******************************************************************** CALL FOR PAPERS IEEE Transactions on Multimedia Special Issue on Multimodal Affective Interaction ******************************************************************** Schedule ----------------- Submission deadline: **December 1, 2009** Notification of acceptance: June 8, 2010 Final manuscript due: June 22, 2010 Tentative publication date: October 2010 Guest Editors ----------------- - Nicu Sebe, University of Trento, Italy - Hamid Aghajan, Stanford University, USA - Thomas Huang, University of Illinois at Urbana-Champaign, USA - Nadia Magnenat-Thalmann, University of Geneva, Switzerland - Caifeng Shan, Philips Research, The Netherlands Affect sensing and recognition from multiple modalities and cues has attracted much attention recently, with important applications in human-computer interaction. However, this research area is still in its infancy and it is an under-explored field. There are many challenges faced when moving to multimodal affective interaction. For example, how to acquire and annotate affect data using multiple sensors or modalities, especially the spontaneous data in natural settings; how to effectively extract and select representative features from different modalities for affect recognition; how to synchronize data or features from different modalities; how to select the fusion strategy of multimodal affect data for a given application, etc.. It is also necessary to investigate which modalities and cues are the most suitable for which application context. To address these challenges, we have to adapt the existing approaches or find new techniques suitable for multimodal affective interaction. This special issue seeks to present and highlight the latest developments on multimodal affective interaction. Submissions that address real-world applications are especially encouraged. Topics of interest include, but are not limited to, • Multimodal affective data fusion (including the user feedback) • Feature extraction from affective data • Recognition & synthesis of affective body language - facial expressions - bodily postures/gestures • Affective speech recognition & synthesis • Personal emotion profile creation • Spontaneous affect analysis • Affective sound and music processing • Design of affective interaction systems • Affective loop in social robots/agents • Affective user behavior/lifestyle modeling • Evaluation of affective interaction systems • Real-time applications of multimodal affective interaction (smart environments, entertainment, virtual reality, affective wearables, education, recommendation systems, etc) Submission Procedure -------------------- Authors should prepare manuscripts according to the Information for Authors as published at http://www.ieee.org/organizations/society/tmm/infotmm.html. Note that mandatory overlength page charges and color charges will apply. Manuscripts should be submitted electronically through the online IEEE manuscript submission system at http://tmmieee.manuscriptcentral.com/. When selecting a manuscript type, authors must click on Special Issue on Multimodal Affective Interaction. Authors should follow the instructions for the IEEE Transactions on Multimedia. A completed copyright form is required to be signed and faxed to 1-732-562-8905 at the time of submission. Please indicate the manuscript number on the top of the page.