26th ACM International Conference on Multimodal Interaction Call for Papers

26th ACM International Conference on Multimodal Interaction
(4-8 Nov 2024)
Dear colleague,
The deadline for paper submissions for ICMI 2024 is around the corner.

https://icmi.acm.org/2024/call-for-papers/

https://new.precisionconference.com/submissions/icmi24a


The 26th International Conference on Multimodal Interaction (ICMI
2024) will be held in San José, Costa Rica. ICMI is the premier
international forum that brings together multimodal artificial
intelligence (AI) and social interaction research. Multimodal AI
encompasses technical challenges in machine learning and computational
modeling such as representations, fusion, data, and systems. The study
of social interactions encompasses both human-human interactions and
human-computer interactions.  A unique aspect of ICMI is its
multidisciplinary nature which values both scientific discoveries and
technical modelling achievements, with an eye towards impactful
applications for the good of people and society.

Novelty will be assessed along two dimensions: scientific novelty and
technical novelty. Accepted papers at ICMI 2024 will need to be novel
along one of the two dimensions:

    Scientific Novelty: Papers should bring new scientific knowledge
    about human social interactions, including human-computer
    interactions. For example, discovering new behavioral markers that
    are predictive of mental health or how new behavioral patterns
    relate to children’s interactions during learning. It is the
    responsibility of the authors to perform a proper literature
    review and clearly discuss the novelty in the scientific
    discoveries made in their paper.

    Technical Novelty: Papers should propose novelty in their
    computational approach for recognizing, generating or modeling
    multimodal data. Examples include: novelty in the learning and
    prediction algorithms, in the neural architecture, or in the data
    representation. Novelty can also be associated with new usages of
    an existing approach.

Commitment to ethical conduct is required and submissions must adhere
to ethical standards in particular when human-derived data are
employed. Authors are encouraged to read the ACM Code of Ethics and
Professional Conduct (https://ethics.acm.org/).

ICMI 2024 conference theme:

The theme of this year's ICMI conference revolves around
"Equitability and environmental sustainability in multimodal
interaction technologies." The focus is on exploring how multimodal
systems and multimodal interactive applications can serve as tools to
bridge the digital divide, particularly in underserved communities and
countries, with a specific emphasis on those in Latin America and the
Caribbean. The conference aims to delve into the design principles
that can render multimodal systems more equitable and sustainable in
applications such as health and education, thereby catalyzing positive
transformations in development for historically marginalized groups,
including racial/ethnic minorities and indigenous peoples. Moreover,
there is a crucial exploration of the intersection between multimodal
interaction technologies and environmental sustainability. This
involves examining how these technologies can be crafted to
comprehend, disseminate, and mitigate the adverse impacts of climate
change, especially in the Latin America and Caribbean region. The
conference endeavors to explore the potential of multimodal systems in
fostering community resilience, raising awareness, and facilitating
education related to climate change, thereby contributing to a
holistic approach that encompasses both social and environmental
dimensions.

Additional topics of interest include but are not limited to:

    Affective computing and interaction
    Cognitive modeling and multimodal interaction
    Gesture, touch and haptics
    Healthcare, assistive technologies
    Human communication dynamics
    Human-robot/agent multimodal interaction
    Human-centered A.I. and ethics
    Interaction with smart environment
    Machine learning for multimodal interaction
    Mobile multimodal systems
    Multimodal behaviour generation
    Multimodal datasets and validation
    Multimodal dialogue modeling
    Multimodal fusion and representation
    Multimodal interactive applications
    Novel multimodal datasets
    Speech behaviours in social interaction
    System components and multimodal platforms
    Visual behaviours in social interaction
    Virtual/augmented reality and multimodal interaction


Important Dates
Abstract deadline 	April 26th, 2024
Paper Submission 	May 3rd, 2024
Rebuttal Period 	June 16th-23rd, 2024
Paper notification 	July 18th, 2024
Camera-ready paper 	August 16th, 2024
Presenting at main conference 	November 5th-7th, 2024