Multimodal Data Analysis for Understanding of Human Behaviour, Emotions and their Reasons Call for Papers

Call for Papers for Special Session in 21st International Conference on Content-based Multimedia Indexing

September 18-20, Reykjavik, Iceland

Special Sessions – CBMI 2024

Multimodal Data Analysis for Understanding of Human Behaviour,
Emotions and their Reasons

This special session addresses the processing of all types of data
related to understanding of human behaviour, emotions, and their
reasons, such as current or past context. Understanding human
behaviour and context may be beneficial for many services both online
and in physical spaces. For example, detecting lack of skills,
confusion or other negative states may help to adapt online learning
programmes, to detect a bottleneck in the production line, to
recognise poor workplace culture etc., or maybe to detect a dangerous
spot on a road before any accident happens there. Detection of unusual
behaviour may help to improve security of travellers and safety of
dementia sufferers and visually/ audio impaired individuals, for
example, to help them to stay away from potentially dangerous
strangers, e.g., drunk people or football fans forming a big crowd.

In context of multimedia retrieval, understanding human behaviour and
emotions could help not only for multimedia indexing, but also to
derive implicit (i.e., other than intentionally reported) human
feedback regarding multimedia news, videos, advertisements,
navigators, hotels, shopping items etc. and improve multimedia
retrieval.

Humans are good at understanding other humans, their emotions and
reasons. For example, when looking at people engaged in different
activities (sport, driving, working on a computer, working in a
construction site, using public transport etc.), a human observer can
understand whether a person is engaged in the task or distracted,
stopped the recommended video because the video was not interesting,
or because the person quickly found what he needed in the beginning of
the video. After observing another human for some time, humans can
also learn his/ her tastes, skills and personality traits.

Hence the interest of this session is, how to improve AI understanding
of the same aspects? The topics include (but are not limited to) the
following

* Use of various sensors for monitoring and understanding human
behaviour, emotion/ mental state/ cognition, and context: video,
audio, infrared, wearables, virtual (e.g., mobile device usage,
computer usage) etc.

* Methods for information fusion, including information from various
heterogeneous sources

* Methods to learn human traits and preferences from long term
observations

* Methods to detect human implicit feedback from past and current
observations

* Methods to assess task performance: skills, emotions, confusion,
engagement in the task, context

* Methods to detect potential security and safety threats and risks

* Methods to adapt behavioural and emotional models to different end
users and contexts without collecting a lot of labels from each user
and/ or for each context: transfer learning, semi-supervised learning,
anomaly detection, one-shot learning etc.

* How to collect data for training AI methods from various sources,
e.g., internet, open data, field pilots etc.

    Use of behavioural or emotional data to model humans and adapt services either online or in physical spaces.
    Ethics and privacy issues in modelling human emotions, behaviour, context and reasons

 

Proceedings should be published by ACM, the information will soon
appear in the conference web page. Best papers will be invited to
Multimedia Tools and Applications journal.
 
Conference Website: http://cbmi2024.org/

Authors can submit full papers (6 pages + references), short papers (4
pages + references), special session papers (6 pages + references) and
demonstration proposals (4 pages + 1 page demonstration description +
references). Authors of certain best papers of the conference will be
invited to submit extended versions of their contributions to a
special journal issue in MTAP (approval pending). Submissions to CBMI
are peer reviewed in a single blind process. All types of papers must
use the IEEE templates at
https://www.ieee.org/conferences/publishing/templates.html. The
language of the conference is English.
 

Submission Deadlines
     
    Full and short research papers are due March 22, 2024
     
    Special session papers are due March 22, 2024
     
    Demonstration submissions are due April 22, 2024