Unconsciously, humans evaluate
situations
based
on
environment
and
social
parameters
when
recognizing
emotions
in
social
interactions.
Without
context,
even
humans
may
misunderstand
the
observed
facial,
vocal
or
body
behavior.
Contextual
information,
such
as
the
ongoing
task
(e.g., human-computer
vs.
human-robot
interaction),
the
identity
(male vs. female)
and
natural
expressiveness
of
the
individual
(e.g., introvert
vs.
extrovert),
as
well
as
the
intra-
and
interpersonal
contexts,
help
us
to
better
interpret
and
respond
to
environment
around
us.
These
considerations
suggest
that
attention
to
context
information
can
deepen
our
understanding
of
affect
communication
(e.g., discrete
emotions,
affective
dimensions
such
as
valence
and
arousal,
different
types
of
moods
and
sentiment,
etc.)
for
making
reliable
real-world
affect-sensitive
applications.
This 6th CBAR workshop
aims
to
investigate
how
to
efficiently
exploit
and
model
context
using
the
cutting-edge
computer
vision
and
machine
learning
approaches
in
order
to
advance
automatic
affect
recognition.
Topics
of
interest
include,
but
are
not
limited
to:
- Context-sensitive affect recognition from still images or videos.
- Audio and/or physiological data modeling for context-sensitive affect recognition.
- Context based corpora recording and annotation.
- Domain adaptation for context-aware affect recognition.
- Multi-modal context-aware fusion for affect recognition to successfully handle:
-Asynchrony and
discordances
of
different
modalities
such
as
voice,
face,
and
head/body.
-Innate priority
among
modalities.
-Temporal variations
in
the
relative
importance
of
the
modalities
according
to
the
context.
- Theoretical and empirical analysis of influence of context on affect recognition
- Context aware applications:
-Depression severity
assessment,
pain
intensity
measurement,
and
autism
screening
(e.g. the
influence
of
age,
gender,
intimate
vs.
stranger
interaction,
physician-patient
relationship,
home
vs.
hospital).
-Affect-based
human-robot
and
human-embodied
conversational
agent
interactions
(e.g. autism
therapy
and
story-telling,
caregiving
for
the
elderly).
-Other applications
such
as
context-sensitive
and
affect-aware
intelligent
tutors
(e.g. learning
profile,
personality
assessment,
student
performance,
content).
Submission Policy:
We call for submission of high-quality
papers. The submitted manuscripts should not be submitted to another conference or workshop. Each paper will receive at least two reviews. Acceptance will be based on relevance to the workshop, novelty, and technical quality.
At least one author of each paper must register and attend
the workshop to present the paper.
Workshop Proceedings will
be
submitted
for
inclusion
to
IEEE
Xplore.
The reviewing process for the workshop will be “double-blind”. All submissions should, therefore, be appropriately anonymized not to reveal authors names or authors’ institutions.
Submissions must be in PDF format, in accordance with the IEEE FG conference paper style (4-8 pages).
In this talk, we will
report our recent works on social learning for (i) detecting individual traits
such as pathology, identity and personality during human-robot interaction,
(ii) engagement detection. We will also describe how these frameworks could be
employed to investigate coordination mechanisms and in particular when it comes
to pathologies such as autism spectrum disorders
Keynote Speakers:
Title: Body
movement as a rich modality to capture and regulate contextual pain experiences
Abstract: With the
emergence of full-body sensing technology come new opportunities to support
people’s affective experiences and needs. In my talk, I will present our work
on technology for chronic pain management and discuss how such technology can
lead to more effective physical rehabilitation through integrating it in
everyday activities and supporting people at both physical and affective
levels. I will also discuss our findings on how perception of pain behaviour is
biased by context in human-human interaction and how affective-context
modulates pain and pain coping capabilities. I will conclude by discussing some
of the implications for affect- and pain-aware technology design.
Title:
Interpersonal Human-Human and Human-Machine Interaction Approaches for
Individual and Social Traits Detection
Abstract:
One of the most significant challenges in robotics is to achieve closer
interactions between humans and robots. Mutual behaviors occurring during
interpersonal interaction provide unique insights into the complexities of the
processes underlying human-robot coordination. In particular, interpersonal
interaction, the process by which two or more people exchange information
through verbal (what is said) and non-verbal (how it is said) messages could be
exploited to both establish interaction and inform about the quality of
interaction.
Guoying Zhao
Title: Facial expression analysis: From macro to micro
16:00 - 16:30: Coffee break
16:30 - 17:15: Keynote 3
Title: Facial expression analysis: From macro to micro
Abstract:
Emotions are a central part of human communication, play an important role in
everyday social life, and should have a key role in human-computer
interactions. Emotions are complicated. Sometimes, people intentionally express
their emotions, e.g., in the way of expressed macro-expressions, to help
deliver the messages and sometimes people would suppress and hide their
emotions, manipulated as micro-expressions, for different reasons. This talk
introduces the work from macro-expression analysis, to micro-expression
detection and recognition, and discusses the open problems in this area.
Tentative
Schedule: Tuesday,
14th May 2019
Chair: Zakia Hammal
14:00 - 14:45: Keynote 1
Guoying Zhao
Facial
expression analysis: From macro to micro.
14:45 - 15:30: Keynote 2
Nadia Berthouze
Body
movement as a rich modality to capture and regulate contextual pain experiences.
15:30 - 16:00
Umut Avci (Yasar
University) and Oya Aran (De La Salle University)
Analyzing
group performance in small group interaction: Linking personality traits and
group performance through the verbal content.
16:00 - 16:30: Coffee break
16:30 - 17:15: Keynote 3
Mohamed Chetouani
Interpersonal
Human-Human and Human-Machine Interaction Approaches for Individual and Social
Traits Detection.
Organizers:
Merlin Teodosia Suarez
Center for Empathic Human-Computer Interactions,
De La Salle University
Important Dates:
Submission Deadline: 28 January 2019
Notification of Acceptance: 17 February 2019
Camera Ready: 24 February
2019
Program Committee (to be completed)
Program Committee (to be completed)
Anna Esposito, University degli Studi della Campania, “Luigi
Vanvitelli”, Italy
Mohammad H. Mahoor, University of Denver, USA
Yan Tong, University of South Carolina, USA
Ursula Hess, Humboldt University, Berlin
Laurence Devillers, Paris-Sorbonne
IV, France
Hongying Meng, Brunel
University London, UK
Oya Aran, De La Salle University, Philippines
Khiet Truong, University of Twente, Netherlands
Oya Aran, De La Salle University, Philippines
Khiet Truong, University of Twente, Netherlands