[Rstlist] Call for papers: HCI 2021 Session "Semantic, artificial and computational interaction studies: Towards a behavioromics of multimodal communication"
Andy Lücking
luecking at em.uni-frankfurt.de
Mon Nov 2 11:22:31 UTC 2020
*Apologies for cross-postings"
---
Call for papers
HCI 2021 Session: "Semantic, artificial and computational interaction
studies: Towards a behavioromics of multimodal communication"
In natural language interactions, the whole human body acts as a
"semiotic display" (Goodwin 2003). The field of HCI was among the
first that paid this social fact its due attention. Notwithstanding a
few exceptions, theoretical linguistics only recently begun to
investigate the communicative bandwidth of speech and gesture. We use
"gesture" in a broad sense here, covering manual gestures, facial
expresssions, head movements, shrugs, laughter, or body orientation.
Simultaneously, due to the digital turn, the traditional
methodological triad of "armchair, laboratory and field" (Clark &
Bangerter 2004) is expanded by data analytics, that is in the context
of multimodal interaction, a statistical means to describe the form of
communication. This session aims at bringing these branches together.
Potential aims include to delineate experimental studies,
computational methods, and resource building and exploitation that
connects armchair, laboratory, field and data -- a joint
methodological endeavour that might be called "behavioromics".
Possible questions in this regard include: Can HCI, health or
ergonomic applications be used as experimental gesture study settings?
Which computational methods enable big gesture data analysis? Are
there already resources that can be used in this regard? What
phenomena are studied in semantics and dialogue theory at all?
Finally: how do gesture data analyses, digital human modelling
applications and semantic/pragmatic gesture research combine
methodologically?
The session cover, but is not limited to, topics such as the following:
- phenomena under discussion: past, present, future
- big gesture data
- creation and exploitation of multimodal corpora
- avatars as experimental setting
- cross-modal tracking
- data-based multimodal analysis
- detecting gestalts
- automatic annotation
- multimodal networks
We want to emphasize that conceptual contributions are highly welcome!
By this we mean contributions that reflect on, or propose steps
towards, the individual thematic priority with respect to
"behavioromics".
Last but not least, the conference session provides a platform for
bringing together semanticists, computer scientists and researchers
from related fields that deal with multimodal interaction. Since we
all work on virtually the same topic but from different angles, there
should be opportunities to get in touch: to see what others are doing
and to approach some of the above-outlined, methodological challenges.
*NEXT STEP: December 01, 2020: 1-2 page abstract from the authors
through the CMS at https://cms.hci.international/2021*. Make sure to
select the correct session!
Important dates:
- December 01, 2020: upload abstract
- February 14, 2021: full paper
- July 24--29, 2021: virtual conference
Session organizers:
Andy Lücking (www.texttechnologylab.org/team/andy-luecking/)
Alexander Mehler (www.texttechnologylab.org/team/alexander-mehler/)
Cornelia Ebert (https://user.uni-frankfurt.de/~coebert/)
References for quotations:
Clark, Herbert H. and Adrian Bangerter (2004). “Changing Ideas about
Reference.” In: Experimental Pragmatics. Ed. by Ira A. Noveck and Dan
Sperber. Palgrave Studies in Pragmatics, Language and Cognition.
Basingstoke and New York: Palgrave Macmillan. Chap. 2, pp. 25–49.
Goodwin, Charles (2003). “Pointing as Situated Practice.” In:
Pointing: Where Language, Culture, and Cognition Meet. Ed. by Sotaro
Kita. Mahwah, New Jersey: Lawrence Erlbaum Associates, Inc. Chap. 2,
pp. 217–241.
---
More information about the Rstlist
mailing list