28.426, Calls: Comp Ling, Disc Analysis, Gen Ling, Pragmatics, Text/Corpus Ling/Germany

The LINGUIST List linguist at listserv.linguistlist.org
Fri Jan 20 14:30:27 UTC 2017


LINGUIST List: Vol-28-426. Fri Jan 20 2017. ISSN: 1069 - 4875.

Subject: 28.426, Calls: Comp Ling, Disc Analysis, Gen Ling, Pragmatics, Text/Corpus Ling/Germany

Moderators: linguist at linguistlist.org (Damir Cavar, Malgorzata E. Cavar)
Reviews: reviews at linguistlist.org (Helen Aristar-Dry, Robert Coté,
                                   Michael Czerniakowski)
Homepage: http://linguistlist.org

*****************    LINGUIST List Support    *****************
                       Fund Drive 2016
                   25 years of LINGUIST List!
Please support the LL editors and operation with a donation at:
           http://funddrive.linguistlist.org/donate/

Editor for this issue: Kenneth Steimel <ken at linguistlist.org>
================================================================


Date: Fri, 20 Jan 2017 09:30:19
From: Birgit Hänel [khaenel at uos.de]
Subject: International Conference on Multimodal Communication

 
Full Title: International Conference on Multimodal Communication 

Date: 09-Jun-2017 - 11-Jun-2017
Location: Osnabrück, Germany 
Contact Person: Mark Turner
Meeting Email: icmc2017 at gmail.com
Web Site: https://sites.google.com/a/case.edu/icmc2017/ 

Linguistic Field(s): Computational Linguistics; Discourse Analysis; General Linguistics; Pragmatics; Text/Corpus Linguistics 

Call Deadline: 15-Feb-2017 

Meeting Description:

International Conference on Multimodal Communication: Developing New Theories
and Methods
Osnabrück University, 9-11 June 2017

Conference Organizers: 

Alexander Bergs
Mark Turner

Confirmed plenary speakers:

- Harald Baayen.  Alexander von Humboldt Professor, University of Tübingen.
- Thomas Hoffmann. Professor of Linguistics, Katholische Universität
Eichstätt-Ingolstadt.
- Jungseock Joo. Assistant Professor of Communication Studies at UCLA and
Research Scientist at Facebook. “Deep learning and computer vision techniques
for human gesture and facial expression analysis.” 
- Irene Mittelberg. Professor of Linguistics and Cognitive Semiotics at the
Human Technology Centre (HumTec) at RWTH Aachen University. Mittelberg directs
the Natural Media Lab and the Center for Sign Language and Gesture (SignGes).
''Crossmodal clusters: Mapping linguistic constructions and gestural patterns
via motion-capture data''
- Francis Steen.  Professor of Communication Studies, UCLA. Co-director of the
Distributed Little Red Hen Lab.
- Eve Sweetser. Professor of Linguistics, UC-Berkeley. Coordinator,
UC-Berkeley Gesture and Multimodality Group. Coordinator, UC-Berkeley Matrix
Metaphor research group. Co-PI, MetaNet IARPA research project. ''Viewpoint,
creativity and convention in multimodal constructions.'' 

In addition to plenary talks, parallel sessions, a poster session, and a
conference dinner, the conference will feature, on Saturday and Sunday
mornings, several plenary workshops on methods by leading methodological
experts, each presenting a specific workflow, showing how specific methods can
be applied to transform a research question into finished, publishable
research products. These workshops will cover, among other topics, new
multimodal tools developed in the Red Hen Lab (http://redhenlab.org),
Cinepoetics tools for analyzing film and television, tools for gesture
analysis, tools for automatic computational visual analysis of facial and
gestural communication, and computational tools for finding and analyzing
co-speech gesture in a corpus.

Subsequent messages from the conference organizers will provide the usual
logistical information, including a link to a conference website, and explain
how to submit abstracts for parallel sessions and for a poster session. The
conference will run from about 1:30pm 9 June to about 6:15pm 11 June 2017,
Central European Summer Time.

We thank, for their generous support of the conference, Osnabrück University,
Case Western Reserve University, and the Anneliese-Maier Research Award
program of the Alexander von Humboldt Foundation.


Final Call for Papers:

Dear researchers in various fields involving multimodal communication:

We have received many high-quality abstracts for ICMC2017, but also several
alerts that because I announced ICMC2017 mostly through linguistics
communities (cogling-l, funknet, cxg), some groups of multimodal researchers
missed the call altogether or until it was too late.

Since the mission of ICMC2017 is to bring together researchers from currently
separated fields to encourage them to collaborate, we are extending the
deadline for submissions to 15 February 2017.  

We will begin now the process of reviewing abstracts we have received and
communicate decisions, but will conduct an additional process of review for
additional abstracts received by 15 February 2017.  

Thanks to you all for your work and advice! 

International Conference on Multimodal Communication
https://sites.google.com/a/case.edu/icmc2017/

We encourage presentations on any aspect of multimodal communication, 
including topics that deal with language and multimodal constructions,
paralinguistics, facial expressions, gestures, cinema, theater, role-playing
games. The research domains can be drawn from personal interaction, social
media, mass media, group communication, and other areas. We invite conceptual
papers, observational studies, experiments, and computational, technical, and
statistical approaches.

All submissions will be reviewed for acceptance.

Instructions: 

Submit an abstract by emailing it to icmc2017 at gmail.com 

Requirements: 

1. The subject line of the email message must be one of two phrases, either: 

''abstract for parallel session'' 
or 
''abstract for poster session'' 

2. At the top of the body of the message, provide: 

- Name 
- Affiliation 
- Email address 
- Abstract Title 

3. Content: maximum of 500 words + maximum of 15 references 

4. Avoid explicit personal identifiers in the body of the email message except
for the 4 elements at the top listed under (2). This will make it easier for
the organizers to supervise double-blind reviewing.




------------------------------------------------------------------------------

*****************    LINGUIST List Support    *****************
                       Fund Drive 2016
Please support the LL editors and operation with a donation at:
            http://funddrive.linguistlist.org/donate/

        Thank you very much for your support of LINGUIST!
 


----------------------------------------------------------
LINGUIST List: Vol-28-426	
----------------------------------------------------------
Visit LL's Multitree project for over 1000 trees dynamically generated
from scholarly hypotheses about language relationships:
          http://multitree.org/








More information about the LINGUIST mailing list