<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40"><head><meta http-equiv=Content-Type content="text/html; charset=iso-8859-1"><meta name=Generator content="Microsoft Word 14 (filtered medium)"><style><!--
/* Font Definitions */
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0cm;
margin-bottom:.0001pt;
font-size:11.0pt;
font-family:"Calibri","sans-serif";
mso-fareast-language:EN-US;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:purple;
text-decoration:underline;}
span.EmailStyle17
{mso-style-type:personal-compose;
font-family:"Calibri","sans-serif";
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;
font-family:"Calibri","sans-serif";
mso-fareast-language:EN-US;}
@page WordSection1
{size:612.0pt 792.0pt;
margin:3.0cm 2.0cm 3.0cm 2.0cm;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]--></head><body lang=EN-GB link=blue vlink=purple><div class=WordSection1><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><b><span style='font-size:14.0pt;font-family:"Arial","sans-serif"'>Call for Papers</span></b><b><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'> </span></b><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><b><span style='font-size:14.0pt;font-family:"Arial","sans-serif"'>The Fourth Nordic Symposium on Multimodal Communication</span></b> <o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><b><span style='font-size:14.0pt;font-family:"Arial","sans-serif"'>University of Gothenburg, November 15-16, 2012</span></b><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>The 4th Nordic Symposium on Multimodal Communication aims to provide a multidisciplinary forum for researchers from different disciplines who study multimodality in human communication as well as human-computer interaction.</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>The multimodal communication symposium is organized by the SCCIIL Interdisciplinary research centre and Division for Communication and Cognition at the Department of Applied IT, University of Gothenburg and the NOMCO (Multimodal Corpora for the Nordic Countries) NORDCORP project (<a href="http://www.http:/sskkii.gu.se/nomco/">www.http://sskkii.gu.se/nomco/</a>). The invited lecturers have been chosen as representing different aspects of multimodal communication relevant for researchers in the Nordic countries.</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'> Human communication is naturally multimodal, involving the interaction of visual, auditory and other sensory modalities in using speech, facial expressions, head movements, hand gestures and body postures. Multimodal communication is an area of research which is growing rapidly. One reason for this is the increased interest in embodied and situated communication – how humans interact with each other using different modalities, as well as how different artefacts in the communicative activity affect the interaction and how humans and animals interact, for example in riding therapy. Work places as well as school environments, health care and other services increasingly involve complex multimodal communication. The development in mobile media, robotics etc. makes new multimodal technical solutions to communicative challenges possible while at the same creating new challenges for communication research.</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'> The past two decades have witnessed numerous initiatives and research efforts to improve the state of the art concerning the way in which multimodal communication can be described in its own right, as well as modelled and taken advantage of in computer systems. Such efforts include i.a. the collection and annotation of multimodal data, the development of methods for capture and interpretation of multimodal behaviour, and for modelling and generation of multimodal data.</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'> The symposium continues a tradition, established by the Swedish Symposia on Multimodal Communication held in 1997, 1998, 1999 and 2000, and continued by the two Nordic Symposia on Multimodal Communication held in 2003 and 2005, the workshop held in Odense in 2009 and the Third Nordic Symposium in Multimodal Communication in Helsinki 2011.</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'> The symposium will be of interest to anyone studying multimodal communication. Although it hopes to provide a picture of the current state-of-the-art of multimodal research in the Nordic countries, researchers from outside of the Nordic region are most welcome.</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'> <b>Invited speakers<o:p></o:p></b></span></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><b><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'> </span></b><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Karl Grammer</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Dirk Heylen</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Michael Kipp</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Maja Pantic</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Daniel Västfljäll</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><b><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Topics</span></b><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Topics addressed in the symposium cover all aspects of multimodal</span> <span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>communication, such as:</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Speech and gestures in human communication</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Intercultural aspects of multimodal behaviour</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Multimodality aspects of language acquisition</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Multimodal human computer interaction and conversational agents</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>• Multimodal human-animal communication</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>• Multimodal health communication</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>• Multimodal communication, communication disorders and communication support (AAC)</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Multimodal dialogue systems</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Multimodal corpora</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Annotation schemes and tools for multimodal corpora</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Automatic recognition and interpretation of different modalities</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Modality fusion</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Motion capture</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Eye-gaze recognition</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Machine-learning techniques applied to multimodal data</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto'><span style='font-size:10.0pt;font-family:"Arial","sans-serif"'>* Evaluation methods for multimodal systems</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><b><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Documentation and dissemination</span></b><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>The conference proceedings will be published online, containing peer reviewed abstract of 1-2 pages. They will be distributed at the conference. A selection of papers for a special issue of a relevant journal will be discussed at the symposium.</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><b><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Organizing committee</span></b><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Jens Allwood</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Elisabeth Ahlsén</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Patrizia Paggio</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Costanza Navarretta</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Kristiina Jokinen</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><b><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'> Program committee</span></b><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Nick Campbell</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Jens Edlund</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Joakim Gustafsson</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Pentti Haddington</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Bart Jongejan</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Arne Jönsson</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Matthias Rehm</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Alexandra Weilenman</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Nataliya Berbyuk Lindström</span><o:p></o:p></p><p class=MsoNormal style='mso-margin-top-alt:auto;mso-margin-bottom-alt:auto;text-autospace:none'><span lang=EN-US style='font-size:10.0pt;font-family:"Arial","sans-serif"'>Mikael Jensen</span><o:p></o:p></p></div></body></html>