<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-15">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<b>2 PhD Studentships available at Saarland University,
Saarbruecken, Germany</b><b><br>
</b><br>
The Embodied Spoken Interaction (ESI) group at the Saarland
University, headed by Dr. Maria Staudte, invites applications for a
PhD studentship, available from September 1st, 2013 (75% TV-L E13) :<br>
<br>
<br>
1. "Joint Attention in Human-Agent-Interaction" in
Psycholinguistics/Human-Agent-Interaction
<br>
<br>
<meta http-equiv="content-type" content="text/html;
charset=ISO-8859-15">
People interact using speech as well as other non-verbal cues. Gaze,
for instance, ubiquitously accompanies utterances in face-to-face
interaction and may provide additional referential information from
the speaker or may reveal whether the listener has understood.
Following the partner's gaze and interpreting her posture and
gestures can not only enrich but even be essential for successful
and efficient communication.
Understanding and modeling the dynamic interplay of speech and
non-verbal cues such as gaze and how they combine to encode a
particular message is a complex enterprise. The use of virtual
agents as interaction partners provides one way to approach this
problem: The artificial partner introduces a precise, controllable,
and yet dynamic component to the interaction with humans. Using them
as test beds enables us to observe, model and test complex
multi-modal behaviors. <br>
<br>
The proposed PhD project will develop interactive behaviors of a
virtual character using state-of-the-art virtual agent software and
modern eye- and motion-tracking systems. Besides the developing
component, this research is also empirical and will involve the
design of user studies and data analysis in order to tackle
questions such as how joint attention and/or certain gestures or
facial expressions are employed and affect spoken content. <br>
<br>
Applicants should hold a Master degree in computational linguistics,
computer science, cognitive science, psychology or psycholinguistics
(or equivalent)
and should have an interest in modeling and understanding the
dynamics of human interaction. Basic programming skills are
necessary. Experience with experiment design and statistics are an
advantage but not required. Most importantly, the successful
applicant should be enthusiastic about the general research
questions and be prepared to learn new methods. <br>
<br>
***<br>
<br>
2.
<meta http-equiv="content-type" content="text/html;
charset=ISO-8859-15">
"Interpreting Listener Behavior to Inform Navigational Guidance" in
NLP/Psycholinguistics<br>
<br>
<meta http-equiv="content-type" content="text/html;
charset=ISO-8859-15">
People look at what is being talked about. By following referring
expressions (RE) to their referent, a person can ground that
expression in the environment and fully understand and validate the
utterance that contains it. In turn, fixating the intended referent
signals to the speaker that the listener has understood. Thus,
listener eye-movements fulfill several roles: While (privately)
seeking visual information linked to the utterance content, they
also (publicly) reflect (un-)successful reference resolution or,
more generally, belief states of the listener to the speaker.
In a dynamic and complex environment, such as the GIVE challenge, in
which other tasks are involved as well, it becomes increasingly
difficult to infer *what* precisely listeners understand and intend
to do next and *how* e.g. their eye-movements help to infer this.
Understanding listener behavior and how an artificial speaker (e.g.
a dialog system) can exploit this information most efficiently, will
be a major concern of this research project.<br>
<br>
One possible PhD project would thus consist in developing strategies
and algorithms for a system that is informed about the user's visual
attention at any given time by modern remote eye-tracking
technology. Besides the developing component, this line of research
is also empirical and will involve the design of user studies and
data analysis to develop and test these strategies. Alternative
settings, such as in-car tracking along with navigational
instructions, are also conceivable in order to explore effective
ways of using listener gaze for giving efficient and safe
instructions. <br>
<br>
Applicants should hold a Master degree in computational linguistics,
computer science, cognitive science or equivalent, and should have
an interest in modeling and understanding the dynamics of spoken
interaction. Good programming skills are necessary. Experience with
experiment design and statistics are an advantage but not required.
Most importantly, the successful applicant should be enthusiastic
about the general research questions and be prepared to learn new
methods. <br>
<br>
***<br>
<br>
The Embodied Spoken Interaction (ESI) group is part of the <a
href="http://www.mmci.uni-saarland.de/">"Multi-Modal Computing and
Interaction" Cluster of Excellence</a> at Saarland University
which provides a very fruitful and constructive research environment
with excellent opportunities for exchange and cooperation. The group
has access to numerous state-of-the-art eye-tracking laboratories, a
64 channel EEG/ERP lab, and modern computing infrastructure, and
conducts research at the level of international excellence. <br>
<br>
The candidate will be expected to contribute to the high standards
of the group and to be actively involved in the preparation and
publication of new results. Further information about the group can
be found at: <a
href="http://www.mmci.uni-saarland.de/en/independent_research_groups/esi">
http://www.mmci.uni-saarland.de/en/independent_research_groups/esi
</a> <br>
<br>
Applicants should submit their research statement, a CV, a copy of
their school and university degrees, a representative reprint
(thesis or paper if applicable), and names and contact information
of two references. The position remains open until filled, but
preference will be given to applications received by <b>1 August</b>.
All documents should be e-mailed as a single PDF to: masta AT coli
DOT uni-saarland DOT de<br>
<br>
<br>
Thanks and Best Regards,<br>
Maria Staudte<br>
<br>
</body>
</html>