32.986, Support: Computational Linguistics: PhD, Labex Empirical Foundations of Linguistics EFL

The LINGUIST List linguist at listserv.linguistlist.org
Wed Mar 17 18:20:30 UTC 2021


LINGUIST List: Vol-32-986. Wed Mar 17 2021. ISSN: 1069 - 4875.

Subject: 32.986, Support: Computational Linguistics: PhD, Labex Empirical Foundations of Linguistics EFL

Moderator: Malgorzata E. Cavar (linguist at linguistlist.org)
Student Moderator: Jeremy Coburn, Lauren Perkins
Managing Editor: Becca Morris
Team: Helen Aristar-Dry, Everett Green, Sarah Robinson, Nils Hjortnaes, Joshua Sims, Billy Dickson
Jobs: jobs at linguistlist.org | Conferences: callconf at linguistlist.org | Pubs: pubs at linguistlist.org

Homepage: http://linguistlist.org

Please support the LL editors and operation with a donation at:
           https://funddrive.linguistlist.org/donate/

Editor for this issue: Becca Morris <becca at linguistlist.org>
================================================================


Date: Wed, 17 Mar 2021 14:19:56
From: Christel Préterre [christel.preterre at sorbonne-nouvelle.fr]
Subject: Computational Linguistics: PhD, Labex Empirical Foundations of Linguistics EFL, France

 Institution/Organization: Labex Empirical Foundations of Linguistics EFL 
Department:  
Web Address: http://www.labex-efl.fr 

Level: PhD 

Duties: Research
 
Specialty Areas: Computational Linguistics 
 

Description:

Call for applications
Fully-funded PhD position in computational linguistics at EFL LabEx in Paris
“Articulation of a compositional model of negation with distributional
semantics”

The EFL LabEx (Laboratory of Excellence "Empirical Foundations in Linguistics"
http://www.labex-efl.fr) is offering a 3 year PhD grant in computational
linguistics , full time, about net 1700 euros/month, starting in the Fall 2021
at the University Sorbonne Nouvelle (Paris).

Topic of the PhD:
The PhD topic falls within the strand 5 of the LabEx “Computational Semantic
Analysis”. We propose to work on the representation of negation within
pretrained transformer-based language models (Devlin et al., 2019). The aim
will be to study what form of compositionality emerges in these models, in
particular for the treatment of negation (in the same vein as (Ettinger,
2019)), and what improvements can be made, in particular with the addition of
self-supervised tasks requiring more sophisticated linguistic information than
the simple linear order (such as the addition of syntactic information by (Xu
et al., 2020)).

Profile:
- The candidate must have a Master degree by November 2021, with a
specialization in computational linguistics or NLP.
- The candidate is expected to have some previous knowledge of deep learning
algorithms for NLP (including pre-trained transformer-based language models)
and in semantics.
- He or she will be affiliated to the laboratory Lattice (
https://www.lattice.cnrs.fr/ ), and to the graduate school ED 622 (
http://www.univ-paris3.fr/ed-622-sciences-du-langage-3413.kjsp ).
- Attendance to laboratory and doctoral seminars is expected; it might also be
possible to teach classes in linguistics or computational linguistics.

Application:
The application file should be sent by May 9 (midnight MET) to
pascal.amsili at ens.fr and
marie.candito at u-paris.fr

It should comprise:
- A CV (max 5 pages) with transcripts (Master), diplomas, internships
- A cover letter
- The names and contact of two referees for reference letters

The candidates selected for auditions will send their Master thesis or other
written work supporting their
qualification for the project.

They will be interviewed (remotely) between the end of May and mid-June 2021

References:
Devlin, J., Chang M.-W., Lee K. and Toutanova K. (2019). “BERT: Pre-training
of Deep Bidirectional Transformers for Language Understanding” In NAACL-HLT
2019.
Ettinger, A. (2019). What bert is not: Lessons from a new suite of
psycholinguistic diagnostics for language models. Transactions of the
Association for Computational Linguistics , 8:34–48.
Xu Z., Guo D., Tang D., Su Q., Shou L., Gong M., Zhong W., Quan X., Duan N.,
and Jiang D. (2020) “Syntax-Enhanced Pre-trained Model”. arXiv:2012.14116
 

Application Deadline: 09-May-2021 

Web Address for Applications: http://en.labex-efl.fr/post/fully-funded-phd-position-in-computational-linguistics 

Contact Information: 
	Mr Pascal Amsili 
	pascal.amsili at ens.fr  


------------------------------------------------------------------------------

***************************    LINGUIST List Support    ***************************
 The 2020 Fund Drive is under way! Please visit https://funddrive.linguistlist.org
  to find out how to donate and check how your university, country or discipline
     ranks in the fund drive challenges. Or go directly to the donation site:
                   https://crowdfunding.iu.edu/the-linguist-list

                        Let's make this a short fund drive!
                Please feel free to share the link to our campaign:
                    https://funddrive.linguistlist.org/donate/
 


----------------------------------------------------------
LINGUIST List: Vol-32-986	
----------------------------------------------------------






More information about the LINGUIST mailing list