20.2001, Calls: Computational Linguistics/Natural Language Engineering (Jrnl)

LINGUIST Network linguist at LINGUISTLIST.ORG
Thu May 28 02:39:57 UTC 2009


LINGUIST List: Vol-20-2001. Wed May 27 2009. ISSN: 1068 - 4875.

Subject: 20.2001, Calls: Computational Linguistics/Natural Language Engineering (Jrnl)

Moderators: Anthony Aristar, Eastern Michigan U <aristar at linguistlist.org>
            Helen Aristar-Dry, Eastern Michigan U <hdry at linguistlist.org>
 
Reviews: Randall Eggert, U of Utah  
       <reviews at linguistlist.org> 

Homepage: http://linguistlist.org/

The LINGUIST List is funded by Eastern Michigan University, 
and donations from subscribers and publishers.

Editor for this issue: Fatemeh Abdollahi <fatemeh at linguistlist.org>
================================================================  

LINGUIST is pleased to announce the launch of an exciting new feature:  
Easy Abstracts! Easy Abs is a free abstract submission and review facility 
designed to help conference organizers and reviewers accept and process 
abstracts online.  Just go to: http://www.linguistlist.org/confcustom, 
and begin your conference customization process today! With Easy Abstracts, 
submission and review will be as easy as 1-2-3!

===========================Directory==============================  

1)
Date: 25-May-2009
From: Marco Pennacchiotti < pennac at yahoo-inc.com >
Subject: Natural Language Engineering
 

	
-------------------------Message 1 ---------------------------------- 
Date: Wed, 27 May 2009 22:38:21
From: Marco Pennacchiotti [pennac at yahoo-inc.com]
Subject: Natural Language Engineering

E-mail this message to a friend:
http://linguistlist.org/issues/emailmessage/verification.cfm?iss=20-2001.html&submissionid=218629&topicid=3&msgnumber=1
  

Full Title: Natural Language Engineering 


Linguistic Field(s): Computational Linguistics 

Call Deadline: 30-Jun-2009 

In the last decades, vector space models (VSM) have received a growing 
attention in different fields of Artificial Intelligence, ranging from
natural  language processing (NLP) and cognitive science, to vision
analysis and  applications in the humanities. The basic idea of VSM is to
represent entities  as vectors in a geometric space, so that their
similarity can be measured according to distance metrics in the space. VSM
have demonstrated to successfully model and solve a variety of problems, 
such as metaphor detection and analysis, priming, discourse analysis, and 
information retrieval.

In computational linguistics, the Distributional Hypothesis  leverages the
notion of VSM to model the semantics of words and other linguistic
entities. The hypothesis was autonomously elaborated in different works,
and has been  since then applied through different settings.  The
hypothesis' core states that 'a word is defined by the company it keeps',
i.e. by the set of linguistic contexts in which it appears. 

Despite the growing popularity of distributional approaches, existing 
literature raises issues on many important aspects that have still to be 
addressed. Examples are: the need of comparative in depth analyses of the
semantic properties captured by different types of distributional models;
the application of new geometrical approaches as the use of quantum logic
operators or tensor decomposition; the study of the interaction between 
distributional approaches and supervised machine learning, as the adoption
of  kernel methods based on distributional information; the application of
distributional techniques in real world applications and in other fields.

Topics

The goal of the special issue is to offer a common journal venue where to 
gather and summarize the state of the art on distributional techniques
applied  to lexical semantics, as a cornerstone in computational
linguistics research.  As a side effect, the aim is also to propose a
systematic and harmonized view  of the works carried out independently by
different researchers in the last  years, which sometimes resulted in
diverging and somehow inconsistent uses of  terminology and
axiomatizations. A further goal is to increase awareness in the 
computational linguistic community about cutting-edge studies on
geometrical  models, machine learning applications and experiences in
different scientific fields.

The special issue in particular focuses on the following areas of interest,
building on topics proposed for the GEMS workshop (EACL 2009, Athens,
http://art.uniroma2.it/gems):

- Comparisons analysis of different distributional spaces (document-based,
word-based, syntax based and others) and their parameters (dimension,
corpus size, etc.)
- Eigenvector methods (e.g. Singular Value and Tucker Decomposition)
- Higher order tensors and Quantum Logic extensions
- Feature engineering in machine learning models
- Computational complexity and evaluation issues
- Graph-based models over semantic spaces
- Logic and inference in semantic spaces
- Cognitive theories of semantic space models
- Applications in the humanities and social sciences
- Application of distributional approaches in :
       - Word sense disambiguation and discrimination
       - Selectional preference induction
       - Acquisition of lexicons and linguistic patterns
       - Conceptual clustering
       - Kernels methods for NLP (e.g. relation extraction and textual
entailment)
       - Quantitative extensions of Formal Concept Analysis
       - Modeling of linguistic and ontological knowledge

For more information please see: http://art.uniroma2.it/jnle

Call Deadline: 30-June-2009




-----------------------------------------------------------
LINGUIST List: Vol-20-2001	

	



More information about the LINGUIST mailing list