[Corpora-List] Call for Participation: The Joint Student Response Analysis and 8th Recognizing Textual Entailment Challenge

Myroslava Dzikovska mdzikovs at inf.ed.ac.uk
Tue Jan 22 16:44:18 UTC 2013


Please distribute widely - Apologies for cross-posting

*******************************************

SECOND CALL FOR PARTICIPATION

THE JOINT STUDENT RESPONSE ANALYSIS AND 8TH RECOGNIZING TEXTUAL
ENTAILMENT CHALLENGE
(SemEval-2013 Task 7)


HIGHLIGHTS:

- TASK REGISTRATION DEADLINE: February 15, 2013, at http://bit.ly/XUcQ0v
- CHALLENGE DURATION: from February 25, 2013 to MARCH 15, 2013
- JOINT CHALLENGE EMAIL DISCUSSION GROUP: http://bit.ly/XwcwCR
- DETAILED TASK DESCRIPTION: http://bit.ly/10nQTGJ
- UPDATED EVALUATION CODE AND BASELINES: http://bit.ly/140DHMH
- SemEval WORKSHOP: June 13-14, 2013, co-located with NAACL
  (http://bit.ly/140uIuN) & *Sem (http://bit.ly/WknhYY),
  Atlanta, Georgia, USA
				

INTRODUCTION

We are pleased  to invite participants to the Joint Student Response
Analysis and 8th Recognizing Textual Entailment Challenge at SemEval
2013, a common effort of both educational technology and textual
inference communities in order to present a unified scientific challenge
addressing researchers in both fields. It will be run together with the
SEMEVAL-2013 Semantic Evaluation Exercise, co-located with the *SEM and
NAACL-2013 conferences.

MAIN TASK

The goal of the task is to produce an assessment of student answers to
explanation and definition questions typically asked in practice
exercises, tests or tutorial dialogue.  It is linked to the tasks of
semantic analysis and recognizing textual entailment in the textual
inference community, and essay grading and short answer assessment in
the educational NLP community.

Specifically, given a question, a known correct "reference answer" and a
1- or 2-sentence "student answer", the Main task consists of assessing
the correctness of a student's answer at different levels of granularity
- either 5-way (correct, partially correct, contradictory, irrelevant,
not in the domain), 3-way task (correct, contradictory, incorrect), or
2-way task (correct, incorrect). Participants can opt to carry out the
task at any level of granularity.

This task is similar to essay grading and short answer assessment.
However, meaningful labels are used instead of numerical scores as a way
to extract information necessary to provide effective tutorial feedback
with respect to the specific problems that can be found in student
answers. Educational NLP and essay grading system developers are
encouraged to adapt their scoring techniques to the categorical labeling
problem.

The Student Response Analysis task  is also closely related to the
notion of textual entailment.   In most correct answers at least a
substantial portion of the reference answer will be entailed by the
student answer text, even if not all of it. Thus textual entailment
engine developers are encouraged to exploit textual entailment
techniques to test the potential contributions of their systems to the
student response assessment problem.

PILOT TASK ON PARTIAL ENTAILMENT

An additional pilot task on partial entailment is offered as part of the
challenge, where systems may recognize that specific parts of the
hypothesis are entailed by the text, even though entailment might be not
recognized for the hypothesis as a whole. Such recognition of partial
entailment may have various uses in the educational setting based on
identifying the missing parts in the student answer, and may similarly
have value in other applications such as summarization or question
answering.

SCHEDULE

- February 15, 2013 	REGISTRATION DEADLINE
***
CHALLENGE
- MAIN TASK test set release: Feb. 25
- MAIN TASK submissions: March 4
- PILOT TASK test set release: March 5
- PILOT TASK  submissions: March 12
- Results to participants: March 15
***
- April 9, 2013 	Paper submission deadline [TBC]
- April 23, 2013 	Reviews Due [TBC]
- May 4, 2013 		Camera ready Due [TBC]
***
SemEval WORKSHOP
- June 13-14, 2013, co-located with NAACL & *Sem, Atlanta, Georgia, USA


PARTICIPATING IN THE CHALLENGE

More details on the Main and Pilot tasks are available at the SEMEVAL
2013 website: http://bit.ly/10nQTGJ.

Future updates will be distributed through the Joint Challenge Google
Discussion Group, please join to ensure that you will receive
announcements and status updates, at http://bit.ly/XwcwCR

To help us with forward planning, please register now at SemEval 2013
website if you are intending to enter the challenge
(http://bit.ly/XUcQ0v). All teams are required to register by 15
February 2013.


ORGANIZERS

Myroslava Dzikovska (m.dzikovska at ed.ac.uk), University of Edinburgh, UK
[Primary contact]
Rodney Nielsen (Rodney.Nielsen at UNT.edu), University of North Texas, USA
Chris Brew (christopher.brew at gmail.com), Educational Testing Service
(ETS), USA
Claudia Leacock (claudia_leacock at mcgraw-hill.com), CTB McGraw-Hill; USA

Luisa Bentivogli (bentivo at fbk.eu), CELCT and FBK, Italy
Peter Clark, (peterc at vulcan.com) Vulcan Inc., USA
Ido Dagan (dagan at cs.biu.ac.il) Bar-Ilan University, Israel
Hoa Trang Dang, (hoa.dang at nist.gov) NIST, USA
Danilo Giampiccolo, (giampiccolo at celct.it), CELCT, Italy

-- 
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.


_______________________________________________
UNSUBSCRIBE from this page: http://mailman.uib.no/options/corpora
Corpora mailing list
Corpora at uib.no
http://mailman.uib.no/listinfo/corpora



More information about the Corpora mailing list