[HPSG-L] 4REAL Workshop, 1st Cfp : LREC Workshop on Reproducibility

António Branco antonio.branco at di.fc.ul.pt
Sat Dec 19 08:33:54 UTC 2015


[apologies for duplicates; please help do disseminate]





================= FIRST CALL FOR PAPERS================


4REAL Workshop
Workshop on Research Results Reproducibility and
Resources Citation in Science and Technology of Language

28 May 2016
Portorož, Slovenia

Collocated with LREC2016
10th Language Resources and Evaluation Conference
http://lrec2016.lrec-conf.org




Important dates:

15 January 2016: Last call for papers
15 February 2016: Deadline for submissions
15 March 2016: Notification of authors
15 April 2016: Deadline for camera-ready
28 May 2016: Workshop






Call for papers:

The discussion on research integrity has grown in importance as
the resources allocated to and societal impact of scientific
activities have been expanding (e.g. Stodden, 2013, Aarts et al., 2015),
to the point that it has recently crossed the borders of
the research world and made its appearance in important mass media
and was brought to the attention of the general public
(e.g. Nail and Gautam, 2011, Zimmer, 2012, Begley and Sharon 2012,
The Economist, 2013).

The immediate motivation for this increased interest is to be found
in a number of factors, including the realization that for
some published results, their replication is not being obtained
(e.g. Prinz et al., 2011; Belgley and Ellis, 2012); that there may be
problems with the commonly accepted reviewing procedures, where
deliberately falsified submissions, with fabricated errors and
fake authors, get accepted even in respectable journals
(e.g. Bohannon, 2013); that the expectation of researchers vis a vis
misconduct, as revealed in inquiries to scientists on questionable
practices, scores higher than one might expect or would be ready
to accept (e.g. Fanelli, 2009); among several others.

Doing justice to and building on the inherent ethos of scientific
inquiry, this issue has been under thorough inquiry leading to
a scrutiny on its possible immediate causes and underlying factors,
and to initiatives to respond to its challenge, namely by
the setting up of dedicated conferences (e.g. WCRI – World Conference
on Research Integrity), dedicated journals (e.g. RIPR – Research
Integrity and Peer review), support platforms (e.g. COS – Center for
Open Science), revised and more stringent procedures (e.g. Nature, 2013),
batch replication studies (e.g. Aarts et al, 2015), investigations
on misconduct (e.g. Hvistendahl, 2013), etc.

This workshop seeks to foster the discussion and the advancement
on a topic that has been so far given insufficient attention
in the research area of language processing tools and resources
(Branco, 2013, Fokkens et al., 2013) and that has been an important topic
emerging in other scientific areas. That is the topic of the reproducibility
of research results and the citation of resources, and its impact
on research integrity.

We are thus inviting submissions of articles that present pioneering
cases, either with positive or negative results, of actual replication
exercises of previous published results in our area. We are interested
also in articles discussing the challenges, the risk factors,
the procedures, etc. specific to our area or that should be adopted,
or adapted from other neighboring areas, possibly including of course
the new risks raised by the replication articles themselves and
their own integrity, in view of the preservation of the reputation
of colleagues and works whose results are reported has having been
replicated, etc.

Byu the same token, this workshop is interested also on articles addressing
methodologies for monitoring, maintaining or improving citation of
language resources and tools and to assess the importance of data citation
for research integrity and for the advancement of natural language science
and technology.




Organization committee:

António Branco (University of Lisbon)
Nicoletta Calzolari (ILC)
Khalid Choukri (ELRA)




Program committee:

António Branco (University of Lisbon)
Iryna Gurevych (Universität Darmstadt)
Isabel Trancoso (INESC)
Joseph Mariani (CNRS/LIMSI)
Justus Roux (NWVU)
Khalid Choukri (ELRA)
Maria Gavrilidou (ILSP)
Marko Grobelnik (Jozef Stefan Institute)
Marko Tadic (University of Zagreb)
Mike Rosner (University of Malta)
Nicoletta Calzolari (ILC)
Nick Campbell (Trinity College Dublin)
Senja Pollak (Jozef Stefan Institute)
Stelios Piperidis (ILSP)
Steven Krauwer (University of Utrecht)
Thierry Declerck (DFKI)
Torsten Zesch (University of Duisburg-Essen)
Yohei Murakami (Language Grid Japan)




"Share your LRs!" initiative:

When submitting a paper from the START page, authors will be asked
to provide essential information about resources (in a broad sense,
i.e. also technologies, standards, evaluation kits, etc.) that have
been used for the work described in the paper or are a new result
of your research. Moreover, ELRA encourages all LREC authors to share
the described LRs (data, tools, services, etc.) to enable their reuse
and replicability of experiments (including evaluation ones).




References:

Aarts, et al., 2015, “Estimating the reproducibility of psychological
science”, Science
The Economist, 2013, Unreliable Research: Trouble at the Lab, The Economist,
October 19, 2013, online
Begley, 2012, In cancer science, many "discoveries" don't hold up,
Reuters, March 28th, online
Begley and Ellis, 2012, Drug development: Raise standards for preclinical
cancer research, Nature
Bohannon, John, 2013, Who's Afraid of Peer Review?, Science
Branco, António, 2013, Reliability and Meta-reliability of Language
Resources: Ready to initiate the Integrity Debate?, TLT12
COS, Centre for Open Science
Fanelli,2009 How Many Scientists Fabricate and Falsify Research?
A Systematic Review and Meta-Analysis of Survey Data, PL OS ONE
Fokkens et al., 2013, Offspring from Reproduction Problems: What Replication
Failure Teaches US, ACL.
Hvistendahl, 2013, "China's Publication Bazaar”, Science.
Nail, 2011, Scientists' Elusive Goal: Reproducing Study Results,
The Wall Street Journal.
Nature, 2013, “Announcement: Reducing our irreproducibility”, Nature,
Editorial.
Prinz, et al., 2011, Believe it or not: how much can we rely on published
data on potential drug targets?, Nature Reviews Drug Discovery 10, 712.
RIPR, Research Integrity and Peer Review.
Stodden, 2013, Resolving Irreproducibility in Empirical and Computational
Research, IMS Bulletin Online.
WCRI, World Conference on Research Integrity.
Zimmer, Carl, 2012, A Sharp Rise in Retractions Prompts Calls for Reform,
The New York Times.










_______________________________________________
META-NET-all mailing list
META-NET-all at meta-net.eu
http://www.dfki.de/mailman/cgi-bin/listinfo/meta-net-all




More information about the HPSG-L mailing list