34.1358, Calls: Learning with Small Data

The LINGUIST List linguist at listserv.linguistlist.org
Fri Apr 28 23:05:05 UTC 2023


LINGUIST List: Vol-34-1358. Fri Apr 28 2023. ISSN: 1069 - 4875.

Subject: 34.1358, Calls: Learning with Small Data

Moderator: Malgorzata E. Cavar, Francis Tyers (linguist at linguistlist.org)
Managing Editor: Lauren Perkins
Team: Helen Aristar-Dry, Steven Franks, Everett Green, Joshua Sims, Daniel Swanson, Matthew Fort, Maria Lucero Guillen Puon, Zackary Leech, Lynzie Coburn
Jobs: jobs at linguistlist.org | Conferences: callconf at linguistlist.org | Pubs: pubs at linguistlist.org

Homepage: http://linguistlist.org

Please support the LL editors and operation with a donation at:
           https://funddrive.linguistlist.org/donate/

Editor for this issue: Everett Green <everett at linguistlist.org>
================================================================


Date: 28-Apr-2023
From: Ellen Breitholtz [ellen.breitholtz at ling.gu.se]
Subject: Learning with Small Data


Full Title: Learning with Small Data
Short Title: LSD

Date: 11-Sep-2023 - 12-Sep-2023
Location: Gothenburg, Sweden
Contact Person: Ellen Breitholtz
Meeting Email: ellen.breitholtz at ling.gu.se
Web Site:
https://sites.google.com/view/learning-with-small-data/home?authuser=0

Linguistic Field(s): Computational Linguistics; General Linguistics

Call Deadline: 12-May-2023

Meeting Description:

There is now an acute need for intensive research on the possibility
of effective learning with small data. Our 2023 conference, LSD, is
devoted to work on this problem, with application to computational
linguistics.

Why is there this need? Current deep learning systems require large
amounts of data in order to yield optimal results. Neural language
models are now trained on many billions of parameters, with data sets
that are terabytes in size. Despite this, they have achieved
remarkable success across a wide range of tasks in Natural Language
Processing, and in AI generally. But these systems have a number of
limitations which require closer attention:

First, the models take a long time to pretrain, and they are difficult
to modify. As a result, much research in NLP is shaped by what one can
achieve with large transformers. This has marginalised important
computational learning questions for which they are not well suited.

Second, because of the heavy resources required to develop them, they
have become the preserve of tech companies. Researchers at most
universities and smaller centres are now positioned as consumers of
these systems, limited to fine tuning them for experimental work on
downstream tasks.

Third, the complexity, size, and mode of computation of transformers
has rendered the way in which they acquire the generalisations
extracted from data largely opaque. This has made it difficult to
understand precisely why they succeed, or fail, where they do.

Finally, comparison with human learning and representation has become
increasingly difficult, given the large disparity in accessible data
and learning time between transformers and humans. Therefore, the
cognitive interest of deep learning has receded.

These reasons alone are sufficient to motivate us at CLASP to bring
fellow researchers together for an organized discussion. We welcome
original contributions in all areas of NLP and related domains of AI
that address aspects of this issue.

Call for Papers:

Learning with Small Data (LSD) will bring together researchers from
various areas to discuss  the sustainability of current state of the
art methods in computational linguistics which rely on very large
models, such as GPT2-3, BERT, and XLNet. The conference encourages
contributions from machine learning, computational linguistics,
theoretical linguistics, philosophy, cognitive science, and
psycholinguistics, as well as from artificial intelligence ethics and
social policy. We hope to see innovative technical proposals,  and we
will cultivate a wide spectrum of views within a lively dialog on the
issues that the conference addresses.

For the full call for papers visit our website:
https://sites.google.com/view/learning-with-small-data/call-for-papers



------------------------------------------------------------------------------


LINGUIST List is supported by the following publishers:

American Dialect Society/Duke University Press http://dukeupress.edu

Bloomsbury Publishing (formerly The Continuum International Publishing Group) http://www.bloomsbury.com/uk/

Brill http://www.brill.com

Cambridge Scholars Publishing http://www.cambridgescholars.com/

Cambridge University Press http://www.cambridge.org/linguistics

Cascadilla Press http://www.cascadilla.com/

De Gruyter Mouton https://cloud.newsletter.degruyter.com/mouton

Dictionary Society of North America http://dictionarysociety.com/

Edinburgh University Press www.edinburghuniversitypress.com

Equinox Publishing Ltd http://www.equinoxpub.com/

European Language Resources Association (ELRA) http://www.elra.info

Georgetown University Press http://www.press.georgetown.edu

John Benjamins http://www.benjamins.com/

Lincom GmbH https://lincom-shop.eu/

Linguistic Association of Finland http://www.ling.helsinki.fi/sky/

Multilingual Matters http://www.multilingual-matters.com/

Narr Francke Attempto Verlag GmbH + Co. KG http://www.narr.de/

Netherlands Graduate School of Linguistics / Landelijke (LOT) http://www.lotpublications.nl/

Oxford University Press http://www.oup.com/us

Wiley http://www.wiley.com


----------------------------------------------------------
LINGUIST List: Vol-34-1358
----------------------------------------------------------



More information about the LINGUIST mailing list