[Dgkl] CfP: Computational approaches to language dynamics (Theme session at DGKL 2026)

Stefan Hartmann hartmast at hhu.de
Tue Feb 10 14:38:16 UTC 2026


**

*Computational approaches to language dynamics: *

*

Cognitive and constructional perspectives


- Theme session at the 11th International Conference of the German 
Cognitive Linguistics Association (DGKL) 
<https://www.uni-bielefeld.de/fakultaeten/linguistik-literaturwissenschaft/forschung/arbeitsgruppen/germanistische-grammatikf/dgkl2026/index.xml>, 
August 31st to September 2nd, 2026, Bielefeld -


*Convenors: *

Bastian Bunzeck, University of Bielefeld, bastian.bunzeck at uni-bielefeld.de

Stefan Hartmann, HHU Düsseldorf, hartmast at hhu.de


Cognitive linguistics conceives of language as a complex adaptive system 
(Beckner et al. 2009). Modelling the intricacies of this system entails 
numerous challenges for any approach that aims at going beyond the mere 
description of linguistic facts and instead tries to offer an 
explanatory account of linguistic phenomena. Multifactorial statistical 
methods for the analysis of corpus or experimental data have proven 
highly insightful in recent years, but even they cannot answer all 
research questions that emerge at the interface of cognition, culture, 
and language use. For instance, when it comes to social effects that 
shape the contemporary make-up of the grammar of a language, or when it 
comes to pathways of language learning as influenced by the input that 
an individual receives from many different sources, there are many 
variables that we cannot control for. Computational approaches have 
therefore become more and more important in recent years in various 
domains. To mention only two examples, computational modelling, which 
has been among the most important approaches for testing hypotheses 
about the evolutionary development of language(s) for decades (Smith et 
al. 2003, Kirby 2013, Ruland et al. 2023), is increasingly being used to 
account for phenomena of language variation and change (Sevenants et al. 
2021, Pijpops 2022). Secondly, machine-learning approaches in the form 
of deep neural language models (LMs) have started to play an 
increasingly important role in modelling language processing and 
language dynamics. (L)LMs are not only being used as “copilots” in 
traditional linguistic research (Torrent et al. 2023), but also open up 
novel ways of modelling first language acquisition (e.g. Bunzeck et al. 
2025, Padovani et al. 2025) or for analyzing structural properties of 
language that may inform theories of cognitively plausible linguistic 
representations (cf. Warstadt & Bowman 2022, Futrell & Mahowald 2025). 
Here, some researchers even argue that LMs provide a proof-of-concept 
for usage-based theories of language (Ambridge & Blything 2024, Goldberg 
2024), a claim that remains contested (cf. Piantadosi et al. 2023 and 
its numerous replies), also due to the rule-based nature of common 
linguistic benchmarks (Weissweiler et al. 2025).

This theme session aims at bringing together researchers using 
computational methods to address research questions from cognitive 
linguistics and Construction Grammar including, but not limited to, the 
following:

  *

    How do social, cultural, and interactional factors shape grammar(s)
    and language(s), both in an ontogenetic and a historical perspective?

  *

    How can computational approaches help us model the make-up of
    constructional networks both on an individual level and at the level
    of populations of language users?

  *

    To what extent can computational models including (Large) Language
    Models give insights into emergent properties of language(s)?

  *

    How can we probe black-box models like (L)LMs, and how do prompts,
    benchmarks and other tests relate to theories of language structure
    and use?

The theme session aims at building bridges between different 
computational approaches that are used to investigate language dynamics 
at various timescales.


If you are interested in taking part in the theme session, please send 
an abstract (up to 500 words excl. references) to the theme session 
organizers (bastian.bunzeck at uni-bielefeld.de, hartmast at hhu.de) until 
March 15, 2026. Decisions will be sent out in early April.



References


Ambridge, Ben & Liam Blything. 2024. Large language models are better 
than theoretical linguists at theoretical linguistics. Theoretical 
Linguistics 50(1–2). 33–48. https://doi.org/10.1515/tl-2024-2002.

Beckner, Clay, Richard Blythe, Joan Bybee, Morten H. Christiansen, 
William Croft, Nick C. Ellis, John Holland, Jinyun Ke, Diane 
Larsen-Freeman & Tom Schoenemann. 2009. Language is a Complex Adaptive 
System: Position Paper. Language Learning59 Suppl. 1. 1–26. 
https://doi.org/10.1111/j.1467-9922.2009.00533.x.

Bunzeck, Bastian, Daniel Duran & Sina Zarrieß. 2025. Do construction 
distributions shape formal language learning in German BabyLMs? In 
Proceedings of the 29th conference on computational natural language 
learning, 169–186. Vienna, Austria: Association for Computational 
Linguistics.

Futrell, Richard & Kyle Mahowald. 2025. How Linguistics Learned to Stop 
Worrying and Love the Language Models. Behavioral and Brain 
Sciences1–98. https://doi.org/10.1017/S0140525X2510112X.

Goldberg, Adele E. 2024. Usage-based constructionist approaches and 
large language models. Constructions and Frames16(2). 220–254. 
https://doi.org/10.1075/cf.23017.gol.

Kirby, Simon. 2013. Language, culture, and computation: An adaptive 
systems approach to biolinguistics. In Cedric Boeckx & Kleanthes K. 
Grohmann (eds.), The Cambridge handbook of biolinguistics, 460–477. 
Cambridge: Cambridge University Press.

Padovani, Francesca, Jaap Jumelet, Yevgen Matusevych & Arianna Bisazza. 
2025. Child-Directed Language Does Not Consistently Boost Syntax 
Learning in Language Models. Proceedings of the 2025 Conference on 
Empirical Methods in Natural Language Processing, 19746–19767. Suzhou, 
China: Association for Computational Linguistics. 
doi:10.18653/v1/2025.emnlp-main.999.

Piantadosi, Steven T. 2024. Modern language models refute Chomsky’s 
approach to language.Language Science Press. 
https://doi.org/10.5281/ZENODO.12665933.

Pijpops, Dirk. 2022. Lectal contamination: Evidence from corpora and 
from agent-based simulation. International Journal of Corpus 
Linguistics27(3). 259–290. https://doi.org/10.1075/ijcl.20040.pij.

Ruland, Marcel, Alejandro Andirkó, Iza Romanowska & Cedric Boeckx. 2023. 
Modelling of factors underlying the evolution of human language. 
Adaptive Behavior.SAGE Publications. 
https://doi.org/10.1177/10597123221147336.

Sevenants, Anthe & Dirk Speelman. 2021. Keeping up with the Neighbours - 
An Agent-Based Simulation of the Divergence of the Standard Dutch 
Pronunciations in the Netherlands and Belgium. Computational Linguistics 
in the Netherlands Journal11. 5–26.

Smith, Kenny, Simon Kirby & Henry Brighton. 2003. Iterated Learning: A 
Framework for the Emergence of Language. Artificial Life 9. 371–386.

Torrent, Tiago Timponi, Thomas Hoffmann, Arthur Lorenzi Almeida & Mark 
Turner. 2023. Copilots for Linguists: AI, Constructions, and Frames. 
Cambridge University Press. https://doi.org/10.1017/9781009439190.

Warstadt, Alex & Samuel R. Bowman. 2022. What Artificial Neural Networks 
Can Tell Us about Human Language Acquisition. Algebraic Structures in 
Natural Language, 17–60. 1. edn. Boca Raton: CRC Press. 
doi:10.1201/9781003205388-2.

Weissweiler, Leonie, Kyle Mahowald & Adele E. Goldberg. 2025. Linguistic 
Generalizations are not Rules: Impacts on Evaluation of LMs. In Claire 
Bonial, Melissa Torgbi, Leonie Weissweiler, Austin Blodgett, Katrien 
Beuls, Paul Van Eecke & Harish Tayyar Madabushi (Hrsg.), Proceedings of 
the Second International Workshop on Construction Grammars and NLP, 
61–74. Düsseldorf, Germany: Association for Computational Linguistics.*

-- 
Prof. Dr. Stefan Hartmann (he/him)
Heinrich-Heine-Universität Düsseldorf
Philosophische Fakultät
Institut für Germanistik, Abt. Germanistische Sprachwissenschaft
Universitätsstraße 1
40225 Düsseldorf
Gebäude: 24.53
Etage/Raum: U1.94
Tel.: +49 211 81-13684
Website:https://stefanhartmann.eu/
Personal webex room:https://hhu.webex.com/meet/shartmann

Sekretariat / Secretary's office:
Claudia Franken-Stemmler
Geb. 24.52, Raum U1.23
Tel. +49 211 81-11393
claudia.franken-stemmler at hhu.de
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listserv.linguistlist.org/pipermail/dgkl/attachments/20260210/e52631b3/attachment-0001.htm>


More information about the DGKL mailing list