33.566, Books: An Entropy and Noisy-Channel Model for Rule Induction: Radulescu
The LINGUIST List
linguist at listserv.linguistlist.org
Mon Feb 14 14:53:45 UTC 2022
LINGUIST List: Vol-33-566. Mon Feb 14 2022. ISSN: 1069 - 4875.
Subject: 33.566, Books: An Entropy and Noisy-Channel Model for Rule Induction: Radulescu
Moderator: Malgorzata E. Cavar (linguist at linguistlist.org)
Student Moderator: Billy Dickson
Managing Editor: Lauren Perkins
Team: Helen Aristar-Dry, Everett Green, Sarah Goldfinch, Nils Hjortnaes,
Joshua Sims, Billy Dickson, Amalia Robinson, Matthew Fort
Jobs: jobs at linguistlist.org | Conferences: callconf at linguistlist.org | Pubs: pubs at linguistlist.org
Homepage: http://linguistlist.org
Please support the LL editors and operation with a donation at:
https://funddrive.linguistlist.org/donate/
Editor for this issue: Billy Dickson <billyd at linguistlist.org>
================================================================
Date: Mon, 14 Feb 2022 09:52:59
From: Janacy van Duijn Genet [lot-fgw at uva.nl]
Subject: An Entropy and Noisy-Channel Model for Rule Induction: Radulescu
Title: An Entropy and Noisy-Channel Model for Rule Induction
Series Title: LOT Dissertation Series
Publication Year: 2021
Publisher: Netherlands Graduate School of Linguistics / Landelijke (LOT)
http://www.lotpublications.nl/
Book URL: https://www.lotpublications.nl/an-entropy-and-noisy-channel-model-for-rule-induction
Author: Silvia Radulescu
Paperback: ISBN: 9789460933929 Pages: 291 Price: Europe EURO 35
Abstract:
Language learners not only memorize specific items and combinations of items,
but they also infer statistical patterns between these specific items
(item-bound generalization), while also forming categories and generalized
rules that apply to categories of items (category-based generalization). The
mechanisms and factors that trigger and modulate rule learning are still
largely underspecified.
The main goal of this dissertation is to propose and test an innovative
entropy model for rule induction based on Shannon’s noisy-channel coding
theory. The main hypothesis of the entropy model is that rule induction is an
encoding mechanism gradually driven by the dynamics between an external factor
– input entropy – and an internal factor – channel capacity. Entropy measures
input variability, while channel capacity is the amount of entropy processed
per second.
The findings showed that when input entropy increases, the tendency to move
from item-bound generalization to category-based generalization increases
gradually. Results also showed that low input entropy facilitates item-bound
generalization, not only rote memorization. In the case of non-adjacent
dependencies, results showed that it is input entropy that drives rule
learning, not the set size of items, as it was previously claimed. Regarding
channel capacity, the findings showed that sped up rate of information
transmission leads to higher tendency towards category-based generalization.
These findings bring evidence in favor of the entropy model. The dissertation
also sketches the first joint information-theoretic and thermodynamic model of
rule induction, proposing that the 2nd law of thermodynamics and the
constructal law of thermodynamics can answer why and how rule induction
happens.
Linguistic Field(s): Cognitive Science
Psycholinguistics
Written In: English (eng)
See this book announcement on our website:
http://linguistlist.org/pubs/books/get-book.cfm?BookID=158173
------------------------------------------------------------------------------
*************************** LINGUIST List Support ***************************
The 2020 Fund Drive is under way! Please visit https://funddrive.linguistlist.org
to find out how to donate and check how your university, country or discipline
ranks in the fund drive challenges. Or go directly to the donation site:
https://crowdfunding.iu.edu/the-linguist-list
Let's make this a short fund drive!
Please feel free to share the link to our campaign:
https://funddrive.linguistlist.org/donate/
----------------------------------------------------------
LINGUIST List: Vol-33-566
----------------------------------------------------------
More information about the LINGUIST
mailing list