Search Box

Friday, August 7, 2015

Language Universal: Found in All Languages

MIT Claims to Have Found a "Language Universal" That Ties All Languages Together

A language universal would bring evidence to Chomsky's controversial theories

Cathleen O'Grady | August 6, 2105


Language takes an astonishing variety of forms across the world—to such a huge extent that a long-standing debate rages around the question of whether all languages have even a single property in common. Well, there’s a new candidate for the elusive title of “language universal” according to a paper in this week’s issue of PNAS. All languages, the authors say, self-organise in such a way that related concepts stay as close together as possible within a sentence, making it easier to piece together the overall meaning.

<more at http://arstechnica.com/science/2015/08/mit-claims-to-have-found-a-language-universal-that-ties-all-languages-together/; related link: http://web.mit.edu/futrell/www/papers/futrell2015largescale.pdf (Large-scale evidence of dependency length minimization in 37 languages. Richard Futrell, Kyle Mahowald, and Edward Gibson. Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA. [Abstract: Explaining the variation between human languages and the constraints on that variation is a core goal of linguistics. In the last 20 y, it has been claimed that many striking universals of cross-linguistic variation follow from a hypothetical principle that dependency length—the distance between syntactically related words in a sentence—is minimized. Various models of human sentence production and comprehension predict that long dependencies are difficult or inefficient to process; minimizing dependency length thus enables effective communication without incurring processing difficulty. However, despite widespread application of this idea in theoretical, empirical, and practical work, there is not yet large-scale evidence that dependency length is actually minimized in real utterances across many languages; previous work has focused either on a small number of languages or on limited kinds of data about each language. Here, using parsed corpora of 37 diverse
languages, we show that overall dependency lengths for all languages are shorter than conservative random baselines. The results strongly suggest that dependency length minimization is a universal quantitative property of human languages and support explanations of linguistic variation in terms of general properties of human information processing.]>

No comments:

Post a Comment