As I mentioned in my previous post, the 2012 Poznań Linguistic Meeting (PLM) features a thematic section on “Theory and evidence in language evolution research.” This section's invited speaker was Jim Hurford, who is Emeritus Professor at Edinburgh University. Hurford is a very eminent figure in language evolution research and has published two very influential and substantive volumes on “Language in the Light of Evolution”: The Origins of Meaning (2007) and The Origins of Grammar (2011).
In his Talk, Hurford asked “What is wrong, and what is right, about current theories of language, in the light of evolution?” (you can find the abstract here).
Hurford presented two extreme positions on the evolution of language (which nevertheless are advocated by quite a number of evolutionary linguists) and then discussed what kinds of evidence and lines of reasoning support or seem to go against these positions.
Extreme position A, which basically is the Chomskyan position of Generative Grammar, holds that:
(1) There was a single biological mutation which (2) created a new unique cognitive domain, which then (3) immediately enabled the unlimited command of complex structures via the computational operation of merge. Further, according to this extreme position, (4) this domain is used primarily for advanced private thought and only derivatively for public communication and lastly (5) it was not promoted by natural selection.
On the other end of the spectrum there is extreme position B, which holds that:
(1) there were many cumulative mutations which (2) allowed the expanding interactions of pre-existing cognitive domains creating a new domain, which however is not characterized by principles unique to language. This then (3) gradually enabled the command of successively more complex structures. Also, on this view, this capacity was used primarily for public communication, and only derivatively for advanced private thought and was (5) promoted by natural selection.
Hurford then went on to discuss which of these individual points were more likely to capture what actually happened in the evolution of language.
He first looked at the debate over the role of natural selection in the evolution of language. In Generative Grammar there is a biological neurological mechanism or computational apparatus, called Universal Grammar (UG) by Chomsky, which determines what languages human infants could possibly acquire. In former Generative Paradigms, like the Government & Binding Approach of the 1980s, UG was thought to be extremely complex. What was more, some of these factors and structures seemed extremely arbitrary. Thus, from this perspective, it seemed inconceivable that they could have been selected for by natural selection. This is illustrated quite nicely in a famous quote by David Lightfoot:
“Subjacency has many virtues, but I am not sure that it could have increased the chances of having fruitful sex (Lightfoot 1991: 69)”
However, with the rise of the minimalist programme (MP) in the 90s these arbitrary properties disappeared from mainstream Generative/MinimalistTheorizing. This means that in MP, Language isn’t that complicated, abstract and arbitrary as it once was. Instead, there is only one central property that is seen as characterizing the core linguistic system: The ability to combine elements via the operation of merge. If we follow this development, this means that Lightfoot’s problem goes away. Hurford argued that in addition, the ability to merge concepts in your head to form more complex conceptual representations obviously is adaptive in private thought so that from his perspective, natural selection now does play a role in this kind of account.
The process of Externalization, that is the ability to merge meaning-form pairs into signs in order to express complex meanings in public communication is also obviously adaptive. The ability to use merge in thinking and communication is thus highly adaptive. So even within Generative theorizing, it seems as if natural selection should play a role in accounts of how language evolved. However, Chomsky himself, and many Generativist following him, still insists that natural selection plays no part in the evolution of language. Hurford criticizes this view by stating that a completely non-adaptationist view would predict that people without merge were not evolutionary disadvantaged, which clearly isn’t the case.
Hurford then addressed the Competence – Performance distinction as used in Generative Grammar. In the Generative tradition competence refers to a tacit knowledge of the unlimited possibilities of a combinatorial capacity of the linguistic system.
Quite obviously, accidental and temporary performance factors, e.g. distraction, drunkenness, or sudden death, do not factor into the underlying knowledge or linguistic competence. Such practical bounds are not relevant for a theory of competence.
However, Hurford argues that performance is in fact relevant to determining what exactly competence is. Namely, there are also permanent limiting factors, like processing capacity, storage capacity or short term memory in conditions of alertness. These are not accidental and are just as ‘innate’ as any form of Language Acquisition device supposed by nativists.
One example of this is complex birdsong, which is a very regular behaviour based on an innate template and environmental exemplars.
That is, its underlying biological foundations seem to function in a very similar way to how language is thought to be organized by some.
However, the template already has limiting factors built into it, like the regularity and numerical upper and lower bounds of the number of repetitions of phrases as well as the overall durations. The fact that these are already built means that in birds, it doesn’t make sense to speak of competence and performance independently of each other. The same, Hurford argues, goes for language.
Hurford thus coined the term Competence+™, as representing a package of competence plus statistical ingredients and the kinds of permanent limiting factors like storage or processing capacity mentioned above.
This then led Hurford to coin the term of UG+™ (Universal Grammar Plus™), which refers to the fact that on this view the formal properties attributed to Universal Grammar (i.e. the ‘language acquisition toolkit’ a child is born with), on the one hand, and memory and processing power, on the other, would not have evolved independently. Instead, memory and processing power are inherently numerically bounded. UG+™, then, represents the coevolved package of formal and numerical information. That is, it refers to the coevolution of form and behavioural dispositions. On this view, a child is born with UG+™ and then acquires or develops Competence+™.
In terms of the timing of externalization and the evolution of merge there are two possibilities:
Externalization preceding Merge: On this view, even the simplest conceptual units were externalized from an early stage in hominid evolution onwards (confer Bickerton’s protolanguage). This means that merge would have been public from the start.
The other possibility is that the capacity to merge conceptual units for advanced thought precedes externalization and public communication. In normal humans it is important to note that complex thought and complex language go together. There are of course pathologies where they are dissociated but overall there is a lot of evidence that there is a correlation between verbal and noverbal IQ, that learning simple public labels modifies thought, especially in children, that bilinguals perform better in certain tasks, and that words can function as and aid to thought.
This suggests a less simple possibility: A coevolutionary spiral of successively more complex language and more complex thought. This also means there can’t have been only a single mutation: The grammatical system and the system supporting it (storage capacities, working memory, vocal manual skills, pragmatics, etc.) are highly interdependent. They all had to evolve in partnership. Hurford closed his talk by discussing the question whether evolution has produced a new unique domain. Hurford stresses that our evolved capacities for language have built on pre-existing capacities for, e.g.
- Hierarchical organization of behaviour
- Semantic memory for facts storage
- Fast routinization of useful procedures
"So the conclusion is mixed: The wondrous recursive creativity in language is not as special as it is often claimed to be. Nevertheless language is a special system because of what is does and the particular structural materials it uses to do it" (Jackendoff 2007: 143, see also Hurford 2011: 510 ).Beckner et al. (2009: 17), in their position paper on language as a complex adaptive system, go even further:
“[I]n a complex systems framework, language is viewed as an extension of numerous domain-general cognitive capacities such as shared attention, imitation, sequential learning, chunking, and categorization (Bybee, 1998b; Ellis, 1996). Language is emergent from ongoing human social interactions, and its structure is fundamentally molded by the preexisting cognitive abilities, processing idiosyncrasies and limitations, and general and specific conceptual circuitry of the human brain.”According to them
“As soon as humans were able to string two words together, the potential for the development of grammar exists, with no further mechanisms other than sequential processing, categorization, conventionalization, and inference-making (Bybee, 1998b; Heine & Kuteva,2007).”As a Cognitive Linguist I think that regarding this last point I agree more with Jackendoff and Beckner et al. than with Hurford, although this might also only be a terminology issue of how to define “new” or “unique.” I think it’s perfectly fine to say that the system supporting language becomes a specialized and unique domain due to the kinds of symbolic input it operates on, but many developmentalists would also stress that it is in fact not evolution which created this new unique domain, but ontogeny in a richly socio-interactive cultural setting or a “symbolic niche.” Saying that in normal modern language users’ there is a specialized system for language is quite compatible with Cognitive Approaches and the view of language as a complex adaptive system. However, as people like Elizabeth Bates or Anna Karmiloff-Smith have pointed out, a linguistic cognitive domain can emerge through multiple domain-general cognitive and processing factors in combination with dynamics of social interaction and actual language during a child’s individual development. So in this view it would not be evolution creating a new unique domain, but development, which is quite compatible with what we know about neural re-use:
“According to [theories of neural re-use], it is quite common for neural circuits established for one purpose to be exapted (exploited, recycled, redeployed) during evolution or normal development (my emphasis), and be put to different uses, often without losing their original functions. Neural reuse theories thus differ from the usual understanding of the role of neural plasticity (which is, after all, a kind of reuse) in brain organization along the following lines: According to neural reuse, circuits can continue to acquire new uses after an initial or original function is established; the acquisition of new uses need not involve unusual circumstances such as injury or loss of established function; and the acquisition of a new use need not involve (much) local change to circuit structure (e.g., it might involve only the establishment of functional connections to new neural partners)” (Anderson 2010)In this perspective language was shaped by and adapted to the organization principles brain, and it is normal development and the connectivity patterns. Language, then, becomes more similar to other highly specialized neural systems like chess or driving (Karmiloff-Smith 1992). Thus, general cognitive capacities and constraints (e.g. constraints from the conceptual system, pragmatics, learning and processing mechanisms, perceptuo-motor factors and others, cf. Christiansen & Chater, 2008) and their interconnectivity might play a more crucial role than Hurford gives them credit for. As Michael Tomasello (2003: 284) argues:
“Everyone agrees that human beings can acquire a natural language only because they are biologically prepared to do so and only because they are exposed other people in the culture speak- ing a language. The difficult part is in specifying the exact nature of this biological preparation, including the exact nature of the cognitive and learning skills that children use during ontogeny to acquire competence with the language into which they are born.”On this alternative view children have specialized capacities for intention-reading and pattern-finding, including general cognitive processes of cultural learning, a drive to communicate, shared intentionality, joint attentional capacities, schematization and analogy, symbolic processing, distributional analysis, entrenchment, and other cognitive mechanisms for constraining generalizations and language learning that not only apply to linguistic input. This view is advocated, for example, by usag usage-based (e.g. Michael Tomasello, Elena Lieven) cognitive-functional (e.g. Liz Bates), socio-cognitive (e.g. Eve Clark, Jerome Bruner) and emergentist (e.g. Brian MacWhinney) views of language acquisition, use, and processing,
The extent to which this architecture might also process other input is also a matter of debate. The CAS view for example predicts that:
“Specifically, language will depend heavily on brain areas fundamentally linked to various types of conceptual understanding, the processing of social interactions, and pattern recognition and memory. It also predicts that so-called “language areas” should have more general, prelinguistic processing functions even in modern humans [...].” (Beckner et al. 2009: 18)So the jury is still out whether the interconnection and co-optation of domain-general processes alone can explain language acquisition and use, and this is a highly active and exciting area of research (see e.g. this paper published a couple of days ago in a special issue on "Pattern perception and computational complexity") that plays a fundamental role in answering the question of what is wrong, and what is right in language evolution research.
[cross-posted in a slightly modified version at Replicated Typo]
No comments:
Post a Comment