Language acquisition

Language acquisition is the process by which humans acquire the capacity to perceive, produce and use words to understand and communicate. This capacity involves the picking up of diverse capacities including syntax, phonetics, and an extensive vocabulary. This language might be vocal as with speech or manual as in sign. Language acquisition usually refers to first language acquisition, which studies infants' acquisition of their native language, rather than second language acquisition, which deals with acquisition (in both children and adults) of additional languages.

The capacity to acquire and use language is a key aspect that distinguishes humans from other organisms. While many forms of animal communication exist, they have a limited range of nonsyntactically structured vocabulary tokens that lack cross cultural variation between groups.

A major concern in understanding language acquisition is how these capacities are picked up by infants from what appears to be very little input. A range of theories of language acquisition has been created in order to explain this apparent problem including innatism in which a child is born prepared in some manner with these capacities, as opposed to the other theories in which language is simply learned.

History
Plato felt that the word-meaning mapping in some form was innate. Sanskrit grammarians debated over twelve centuries whether meaning was god-given (possibly innate) or was learned from older convention—e.g. a child learning the word for cow by listening to trusted speakers talking about cows.

In modern times, empiricists like Hobbes and Locke argued that knowledge (and for Locke, language) emerge ultimately from abstracted sense impressions. This led to Carnap's Aufbau, an attempt to learn all knowledge from sense datum, using the notion of "remembered as similar" to bind these into clusters, which would eventually map to language.

Under Behaviorism, it was argued that language may be learned through a form of operant conditioning. In B.F. Skinner's Verbal Behaviour (1957), he suggested that the successful use of a sign such as a word or lexical unit, given a certain stimulus, reinforces its "momentary" or contextual probability. Empiricist theories of language acquisition include statistical learning theories of language acquisition, Relational Frame Theory, functionalist linguistics, social interactionist theory, and usage-based language acquisition.

This behaviourist idea was strongly attacked by Noam Chomsky in a review article in 1959, calling it "largely mythology" and a "serious delusion". Instead, Chomsky argued for a more theoretical approach, based on a study of syntax.

Social interactionism
Social interactionist theory consists of a number of hypotheses on language acquisition. These hypotheses deal with written, spoken, or visual social tools which consist of complex systems of symbols and rules on language acquisition and development. The compromise between “nature” and “nurture” is the “interactionist” approach. In addition, for years, psychologists and researchers have been asking the same question. What are the language behaviors that nature provides innately and what are those behaviors that are realized by environmental exposure, which is nurture.

Relational frame theory
The relational frame theory (Hayes, Barnes-Holmes, Roche, 2001), provides a wholly selectionist/learning account of the origin and development of language competence and complexity. Based upon the principles of Skinnerian behaviorism, RFT posits that children acquire language purely through interacting with the environment. RFT theorists introduced the concept of functional contextualism in language learning, which emphasizes the importance of predicting and influencing psychological events, such as thoughts, feelings, and behaviors, by focusing on manipulable variables in their context. RFT distinguishes itself from Skinner's work by identifying and defining a particular type of operant conditioning known as derived relational responding, a learning process that to date appears to occur only in humans possessing a capacity for language. Empirical studies supporting the predictions of RFT suggest that children learn language via a system of inherent reinforcements, challenging the view that language acquisition is based upon innate, language-specific cognitive capacities.

Emergentism
Emergentist theories, such as MacWhinney's competition model, posit that language acquisition is a cognitive process that emerges from the interaction of biological pressures and the environment. According to these theories, neither nature nor nurture alone is sufficient to trigger language learning; both of these influences must work together in order to allow children to acquire a language. The proponents of these theories argue that general cognitive processes subserve language acquisition and that the end result of these processes is language-specific phenomena, such as word learning and grammar acquisition. The findings of many empirical studies support the predictions of these theories, suggesting that language acquisition is a more complex process than many believe.

Generativism
Generative grammar, associated especially with the work of Noam Chomsky, is currently one of the principal approaches to children's acquisition of syntax. The leading idea is that human biology imposes narrow constraints on the child's "hypothesis space" during language acquisition. In the Principles and Parameters Framework, which has dominated generative syntax since Chomsky's (1980) Lectures on Government and Binding, the acquisition of syntax resembles ordering from a menu: The human brain comes equipped with a limited set of choices, and the child selects the correct options using her parents' speech, in combination with the context.

An important argument in favor of the generative approach is the Poverty of the stimulus argument. The child's input (a finite number of sentences encountered by the child, together with information about the context in which they were uttered) is in principle compatible with an infinite number of conceivable grammars. Moreover, few if any children can rely on corrective feedback from adults when they make a grammatical error. Yet, barring situations of medical abnormality or extreme privation, all the children in a given speech-community converge on very much the same grammar by the age of about five years. An especially dramatic example is provided by children who for medical reasons are unable to produce speech, and therefore can literally never be corrected for a grammatical error, yet nonetheless converge on the same grammar as their typically developing peers, according to comprehension-based tests of grammar.

Considerations such as these have led Chomsky, Jerry Fodor, Eric Lenneberg and others to argue that the types of grammar that the child needs to consider must be narrowly constrained by human biology (the nativist position). These innate constraints are sometimes referred to as universal grammar, the human "language faculty," or the "language instinct."

Empiricism
Since Chomsky in the 1950s, many criticisms of the basic assumptions of generative theory have been put forth. Critics argue that the concept of a Language Acquisition Device (LAD) is unsupported by evolutionary anthropology, which tends to show a gradual adaptation of the human brain and vocal chords to the use of language, rather than a sudden appearance of a complete set of binary parameters delineating the whole spectrum of possible grammars ever to have existed and ever to exist. (Binary parameters are common to digital computers but not, as it turns out, to neurological systems such as the human brain.)

Further, while generative theory has several hypothetical constructs (such as movement, empty categories, complex underlying structures, and strict binary branching) that cannot possibly be acquired from any amount of linguistic input, it is unclear that human language is actually anything like the generative conception of it. Since language, as imagined by nativists, is unlearnably complex, subscribers to this theory argue that it must therefore be innate. A different theory of language, however, may yield different conclusions. While all theories of language acquisition posit some degree of innateness, a less convoluted theory might involve less innate structure and more learning. Under such a theory of grammar, the input, combined with both general and language-specific learning capacities, might be sufficient for acquisition.

Since 1980, linguists studying children, such as Melissa Bowerman, and psychologists following Jean Piaget, like Elizabeth Bates and Jean Mandler, came to suspect that there may indeed be many learning processes involved in the acquisition process, and that ignoring the role of learning may have been a mistake.

In recent years, opposition to the nativist position has multiplied. The debate has centered on whether the inborn capabilities are language-specific or domain-general, such as those that enable the infant to visually make sense of the world in terms of objects and actions. The anti-nativist view has many strands, but a frequent theme is that language emerges from usage in social contexts, using learning mechanisms that are a part of a general cognitive learning apparatus (which is what is innate). This position has been championed by Elizabeth Bates, Catherine Snow, Brian MacWhinney, Michael Tomasello, Michael Ramscar, William O'Grady, and others. Philosophers, such as Fiona Cowie and Barbara Scholz with Geoffrey Pullum have also argued against certain nativist claims in support of empiricism.

Statistical learning
Some language acquisition researchers, such as Elissa Newport, Richard Aslin, and Jenny Saffran, believe that language acquisition is based primarily on general learning mechanisms, namely statistical learning. The development of connectionist models that are able to successfully learn words and syntactical conventions supports the predictions of statistical learning theories of language acquisition, as do empirical studies of children's learning of words and syntax.

Chunking
Chunking theories of language acquisition constitute a group of theories related to statistical learning theories in that they assume that the input from the environment plays an essential role; however, they postulate different learning mechanisms. The central idea of these theories is that language development occurs through the incremental acquisition of meaningful chunks of elementary constituents, which can be words, phonemes, or syllables. Recently, this approach has been highly successful in simulating several phenomena in the acquisition of syntactic categories and the acquisition of phonological knowledge. The approach has several features that make it unique: the models are implemented as computer programs, which enables clear-cut and quantitative predictions to be made; they learn from naturalistic input, made of actual child-directed utterances; they produce actual utterances, which can be compared with children’s utterances; and they have simulated phenomena in several languages, including English, Spanish, and German.

Researchers at the Max Planck Institute for Evolutionary Anthropology have developed a computer model analyzing early toddler conversations to predict the structure of later conversations. They showed that toddlers develop their own individual rules for speaking with slots into which they could put certain kinds of words. A significant outcome of the research was that rules inferred from toddler speech were better predictors of subsequent speech than traditional grammars.

Vocabulary acquisition
The capacity to acquire the ability to incorporate the pronunciation of new words depends upon the capacity to engage in speech repetition. Children with reduced abilities to repeat nonwords (a marker of speech repetition abilities) show a slower rate of vocabulary expansion than children for whom this is easy. It has been proposed that the elementary units of speech has been selected to enhance the ease with which sound and visual input can be mapped into motor vocalization. Several computational models of vocabulary acquisition have been proposed so far.

Meaning
Children learn on average 10 to 15 new word meanings each day, but only one of these words can be accounted for by direct instruction. The other nine to 14 word meanings need to be picked up in some other way. It has been proposed that children acquire these meanings with the use of processes modeled by latent semantic analysis; that is, when they meet an unfamiliar word, children can use information in its context to correctly guess its rough area of meaning.

Neurocognitive research
According to several linguists, neurocognitive research has confirmed many standards of language learning, such as: "learning engages the entire person (cognitive, affective, and psychomotor dominas), the human brain seeks patterns in its searching for meaning, emotions affect all aspects of learning, retention and recall, past experience always affects new learning, the brain's working memory has a limited capacity, lecture usually results in the lowest degree of retention, rehearsal is essential for retention, practice [alone] does not make perfect, and each brain is unique" (Sousa, 2006, p. 274). In terms of genetics, the gene ROBO1 has been associated with phonological buffer integrity or length