Universal grammar
Universal grammar (UG), in modern linguistics, is the theory of the innate biological component of the language faculty, usually credited to Noam Chomsky. The basic postulate of UG is that there are innate constraints on what the grammar of a possible human language could be. When linguistic stimuli are received in the course of language acquisition, children then adopt specific syntactic rules that conform to UG.[1] The advocates of this theory emphasize and partially rely on the poverty of the stimulus (POS) argument and the existence of some universal properties of natural human languages. However, the latter has not been firmly established.
Other linguists have opposed that notion, arguing that languages are so diverse that the postulated universality is rare.[2] The theory of universal grammar remains a subject of debate among linguists.[3]
Overview
[edit]The term "universal grammar" is placeholder for whichever domain-specific features of linguistic competence turn out to be innate. Within generative grammar, it is generally accepted that there must be some such features, and one of the goals of generative research is to formulate and test hypotheses about which aspects those are.[4][5] In day-to-day generative research, the notion that universal grammar exists motivates analyses in terms of general principles. As much as possible, facts about particular languages are derived from these general principles rather than from language-specific stipulations.[4]
Evidence
[edit]The idea that at least some aspects are innate is motivated by poverty of the stimulus arguments.[6][7] For example, one famous poverty of the stimulus argument concerns the acquisition of yes–no questions in English. This argument starts from the observation that children only make mistakes compatible with rules targeting hierarchical structure even though the examples which they encounter could have been generated by a simpler rule that targets linear order. In other words, children seem to ignore the possibility that the question rule is as simple as "switch the order of the first two words" and immediately jump to alternatives that rearrange constituents in tree structures. This is taken as evidence that children are born knowing that grammatical rules involve hierarchical structure, even though they have to figure out what those rules are.[6][7][8]
Theories of universal grammar
[edit]Within generative grammar, there are a variety of theories about what universal grammar consists of. One notable hypothesis proposed by Hagit Borer holds that the fundamental syntactic operations are universal and that all variation arises from different feature-specifications in the lexicon.[5][9] On the other hand, a strong hypothesis adopted in some variants of Optimality Theory holds that humans are born with a universal set of constraints, and that all variation arises from differences in how these constraints are ranked.[5][10] In a 2002 paper, Noam Chomsky, Marc Hauser and W. Tecumseh Fitch proposed that universal grammar consists solely of the capacity for hierarchical phrase structure.[11]
The main hypotheses
[edit]In an article entitled "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?"[12] Hauser, Chomsky, and Fitch present the three leading hypotheses for how language evolved and brought humans to the point where they have a universal grammar.
The first hypothesis states that the faculty of language in the broad sense (FLb) is strictly homologous to animal communication. This means that homologous aspects of the faculty of language exist in non-human animals.
The second hypothesis states that the FLb is a derived and uniquely human adaptation for language. This hypothesis holds that individual traits were subject to natural selection and came to be specialized for humans.
The third hypothesis states that only the faculty of language in the narrow sense (FLn) is unique to humans. It holds that while mechanisms of the FLb are present in both human and non-human animals, the computational mechanism of recursion has evolved recently, and solely in humans.[13]
Presence of creole languages
[edit]The presence of creole languages is sometimes cited as further support for this theory, especially by Bickerton's language bioprogram theory. Creole languages develop and form when disparate societies with no common language come together and are forced to devise a new system of communication. The system used by the original speakers is typically an inconsistent mix of vocabulary items, known as pidgin. As these speakers' children begin to acquire their first language, they use the pidgin input to effectively create their own original language, known as a creole language. Unlike pidgins, creole languages have native speakers (those with language acquisition from early childhood) and make use of a full, systematic grammar.
Bickerton claims the fact that certain features are shared by virtually all creole languages supports the notion of a universal grammar. For example, their default point of reference in time (expressed by bare verb stems) is not the present moment, but the past. Using pre-verbal auxiliaries, they uniformly express tense, aspect, and mood. Negative concord occurs, but it affects the verbal subject (as opposed to the object, as it does in languages like Spanish). Another similarity among creole languages can be identified in the fact that questions are created simply by changing the intonation of a declarative sentence; not its word order or content.
Opposing this notion, the work by Carla Hudson-Kam and Elissa Newport suggests that creole languages may not support a universal grammar at all. In a series of experiments, Hudson-Kam and Newport looked at how children and adults learn artificial grammars. They found that children tend to ignore minor variations in the input when those variations are infrequent, and reproduce only the most frequent forms. In doing so, the children tend to standardize the language they hear around them. Hudson-Kam and Newport hypothesize that in a pidgin-development situation (and in the real-life situation of a deaf child whose parents are or were disfluent signers), children systematize the language they hear, based on the probability and frequency of forms, and not that which has been suggested on the basis of a universal grammar.[14][15] Further, they argue, it seems to follow that creole languages would share features with the languages from which they are derived, and thus look similar in terms of grammar.
Many researchers of universal grammar argue against the concept of relexification, i.e. that a language replaces its lexicon almost entirely with that of another. This, they argue, goes against the universalist notions of a universal grammar, which has an innate grammar.[citation needed]
Views and assessments
[edit]Recent research has used recurrent neural network architectures (RNNs). Christian et al (2018) focused on a strong version of the poverty-of-the-stimulus argument, which claims that language learners require a hierarchical constraint, although they report that a milder version, which only asserts that a hierarchical bias is necessary, is difficult to assess using RNNs because RNNs must possess some biases and the nature of these biases remains "currently poorly understood." They go on to acknowledge that while all the architectures they used had a bias toward linear order and the GRU-with-attention architecture was the only one that overcame this linear bias sufficiently to generalize hierarchically. "Humans certainly could have such an innate constraint."[16]
The empirical basis of poverty-of-the-stimulus arguments has been challenged by Geoffrey Pullum and others, leading to a persistent back-and-forth debate in the language acquisition literature.[17][18]
Language acquisition researcher Michael Ramscar has suggested that when children erroneously expect an ungrammatical form that then never occurs, the repeated failure of expectation serves as a form of implicit negative feedback that allows them to correct their errors over time, in the way that, for example, children correct grammar generalizations like goed to went through repetitive failure.[19][20]
In addition, it has been suggested that people learn about probabilistic patterns of word distribution in their language, rather than hard and fast rules (see Distributional hypothesis).[21] For example, in English, children overgeneralize the past tense marker "-ed" and conjugate irregular verbs as if they were regular, producing forms like goed and eated, and then correct this deviancy over time.[19] It has also been hypothesized that the poverty of the stimulus problem can be largely avoided if it is assumed that children employ similarity-based generalization strategies in language learning, i.e. generalizing about the usage of new words from similar words they already know how to use.[22]
Neurogeneticists Simon Fisher and Sonja Vernes observe that, with human language-skills being evidently unmatched elsewhere in the world's fauna, there have been several theories about one single mutation event occurring some time in the past in our nonspeaking ancestors, as argued by e.g. Chomsky (2011), i.e. some "lone spark that was sufficient to trigger the sudden appearance of language and culture." They characterize that notion as "romantic" and "inconsistent with the messy mappings between genetics and cognitive processes." According to Fisher & Vernes, the link between genes to grammar has not been consistently mapped by scientists. What has been established by research, they claim, relates primarily to speech pathologies. The arising lack of certainty, they conclude, has provided an audience for "unconstrained speculations" that have fed the "myth" of "so-called grammar genes".[23]
Professor of Natural Language Computing Geoffrey Sampson maintains that universal grammar theories are not falsifiable and are therefore pseudoscientific. He argues that the grammatical "rules" linguists posit are simply post-hoc observations about existing languages, rather than predictions about what is possible in a language.[24][25][26] Sampson claims that every one of the "poor" arguments used to justify the language-instinct claim is wrong. He writes that "either the logic is fallacious, or the factual data are incorrect (or, sometimes, both)," and the "evidence points the other way." Children are good at learning languages, because people are good at learning anything that life throws at us — not because we have fixed structures of knowledge built-in.[24] Similarly, professor of cognitive science Jeffrey Elman argues that the unlearnability of languages ostensibly assumed by universal grammar is based on a too-strict, "worst-case" model of grammar, which is not in keeping with any actual grammar.
Linguist James Hurford, in his article "Nativist and Functional Explanations in Language Acquisition,"[27] offers the major differences between the glossogenetic and the phylogenetic mechanisms. He states that, "Deep aspects of the form of language are not likely to be readily identifiable with obvious specific uses, and one cannot suppose that it will be possible to attribute them directly to the recurring short-term needs of successive generations in a community. Here, nativist explanations for aspects of the form of language, appealing to an innate LAD, seem appropriate. But use or function can also be appealed to on the evolutionary timescale, to attempt to explain the structure of the LAD itself." For Hurford, biological mutations plus functional considerations constitute the explanans, while the LAD itself constitutes the explanandum. The LAD is part of the species' heredity, the result of mutations over a long period, he states. But, while he agrees with Chomsky that the mechanism of grammaticisation is located in "the Chomskyan LAD" and that Chomsly is "entirely right in emphasising that a language (E-language) is an artifact resulting from the interplay of many factors," he states that this artifact should be of great interest and systematic study, and can affect grammatical competence, i.e. "I-language."[27]
Morten H. Christiansen, professor of Psychology, and Nick Chater, professor of Psychology and Language Sciences, have argued that "a biologically determined UG is not evolutionarily viable." As the processes of language change are much more rapid than processes of genetic change, they state, language constitutes a "moving target" both over time and across different human populations, and, hence, cannot provide a stable environment to which language genes could have adapted. In following Darwin, they view language as a complex and interdependent "organism," which evolves under selectional pressures from human learning and processing mechanisms, so that "apparently arbitrary aspects of linguistic structure may result from general learning and processing biases deriving from the structure of thought processes, perceptuo-motor factors, cognitive limitations, and pragmatics".[28] Professor of linguistics Norbert Hornstein countered polemically that Christiansen and Chater appear "to have no idea what generative grammar [theory] is," and "especially, but not uniquely, about the Chomsky program." Hornstein points out that all "grammatically informed psycho-linguistic works done today or before" understand that generative/universal grammar capacities are but one factor among others needed to explain real-time acquisition." Christiansen and Chater's observation that "language use involves multiple interacting variables" [italics in the original] is, essentially, a truism. It is nothing new, he argues, to state that "much more than a competence theory will be required" to figure out how language is deployed, acquired, produced, or parsed. The position, he concludes, that universal grammar properties are just "probabilistic generalizations over available linguistic inputs" belongs to the "traditional" and "debunked" view held by associationists and structuralists many decades in the past.[29]
In the same vein, professor of linguistics Nicholas Evans and prfessor of psycholinguistics Stephen C. Levinson observe[30] that Chomsky’s notion of a Universal Grammar has been mistaken for a set of substantial research findings about what all languages have in common, while, it is, "in fact," the programmatic label for "whatever it turns out to be that all children bring to learning a language." For substantial findings about universals across languages, they argue, one must turn to the field of linguistic typology, which bares a "bewildering range of diverse languages" and in which "generalizations are really quite hard to extract." Chomsky’s actual views, combining, as they claim, philosophical and mathematical approaches to structure with claims about the innate endowment for language, have been "hugely influential in the cognitive sciences.[30]: 430
Wolfram Hinzen, in his work The philosophical significance of Universal Grammar[31] seeks to re-establish the epistemological significance of grammar and addresses the three main current objections to Cartesian universal grammar, i.e. that it has no coherent formulation, it cannot have evolved by standard, accepted neo-Darwinian evolutionary principles, and it goes against the variation extant at all levels of linguistic organization, which lies at the heart of human faculty of language.
In the domain of field research, Daniel Everett has claimed that the Pirahã language is a counterexample to the basic tenets of universal grammar because it lacks clausal embedding. According to Everett, this trait results from Pirahã culture emphasizing present-moment concrete matters.[32] Nevins et al (2007) have responded that Pirahã does, in fact, have clausal embedding, and that, even if it did not, this would be irrelevant to current theories of universal grammar. They addressed each of Everett's claims and, using Everett's "rich material" data, claim to have found no evidence of a causal relation between culture and grammatical structure. Pirahã grammar, they concluded, presents no unusual challenge, much less the "severe" one claimed by Everett, to the notion of a universal grammar.[33]
Developments
[edit]The modern conception of universal grammar is generally attributed to Noam Chomsky, yet similar ideas are found in older work. A related idea is found in Roger Bacon's c. 1245 Overview of Grammar and c. 1268 Greek Grammar, where he postulates that all languages are built upon a common grammar, even though it may undergo incidental variations. In the 13th century, the speculative grammarians postulated universal rules underlying all grammars.[citation needed]
The concept of a universal grammar or language was at the core of the 17th century projects for philosophical languages. An influential work in that time was Grammaire générale by Claude Lancelot and Antoine Arnauld. They describe a general grammar for languages, coming to the conclusion that grammar has to be universal.[34] There is a Scottish school of universal grammarians from the 18th century, as distinguished from the philosophical language project, which included authors such as James Beattie, Hugh Blair, James Burnett, James Harris, and Adam Smith.
The article on grammar in the first edition of the Encyclopædia Britannica (1771) contains an extensive section titled "Of Universal Grammar," under the lemma "Grammar.".[35]
In the late 19th and early 20th century, Wilhelm Wundt and Otto Jespersen claimed that these earlier arguments were overly influenced by Latin and ignored the breadth of worldwide language rsal grammar", but reduced it to universal syntactic categories or super-categories, such as number, tenses, etc.[36]
Behaviorists, after the rise of the eponymous theory, advanced the idea that language acquisition, like any other kind of learning, could be explained by a succession of trials, errors, and rewards for success.[37] In other words, children learn their mother tongue by simple imitation, through listening and repeating what adults say. For example, when a child says "milk" and the mother will smile and give milk to her childas a result, the child will find this outcome rewarding, thus enhancing the child's language development.[38]
In 2017, Chomsky and Berwick co-wrote their book titled Why Only Us, where they defined both the minimalist program and the strong minimalist thesis and its implications, to update their approach to UG theory. According to Berwick and Chomsky, "the optimal situation would be that UG reduces to the simplest computational principles which operate in accord with conditions of computational efficiency. This conjecture is ... called the Strong Minimalist Thesis (SMT)."[39]: 94
The significance of SMT is to shift the previous emphasis on a universal grammar to the concept that Chomsky and Berwick now call "merge". "Merge" is defined there as follows:
Every computational system has embedded within it somewhere an operation that applies to two objects X and Y already formed, and constructs from them a new object Z. Call this operation Merge.
SMT dictates that "Merge will be as simple as possible: it will not modify X or Y or impose any arrangement on them; in particular, it will leave them unordered; an important fact. Merge is therefore just set formation: Merge of X and Y yields the set {X, Y}."[39]: 98
See also
[edit]Notes
[edit]- ^ Chomsky, Noam. "Tool Module: Chomsky's Universal Grammar". Retrieved 2010-10-07.
- ^ Evans, Nicholas; Levinson, Stephen C. (26 October 2009). "The myth of language universals: Language diversity and its importance for cognitive science". Behavioral and Brain Sciences. 32 (5): 429–48. doi:10.1017/S0140525X0999094X. hdl:11858/00-001M-0000-0012-C29E-4. PMID 19857320. S2CID 2675474. Archived (PDF) from the original on 27 July 2018.
- ^ Christensen, Christian Hejlesen (March 2019). "Arguments for and against the Idea of Universal Grammar". Leviathan (4): 12–28. doi:10.7146/lev.v0i4.112677. S2CID 172055557. Retrieved 1 May 2025.
- ^ a b Wasow, Thomas (2003). "Generative Grammar" (PDF). In Aronoff, Mark; Ress-Miller, Janie (eds.). The Handbook of Linguistics. Blackwell. p. 299. doi:10.1002/9780470756409.ch12. ISBN 978-0-631-20497-8.
- ^ a b c Pesetsky, David (1999). "Linguistic universals and universal grammar". In Wilson, Robert; Keil, Frank (eds.). The MIT encyclopedia of the cognitive sciences. MIT Press. pp. 476–478. doi:10.7551/mitpress/4660.001.0001. ISBN 978-0-262-33816-5.
- ^ a b Adger, David (2003). Core syntax: A minimalist approach. Oxford University Press. pp. 8–11. ISBN 978-0199243709.
- ^ a b Lasnik, Howard; Lidz, Jeffrey (2017). "The Argument from the Poverty of the Stimulus" (PDF). In Roberts, Ian (ed.). The Oxford Handbook of Universal Grammar. Oxford University Press.
- ^ Crain, Stephen; Nakayama, Mineharu (1987). "Structure dependence in grammar formation". Language. 63 (3): 522–543. doi:10.2307/415004. JSTOR 415004.
- ^ Gallego, Ángel (2012). "Parameters". In Boeckx, Cedric (ed.). The Oxford Handbook of Linguistic Minimalism. Oxford University Press. doi:10.1093/oxfordhb/9780199549368.013.0023.
- ^ McCarthy, John (1992). Doing optimality theory. Wiley. pp. 1–3. ISBN 978-1-4051-5136-8.
- ^ Hauser, Marc; Chomsky, Noam; Fitch, W. Tecumseh (2002). "The faculty of language: what is it, who has it, and how did it evolve". Science. 298 (5598): 1569–1579. doi:10.1126/science.298.5598.1569. PMID 12446899.
- ^ Hauser, Marc; Chomsky, Noam; Fitch, William Tecumseh (22 November 2002), "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" (PDF), Science, 298 (5598): 1569–1579, doi:10.1126/science.298.5598.1569, PMID 12446899, archived from the original (PDF) on 28 December 2013, retrieved 28 December 2013
- ^ Hauser, Marc; Chomsky, Noam; Fitch, William Tecumseh (22 November 2002), "The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?" (PDF), Science, 298 (5598): 1569–1579, doi:10.1126/science.298.5598.1569, PMID 12446899, archived from the original (PDF) on 28 December 2013, retrieved 11 April 2024,
We hypothesize that FLN only includes recursion and is the only uniquely human component of the faculty of language. [...] the core recursive aspect of FLN currently appears to lack any analog in animal communication and possibly other domains as well.
- ^ Hudson Kam, C. L.; Newport, E. L. (2009). "Getting it right by getting it wrong: When learners change languages". Cognitive Psychology. 59 (1): 30–66. doi:10.1016/j.cogpsych.2009.01.001. PMC 2703698. PMID 19324332.
- ^ Dye, Melody (February 9, 2010). "The Advantages of Being Helpless". Scientific American. Retrieved June 10, 2014.
- ^ McCoy, R. Thomas; Frank, Robert; Linzen, Tal (2018). "Revisiting the poverty of the stimulus: hierarchical generalization without a hierarchical bias in recurrent neural networks" (PDF). Proceedings of the 40th Annual Conference of the Cognitive Science Society: 2093–2098. arXiv:1802.09091.
- ^ Pullum, Geoff; Scholz, Barbara (2002). "Empirical assessment of stimulus poverty arguments". The Linguistic Review. 18 (1–2): 9–50. doi:10.1515/tlir.19.1-2.9.
- ^ Legate, Julie Anne; Yang, Charles (2002). "Empirical re-assessment of stimulus poverty arguments" (PDF). The Linguistic Review. 18 (1–2): 151–162. doi:10.1515/tlir.19.1-2.9.
- ^ a b Fernández, Eva M.; Helen Smith Cairns (2011). Fundamentals of Psycholinguistics. Chichester, West Sussex, England: Wiley-Blackwell. ISBN 978-1-4051-9147-0.
- ^ Ramscar, Michael; Yarlett, Daniel (2007). "Linguistic self-correction in the absence of feedback: A new approach to the logical problem of language acquisition". Cognitive Science. 31 (6): 927–960. CiteSeerX 10.1.1.501.4207. doi:10.1080/03640210701703576. PMID 21635323. S2CID 2277787.
- ^ McDonald, Scott; Ramscar, Michael (2001). "Testing the distributional hypothesis: The influence of context on judgements of semantic similarity". Proceedings of the 23rd Annual Conference of the Cognitive Science Society: 611–616. CiteSeerX 10.1.1.104.7535.
- ^ Yarlett, Daniel G.; Ramscar, Michael J. A. (2008). "Language Learning Through Similarity-Based Generalization" (PDF). draft. Stanford University. CiteSeerX 10.1.1.393.7298.
- ^ Fisher, Simon E.; Vernes, Sonja C. (January 2015). "Genetics and the Language Sciences". Annual Review of Linguistics. 1: 289–310. doi:10.1146/annurev-linguist-030514-125024. hdl:11858/00-001M-0000-0019-DA19-1. Retrieved 1 May 2025.
- ^ a b Sampson, Geoffrey (2005). The 'Language Instinct' Debate: Revised Edition. Bloomsbury Academic. ISBN 978-0-8264-7385-1.
- ^ Sampson, Geoffrey (30 August 2022). "The 'Language Instinct' Debate". GRSampson.net. Retrieved April 27, 2025.
[My book] ends by posing the question 'How could such poor arguments have passed muster for so long?'
- ^ Cipriani, Enrico (2015). "The generative grammar between philosophy and science". European Journal of Literature and Linguistics. 4: 12–16.
- ^ a b Hurford, James R. (1995). "Nativist and Functional Explanations in Language Acquisition" (PDF). In I. M. Roca (ed.). Logical Issues in Language Acquisition. Dordrecht, Holland; Providence, Rhode Island: Foris Publications. p. 88. Archived (PDF) from the original on 2022-10-09. Retrieved June 10, 2014.
- ^ Christiansen, Morten H.; Chater, Nick (January 1985). "Language as Shaped by the Brain". Behavioral and Brain Sciences. 31 (5): 489–508. doi:10.1017/S0140525X08004998. Retrieved 3 May 2025.
- ^ Hornstein, Norbert (21 August 2017). "Language vs linguistics, again; the case of Christiansen and Chater". Faculty of Language. University of Maryland. Retrieved 3 May 2025.
- ^ a b Evans, Nicholas; Levinson, Stephen C. (2009). "The Myth of Language Universals: Language diversity and its importance for cognitive science" (PDF). Behavioral and Brain Sciences. 32 (5): 429–492. doi:10.1017/S0140525X0999094X. hdl:11858/00-001M-0000-0012-C29E-4. PMID 19857320. Retrieved 3 May 2025.
- ^ Hinzen, Wolfram (September 2012). "The philosophical significance of Universal Grammar". Language Sciences. 34 (5): 635–649. doi:10.1016/j.langsci.2012.03.005.
- ^ Everett, Daniel L. (August–October 2005). "Cultural Constraints on Grammar and Cognition in Pirahã: Another Look at the Design Features of Human Language" (PDF). Current Anthropology. 46 (4): 621–646. doi:10.1086/431525. hdl:2066/41103. S2CID 2223235. Archived (PDF) from the original on 2022-10-09.
- ^ Nevins, Andrew; Pesetsky, David; Rodrigues, Cilene (March 8, 2007). "Pirahã Exceptionality: a Reassessment" (PDF). International Cognition & Culture Institute. Retrieved May 1, 2025.
- ^ Lancelot, Claude, 1615?–1695 (1967). Grammaire generale et raisonnee, 1660. Scolar Press. OCLC 367432981.
{{cite book}}
: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link) - ^ "Of Universal Grammar". Encyclopædia Britannica. 2 (1st ed.). National Library of Scotland: 728–9. 1771. Retrieved 1 May 2025.
- ^ Jespersen 1965, p. 53.
- ^ Chomsky, Noam. "Tool Module: Chomsky's Universal Grammar". Retrieved 2010-10-07.
- ^ Ambridge & Lieven, 2011.
- ^ a b Chomsky, Noam; Berwick, Robert C. (12 May 2017). Why Only Us?. MIT Press. ISBN 9780262533492.
References
[edit]- Ambridge, Ben; Lieven, Elena V. M. (2011-03-17). Child Language Acquisition. Cambridge University Press. ISBN 978-0-521-76804-7.
- Baker, Mark C. The Atoms of Language: The Mind's Hidden Rules of Grammar. Oxford University Press, 2003. ISBN 0-19-860632-X.
- Beattie, James. "Of Universal Grammar". Section II, The Theory of Language (1788). Rpt in Dissertations Moral and Critical (1783, 1986.)
- Blair, Hugh. Lecture 6, 7, and 8, Lectures on Rhetoric and Belles Lettres, (1783). Rpt New York: Garland, 1970.
- Burnett, James. Of the Origin and Progress of Language. Edinburgh, 1774–1792.
- Chomsky, Noam (2007), "Approaching UG from Below", Interfaces + Recursion = Language?, DE GRUYTER, pp. 1–30, doi:10.1515/9783110207552-001, ISBN 9783110207552
- Chomsky, N. Aspects of the Theory of Syntax. MIT Press, 1965. ISBN 0-262-53007-4.
- Chomsky, Noam (2017), "The Galilean Challenge: Architecture and Evolution of Language", Journal of Physics: Conference Series, 880 (1): 012015, Bibcode:2017JPhCS.880a2015C, doi:10.1088/1742-6596/880/1/012015, ISSN 1742-6588
- Elman, J., Bates, E. et al. Rethinking innateness. MIT Press, 1996.
- Harris, James. Hermes or A Philosophical Inquiry Concerning Universal Grammar. (1751, 1771.)
- Jespersen, Otto (1965) [1924], The Philosophy of Grammar, Norton
- Kliesch, C. (2012). Making sense of syntax – Innate or acquired? Contrasting universal grammar with other approaches to language acquisition. Journal of European Psychology Students, 3, 88–94,
- Lancelot, Claude; Arnauld, Antoine (1968) [1660], Grammaire générale et raisonnée contenant les fondemens de l'art de parler, expliqués d'une manière claire et naturelle, Slatkine Reprints
- "Of Universal Grammar". In "Grammar". Encyclopædia Britannica, (1771).
- Pesetsky, David. "Linguistic Universals and Universal Grammar". In The MIT Encyclopedia of the Cognitive Sciences. Ed. Robert A. Wilson and Frank C. Keil Cambridge, MA: MIT Press 1999.
- Sampson, G. The "Language Instinct" Debate. Continuum International Publishing Group, 2005. ISBN 0-8264-7384-9.
- Smith, Adam. "Considerations Concerning the First Formation of Languages". In Lectures on Rhetoric and Belles Lettres. Ed. J. C. Bryce. Indianapolis: Liberty Press, 1983, 203–226.
- Smith, Adam. "Of the Origin and Progress of Language". Lecture 3, Lectures on Rhetoric and Belles Lettres. Ed. J. C. Bryce. Indianapolis: Liberty Press, 1983, 9–13.
- Tomasello, M. Constructing a Language: A Usage-Based Theory of Language Acquisition. Harvard University Press, 2003. ISBN 0-674-01030-2.
- Valian, Virginia (1986), "Syntactic Categories in the Speech of Young Children", Developmental Psychology, 22 (4): 562–579, doi:10.1037/0012-1649.22.4.562
- Window on Humanity. A Concise Introduction to Anthropology. Conrad Phillip Kottak. Ed. Kevin Witt, Jill Gordon. The McGraw-Hill Companies, Inc. 2005.
- White, Lydia. "Second Language Acquisition and Universal Grammar". Cambridge University Press, 2003. ISBN 0-521-79647-4
- Zuidema, Willem. How the poverty of stimulus solves the poverty of stimulus. "Evolution of Language: Fourth International Conference", Harvard University, March 2002.
Further reading
[edit]- Moro, Andrea (2016). Impossible Languages. The MIT Press. ISBN 978-0262034890.