In lieu of an abstract, here is a brief excerpt of the content:

Reviewed by:
  • Rich languages from poor inputs ed. by Massimo Piattelli-Palmarini and Robert C. Berwick
  • Iris Berent
Rich languages from poor inputs. Ed. by Massimo Piattelli-Palmarini and Robert C. Berwick. Oxford: Oxford University Press, 2013. Pp. xiii, 313. ISBN 9780199590339. $110 (Hb).

This collection is a celebration of the late Carol Chomsky’s bold, pioneering lifework. It is also an opportunity to reflect on the state of the art in linguistics and its sister disciplines—psycholinguistics and reading research—on three questions that are at the heart of her legacy: the richness of language acquired from impoverished input, its gradual development, and its role in reading and writing. The three parts of this volume address each of these questions in turn.

A rather humbling demonstration of the resilience of language to extreme sensory deprivation is presented by Carol Chomsky’s own work on the linguistic abilities of deaf-blind individuals who acquired language haptically, via the Tadoma method (a method that allows language learners to perceive speech by placing their hand on the face and neck of the speaker). Despite radical limitations in input, the linguistic capacities of these individuals are nearly intact; the detailed case studies are reprinted in the final chapter of this volume. Another linguistic triumph in the face of sensory adversity is the ability of blind children to infer the root meaning of verbs such as see and look from their unique syntactic structure and the putative universal rules linking syntax to semantics—a case documented with great clarity and elegance in the chapter by Lila Gleitman and Barbara Landau.

Poverty of stimulus, however, is not restricted to sensory deprivation, nor is it unique to the deaf and blind child. And indeed, big (linguistic) data do not provide discovery procedures for grammatical rules. Just as the impoverished sensory input available to the blind child fails to specify the semantics of see, so does the myriad of linguistic evidence available to typical children underdetermine which aspect of the input—word order or syntactic structure—is relevant for sentence structure. This conundrum, outlined by Noam Chomsky over four decades ago (1968), relates to the challenge of forming polar interrogatives with relative clauses (PIRC, see 1a), a task accomplished by children within the first four years of life (Crain & Nakayama 1987).

  1. (1).

    1. a. Is the little boy who is crying hurt?

    2. b. The little boy who is crying is hurt.

But whether children do in fact lack the linguistic evidence necessary to solve the induction problem has been the subject of debate. The three chapters by Xuan-Nga Cao Kam and Janet Dean Fodor, by Robert Berwick, Noam Chomsky, and Massimo Piattelli-Palmarini, and by Noam Chomsky revisit this challenge.

Kam and Fodor’s detailed analysis of word-learning bigram models (Reali & Christiansen 2005) demonstrates that, absent an inherent bias to attend to syntactic structure, learners fail at even the simplest task of distinguishing well-formed sentences from ill-formed ones. Similar limitations are documented by Berwick, Chomsky, and Piatelli-Palmarini in two other models—a trigram version based on Reali & Christiansen 2005 and the ‘weak substitutability’ approach of Clark & Eyraud 2007. According to Berwick and colleagues, the insensitivity to structure (a property they distinguish from the representation of hierarchical structure) also persists in a Bayesian selection model (Perfors et al. 2011). [End Page 980]

But, as Berwick and colleagues point out, the poverty-of-the-stimulus challenge goes far beyond the narrow task of sifting well-formed sentences from ill-formed ones, and beyond the PIRC construction or English. An explanation of these facts can only be given within a broader account of the syntactic operations that are allowable in human language. And such an explanation, suggests Noam Chomsky, must begin with a question—the puzzle of why syntactic operations appeal to structural distance, rather than the linear distance among elements. The emergence of this feature in all grammars must reflect the design of the language organ itself (specifically, the computational efficiency of the labeling algorithm), rather than properties of the linguistic input and the functional demands...

pdf

Share