Jeff Lidz

Posted on

Jeff Lidz

University of Maryland

Abstract

2nd year syntax

There is a paradox in language acquisition concerning the perception of the input. If learners can veridically parse the input, then there is nothing to learn from it; but if they cannot parse the input, then it is unclear how they avoid faulty inferences about structure, or even learn from it at all (Valian 1990, Fodor 1996). In this talk, I examine how children deal with their input, given only partial knowledge of the target grammar. Specifically, I focus on the intersection between wh-movement, transitivity and verb learning. First, I show that infants as early as 16-months can use a verb’s distribution in transitive and intransitive clauses to draw inferences about  its argument-taking properties. I also show that knowledge of transitivity precedes children’s ability to represent wh-dependencies. But this raises a puzzle about the acquisition of transitivity to begin with. How can infants identify a verb’s transitivity given the high rate of wh-questions in speech to children. When infants who do not yet represent wh-questions like “what did Amy fix?” as involving a dependency, what stops them from erroneously inferring that fix does not require an object? We show computationally that infants who do not yet have that ability can learn to “filter” non-basic clauses from the data they use for verb learning. Thus, learners may be able to overcome the limits of partial knowledge by filtering data that may lead to faulty inferences about their grammar.
Return to the list of speakers

Sidebar