Parametric theories of stress (e.g., Dresher & Kaye 1990, henceforth DK) model typology using a small set of choices (parameters) specified in UG. One often cited motivation for the parametric approach (Chomsky 1981) is its presumed advantage for the language-learning child, whose task is reduced to setting the values for these pre-specified parameters. Despite this apparent learning advantage, parameter setting is a difficult problem, since parameters are interdependent, and determining the target grammar requires inferences to be made across many word forms (DK). Existing parametric stress learners address these challenges by positing domain-specific learning mechanisms, such as setting parameters in a pre-specified order, and/or giving parameters substantive cues (pre-specified phonological configurations that trigger a certain parameter value; see DK). Moreover, Pearl (2011) argues that the general-purpose statistical learner for parameters developed by Yang (2002), the Naïve Parameter Learner (NPL), requires such domain-specific mechanisms to learn English stress.

In this paper we introduce a general-purpose, incremental, statistical model for learning parameters, Expectation Driven Parameter Learning (EDPL), relying on EDL as proposed in (Jarosz 2015). We show that EDPL performs well on a representative subset of the languages in DK’s parametric system. We also test the NPL and show that it does not succeed on the same learning problem. Our results weaken the case for domain-specific learning mechanisms, supporting a more modest UG.