Empirical models

We refine and extend prior views of the description, purposes, and contexts-of-use of acknowledgment acts through empirical examination of the use of acknowledgments in task-based conversation. We distinguish three broad classes of acknowledgments (other--*ackn, self--*other--*ackn, and self+ackn) and present a catalogue of 13 patterns within these classes that account for the specific uses of acknowledgment in the corpus.

This book provides the state of the art of the investigation and the in-depth analysis of hydraulic conductivity from the theoretical to semi-empirical models perspective as well as policy development associated with management of land resources emanating from drainage-problem soils. A group of international experts contributed to the development of this book. It is envisaged that this thought provoking book will excite and appeal to academics, engineers, researchers and University students who seek to explore the breadth and in-depth knowledge about hydraulic conductivity....

The second edition of "Model Predictive Control" provides a thorough introduction to theoretical and practical aspects of the most commonly used MPC strategies. It bridges the gap between the powerful but often abstract techniques of control researchers and the more empirical approach of practitioners. The book demonstrates that a powerful technique does not always require complex control algorithms. Many new exercises and examples have also been added throughout.

TWO ESSAYS IN INTERNATIONAL ECONOMICS: AN EMPIRICAL APPROACH TO PURCHASING POWER PARITY AND THE MONETARY MODEL OF EXCHANGE RATE DETERMINATION I adopt a different strategy: I compare housing markets that differ in the strength of
the residential location-school assignment link, and I develop simple reduced-form
implications of parental valuations for the across-school distribution of student
characteristics and educational outcomes as a function of the strength of this link.

We describe a simple variant of the interpolated Markov model with non-emitting state transitions and prove that it is strictly more powerful than any Markov model. Empirical results demonstrate that the non-emitting model outperforms the interpolated model on the Brown corpus and on the Wall Street Journal under a wide range of experimental conditions. The nonemitting model is also much less prone to overtraining. The remainder of our article consists of four sections.

Mathematical modelling is the process of formulating an abstract model
in terms of mathematical language to describe the complex behaviour of
a real system. Mathematical models are quantitative models and often
expressed in terms of ordinary differential equations and partial differential
equations. Mathematical models can also be statistical models,
fuzzy logic models and empirical relationships. In fact, any model description
using mathematical language can be called a mathematical
model.

INFORMATION SYSTEM QUALITY : AN EXAMINATION OF SERVICE-BASED MODELS AND ALTERNATIVES If both peer group
and school effectiveness are important to parents, then, the Tiebout mechanism rewards
effective administrators only when there are many districts. Model (3) suggests that in this
case the test score gap between high- and low-income schools will tend to be larger in
markets with a great deal of interdistrict competition than in those with less Tiebout choice.
I test for this in the empirical analysis below....

We develop and implement a framework in which prior views and empirical evidence
about pricing models and managerial skill can be incorporated formally into the invest-
ment decision. Our framework relies on a set of passive indexes or \assets," consisting of
nonbenchmark assets as well as the benchmark assets prescribed by a pricing model. A
common interpretation of alpha, the intercept in a regression of the fund's excess return on
the benchmarks, is that it represents the skill of the fund's manager in selecting mispriced
securities.

We investigate the empirical behavior of ngram discounts within and across domains. When a language model is trained and evaluated on two corpora from exactly the same domain, discounts are roughly constant, matching the assumptions of modiﬁed Kneser-Ney LMs. However, when training and test corpora diverge, the empirical discount grows essentially as a linear function of the n-gram count. We adapt a Kneser-Ney language model to incorporate such growing discounts, resulting in perplexity improvements over modiﬁed Kneser-Ney and Jelinek-Mercer baselines. ...

In this paper, we propose a linguistically annotated reordering model for BTG-based statistical machine translation. The model incorporates linguistic knowledge to predict orders for both syntactic and non-syntactic phrases. The linguistic knowledge is automatically learned from source-side parse trees through an annotation algorithm. We empirically demonstrate that the proposed model leads to a signiﬁcant improvement of 1.55% in the BLEU score over the baseline reordering model on the NIST MT-05 Chinese-to-English translation task. ...

This paper reports the development of loglinear models for the disambiguation in wide-coverage HPSG parsing. The estimation of log-linear models requires high computational cost, especially with widecoverage grammars. Using techniques to reduce the estimation cost, we trained the models using 20 sections of Penn Treebank. A series of experiments empirically evaluated the estimation techniques, and also examined the performance of the disambiguation models on the parsing of real-world sentences. ...

This paper presents empirical studies and closely corresponding theoretical models of the performance of a chart parser exhaustively parsing the Penn Treebank with the Treebank’s own CFG grammar. We show how performance is dramatically affected by rule representation and tree transformations, but little by top-down vs. bottom-up strategies.

We present an empirical study of the applicability of Probabilistic Lexicalized Tree Insertion Grammars (PLTIG), a lexicalized counterpart to Probabilistic Context-Free Grammars (PCFG), to problems in stochastic naturallanguage processing. Comparing the performance of PLTIGs with non-hierarchical N-gram models and PCFGs, we show that PLTIG combines the best aspects of both, with language modeling capability comparable to N-grams, and improved parsing performance over its nonlexicalized counterpart. Furthermore, training of PLTIGs displays faster convergence than PCFGs. ...

In this work I address the challenge of augmenting n-gram language models according to prior linguistic intuitions. I argue that the family of hierarchical Pitman-Yor language models is an attractive vehicle through which to address the problem, and demonstrate the approach by proposing a model for German compounds. In an empirical evaluation, the model outperforms the Kneser-Ney model in terms of perplexity, and achieves preliminary improvements in English-German translation.

This paper presents a new approach to partial parsing of context-free structures. The approach is based on Markov Models. Each layer of the resulting structure is represented by its own Markov Model, and output of a lower layer is passed as input to the next higher layer. An empirical evaluation of the method yields very good results for NP/PP chunking of German newspaper texts.

This paper introduces new methods based on exponential families for modeling the correlations between words in text and speech. While previous work assumed the effects of word co-occurrence statistics to be constant over a window of several hundred words, we show that their influence is nonstationary on a much smaller time scale.

I am a clinical psychologist. The guiding training model for clinical
psychology is called "scientist-practitioner" which demands that any
Ph.D. clinical psychologist must be trained both as a scientist and
practitioner. Since receiving my Ph.D.