"Except Ross Douthat is not that kind of Catholic. He is a convert, whose ancestry runs right through the Protestant establishment, including his great-grandfather having been the governor of Connecticut. Calling himself a Catholic in the discussion of historic power and opportunity was a Rachel Dolezal–grade feat of impersonation. To the extent there is a story to be told about the decline of the cultural dominance of the Protestant ruling class, it would be the story of how Ross Douthat came to identify as Catholic, without ceding any power or influence along the way."

And:

"Douthat presents that version of things as a speculative alternative history:
'So it’s possible to imagine adaptation rather than surrender as a different WASP strategy across the 1960s and 1970s. In such a world the establishment would have still admitted more blacks, Jews, Catholics and Hispanics (and more women) to its ranks … but it would have done so as a self-consciously elite-crafting strategy, rather than under the pseudo-democratic auspices of the SAT and the high school resume and the dubious ideal of “merit.” '
"What is the difference between “a self-consciously elite-crafting strategy” and “the SAT and the high school resume and the dubious ideal of ‘merit'”? This is exactly what the Ivies did: they adapted their conception of the elite to include more different demographic groups, whose elite status was to be measured with tests and resumes."

"This study presents methods for projecting population and migration over time in cases were empirical data are missing or undependable. The methods are useful for cases in which the researcher has details of population size and structure for a limited period of time (most obviously, the end point), with scattered evidence on other times. It enables estimation of population size, including its structure in age, sex, and status, either forward or backward in time. The program keeps track of all the details. The calculated data can be reported or sampled and compared to empirical findings at various times and places to expected values based on other procedures of estimation.
"The application of these general methods that is developed here is the projection of African populations backwards in time from 1950, since 1950 is the first date for which consistently strong demographic estimates are available for national-level populations all over the African continent. The models give particular attention to migration through enslavement, which was highly important in Africa from 1650 to 1900. Details include a sensitivity analysis showing relative significance of input variables and techniques for calibrating various dimensions of the projection with each other. These same methods may be applicable to quite different historical situations, as long as the data conform in structure to those considered here."

"For spatial and network data, we consider models formed from a Markov random field (MRF) structure and the specification of a conditional distribution for each observation. At issue, fast simulation from such MRF models is often an important consideration, particularly when repeated generation of large numbers of data sets is required (e.g., for approximating sampling distributions). However, a standard Gibbs strategy for simulating from MRF models involves single-updates, performed with the conditional distribution of each observation in a sequential manner, whereby a Gibbs iteration may become computationally involved even for relatively small samples. As an alternative, we describe a general way to simulate from MRF models using Gibbs sampling with "concliques" (i.e., groups of non-neighboring observations). Compared to standard Gibbs sampling, this simulation scheme can be much faster by reducing Gibbs steps and by independently updating all observations per conclique at once. We detail the simulation method, establish its validity, and assess its computational performance through numerical studies, where speed advantages are shown for several spatial and network examples."

"Quantile regression, as introduced by Koenker and Bassett (1978), may be viewed as an extension of classical least squares estimation of conditional mean models to the estimation of an ensemble of models for several conditional quantile functions. The central special case is the median regression estimator which minimizes a sum of absolute errors. Other conditional quantile functions are estimated by minimizing an asymmetrically weighted sum of absolute errors. Quantile regression methods are illustrated with applications to models for CEO pay, food expenditure, and infant birthweight."

"We present two new methods for estimating the order (memory depth) of a finite alphabet Markov chain from observation of a sample path. One method is based on entropy estimation via recurrence times of patterns, and the other relies on a comparison of empirical conditional probabilities. The key to both methods is a qualitative change that occurs when a parameter (a candidate for the order) passes the true order. We also present extensions to order estimation for Markov random fields."

"Randomized neural networks are immortalized in this AI Koan: In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6. What are you doing?'' asked Minsky. I am training a randomly wired neural net to play tic-tac-toe,'' Sussman replied. Why is the net wired randomly?'' asked Minsky. Sussman replied, I do not want it to have any preconceptions of how to play.'' Minsky then shut his eyes. Why do you close your eyes?'' Sussman asked his teacher. So that the room will be empty,'' replied Minsky. At that moment, Sussman was enlightened. We analyze shallow random networks with the help of concentration of measure inequalities. Specifically, we consider architectures that compute a weighted sum of their inputs after passing them through a bank of arbitrary randomized nonlinearities. We identify conditions under which these networks exhibit good classification performance, and bound their test error in terms of the size of the dataset and the number of random nonlinearities."

" introduce Forecastable Component Analysis (ForeCA), a novel dimension reduction technique for temporally dependent signals. Based on a new forecastability measure, ForeCA finds an optimal transformation to separate a multivariate time series into a forecastable and an orthogonal white noise space. I present a converging algorithm with a fast eigenvector solution. Applications to financial and macro-economic time series show that ForeCA can successfully discover informative structure, which can be used for forecasting as well as classification. The R package ForeCA (this http URL) accompanies this work and is publicly available on CRAN."