"... Conditioning is the generally agreed-upon method for updating probability distributions when one learns that an event is certainly true. But it has been argued that we need other rules, in particular the rule of cross-entropy minimization, to handle updates that involve uncertain information. In thi ..."

Conditioning is the generally agreed-upon method for updating probability distributions when one learns that an event is certainly true. But it has been argued that we need other rules, in particular the rule of cross-entropy minimization, to handle updates that involve uncertain information. In this paper we re-examine such a case: van Fraassen&apos;s Judy Benjamin problem [1987], which in essence asks how one might update given the value of a conditional probability. We argue that---contrary to the suggestions in the literature---it is possible to use simple conditionalization in this case, and thereby obtain answers that agree fully with intuition. This contrasts with proposals such as cross-entropy, which are easier to apply but can give unsatisfactory answers. Based on the lessons from this example, we speculate on some general philosophical issues concerning probability update. 1 INTRODUCTION How should one update one&apos;s beliefs, represented as a probability distribution Pr over some ...

... incorporate it? There is in fact a rich literature on the subject (e.g., see [Bacchus, Grove, Halpern, and Koller 1994; Diaconis and Zabell 1982; Jeffrey 1983; Jaynes 1983; Paris and Vencovska 1992; =-=Uffink 1995]). Most p-=-roposals attempt to find the probability distribution that satisfies the new information and is in some sense the &quot;closest&quot; to the original distribution Pr. Certainly the best known and most...

...ty or of information? What is wrong with, say, the standard deviation? Indeed, there even exist examples in which the entropy does not seem to reflect one’s intuitive notion of information (see e.g., =-=[5]-=-). Other entropies, justified by a different choice of axioms, were subsequently introduced [6]-[9]. From our point of view the real limitation is that Shannon was not concerned with inductive inferen...

"... The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distri ..."

The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule to equate the expectation values of certain functions with their empirical averages. There are, however, various other ways in which one can construct constraints from empirical data, which makes the maximum entropy principle lead to very different probability assignments. This paper shows that an argument by Jaynes to justify the usual constraint rule is unsatisfactory and investigates several alternative choices. The choice of a constraint rule is also show...

"... The relationship between three probability distributions and their maximizable entropy forms is discussed without postulating entropy property. For this purpose, the entropy I is defined as a measure of uncertainty of the probability distribution of a random variable x by a variational relationship ..."

The relationship between three probability distributions and their maximizable entropy forms is discussed without postulating entropy property. For this purpose, the entropy I is defined as a measure of uncertainty of the probability distribution of a random variable x by a variational relationship dxxddI − = , a definition underlying the maximization of entropy for corresponding distribution.

...ogorov-Sinai entropy for example). But the question remainssopen in the scientific community about whether or not Shannon entropy is the unique usefulsmeasure of statistical uncertainty or information=-=[9]-=-.sThe origin of this question can be traced back to the principle of maximum entropys(maxent) by Jaynes[10] who claimed the Shannon entropy was singled out to be the onlysconsistent measure of uncerta...

"... Abstract—We present a mathematical formulation for the opti-mization of query forgery for private information retrieval, in the sense that the privacy risk is minimized for a given traffic and processing overhead. The privacy risk is measured as an infor-mation-theoretic divergence between the user’ ..."

Abstract—We present a mathematical formulation for the opti-mization of query forgery for private information retrieval, in the sense that the privacy risk is minimized for a given traffic and processing overhead. The privacy risk is measured as an infor-mation-theoretic divergence between the user’s query distribution and the population’s, which includes the entropy of the user’s dis-tribution as a special case. We carefully justify and interpret our privacy criterion from diverse perspectives. Our formulation poses a mathematically tractable problem that bears substantial resem-blance with rate-distortion theory. Index Terms—Entropy, Kullback–Leibler divergence, privacy risk, private information retrieval, query forgery.

...ith the rationale behind maximum-entropy methods, an involved topic not without controversy, which arose in statistical mechanics [35], [36], and has been extensively addressed by abundant literature =-=[37]-=- over the past half century. Some of the arguments advocating maximum-entropy methods deal with the highest number of permutations with repeated elements associated with an empirical distribution [38]...

by
Igor Douven, Timothy Williamson
- The British Journal for the Philosophy of Science

"... This paper is concerned with formal solutions to the lottery paradox on which high probability defeasibly warrants acceptance. It considers some recently proposed solutions of this type and presents an argument showing that these solutions are trivial in that they boil down to the claim that perfect ..."

This paper is concerned with formal solutions to the lottery paradox on which high probability defeasibly warrants acceptance. It considers some recently proposed solutions of this type and presents an argument showing that these solutions are trivial in that they boil down to the claim that perfect probability is sufficient for rational acceptability. The argument is then generalized, showing that a broad class of similar solutions faces the same problem. Over the past decades, there has been a steadily growing interest in utilizing probability theory to elucidate, or even analyze, concepts central to traditional epistemology. Special attention in this regard has been given to the notion of rational acceptability. Many have found the following thesis at least prima facie a promising starting point for a probabilistic elucidation of that notion: Sufficiency Thesis (ST) A propositionϕis rationally acceptable if Pr(ϕ)&gt;t, where Pr is a probability distribution over propositions and t is a threshold value close to 1. 1 Another plausible constraint is that when some propositions are rationally

"... We propose an extension of the principle of virtual work of mechanics to random dynamics of mechanical systems. The total virtual work of the interacting forces and inertial forces on every particle of the system is calculated by considering the motion of each particle. Then according to the princip ..."

We propose an extension of the principle of virtual work of mechanics to random dynamics of mechanical systems. The total virtual work of the interacting forces and inertial forces on every particle of the system is calculated by considering the motion of each particle. Then according to the principle of Lagrange-d’Alembert for dynamical equilibrium, the vanishing ensemble average of the virtual work gives rise to the thermodynamic equilibrium state with maximization of thermodynamic entropy. This approach establishes a close relationship between the maximum entropy approach for statistical mechanics and a fundamental principle of mechanics, and constitutes an attempt to give the maximum entropy approach, considered by many as only an inference principle based on the subjectivity of probability and entropy, the status of fundamental physics law.

...in a simple manner. However, in spite of its success and popularity, maxent has always been at the center of scientific and philosophical discussions and has raised many questions and controversies[4]=-=[5]-=-[6]. A central question is why a thermodynamic system chooses the equilibrium microstates such that the BGS entropy gets to maximum. As a basic assumption of scientific theory, maxent is not directly ...