PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, 1994
We discuss two general issues concerning diverging sets of Bayesian (conditional) probabilities—d... more We discuss two general issues concerning diverging sets of Bayesian (conditional) probabilities—divergence of “posteriors”—that can result with increasing evidence. Consider a set of probabilities typically, but not always, based on a set of Bayesian “priors.” Incorporating sets of probabilities, rather than relying on a single probability, is a useful way to provide a rigorous mathematical framework for studying sensitivity and robustness in Classical and Bayesian inference. See: Berger (1984, 1985, 1990); Lavine (1991); Huber and Strassen (1973); Walley (1991); and Wasserman and Kadane (1990). Also, sets of probabilities arise in group decision problems. See: Levi (1982); and Seidenfeld, Kadane, and Schervish (1989). Third, sets of probabilities are one consequence of weakening traditional axioms for uncertainty. See: Good (1952); Smith (1961); Kyburg (1961); Levi (1974); Fishburn (1986); Seidenfeld, Schervish, and Kadane (1990); and Walley (1991).
Conditioning can make imprecise probabilities uniformly more imprecise. We call this effect “dila... more Conditioning can make imprecise probabilities uniformly more imprecise. We call this effect “dilation”. In a previous paper (1993), Seidenfeld and Wasserman established some basic results about dilation. In this paper we further investigate dilation on several models. In particular, we consider conditions under which dilation persists under marginalization and we quantify the degree of dilation. We also show that dilation manifests itself asymptotically in certain robust Bayesian models and we characterize the rate at which dilation occurs.
We contrast three decisions rules that extend Expected Utility to contexts where a convex set of ... more We contrast three decisions rules that extend Expected Utility to contexts where a convex set of probabilities is used to depict uncertainty: -Maximin, Maximality, and -admissibility. The rules extend Expected Utility theory as they require that an option is inadmissible if there is another that carries greater expected utility for each probability in a (closed) convex set. If the convex set is a singleton, then each rule agrees with maximizing expected utility. We show that, even when the option set is convex, this pairwise comparison between acts may fail to identify those acts which are Bayes for some probability in a convex set that is not closed. This limitation affects two of the decision rules but not -admissibility, which is not a pairwise decision rule. -admissibility can be used to distinguish between two convex sets of probabilities that intersect all the same supporting hyperplanes.
We extend de Finetti's (1974) theory of coherence to apply also to unbounded random variables... more We extend de Finetti's (1974) theory of coherence to apply also to unbounded random variables. We show that for random variables with mandated infinite prevision, such as for the St. Petersburg gamble, coherence precludes indifference between equivalent random quantities. That is, we demonstrate when the prevision of the difference between two such equivalent random variables must be positive. This result conflicts with the usual approach to theories of Subjective Expected Utility, where preference is defined over lotteries. In addition, we explore similar results for unbounded variables when their previsions, though finite, exceed their expected values, as is permitted within de Finetti's theory. In such cases, the decision maker's coherent preferences over random quantities is not even a function of probability and utility. One upshot of these findings is to explain further the differences between Savage's theory (1954), which requires bounded utility for non-simpl...
We give an extension of de Finetti's concept of coherence to unbounded (but real-valued) rand... more We give an extension of de Finetti's concept of coherence to unbounded (but real-valued) random variables that allows for gambling in the presence of infinite previsions. We present a finitely additive extension of the Daniell integral to unbounded random variables that we believe has advantages over Lebesgue-style integrals in the finitely additive setting. We also give a general version of the Fundamental Theorem of Prevision to deal with conditional previsions and unbounded random variables.
PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, 1990
The “Dutch Book” argument, tracing back to Ramsey (1926) and deFinetti (1974), offers prudential ... more The “Dutch Book” argument, tracing back to Ramsey (1926) and deFinetti (1974), offers prudential grounds for action in conformity with personal probability. Under several structural assumptions about combinations of stakes (that is, assumptions about the combination of wagers), your betting policy is consistent (coherent) only if your fair-odds are probabilities. The central question posed here is the following one: Besides providing an operational test of coherent betting, does the “Book” argument also provide for adequate measurement (elicitation) of the agent’s degrees of beliefs? That is, are an agent’s fairodds also his/her personal probabilities for those events?We argue the answer is “No!” The problem is created by state-dependent utilities. The methods of elicitation proposed by Ramsey, by deFinetti and by Savage (1954), are inadequate to the challenge of state-dependent values.1
Let κ be an uncountable cardinal. Using the theory of conditional probability associated with de ... more Let κ be an uncountable cardinal. Using the theory of conditional probability associated with de Finetti (1974) and Dubins (1975), subject to several structural assumptions for creating sufficiently many measurable sets, and assuming that κ is not a weakly inaccessible cardinal, we show that each probability that is not κ-additive has conditional probabilities that fail to be conglomerable in a partition of cardinality no greater than κ. This generalizes a result of Schervish, Seidenfeld, & Kadane (1984), which established that each finite but not countably additive probability has conditional probabilities that fail to be conglomerable in some countable partition.
It has long been known that the practice of testing all hypotheses at the same level (such as 0.0... more It has long been known that the practice of testing all hypotheses at the same level (such as 0.05), regardless of the distribution of the data, is not consistent with Bayesian expected utility maximization. According to de Finetti's “Dutch Book” argument, procedures that are not consistent with expected utility maximization are incoherent and they lead to gambles that are sure to lose no matter what happens. In this paper, we use a method to measure the rate at which incoherent procedures are sure to lose, so that we can distinguish slightly incoherent procedures from grossly incoherent ones. We present an analysis of testing a simple hypothesis against a simple alternative as a case-study of how the method can work.
Statistical decision theory, whether based on Bayesian principles or other concepts such as minim... more Statistical decision theory, whether based on Bayesian principles or other concepts such as minimax or admissibility, relies on minimizing expected loss or maximizing expected utility. Loss and utility functions are generally treated as unit-less numerical measures of value for consequences. Here, we address the issue of the units in which loss and utility are settled and the implications that those units have on the rankings of potential decisions. When multiple currencies are available for paying the loss, one must take explicit account of which currency is used as well as the exchange rates between the various available currencies.
PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, 1994
We discuss two general issues concerning diverging sets of Bayesian (conditional) probabilities—d... more We discuss two general issues concerning diverging sets of Bayesian (conditional) probabilities—divergence of “posteriors”—that can result with increasing evidence. Consider a set of probabilities typically, but not always, based on a set of Bayesian “priors.” Incorporating sets of probabilities, rather than relying on a single probability, is a useful way to provide a rigorous mathematical framework for studying sensitivity and robustness in Classical and Bayesian inference. See: Berger (1984, 1985, 1990); Lavine (1991); Huber and Strassen (1973); Walley (1991); and Wasserman and Kadane (1990). Also, sets of probabilities arise in group decision problems. See: Levi (1982); and Seidenfeld, Kadane, and Schervish (1989). Third, sets of probabilities are one consequence of weakening traditional axioms for uncertainty. See: Good (1952); Smith (1961); Kyburg (1961); Levi (1974); Fishburn (1986); Seidenfeld, Schervish, and Kadane (1990); and Walley (1991).
Conditioning can make imprecise probabilities uniformly more imprecise. We call this effect “dila... more Conditioning can make imprecise probabilities uniformly more imprecise. We call this effect “dilation”. In a previous paper (1993), Seidenfeld and Wasserman established some basic results about dilation. In this paper we further investigate dilation on several models. In particular, we consider conditions under which dilation persists under marginalization and we quantify the degree of dilation. We also show that dilation manifests itself asymptotically in certain robust Bayesian models and we characterize the rate at which dilation occurs.
We contrast three decisions rules that extend Expected Utility to contexts where a convex set of ... more We contrast three decisions rules that extend Expected Utility to contexts where a convex set of probabilities is used to depict uncertainty: -Maximin, Maximality, and -admissibility. The rules extend Expected Utility theory as they require that an option is inadmissible if there is another that carries greater expected utility for each probability in a (closed) convex set. If the convex set is a singleton, then each rule agrees with maximizing expected utility. We show that, even when the option set is convex, this pairwise comparison between acts may fail to identify those acts which are Bayes for some probability in a convex set that is not closed. This limitation affects two of the decision rules but not -admissibility, which is not a pairwise decision rule. -admissibility can be used to distinguish between two convex sets of probabilities that intersect all the same supporting hyperplanes.
We extend de Finetti's (1974) theory of coherence to apply also to unbounded random variables... more We extend de Finetti's (1974) theory of coherence to apply also to unbounded random variables. We show that for random variables with mandated infinite prevision, such as for the St. Petersburg gamble, coherence precludes indifference between equivalent random quantities. That is, we demonstrate when the prevision of the difference between two such equivalent random variables must be positive. This result conflicts with the usual approach to theories of Subjective Expected Utility, where preference is defined over lotteries. In addition, we explore similar results for unbounded variables when their previsions, though finite, exceed their expected values, as is permitted within de Finetti's theory. In such cases, the decision maker's coherent preferences over random quantities is not even a function of probability and utility. One upshot of these findings is to explain further the differences between Savage's theory (1954), which requires bounded utility for non-simpl...
We give an extension of de Finetti's concept of coherence to unbounded (but real-valued) rand... more We give an extension of de Finetti's concept of coherence to unbounded (but real-valued) random variables that allows for gambling in the presence of infinite previsions. We present a finitely additive extension of the Daniell integral to unbounded random variables that we believe has advantages over Lebesgue-style integrals in the finitely additive setting. We also give a general version of the Fundamental Theorem of Prevision to deal with conditional previsions and unbounded random variables.
PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, 1990
The “Dutch Book” argument, tracing back to Ramsey (1926) and deFinetti (1974), offers prudential ... more The “Dutch Book” argument, tracing back to Ramsey (1926) and deFinetti (1974), offers prudential grounds for action in conformity with personal probability. Under several structural assumptions about combinations of stakes (that is, assumptions about the combination of wagers), your betting policy is consistent (coherent) only if your fair-odds are probabilities. The central question posed here is the following one: Besides providing an operational test of coherent betting, does the “Book” argument also provide for adequate measurement (elicitation) of the agent’s degrees of beliefs? That is, are an agent’s fairodds also his/her personal probabilities for those events?We argue the answer is “No!” The problem is created by state-dependent utilities. The methods of elicitation proposed by Ramsey, by deFinetti and by Savage (1954), are inadequate to the challenge of state-dependent values.1
Let κ be an uncountable cardinal. Using the theory of conditional probability associated with de ... more Let κ be an uncountable cardinal. Using the theory of conditional probability associated with de Finetti (1974) and Dubins (1975), subject to several structural assumptions for creating sufficiently many measurable sets, and assuming that κ is not a weakly inaccessible cardinal, we show that each probability that is not κ-additive has conditional probabilities that fail to be conglomerable in a partition of cardinality no greater than κ. This generalizes a result of Schervish, Seidenfeld, & Kadane (1984), which established that each finite but not countably additive probability has conditional probabilities that fail to be conglomerable in some countable partition.
It has long been known that the practice of testing all hypotheses at the same level (such as 0.0... more It has long been known that the practice of testing all hypotheses at the same level (such as 0.05), regardless of the distribution of the data, is not consistent with Bayesian expected utility maximization. According to de Finetti's “Dutch Book” argument, procedures that are not consistent with expected utility maximization are incoherent and they lead to gambles that are sure to lose no matter what happens. In this paper, we use a method to measure the rate at which incoherent procedures are sure to lose, so that we can distinguish slightly incoherent procedures from grossly incoherent ones. We present an analysis of testing a simple hypothesis against a simple alternative as a case-study of how the method can work.
Statistical decision theory, whether based on Bayesian principles or other concepts such as minim... more Statistical decision theory, whether based on Bayesian principles or other concepts such as minimax or admissibility, relies on minimizing expected loss or maximizing expected utility. Loss and utility functions are generally treated as unit-less numerical measures of value for consequences. Here, we address the issue of the units in which loss and utility are settled and the implications that those units have on the rankings of potential decisions. When multiple currencies are available for paying the loss, one must take explicit account of which currency is used as well as the exchange rates between the various available currencies.
Uploads