Probabilistic analysis of data

A widespread application of such an analysis os weather forecasting - for more than a century, hundreds of weather stations around the world record various important parameters such as the air temperature, wind speed, precipitation, snowfall, etc. Based on these data, scientists build models reflecting seasonal weather changes (depending on the time if the year) as well as the global trends -- for example, temperature change during the last 50 years. These models thought of by themselves, but not necessarily generate good assessments. The very fact that there was correspondence about the gambles - and occasionally some disputes about them - indicated that people do not automatically assess probabilities in the same way, or accurately (e.g., corresponding to relative frequencies, or making good gambling choices).

The Von-Neumann and Morgenstern work, however, does involve some psychological assumptions that people can engage in 'good' probabilistic thinking. First, they must do so implicitly, so that the choice follows certain 'choice axioms' that allow the construction of an expected utility model - i.e., a model that represents choice as maximization of implicitly expected utility; that in turn requires that probabilities at the very least follow the standard axioms of probability theory.

It also implies considering conditional probabilities in a rational manner which is done only when implicit or explicit conditional probabilities are consistent with Bayes' theorem. Thus, the Von-Neumann and Morgenstern work required that people be 'Bayesian' in a consistent sense, although that term is sometimes used to imply that probabilities should at the base be interpreted as degrees of belief.

Another way in which probability assessment must be 'good' is that there should be some at least reasonable approximation between probabilities and long- term relative frequencies; in fact, under particular circumstances (of interchangeability and indefinitely repeated observations), the probabilities of someone whose belief is constrained by Bayes' theorem must approximate relative frequencies.

Consider an analogy with swimming. People do swim well, but occasionally we drown. What happens is that there is particular systematic bais in attempting to swim that makes it difficult. We want to hold our heads above water, When however we raise your heads ti di si m we tend to assume a vertical position in the water, which is one of the few ways of drowning ( aside from freezing of exhaustion of being swept away in rough waters). Just as people drown occasionally by trying to hold their heads above water, people systematically deviate from the rules of probabilistic thinking. Again, however, the emphases are on 'systematic.'

For example, there is now evidence that people's probabilistic judgments are 'sub additive'- in that when a general class is broken into components, the judgementally estimated probabilities assigned to disjoint components that comprise the class sum to a larger number than the probability assigned to the class. That is particularly true in memory, where, for example, people may recall the frequency with which they were angry at a close friend or the relative in the last month and the frequency with which they were angry at a total stranger, and the sum of the estimates is greater than an independent estimator on being angry period (even though it is possible to be angry at someone who is neither a close friend or relative nor a total stranger). the clever opponent will then bet against the occurrence of each component but on the occurrence of the basic event, thereby creating a Dutch BOOk.

No comments:

Post a Comment

Algorithm For Loss Function and introduction

Common Loss functions in machine learning- 1)Regression losses  and  2)Classification losses .   There are three types of Regression losses...