Short note on statistics

The calculation involving the ratio of the size of difference (numerator) to the amount of error (denominator), illustrated in the table, produces a numbed termed a statistic. t, F, and a square of X
are statistics. this value becomes larger as the ratio or difference ( or relationship) to error increases, as the formulae indicate. A larger statistic obviously is better.

   The statistic is compared to a table of statistics to determine to os large enough to reject the null hypothesis.   The reason for this is that larger samples give more confidence that the obtained sample is not unrepresentative of the population, that is, not a "fluke". Small samples are population will be distinctly different due to chance. Therefore, to be confident that the ratio is sufficiently large to reject
the hypothesis when the sample is small, the statistic must be larger.

   In a statistics course, we would learn the several formulae used to calculate the numbers we need in order to determine if we can reject the null hypothesis.

  A few of these formulae are presented here to point out their similarities. Conceptual, not actual, formulae ate used. The different types of formulae are shown in the table

No comments:

Post a Comment

Algorithm For Loss Function and introduction

Common Loss functions in machine learning- 1)Regression losses  and  2)Classification losses .   There are three types of Regression losses...