
Why it is that, when dealing with the sum of two or more independent random variables, it is not their standard deviations that sum (add), but rather their variances. The concept of the (expectation and) variance of a random variable. So, the specific objectives are to truly understand So, if there is one 'master' formula to pay attention to and to 'own', it is the one for the variance of a linear combination of random variables. The latter is central, not just to simple contrasts involving just 2 sample means or proportions, but also in the much wider world of regression, since the variance (sampling variability) of any regression slope can be viewed as the variance of a linear combination of random 'errors', or random deviations, or random variables. It gives the laws governing the variance of a sum of 2, or (especially) \(n\) random variables - and even more importantly - the laws governing the variance of a difference of two random variables. This central chapter addresses a fundamental concept, namely the variance of a random variable. 18.3 Other Exercises (under construction).18.2.10 Correcting length-biased sampling.18.2.6 Height differences of random M-F pairs.18.2.4 Variable-length (parallel) parking spaces.17.3 Other Exercises (under construction).
16.5.2 Statistical Concepts and Principles.15.7.2 Statistical Concepts and Principles.15.4 The p and q functions: an orientation.
14.2 Powers, Logarithms and Anti–logarithms. 13.3.10 weights of offspring (pups/twins). 13.3.8 CI for proportion when observe 0/n or n/n. 13.3.6 It's the 3rd week of the course: it must be Binomial. 13.3.5 Can one influence the sex of a baby?. 13.3.4 Binomial or Opportunistic? (Capitalization on chance. 13.3.3 Automated Chemistries (from Ingelfinger et al). 12.8 Linear combinations of RVs (regression slopes). 12.6 Variance and SD of a FUNCTION of a random variable. 12.5.4 Example of Variance-calculation using one-pass formula. 12.5.2 Some (good) reasons for using variance, which averages the squares of the deviations from the mean. 12.5 Variance (and thus, SD) of a random variable. 12.4 Expected value of a FUNCTION of a random variable. 12.3 Expectation (mean) of a Random Variable. 11.5 Changing the Conditioning: the direction matters. 11.4 Conditional probabilities, and (in)dependence. 11.3 Basic rules for probability calculations. 5.2 Fitting these to data / Estimating them from data. 3.2.2 Ingredients and methods of procedure in a statistical test. 3.2.1 (Frequentist) Test of a Null Hypothesis. 3.1.3 Examples: parameter is a personal number or population mean. 3.1.2 Example: parameter is a proportion. 3.1.1 Example: parameter is 2-valued: yes or no. 2.2.2 Parameter relations in symbols, and with the help of an index-category indicator. 2.2.1 Parameter relations in numbers and words. 0.5 Let's switch to "y-bar", and drop "x-bar".