How It Works
Courses
Pedagogy
FAQ
About Us
Login
JOIN BETA

Probability & Statistics

Our probability and statistics course provides students with a rigorous foundation in statistical theory and methods, building on techniques learned in calculus and linear algebra. Whether pursuing STEM subjects, economics, or other disciplines, this course equips students with the theoretical knowledge to analyze and interpret data effectively.

This comprehensive course covers fundamental topics such as elementary probability, combinatorics, random variables, expectation algebra, discrete and continuous probability distributions, and joint distributions. Real-world examples are integrated throughout, helping students connect theory to practice.

After gaining a solid understanding of elementary probability and random variables, students progress to more advanced topics in statistical inference. This course covers elementary concepts related to parametric inference, the central limit theorem and its applications, confidence intervals, hypothesis testing, analysis of variance, and regression. Students are also introduced to some nonparametric methods. Real-world examples are discussed, helping students comprehend the practical applications of statistical theory.

This course provides ideal preparation for exploring advanced topics such as Bayesian statistics, time series analysis, or machine learning.

Overview

Outcomes

Content

The course begins by building a strong foundation in elementary probability and random variables, key concepts for understanding uncertainty and data analysis. Topics include Bayes' theorem, combinatorics, continuous random variables, and methods for calculating distributions of functions of random variables. By mastering these essential ideas, students are well-prepared to confidently engage with more advanced statistical inference concepts and tackle complex real-world problems.

After studying well-known discrete and continuous random variables, students will delve deeper into the concept of moments. They will learn to calculate expectations, variances, and other fundamental properties of random variables using moments, gaining insight into the behavior and characteristics of distributions. The course also introduces moment-generating functions, exploring their properties and applications. In particular, students will learn how these functions are used to derive the distributions of combinations of random variables, providing a powerful tool for advanced statistical analysis.
In this course, students explore various methods for combining random variables. Key topics include joint, marginal, and conditional distributions and their connections to expectation, variance, and independence within the context of joint distributions.

The second half of the course focuses on statistical inference, covering key topics such as point estimation, the central limit theorem, maximum likelihood estimation, confidence intervals, and hypothesis testing for one-sample and two-sample procedures. Additional topics include one-factor analysis of variance (ANOVA), correlation, regression, and an introduction to chi-square goodness-of-fit tests.

Upon successful completion of this course, students will have mastered the following:

Probability & Random Variables

Expectation

Discrete Probability Distributions

Continuous Probability Distributions

Combining Random Variables

Parametric Inference

Confidence Intervals

Hypothesis Testing

Regression

Nonparametric Inference

1.
Probability & Random Variables
20 topics
1.1. Probability
1.1.1. The Law of Total Probability
1.1.2. Extending the Law of Total Probability
1.1.3. Bayes' Theorem
1.1.4. Extending Bayes' Theorem
1.2. Combinatorics
1.2.1. Permutations With Repetition
1.2.2. K Permutations of N With Repetition
1.2.3. Combinations With Repetition
1.3. Random Variables
1.3.1. Probability Density Functions of Continuous Random Variables
1.3.2. Calculating Probabilities With Continuous Random Variables
1.3.3. Continuous Random Variables Over Infinite Domains
1.3.4. Cumulative Distribution Functions for Continuous Random Variables
1.3.5. Median, Quartiles and Percentiles of Continuous Random Variables
1.3.6. Finding the Mode of a Continuous Random Variable
1.3.7. Approximating Discrete Random Variables as Continuous
1.3.8. Simulating Random Observations
1.4. Functions of Random Variables
1.4.1. One-to-One Transformations of Discrete Random Variables
1.4.2. Many-to-One Transformations of Discrete Random Variables
1.4.3. The Distribution Function Method
1.4.4. The Change-of-Variables Method for Continuous Random Variables
1.4.5. The Distribution Function Method With Many-to-One Transformations
2.
Expectation
16 topics
2.5. Expectation of Random Variables
2.5.1. Expected Values of Discrete Random Variables
2.5.2. Properties of Expectation for Discrete Random Variables
2.5.3. Variance of Discrete Random Variables
2.5.4. Moments of Discrete Random Variables
2.5.5. Properties of Variance for Discrete Random Variables
2.5.6. Moments of Continuous Random Variables
2.5.7. Expected Values of Continuous Random Variables
2.5.8. Variance of Continuous Random Variables
2.5.9. The Rule of the Lazy Statistician
2.6. Moment-Generating Functions
2.6.1. Moment-Generating Functions
2.6.2. Calculating Moments Using Moment-Generating Functions
2.6.3. Calculating Variance and Standard Deviation Using Moment-Generating Functions
2.6.4. Constructing Moment-Generating Functions for Discrete Probability Distributions
2.6.5. Constructing Moment-Generating Functions for Continuous Probability Distributions
2.6.6. Properties of Moment-Generating Functions
2.6.7. The Uniqueness Property of MGFs
3.
Discrete Probability Distributions
20 topics
3.7. The Discrete Uniform Distribution
3.7.1. The Discrete Uniform Distribution
3.7.2. Mean and Variance of the Discrete Uniform Distribution
3.7.3. Modeling With Discrete Uniform Distributions
3.8. The Bernoulli Distribution
3.8.1. The Bernoulli Distribution
3.8.2. Mean and Variance of the Bernoulli Distribution
3.9. The Binomial Distribution
3.9.1. The Binomial Distribution
3.9.2. Modeling With the Binomial Distribution
3.9.3. Mean and Variance of the Binomial Distribution
3.9.4. The CDF of the Binomial Distribution
3.10. The Poisson Distribution
3.10.1. The Poisson Distribution
3.10.2. Modeling With the Poisson Distribution
3.10.3. Mean and Variance of the Poisson Distribution
3.10.4. The CDF of the Poisson Distribution
3.10.5. The Poisson Approximation of the Binomial Distribution
3.11. The Geometric Distribution
3.11.1. The Geometric Distribution
3.11.2. Modeling With the Geometric Distribution
3.11.3. Mean and Variance of the Geometric Distribution
3.12. The Negative Binomial Distribution
3.12.1. The Negative Binomial Distribution
3.12.2. Modeling With the Negative Binomial Distribution
3.12.3. Mean and Variance of the Negative Binomial Distribution
4.
Continuous Probability Distributions
21 topics
4.13. The Normal Distribution
4.13.1. The Z-Score
4.13.2. The Standard Normal Distribution
4.13.3. Symmetry Properties of the Standard Normal Distribution
4.13.4. The Normal Distribution
4.13.5. Mean and Variance of the Normal Distribution
4.13.6. Percentage Points of the Standard Normal Distribution
4.13.7. Modeling With the Normal Distribution
4.13.8. The Empirical Rule for the Normal Distribution
4.13.9. Normal Approximations of Binomial Distributions
4.13.10. The Normal Approximation of the Poisson Distribution
4.14. The Continuous Uniform Distribution
4.14.1. The Continuous Uniform Distribution
4.14.2. Mean and Variance the of Continuous Uniform Distribution
4.14.3. Modeling With Continuous Uniform Distributions
4.15. The Exponential Distribution
4.15.1. The Exponential Distribution
4.15.2. Modeling With the Exponential Distribution
4.15.3. Mean and Variance of the Exponential Distribution
4.16. Other Continous Distributions
4.16.1. The Gamma Function
4.16.2. The Gamma Distribution
4.16.3. The Chi-Square Distribution
4.16.4. The Student's T-Distribution
4.16.5. The F-Distribution
5.
Combining Random Variables
30 topics
5.17. Distributions of Two Discrete Random Variables
5.17.1. Joint Distributions for Discrete Random Variables
5.17.2. The Joint CDF of Two Discrete Random Variables
5.17.3. Marginal Distributions for Discrete Random Variables
5.17.4. Independence of Discrete Random Variables
5.17.5. Conditional Distributions for Discrete Random Variables
5.17.6. The Trinomial Distribution
5.17.7. The Multinomial Distribution
5.18. Distributions of Two Continuous Random Variables
5.18.1. Joint Distributions for Continuous Random Variables
5.18.2. Marginal Distributions for Continuous Random Variables
5.18.3. Independence of Continuous Random Variables
5.18.4. Conditional Distributions for Continuous Random Variables
5.18.5. The Joint CDF of Two Continuous Random Variables
5.18.6. Properties of the Joint CDF of Two Continuous Random Variables
5.18.7. The Bivariate Normal Distribution
5.19. Linear Combinations of Random Variables
5.19.1. Linear Combinations of Binomial Random Variables
5.19.2. Linear Combinations of Poisson Random Variables
5.19.3. Combining Two Normally Distributed Random Variables
5.19.4. Combining Multiple Normally Distributed Random Variables
5.19.5. I.I.D Normal Random Variables
5.20. Expectation for Multivariate Distributions
5.20.1. Expected Values of Sums and Products of Random Variables
5.20.2. Variance of Sums of Independent Random Variables
5.20.3. Computing Expected Values From Joint Distributions
5.20.4. Conditional Expectation for Discrete Random Variables
5.20.5. Conditional Variance for Discrete Random Variables
5.20.6. The Rule of the Lazy Statistician for Two Random Variables
5.21. Covariance of Random Variables
5.21.1. The Covariance of Two Random Variables
5.21.2. Variance of Sums of Random Variables
5.21.3. The Covariance Matrix
5.21.4. The Correlation Coefficient for Two Random Variables
5.21.5. The Sample Covariance Matrix
6.
Parametric Inference
21 topics
6.22. Mean, Variance, and Proportion
6.22.1. The Sample Mean
6.22.2. Sampling Distributions
6.22.3. The Sample Variance
6.22.4. Pooled Variance
6.22.5. Variance of Sample Means
6.22.6. Sample Means From Normal Populations
6.22.7. Sampling Proportions From Finite Populations
6.22.8. The Method of Moments
6.22.9. The Method of Moments: Two-Parameter Distributions
6.23. The Central Limit Theorem
6.23.1. The Central Limit Theorem
6.23.2. Applications of the Central Limit Theorem
6.23.3. Finite Population Corrections for Sample Means
6.23.4. Point Estimates of Population Proportions
6.23.5. Finite Population Corrections for Sample Proportions
6.24. Maximum Likelihood
6.24.1. Product Notation
6.24.2. Logarithmic Differentiation
6.24.3. Likelihood Functions for Discrete Probability Distributions
6.24.4. Log-Likelihood Functions for Discrete Probability Distributions
6.24.5. Likelihood Functions for Continuous Probability Distributions
6.24.6. Log-Likelihood Functions for Continuous Probability Distributions
6.24.7. Maximum Likelihood Estimation
7.
Confidence Intervals
15 topics
7.25. One-Sample Procedures
7.25.1. Confidence Intervals for One Mean: Known Population Variance
7.25.2. Confidence Intervals for One Mean: Unknown Population Variance
7.25.3. Confidence Intervals for One Means: Finite Population Correction
7.25.4. Confidence Intervals for One Proportion
7.25.5. Confidence Intervals for One Proportion: Finite Population Corrections
7.25.6. Confidence Intervals for One Variance
7.26. Two-Sample Procedures
7.26.1. Confidence Intervals for Two Means: Known and Unequal Population Variances
7.26.2. Confidence Intervals for Two Means: Equal and Unknown Population Variance
7.26.3. Confidence Intervals for Two Means: Unequal and Unknown Population Variance
7.26.4. Confidence Intervals for Two Proportions
7.26.5. Confidence Intervals for Paired Samples: Known Variances
7.26.6. Confidence Intervals for Paired Samples: Unknown Variances
7.27. Sample Size
7.27.1. Estimating Sample Sizes for Means
7.27.2. Estimating Sample Sizes for Proportions
7.27.3. Estimating Sample Sizes for Proportions: Finite Population Correction
8.
Hypothesis Testing
19 topics
8.28. One-Sample Procedures
8.28.1. Introduction to Hypothesis Testing
8.28.2. Hypothesis Tests for the Rate of a Poisson Distribution
8.28.3. Critical Regions for Left-Tailed Hypothesis Tests
8.28.4. Critical Regions for Right-Tailed Hypothesis Tests
8.28.5. Two-Tailed Hypothesis Tests
8.28.6. Type I and Type II Errors
8.28.7. Hypothesis Tests for One Mean: Known Population Variance
8.28.8. Hypothesis Tests for One Mean: Unknown Population Variance
8.28.9. Hypothesis Tests for One Variance
8.29. Two-Sample Procedures
8.29.1. Hypothesis Tests for Two Means: Known Population Variances
8.29.2. Hypothesis Tests for Two Means: Equal But Unknown Population Variances
8.29.3. Hypothesis Tests for Two Means: Unequal and Unknown Population Variances
8.29.4. Hypothesis Tests for Two Proportions
8.29.5. Hypothesis Tests for Two Means: Paired-Sample Z-Test
8.29.6. Hypothesis Tests for Two Means: Paired-Sample T-Test
8.29.7. Hypothesis Tests for Two Variances
8.30. Analysis of Variance
8.30.1. One-Factor Within Groups and Between Groups Variation
8.30.2. The Relationship Between SSW, SSB, SST
8.30.3. One-Factor Analysis of Variance
9.
Regression
11 topics
9.31. Correlation and Regression
9.31.1. The Linear Correlation Coefficient
9.31.2. Linear Regression
9.31.3. Residuals and Residual Plots
9.31.4. Spearman's Rank Correlation Coefficient
9.31.5. Confidence Intervals for Linear Regression Slope Parameters
9.31.6. Confidence Intervals for Linear Regression Intercept Parameters
9.32. Linear and Nonlinear Regression With Matrices
9.32.1. The Least-Squares Solution of a Linear System (Without Collinearity)
9.32.2. The Least-Squares Solution of a Linear System (With Collinearity)
9.32.3. Linear Regression With Matrices
9.32.4. Polynomial Regression With Matrices
9.32.5. Multiple Linear Regression With Matrices
10.
Nonparametric Inference
7 topics
10.33. Goodness-of-Fit and Order Statistics
10.33.1. Introduction to Chi-Square Goodness-of-Fit
10.33.2. Testing Binomial Models Using Chi-Square Goodness-of-Fit
10.33.3. Testing Poisson Models Using Chi-Square Goodness-of-Fit
10.33.4. Testing Continuous Uniform Models Using Chi-Square Goodness-of-Fit
10.33.5. Testing Normal Models Using Chi-Square Goodness-of-Fit
10.33.6. Chi-Square Tests of Independence and Homogeneity
10.33.7. Introduction to Order Statistics