This is a bonus post for my main post on the binomial distribution. Here I want to give a formal proof for the binomial distribution mean and variance formulas I previously showed you.[Read more…]
In today’s post, I’m going to give you intuition about the Bernoulli distribution. This is one of the simplest and yet most famous discrete probability distributions. Not only that, it is the basis of many other more complex distributions.
This post is part of my series on discrete probability distributions.[Read more…]
In my previous post I introduced you to probability distributions.
In short, a probability distribution is simply taking the whole probability mass of a random variable and distributing it across its possible outcomes. Since every random variable has a total probability mass equal to 1, this just means splitting the number 1 into parts and assigning each part to some element of the variable’s sample space (informally speaking).
In this post I want to dig a little deeper into probability distributions and explore some of their properties. Namely, I want to talk about the measures of central tendency (the mean) and dispersion (the variance) of a probability distribution.[Read more…]
A few posts ago I introduced you to the “three M’s” of statistics — the concepts of mean, mode, and median. Today I want to talk to you about a related concept called variance.
While the three M’s measure the central tendency of a collection of numbers, the variance measures their dispersion. That is, it measures how different the numbers are from each other.
Measuring dispersion is another fundamental topic in statistics and probability theory. On the one hand, it tells you how much you can trust the central tendency measures as good representatives of the collection. High variance usually means a lot of the numbers in the collection will be far away from those measures.