Skip to content

Instantly share code, notes, and snippets.

@gary23w
Last active February 5, 2025 17:19
Show Gist options
  • Save gary23w/9d87df58d8645780d50bc0d3a15e395b to your computer and use it in GitHub Desktop.
Save gary23w/9d87df58d8645780d50bc0d3a15e395b to your computer and use it in GitHub Desktop.
Shaping and Illustrating after my morning coffee...

Insert this: https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.7/MathJax.js?config=TeX-AMS-MML_HTMLorMML

Bayes' Theorem: A Gary Guide

Bayes' Theorem is a fundamental concept in probability and statistics. It allows us to update our initial beliefs about an event based on new evidence. Here's a step-by-step breakdown of the theorem, its components, and a couple of examples to solidify understanding.


Key Concepts

  1. Independent Events: If events A and B are independent (they don't influence each other), then the probability that both occur is simply the product of their individual probabilities.

    [ P(A \text{ and } B) = P(A) \times P(B) ]

  2. Dependent Events: When events are related or influence each other, the probability that both events occur depends on the probability of one event given that the other has occurred.

    [ P(A \text{ and } B) = P(A) \times P(B|A) ]

    Here, ( P(B|A) ) is the probability that B occurs given that A has already happened.

  3. Bayes' Theorem: This theorem allows us to reverse conditional probabilities. It enables us to find ( P(A|B) ) — the probability that A occurs given that B is true — based on known probabilities.

    Formula

    [ P(A|B) = \frac{P(A) \times P(B|A)}{P(B)} ]

    Where:

    • ( P(A|B) ): Posterior probability — probability of A given B.
    • ( P(A) ): Prior probability — initial probability of A.
    • ( P(B|A) ): Likelihood — probability of B given A.
    • ( P(B) ): Normalizing constant — probability of B, ensuring probability values range from 0 to 1.

Understanding Bayes’ Theorem Through an Example

Example 1: The Cookie Jar Problem

Suppose you have two jars of cookies:

  • Jar 1 contains 10 chocolate chip cookies and 30 sugar cookies. So, 3/4 of the cookies in Jar 1 are sugar cookies.
  • Jar 2 contains 20 chocolate chip cookies and 20 sugar cookies. In Jar 2, only 1/2 are sugar cookies.

Now, a friend picks a sugar cookie from one of the jars. You want to know from which jar the cookie most likely came.

Problem Setup

  1. Hypotheses:

    • Hypothesis 1 (H1): The friend picked the cookie from Jar 1.
    • Hypothesis 2 (H2): The friend picked the cookie from Jar 2.
  2. Priors:

    • Since the friend could pick either jar randomly, both hypotheses are equally likely.
    • ( P(H1) = P(H2) = 0.5 )
  3. Likelihoods:

    • ( P(\text{Sugar} | H1) ): Probability of picking a sugar cookie from Jar 1 = 0.75 (3/4).
    • ( P(\text{Sugar} | H2) ): Probability of picking a sugar cookie from Jar 2 = 0.5 (1/2).
  4. Normalizing Constant (P(E)): This is the probability of picking a sugar cookie overall, regardless of which jar it came from.

    [ P(E) = (P(H1) \times P(\text{Sugar}|H1)) + (P(H2) \times P(\text{Sugar}|H2)) = (0.5 \times 0.75) + (0.5 \times 0.5) = 0.625 ]

Solution Using Bayes’ Theorem

Now, we can find the posterior probability that the friend picked the cookie from Jar 1 given they picked a sugar cookie:

[ P(H1|\text{Sugar}) = \frac{P(H1) \times P(\text{Sugar}|H1)}{P(E)} = \frac{0.5 \times 0.75}{0.625} = 0.6 ]

So, there’s a 60% chance that the cookie came from Jar 1.


Interactive Tools

1. Animated Explanation with Manim

View a step-by-step animation of Bayes' Theorem: Watch Animation

2. Interactive Graph with Desmos

Explore probabilities visually: Open Interactive Graph

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment