Probability

The mathematical study of randomness and uncertainty

Definition

Probability is a branch of mathematics that deals with numerical descriptions of how likely an event is to occur. The probability of an event is a number between 0 and 1, where 0 indicates impossibility and 1 indicates certainty.

In mathematical terms, probability theory provides a framework for modeling random phenomena and quantifying uncertainty. It forms the foundation of statistics, risk assessment, and many fields of science and engineering.

Basic Concepts

Sample Space

The sample space (denoted as S or Ω) is the set of all possible outcomes of a random experiment.

Event

An event is a subset of the sample space. It represents one or more outcomes that we are interested in.

Probability of an Event

The probability of an event E is defined as:

P(E) = Number of favorable outcomes / Total number of possible outcomes

For a finite sample space with equally likely outcomes:

P(E) = |E| / |S|

Fundamental Formulas

Probability Axioms (Kolmogorov Axioms)

  1. Non-negativity: For any event E, P(E) ≥ 0
  2. Normalization: P(S) = 1 (probability of the entire sample space is 1)
  3. Additivity: For mutually exclusive events E₁, E₂, ...:
    P(E₁ ∪ E₂ ∪ ...) = P(E₁) + P(E₂) + ...

Complement Rule

P(E') = 1 - P(E)

Where E' is the complement of event E (E does not occur).

Addition Rule

For any two events A and B:

P(A ∪ B) = P(A) + P(B) - P(A ∩ B)

If A and B are mutually exclusive: P(A ∪ B) = P(A) + P(B)

Conditional Probability

The probability of event A occurring given that event B has occurred:

P(A|B) = P(A ∩ B) / P(B), where P(B) > 0

Multiplication Rule

P(A ∩ B) = P(A|B) × P(B) = P(B|A) × P(A)

Bayes' Theorem

P(A|B) = P(B|A) × P(A) / P(B)

Law of Total Probability

If B₁, B₂, ..., Bₙ form a partition of the sample space:

P(A) = Σ P(A|Bᵢ) × P(Bᵢ)

Independence

Events A and B are independent if:

P(A ∩ B) = P(A) × P(B)

Equivalently: P(A|B) = P(A)

Types of Probability

Classical Probability

Based on equally likely outcomes. Used when all outcomes are known to be equally probable.

P(E) = Number of favorable outcomes / Total outcomes

Empirical Probability

Based on observed frequencies from experiments or data.

P(E) ≈ Frequency of E / Total number of trials

Subjective Probability

Based on personal judgment, experience, or belief about the likelihood of an event.

Examples

Example 1: Coin Toss

What is the probability of getting heads when flipping a fair coin?

Solution:

Sample space S = {Heads, Tails}

Favorable outcome = {Heads}

P(Heads) = 1/2 = 0.5 = 50%

Example 2: Dice Roll

What is the probability of rolling a sum of 7 with two fair dice?

Solution:

Total outcomes = 6 × 6 = 36

Favorable combinations: (1,6), (2,5), (3,4), (4,3), (5,2), (6,1) = 6 ways

P(Sum = 7) = 6/36 = 1/6 ≈ 16.67%

Example 3: Conditional Probability

A bag contains 3 red and 2 blue marbles. Two marbles are drawn without replacement. What is the probability that the second marble is red given that the first was red?

Solution:

After drawing one red marble: 2 red and 2 blue remain

P(Second red | First red) = 2/4 = 1/2 = 50%

Example 4: Bayes' Theorem

A disease affects 1% of the population. A test is 99% accurate (99% true positive rate, 99% true negative rate). If a person tests positive, what is the probability they actually have the disease?

Solution:

P(Disease) = 0.01, P(No disease) = 0.99

P(Positive|Disease) = 0.99, P(Positive|No disease) = 0.01

P(Disease|Positive) = (0.99 × 0.01) / [(0.99 × 0.01) + (0.01 × 0.99)]

P(Disease|Positive) = 0.0099 / 0.0198 = 0.5 = 50%

Applications

  • Statistics: Hypothesis testing, confidence intervals, and data analysis
  • Finance: Risk assessment, portfolio management, and option pricing
  • Engineering: Reliability analysis and quality control
  • Computer Science: Machine learning, cryptography, and algorithm analysis
  • Medicine: Clinical trials and diagnostic testing
  • Physics: Quantum mechanics and statistical mechanics
  • Games: Gambling, sports analytics, and game theory
  • Insurance: Premium calculation and risk assessment

Key Properties

  • Probability values always range from 0 to 1 (inclusive)
  • The probability of an event and its complement sum to 1
  • Independent events do not influence each other's probability
  • Mutually exclusive events cannot occur simultaneously
  • The law of large numbers states that observed frequencies approach theoretical probabilities as the number of trials increases