A discrete random variable has a countable number of possible numeric outcomes.
Probability mass function (pmf):
\(P(X = x) = f(x)\)
Cumulative distribution function (cdf):
\(P(X \leq x) = F(x)\)
\(E(X) = \sum_{S} x f(x)\)
\(Var(X)= E(X^2) - [E(X)]^2\)
If X is a random variable that takes value 1 with probability of success \(\pi\) and 0 with probability \(1-\pi\), then X follows a Bernoulli distribution.
\(S = \{0, 1 \}\)
\(X \sim \text{Bernoulli} (\pi)\)
\(E(X) = \mu = \pi\)
\(Var(X)=\sigma^2 = \pi(1-\pi)\)
Let X be the number of failures needed before the first success is observed in independent trials. \(X\) follows a geometric distribution
The random variable X represents the number of successes in \(n\) trials where in independent trial the probability of success is \(\pi\).
Let \(X\) represent the number of occurrences of an event within a fixed time or space.
A continuous random variable \(X\) would have a sample space ( \(S_X\) ) that is uncountably infinite.
Let X be the the proportion of bike owners on campus. Then \(S_x = [0, 1]\).
Let Y be the the survival time after some surgery. Then \(S_Y = [0, \infty)\).
A probability density function gives the relative likelihood of the continuous random variable within the sample space.
\[f(x) \geq0 \text{ for all } x \ \epsilon \ S_X\]
\[\int_{x \ \epsilon \ S_X} f(x)dx = 1\]
\[P(a \leq X \leq b) = \int_a^b f(x)dx\]
Let \(X\) be a continuous random variable. Then the cumulative distribution function \(F(x) = P(x \leq X)\) defined as:
\[F(x) = P(x \leq X) = \int_{-\infty}^x f(t) dt\]
\(x^2 \geq0\)
\((1-x) \geq0\)
\(f(x) \geq0 \text{ for all } x \ \epsilon \ S_X\)
\(\int_0^1 12(x^2)(1-x)dx\)
\(12\int_0^1 (x^2-x^3)dx\)
\(12\big[\frac{x^3}{3} -\frac{x^4}{4}\bigg\rvert_0^1\big] = 1\)
\(\int_{x \ \epsilon \ S_X} f(x)dx = 1\)
\(P(0.25<X<0.50) = \int_{0.25}^{0.50} 12(x^2)(1-x)dx =12\int_{0.25}^{0.50} (x^2-x^3)dx\)
\(=12\big[\frac{x^3}{3} -\frac{x^4}{4}\bigg\rvert_{0.25}^{0.50}\big] = 0.2617188\)
\(P(X\ \epsilon \ B) = \int_{x \ \epsilon \ B} f(x)dx\)
\(P(X=0.40) =\)
\(\int_{0.40}^{0.40} 12(x^2)(1-x)dx\)
\(12\int_{0.40}^{0.40} (x^2-x^3)dx\)
\(12\big[\frac{x^3}{3} -\frac{x^4}{4}\bigg\rvert_{0.40}^{0.40}\big] = 0\)
\(P(X\leq x) = \int_{-\infty}^x f(t)dt\)
\(P(X\leq 0.70) =\)
\(\int_{0}^{0.70} 12(t^2)(1-t)dt\)
\(12\big[\frac{t^3}{3} -\frac{t^4}{4}\bigg\rvert_{0}^{0.70}\big] = 0.6516996\)
Note: \(f(x) = \frac{dF(x)}{dx}\)
\(E(X) = \int_{x \ \epsilon \ S_X} xf(x)dx\)
\(\int_0^1 x12(x^2)(1-x)dx\)
\(12\int_0^1 (x^3-x^4)dx\)
\(12\big[\frac{x^4}{4} -\frac{x^5}{5}\bigg\rvert_0^1\big] = 0.6\)
\(Var(X) = E(X^2)- [E(X)]^2\)
\(E(X^2) = ?\)
\(E(X^2) = \int_0^1 x^212(x^2)(1-x)dx\)
\(E(X^2) = 12\int_0^1 (x^4-x^5)dx\)
\(E(X^2) = 12\big[\frac{x^5}{5} -\frac{x^6}{6}\bigg\rvert_0^1\big] = 0.4\)
\(Var(X) = 0.4- 0.6^2 = 0.04\)