Events - Let's start with a very simple thing. We have a die, roll it once, and see what kind of events could occur. We may roll a 1.It's also possible, that we roll a 2. Then, it is also possible that before the die stops, a meteorite hits Earth and destroys the die, along with the entire human civilization.
Independence - In probability theory, to say that two events are independent means that the occurrence of one does not affect the probability of the other.
Mutually exclusive events - Two events are mutually exclusive if they cannot occur at the same time.
Conditional probability - A given B it answers the question of what chance does A have, if B definitely occurs.
Total probability theorem - This formula is a fundamental rule relating marginal probabilities to conditional probabilities.
Bayes-theorem - We use this theorem if we want to calculate the probability of an earlier event (Bk) in light of a later occurring event (A).
Random variable - Random variables assign numbers to events.
Distribution function - The distribution function of random variable X, and it is denoted by F(x). F(x)=P(x<X) But, before we fall victims of a fatal mistake, let's make it clear that x and X are two totally different things.
Density function - The way the density function works is that the probabilities are given by the areas below the curve.
EXPECTED VALUE AND STANDARD DEVIATION
Expected value - We get the expected values by multiplying the values of X by their probabilities, and then sum these up.
Standard deviation - The standard deviation tells us how large the fluctuation is around the expected value.
Markov's inequality - Markov's inequality makes a very simple statement,namely, that it is not very likely that a random variable X would get a lot bigger than its expected value.
Chebyshev's inequality - The Chebyshev's inequality says the distance from the expected value cannot be too large.
Binomial distribution - The binomial distribution with parameters n and p is the discrete probability distribution of the number of successes in a sequence of n independent yes/no experiments, each of which yields success with probability p.
Hypergeometric distribution - The hypergeometric distribution is a discrete probability distribution that describes the probability of k defected in n draws, without replacement, from a finite population of size N containing exactly K defected.
Poisson distribution - The Poisson distribution with parameter >>lambda<< is the discrete probability distribution of the number of successes in a sequence of infinitely many yes/no experiments, where >>lambda<< is the expected value of success experiments.
Uniform distribution - The uniform distribution, is a distribution that has constant probability.