Probability and Combinatorics
Let's start with a very simple thing. We have a die, roll it once, and see what kind of events could occur.
We may roll a 1.
It's also possible, that we roll a 2.
Then, it is also possible that before the die stops, a meteorite hits Earth and destroys the die, along with the entire human civilization.
Well, in this case the rolling is invalid. At the beginning, we will only look at cases when the roll is valid, that is, when we get one of the six numbers.
This is called classical probability, and that's what we will discuss for a while. Meteorites will come later.
So, we have a total of six cases. These events are called elementary events.
There are events that consist of more than one elementary events. For example, rolling an even number.
Or, rolling a number greater than 2.
We will use uppercase letters to refer to events.
Every event has a probability. We compute that by counting how many elementary events are included in it, and divide that by the total number of elementary events.
Therefore, all probabilities are between 0 and 1.
We can create new events from existing events.
Let's see what their probabilities look like.
Well, it is worth to remember these. Now, let's move onto something more interesting.
It is time for us to create a short summary of combinatorics.
There are n items
we choose all of them
we choose k items from them
the order matters
the order does not matter
Number of permutations of n different items:
How many different ways can five people sit next to each other on a bench?
Number of permutations of k items chosen from n different items:
How many different ways can three people out of five sit next to each other on a bench?
Number of combinations of k items chosen from n different items:
How many ways can we choose three people out of five?
We pull[e2] five cards out of a 52-card standard deck.
What is the probability that the first and the third cards are aces?
Let's start with all cases.
We choose 5 out of the 52 cards. We need to know whether the order is important or not important.
Since phrases like "first card" and "third card" are mentioned, it seems the order is important.
Now, let's see the desired cases.
The first card is an ace, and that could be of 4 suits.
The next card could be anything from the remaining 51 cards.
Then the third card should again be an ace.
Let's see how many aces we have left.
We have no idea. If we pulled[e3] an ace for the second card, then we only have two left.However, if we didn't, then we have three.
This is a problem, indeed.
When we count the desired cases, we have to start with the wish.
Now the wish is that the first card is an ace, and the third card is an ace, too.
Next, we can think about the other cards.
There are 50 cards left for the second place.
Then 49 and 48.
What is the probability that only the first and the third cards are aces?
The order matters here, too.
The total number of cases is the same as before.
Let's see the desired cases.
We start with the wish, again.
But there are only two aces, so the second card cannot be an ace.
Thus, it could only be 48 cards.
Then 47 and 46.
What is the probability that there will be exactly two aces among the cards?
The order does not matter here, so we use combination.
We have to pull[e4] two out of the 4 aces.
Then we need 3 more cards that are not aces.
Well, this is splendid. Finally, let's see one more exercise.
A basketball team has 9 players, and 5 of them are on the court at the same time.
What is the probability that the two best players are on the court together?
The order of selection does not matter, only whom we select for the court.
Thus, we need combination.
Let's see how many cases there are altogether.
We choose five out of the 9 players.
The desired case is when the two best players are on the court, so we definitely choose them,
and then three others.
What is the probability that only one of the two best players is on the court?
The total number of cases is the same here.
The desired case is when we choose one of the two best players
and then four more out of the miserable amateurs.
[e1]permutations of n objects taken k at a time
Events A and B are called independent if it holds that
In the previous die-rolling example event A was rolling an even number, and event B was rolling a number greater than 2. Let's see if these are independent.
This seems to check out, so events A and B are independent.
There is yet another event, C.
Let's see if B and C are independent.
Events A and B are called mutually exclusive if it holds that
Let's see what it looks like for the events in our example.
Well, it seems these are not mutually exclusive.
On the other hand, A and C are mutually exclusive.
At an insurance company, 70% of the customers have car insurance, 60% have home insurance, and 90% have at least one of these two.
Let event A be that a customer has car insurance, and event B that a customer has home insurance. Are these two events independent?
Two events are independent if
Well, let's see what would be.
Based on these, they are not independent.
And they are not mutually exclusive either, because
At another insurance company, 80% of the customers have car insurance, and 20% of the customers have home insurance without car insurance. What percent of customers have home insurance, if having car insurance and home insurance are independent events?
Well, there is such a thing as:
So, 2/3, or 66% of customers have home insurance.
This is splendid, but let's continue with something really interesting.
We have a die that we roll once. Let event A be rolling an odd number, and event B rolling a number greater than 3.
We get the probability of A the usual way.
We count how many times it will occur and divide that by the total number of events.
There is nothing exciting so far.
The real thrill comes now.
Let's see what the probability of A will be, if we know that event B will definitely occur.
Well, we only have 3 cases, because event B will definitely happen,
and the desired event is an odd number, which is only one of these.
Thus, this new probability is 1/3, and the following notation is used:
We read this as A given[e1] B, and it answers the question of what chance does A have, if B definitely occurs.
The probability of event A, if event B definitely occurs:
Let's see what we can use this for.
In a town, out of 1000 residents, on average, 350 are smoking, 120 has some sort of cardiovascular trouble, and there are 400 who belong to at least one of these groups.
According to a survey, 30% of TV watchers watch the morning news. 90% of TV watchers watch at least one of the morning or evening news.
If a resident has cardiovascular troubles, what is the probability that he is a smoker?
Let's see the problem.
Cardiovascular trouble is definite, smoking is questionable.
There are then these formulas.
Thus, a resident with cardiovascular troubles is a smoker with a 0.583 probability.
Here is another very exciting story.
According to a survey, 90% of TV watchers watch at least one of the morning or evening news. Those who watch the evening news have a 20% chance that they also watched the morning news. The morning news is watched by 30% of all TV watchers.
What is the probability that if someone watches the morning news, then he watches the evening news as well?
A=watches in the morning
B=watches in the evening
Let's try writing the question:
watches in the morning: definite
watches in the evening: questionable
So far so good.
Let's see what we know.
evening for sure, morning 20% chance
There are then these formulas.
90% of TV watchers watch at least one of the morning or evening news.