In the following probability problems, problems #1-3 function as a set, problems #4-5 are another set, and problems #6-7 are yet another set. The scenarios are all similar in a set, and the answer choices for those problems in the same set are the same. What is going on there? Do all questions in the same set have the same answer? Do all have different answers? What is happening?

1) In a certain game, you perform three tasks. You flip a quarter, and success would be heads. You roll a single die, and success would be a six. You pick a card from a full playing-card deck, and success would be picking a spades card. If any of these task are successful, then you win the game. What is the probability of winning?

2) In a certain game, you perform three tasks sequentially. First, you flip quarter, and if you get heads you win the game. If you get tails, then you move to the second task. The second task is rolling a single die. If you roll a six, you win the game. If you roll anything other than a six on the second task, you move to the third task: drawing a card from a full playing-card deck. If you pick a spades card you win the game, and otherwise you lose the game. What is the probability of winning?

3) In a certain game, you perform three tasks. You flip a quarter, and success would be heads. You roll a single die, and success would be a six. You pick a card from a full playing-card deck, and success would be picking a spades card. If *exactly* one of these three tasks is successful, then you win the game. What is the probability of winning?

*The following information accompanies questions 4-5*

Johnson has a corporate proposal. The probability that vice-president Adams will approve the proposal is 0.7. The probability that vice-president Baker will approve the proposal is 0.5. The probability that vice-president Corfu will approve the proposal is 0.4. The approvals of the three VPs are entirely independent of one another.

4) Suppose the Johnson must get VP Adam’s approval, as well as the approval of at least one of the other VPs, Baker or Corfu, to win funding. What is the probability that Johnson’s proposal is funded?

(A) 0.14

(B) 0.26

(C) 0.49

(D) 0.55

(E) 0.86

5) Suppose Johnson must get the approval of at least two of the three VPs to win funding. What is the probability that Johnson’s proposal is funded?

(A) 0.14

(B) 0.26

(C) 0.49

(D) 0.55

(E) 0.86

*The following information accompanies questions 6-7*

Johnson has a corporate proposal. The probability that vice-president Adams will approve the proposal is 0.6. If VP Adams approves the proposal, then the probability that vice-president Baker will approve the proposal is 0.8. If VP Adams doesn’t approve the proposal, then the probability that vice-president Baker will approve the proposal is 0.3.

6) What is the probability that one of the two VPs, but not the other, approves Johnson’s proposal?

(A) 0.12

(B) 0.24

(C) 0.28

(D) 0.48

(E) 0.72

7) What is the probability that at least one of the two VPs, approves Johnson’s proposal?

(A) 0.12

(B) 0.24

(C) 0.28

(D) 0.48

(E) 0.72

Solutions will come at the end of this blog.

## Probability blogs

Here are some previous blogs on probability

2) The Probability “At Least” Question

3) Probability and Counting Techniques

5) Probability DS Practice Questions

Each of the first four have a few practice questions, and the fifth article has 8 DS questions, so combined with the seven here, that’s a great deal of probability practice!

## A review of rules

One important idea in probability is **mutually exclusive** (a.k.a. “disjoint”). Two events are mutually exclusive if they both can’t happen at the same time, if the very fact that one happens completely precludes the other from happening. For example, on a single coin toss, the results H and T are mutually exclusive. On a single die roll, the six numbers on the die are mutually exclusive.

If events F and G are mutually exclusive, then

**P(F and G) = 0**

and

**P(F or G) = P(F) + P(G)**

The first equation expresses mathematically what we expressed in words: it’s impossible for outcomes F & G to occur at the same time. The second rule is the pure form of the rough probability idea that “OR means add” — that rule is approximately true most of the time, but exactly true when the two events are mutually exclusive.

If events A and B are just general events, not mutually exclusive, then

**P(A or B) = P(A) + P(B) – P(A and B)**

That is the **generalized OR rule**, a very important rule.

Another important idea in probability is the idea of **independent**. Two events are independent if whether one happens has absolutely no bearing on whether the other happens; in other words, if we are told the outcome of one event, the fact that this outcome occurred gives us absolutely no information that would help us predict whether the other event will occur. In tossing coins, the separate coins are independent. In rolling dice, the separate dice are independent. In the real world, two absolutely unrelated things would be independent. Consider these two events:

Event A = the New York Mets win on a particular day

Event B = the Nikkei (the Japanese stock index) goes up on a particular day

Those two have absolutely no influence on one another, and even if we are given explicit information about the outcome of one, that would give us absolutely no insight into the outcome of the other.

If each of two variables is a numerical variable that could take a range of numerical values, then a synonym for “independent” would be “completely uncorrelated.” If two variables are correlated, then having information about the value of one allows you to make an informed prediction about the value of the other. If two variables are independent, then knowing the value of one doesn’t give us the foggiest idea of what the other could be.

If events X and Y are independent, then

**P(X and Y) = P(X)*P(Y)**

This rule is the pure form of the rough probability idea that “AND means multiply” — that rule is approximately true most of the time, but exactly true when the two events are independent. Notice, for independent events, we can simplify the generalized OR rule:

**P(X or Y) = P(X) + P(Y) – P(X)*P(Y)**

Those rules will get you through a great deal of the probability on the GMAT. BUT, what if we need an AND rule for two events that are not independent? What would be the “generalized” version of the AND rule for any two events, not just events that are independent. In order to talk about that, we need to introduce a new idea.

It’s worthwhile also mentioning — the conditions “mutually exclusive” and “independent” are special case scenarios. They are the opposite of common: they are relatively rare. Never make it your default assumption that either is true unless the question makes clear that it must be the case.

## Conditional probability

This is a term that, like many math terms, will not explicitly appear on the GMAT, and the notation I will show, standard in many probability textbooks, will not appear on the GMAT. Nevertheless, the idea of conditional probability does appear on the GMAT.

The notation we use is P(A|B). Event A is the main focus: we are interested whether or not A occurs. Event B is some kind of condition we impose: the idea is, we will pretend that we live in a world in which Event B is always true — under those conditions, what is the probability of A? P(A|B) is a “conditional probability”, a probability when we impose the condition of B. The notation P(A|B) is read “the probability of A, given B.”

Here are a few examples. Suppose

A = on a given day in Berkeley, CA, it rains

B = on a given day in Berkeley, CA, there are no clouds in the sky.

Here P(A) would be the probability that here in Berkeley we get rain on a randomly selected day; that would be approximately 0.10 or 0.15. By contrast, if we impose the condition “no clouds”, then the conditional probability, P(A|B), would have to be zero: how could it possibly rain when there are no clouds? This is an example of a condition lowering a probability; in the next example, the condition will elevate the probability.

Here’s another example, more socially relevant. Suppose

A = a randomly selected felony defendant is convicted

B = the defendant is African-American

P(A) looks at all folks in the USA accused of and tried for felony, and regardless of any individual factors (race, age, evidence, crime, etc.) just looks at: what percent, overall, are convicted? According to the BJS, this percent is P(A) = 0.68. In a world of perfect fairness and equality, P(A) and P(A|B) would equal exactly the same thing — in other words, a person’s race would play absolutely no role in whether that person were convicted of a felony. Most regrettably, in America in 2013, 148 years after the end of the US Civil War, 45 years after the death of Dr. Martin Luther King Jr., racism still has a large effect on American society and an overwhelming effect on the criminal justice system; in other words, P(A|B) > P(A). Conditional probability is *not* just a mathematical idea: it has profound social and moral implications for all kinds of issues of justice and fairness in the real world.

On a more mathematical note, notice if events X and Y are independent, then whether Y occurs or not should have absolutely no bearing on whether X occurs. In other words, for independent events X & Y, P(X|Y) = P(X).

## The generalized AND rule

Now that we have discussed conditional probability, we can discuss the generalized AND rule.

If events A and B are two general events, not mutually exclusive, not independent, then, as a general rule:

**P(A and B) = P(A)*P(B|A)**

**P(A and B) = P(B)*P(A|B)**

Either one of these is the **generalized AND rule**. Notice that AND still means multiply, though what we multiply here is a little different from what was multiplied in the special case with independent events.

For example, suppose we are going to pick two cards from a full 52-card deck, without replacement, and we want to know the probability of picking two heart-cards. The phrase “without replacement” means when we pick the first card, we put it aside and do not return it to the deck, so that the second card is picked from a deck of *only 51* cards. That changes the probability. The words “without replacement” always means the choices are NOT independent, because the outcome of the first choice has a big influence on the outcome of subsequent choices. For this example, let

A = first choice is a heart-card

B = second choice is a heart-card

There are four suits in a full deck, and each suit is equally represented, so P(A) = 1/4. Let’s think about P(B|A). If the first card was a heart-card and it was not replaced, that means the second choice is made from a deck of 51 cards that has 13 of the other three suits but only 12 heart-cards. Thus, P(B|A) = 12/51 = 4/17, and P(A and B) = (1/4)*(4/17) = 1/17

The generalized AND rule is often used in sequential tasks such as this, in which there are earlier choices or trials, and the outcomes of these have various effects on later choices or trials. The generalized AND rule is most often not applicable in a more side-by-side choice, in which all the choices are available at the outset.

## Summary

If you understand everything in this post, you are a GMAT Probability pro. If you had some insights while reading, you may want to give the seven problems at the top another look before reading the solutions below.

Here’s another probability question:

8) http://gmat.magoosh.com/questions/1038

If you would like to add anything or ask a clarifying question about anything I have said, please let know in the comments sections.

## Practice problem solutions

1) In this scenario, winning combinations would include success on any one task as well as any combination of two or three successes. In other words, there are several cases that constitute the winning combinations. By contrast, the only way to lose the game would be unsuccessful at all three tasks. Let’s use the **complement rule**.

P(lose game) = P(quarter = T AND dice ≠ 6 AND card ≠ spades)

= (1/2)*(5/6)*(3/4) = 5/16

P(win game) = 1 – P(lose game) = 1 – (5/16) = 11/16

Answer = **(D)**

Curious about why you can’t simply add P(A) + P(B) + P(C) to solve Problem #1? See the extended discussion in the blog comments below.

2) In this scenario, there are several routes that would lead to winning the game. The only route that leads to losing the game is the route in which all three tasks are unsuccessful. We can do this precisely as we did the previous problem.

P(lose game) = P(quarter = T AND dice ≠ 6 AND card ≠ spades)

= (1/2)*(5/6)*(3/4) = 5/16

P(win game) = 1 – P(lose game) = 1 – (5/16) = 11/16

Answer = **(D)**

3) This is very tricky. We have to think of three cases.

Case One: success with coin, no success with die or card

P(coin = H AND die ≠ 6 AND card ≠ spade) = (1/2)*(5/6)*(3/4) = 15/48

Case Two: success with die, no success with coin or card

P(coin = T AND die = 6 AND card ≠ spade) = (1/2)*(1/6)*(3/4) = 3/48

Case Three: success with card, no success with die or coin

P(coin = T AND die ≠ 6 AND card = spade) = (1/2)*(5/6)*(1/4) = 5/48

The winning scenario could be Case One OR Case Two OR Case Three. Since these are joined by OR statements and are mutually exclusive, we simply add the probabilities.

Answer = **(E)**

4) We will use the abbreviation A = VP Adams approves, B = VP Baker approves, and C = VP Corfu approves.

P(funding) = P(A and (B or C)) = P(A)*P(B or C)

We can multiply because everything is independent of everything else. First look at P(B or C). These are not mutually exclusive, so we need to use the generalized OR rule:

P(B or C) = P(B) + P(C) – P(B and C)

Because B & C are independent, we can multiply to find P(A and B)

P(B or C) = (0.5) + (0.4) – (0.5)*(0.4) = 0.9 – 0.2 = 0.7

Now, multiply by P(A)

P(funding) = P(A)*P(B or C) = (0.7)*(0.7) = 0.49

Answer = **(C)**

5) For this one, we have to consider four different cases

P(A and B and (not C)) = (0.7)*(0.5)*(0.6) = 0.21

P(A and (not B) and C) = (0.7)*(0.5)*(0.4) = 0.14

P((not A) and B and C) = (0.3)*(0.5)*(0.4) = 0.06

P(A and B and C) = (0.7)*(0.5)*(0.4) = 0.14

These four are mutually exclusive and are joined by OR, so we add them.

P(funding) = 0.21 + 0.14 + 0.06 + 0.14 = 0.55

Answer = **(D)**

6) We will use the abbreviation A = VP Adams approves and B = VP Baker approves. We will consider two cases:

Case #1: Adams approves and not Baker

P(A and not B) = P(A)*P(not B|A) = (0.6)*(0.2) = 0.12

Case #2: Baker approves and not Adams

P(not A and B) = P(not A)*P(B|not A) = (0.4)*(0.3) = 0.12

These two cases are mutually exclusive and joined by OR, we add them.

P(only one VP approves) = 0.12 + 0.12 = 0.24

Answer = **(B)**

7) Here, the combinations (A and not B), (not A and B), and (A and B) all lead to approval of the proposal. The only one that doesn’t is the complement (not A and not B).

P(not A and not B) = P(not A)*P(not B|not A) = (0.4)*(0.7) = 0.28

P(at least one) = 1 – P(not A and not B) = 1 – 0.28 = 0.72

Answer = **(E)**

Hi there Mike,

I dont think you realize how awesome you are. I really enjoyed reading your posts. You usually touch on the history of a given subject, and it makes it apparent that you really enjoy what teaching. I have also found that you are one of the rare people who is capable of thinking of the justice system in probablistic terms (I have first come across the concept in Fooled by Randomness – a book I am sure you will enjoy if you havent read it already). And finally, I found your posts to be immensely useful for my test prep. They are bite sized, and they contain good practice problems. I cant thank you enough.

Keep the good stuff coming.

Hi Sid,

Thanks so much for your kind words! I’ll make sure Mike sees them!

Happy studying!

Dani

Hi Mike,

I fell for a trap in the answer choices for Question 1 but I’m not sure what the problem is in my logic. It’s asking what the probability of winning is, and to win you need 1 success out of 3 attempts (unrelated to each other).

Let A = Flipping a head = (1/2)

Let B = Rolling a 6 = (1/6)

Let C = Drawing a spade card = (1/4)

P (A or B or C) = (1/2) + (1/6) + (1/4) = (11/12)

In this case, you wouldn’t have to subtract the overlap, such as flipping a head AND rolling a 6, because you still win the game either way so it doesn’t matter if they are counted twice.

Obviously I’m wrong, but I don’t get why… please explain if you can! I love this site by the way!

I am having the same issue as Carly. Can you help us out, Mike?

Thanks in advance.

I am also having the same issue. Can you please help?

Thanks!

Carly & Greg & DC,

I’m happy to respond. The very tricky think about using the OR formula is that you have to think extremely carefully about overlapping cases. In the scenario in problem #1, for brevity, let’s say:

A = the event of flipping the quarter and getting heads — P(A) = 1/2

B = the event of rolling the die and getting 6 — P(B) = 1/6

C = the event of drawing a spade card from the full shuffled deck — P(C) = 1/4

If we flip the quarter, roll the die, and draw a card, there are eight possible outcomes:

Outcome #1 = A and B and C

Outcome #2 = A and B but not C

Outcome #3 = A and C but not B

Outcome #4 = B and C but not A

Outcome #5 = A but neither B nor C

Outcome #6 = B but neither A nor C

Outcome #7 = C but neither A nor B

Outcome #8 = neither A nor B nor C

Now, notice a few things.

P(A) = 1/2 = P(O#1) + P(O#2) + P(O#3) + P(O#5)

P(B) = 1/6 = P(O#1) + P(O#2) + P(O#4) + P(O#6)

P(C) = 1/4 = P(O#1) + P(O#3) + P(O#4) + P(O#7)

Now, if we simply add P(A) + P(B) + P(C), as you have suggested, we get a truckload of overlap — Outcomes #2 & #3 & #4 get

double-counted, and Outcome #1 getstriple-counted!! That’s a lot of overlap that we would have to subtract! If we were going to do it this way, we would have to calculate the probabilities of the first four outcomes, thenP(win) = P(A) + P(B) + P(C) – P(O#2) – P(O#3) – P(O#4) – 2*P(O#1)

That would be complicated!!

Notice that any of the first seven outcomes would be a win, so the probability of winning would be the sum of the first seven outcomes. That still requires seven separate calculations before we can compute what we want. Still a lot of work!

This is why I chose the elegance of the complement rule. All I did in the explanation below was calculate P(O#8) and subtract it from one: this will automatically give me the sum of the probabilities of the first seven outcomes, which is the probability of a win. Using the complement is 10x easier and more efficient than any other solution.

Does all this make sense?

Mike

I dont know why it is the case but almost always in the independent and non mutually exclusive cases we never add up all the positive possibilities with an OR. We always look at the (1-AND) case.

The best way I can explain it, is if you had two separate coin tosses (Toss A and Toss B), you would have a probability of 1/2 for Heads for each toss.

So if I we asked whats the probability of toss A OR toss B being heads?

And you then said well its OR so ADD 1/2+1/2 = 1

This is obviously non-sense since we all know that the probability of A or B being heads is not 100% or guaranteed!

On the other hand if we looked at the inverse case and said that “AT LEAST” 1 heads is all cases except when they are not both tails.

P(A or B) = 1 – P(NotA & Not B) —> 1 – (1/2*1/2) = 1 – (1/4) = 3/4 makes sense!

Hope this helps.

Hello Mike,

Can you help me identify what is wrong in my method used for solving Q.4 ?

P(That Johnson`s approval is funded) = P(A and B approve) + P(A and C approve) + P(P(A ,B and C approve)

= (0.7 * 0.5) + (0.7 * 0.4) + (0.7 * 0.5 * 0.4)

= 0.77

Thanks in advance

Dear Sagar,

I’m happy to respond. With all due respect, there are a number of problems in your approach. Part of the problem is that you jump immediate from the probability ideas to math without understanding the exact translation. Roughly, in probability a plus sign means OR, but in math, OR always includes the AND case. Thus, the P(A and B approves) already includes some cases in which C also approves and some in which C doesn’t approve. The P(A and C approves) already includes some cases in which B approves and some in which B doesn’t approve. The case in which all three approved was already double counted between those two, and then you add that in again the third time! If in that third term, you had subtracted it instead of adding it, subtracting to undo the double-count, rather than adding to create a triple-count error, you would have gotten the right answer.

In probability, you can’t just plug into formulas. You have to understand the precise implication of each and every formula, because it’s extremely easy to intend one thing and then plug into a formula that implies something different. The formulas should NOT be the focus of your understanding: rather, you should focus on the precise details of the “AND” and “OR” scenarios.

Does all this make sense?

Mike

Hi Mike,

Thanks for putting up these examples. For the VP 1st example (problem 4), why can’t we have P (A and B) or P (A and B), instead of P (A and P ( B or C)). Will be great if you could help me out.

Regards

Greetings Mike,

before I ask my question, I do want to thank you and commend you for the helpful work you have posted. Also, on a few occasions you have replied to my comments with great help, which I also thank you for.

I had a question in regards to the proposal question. I know that the rule for the “at least” probability questions are to find the probability of the known event(s) and then subtract by 1. In the case for the proposal question, the reason why we don’t subtract 1 is because by logic, we already know the probability of one of the VP’s deciding not to accept the proposal, correct? opposed to the card question, where we have to calculate the probability of not winning and subtracting it by 1.

Hope to hear from you soon.

Kind Regards,

Herpal Pabla

Dear Herpal,

I’m happy to respond. The point of the “at least” rule is a point of strategy. It is often 100% mathematical correct to calculate things either way — calculate (a) all the cases I do want, or calculate (b) all the cases that I don’t want and subtract it from 1. The point of strategy is that, sometimes, (b) is a much much easier calculation than (a). In #2, for example: calculate it both ways. Calculate the probability of all the scenarios that would lead to winning, and then separate calculate the probability of losing and subtract that from one. I think you will find that the first is a much much longer calculation. Please don’t just accept my word for it: do the two calculations yourself. The point of using the “at least” rule it to employ it when it will save time, when it will significantly reduce the calculation. That’s also true in #7.

In #6, both routes of calculations would be about equally complex. When there’s no significant time advantage to taking the “at least” solution route, then there’s no reason to take it. That’s why I didn’t even examine the “at least” solution for #6.

I assume you were asking about #6. You just said “

the proposal question” and there were two questions, very different, about the proposal scenario. It will help you to be as precise as you can be in your questions.Does all this make sense?

Mike

hi mike

in question number 2 i feel like the correct answer 23/48 is correct in regards of there being no order for moving on to the next task,would that be incorrect? i feel like the probability of winning with starting off with a 1/2 chance of winning and then having 2 more opportunities increases the users winning chances above 1/2 is there something im not understanding?

hi mike

I think i just realized my error, I wasnt calculating all the scenarios of winning but the (odds) of winning i seem to mix those up alot, anywho im really am enjoying your site,and thank you for the valuable info!

Josh,

You are quite welcome, my friend. Best of luck to you!

Mike

Hi MIke,

In question 4,

“Suppose the Johnson must get VP Adam’s approval, as well as the approval of at least one of the other VPs, Baker or Corfu, to win funding ”

aren’t you only considering Baker or Corfu case.

Shouldn’t you also consider a case where all three approve?

It would be really helpful if you cleared this.

Thanks!

Dear Vidu,

I’m happy to help.

In mathematics in general and in probability in particular, the word OR is always what is known as the “inclusive OR” — it always includes the AND case. The OR rule formulas given on this page are all for the “inclusive OR”: they all automatically include the AND case.

You see, the way OR is used in mathematics is different from the way it is often used in colloquial speech, as an “exclusive OR” (called XOR in computer programming). “

You can have cookies OR ice cream!” The implication, of course, is that having both is forbidden. This is the “exclusive OR”, common in colloquial English. The mathematical OR never means this.Because the word OR is always inclusive, the P(A and (B or C)) automatically includes the case in which all three approve.

Does all this make sense?

Mike

There seems to be some erros in Answer 4.

====================

We can multiply because everything is independent of everything else. First look at P(B or C). These are not mutually exclusive, so we need to use the generalized OR rule:

P(B or C) = P(B) + P(C) – P(A and B)

Because B & C are independent, we can multiply to find P(A and B)

P(B or C) = (0.5) + (0.4) – (0.5)*(0.4) = 0.9 – 0.2 = 0.7

====================

I think it should be P(B and C), not P(A and B).

Dear Mathguy,

Yes, that was a genuine typo, and I just corrected it. Thanks for pointing this out.

Mike

Dear Mike,

Excellent article. Probability is one of the trickiest (if I can say like that) parts of the GMAT. This article really explains it in a very simple and efficient way.

Could you just check the solution for the 3rd example. I think the correct answer is under letter E) and not D) as currently stated.

Cheers

Dear Milovan,

Yes, thank you for catching that typo. Great eye for detail! The answer to the third question is indeed (E). Best of luck to you.

Mike