math problem

Reply Fri 8 Aug, 2008 12:09 pm
i posted this in Math & Science, but didn't get much of a response, so i'm trying it here. it's a type of math problem i came across in a book i read recently, <Fooled by randomness>. here's a good example: do try to solve it without googling please.

"A cab was involved in a hit and run accident at night. Two cab companies, the Green and the Blue, operate in the city. 85% of the cabs in the city are Green and 15% are Blue.

A witness identified the cab as Blue. The court tested the reliability of the witness under the same circumstances that existed on the night of the accident and concluded that the witness correctly identified each one of the two colors 80% of the time and failed 20% of the time.

What is the probability that the cab involved in the accident was Blue rather than Green?"

here's a clue: the answer is not 80%. it would suffice to explain why it's not 80% rather than calculate the correct probability, although that's not too hard to do.

i also invite you to post any other common miscalculations of probability you can think of.
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Discussion • Score: 0 • Views: 694 • Replies: 7
No top replies

Reply Fri 8 Aug, 2008 01:38 pm
I'm guessing your looking for something along the lines of:

Out of 100 randomly-selected cabs, the witness would identify 71 as Green and 29 as Blue.

However, you're not discussing 100 cabs; you're discussing one cab.
0 Replies
Reply Fri 8 Aug, 2008 01:49 pm

I set up a table of WAS vs. SAW.

G/G .68
G/B .17
B/G .03
B/B .12

Since B was seen, we only need to deal with G/B and B/B.
.12/.29 for was B
.17/.29 for was G
0 Replies
Reply Fri 8 Aug, 2008 01:56 pm
mark, you're spot on. :wink:

drewdad, it does simplify calculations to just say there are 100 cabs total since we want a percentage answer. given there are 85 green cabs and they're correctly identified 80% of the time, that's .8x85 = 68, and given 15 blue cabs incorrectly identified 20% of the time, that's .2x15 = 3, for a total of 71 indentified as green. without any more math, i assume that 29 cabs are identified as blue. of the 29, 17 are actually green--that's 85 - 68--so probability of the car being actually blue is 12/29 = 41%.

while it's true only one cab is being discussed, we want probability, which is an average, so if this witness made the identification 100 times, he's probably correct 41%, which is a lot less than the 80% one would naively expect.
0 Replies
Reply Fri 8 Aug, 2008 02:32 pm
yitwail wrote:
while it's true only one cab is being discussed, we want probability, which is an average, so if this witness made the identification 100 times, he's probably correct 41%, which is a lot less than the 80% one would naively expect.

It's been a while since my last probability/statistics course, but you've got a sampling error.
0 Replies
Reply Fri 8 Aug, 2008 03:44 pm
you're one up on me then, not having had the benefit of a probability course, so i'd appreciate it if you would point out the sampling error. i got the problem & answer from wikipedia, not necessarily a reliable source, but it's an example of what's called the <base rate fallacy>.

A math prof, John Allen Paulos, author of Innumeracy, used similar reasoning to argue against the pentagon's proposed Total Information Awareness (TIA) program thus:

DO THE MATH: ROOTING OUT TERRORISTS IS TRICKY BUSINESS by John Allen Paulos appeared in the Sunday LOS ANGELES TIMES January 23, 2003. See also related piece on ABCNews.com.

PHILADELPHIA -- Let's start with a basic question: What is the purpose of the battle against terrorism? One answer, perhaps reflecting how most Americans see things, is that we want to feel safe. As philosopher Thomas Hobbes knew, what people want most from the state is protection, not freedom. To this end, and since terrorists appear relatively invulnerable to the usual deterrents, it follows that we would ideally intercept them before they carry out their attacks.

This is part of what is fueling policies like the incarcerations at Guantanamo, the massive sweeps by the Immigration and Naturalization Service, the registration programs we've seen since Sept. 11 and, more ominously, the Pentagon's proposed techno-surveillance system, Total Information Awareness (TIA). Headed by retired Vice Adm. John M. Poindexter of Iran-Contra notoriety, TIA will cost, by some estimates, upward of $200 million over three years. Initial funding of $10 million will help set up a system to "detect, classify, ID, track [and] preempt" future terrorists -- pre-perpetrators, if you will -- whom Poindexter hopes to spot before they do harm.

Using supercomputers, sophisticated software and data-mining techniques common in marketing, the TIA will maintain records on Americans' credit card purchases, plane flights, e-mails, prescriptions, book purchases, housing, legal proceedings, driver's licenses, rental permits and more, all in the hope of detecting suspicious patterns of activity -- buying certain chemicals, say, or renting crop-dusting planes.

Upon detecting these supposedly telltale patterns, law enforcement would hope to stop pre-perpetrators before they commit crimes. It's a worthy goal, but in pursuing it the government will collect, integrate and evaluate extensive personal data on all of us, greatly compromising our privacy and perhaps even our political liberty. Is it worth the cost to society?

Let's consider a mathematical approach to that question, one that derives from probability theory and the obvious fact that the vast majority of people of every ethnicity are not terrorists.

For the sake of argument, let's assume that eventually some system of information gathering and interpretation becomes so uncannily accurate that when it examines a future terrorist (someone with terrorist intentions), 99% of the time it will correctly identify him as a pre-perpetrator. Furthermore, when this system examines somebody who is harmless, 99% of the time the system will correctly identify him as harmless. In short, it makes a mistake only once every 100 times.

Now let's say that law enforcement apprehends a person using this technology. Given these assumptions, one might guess that the person would almost certainly be a terrorist. Right? Well, no. Even with the system's amazing data-mining powers, there would be only a tiny chance that the apprehended person would have gone on to commit a terrorist act if he had not been caught.

To see why this is so and to make the calculations easy, let's postulate a population of 300 million people of whom 1,000 are future terrorists. The system will correctly identify, we're assuming, 99% of these 1,000 people as future terrorists. Thus, since 99% of 1,000 is 990, the system will apprehend 990 future terrorists. Great.

But wait. There are, by assumption, 299,999,000 nonterrorists in our population, and the system will be right about 99% of them as well. Another way of saying this is that it will be wrong about 1% of these people. Since 1% of 299,999,000 equals 2,999,990, the system will swoop down on these 2,999,990 innocent people as well as on the 990 guilty ones, apprehending them all.

That is, the system will arrest almost 3 million innocent people, about 3,000 times the number of guilty ones. And that occurs, remember, only because we're assuming the system has these amazing powers of discernment. If its powers are anything like our present miserable predictive capacities, an even greater percentage of those arrested will be innocent.

Of course, this is an imagined scenario, and the numbers, percentages and assumptions are open to serious question. Nevertheless, the fact remains that since almost all people are innocent, the overwhelming majority of the people rounded up using any set of reasonable criteria will be innocent. And even though the system proposes only increased scrutiny rather than arrest of suspected future terrorists, such scrutiny might very well lead over time to a voluminously detailed government dossier on each of us. At the same time, since scrutiny without interdiction is unlikely to stop future terrorists from carrying out an attack, the system is likely to lead to little, if any, increase in security.

We want to feel safe as we go about our daily lives, but I submit that the proposed Total Information Awareness program is not conducive to a feeling of safety, much less to a feeling of freedom. Let's fight terrorism without ditching our commitment to privacy rights.


John Allen Paulos, a professor of mathematics at Temple University and adjunct professor of journalism at Columbia University, is the author of "Innumeracy" and the forthcoming "A Mathematician Plays theMarket."
0 Replies
Reply Fri 8 Aug, 2008 04:13 pm
That particular article is spot on. It's similar to analyzing the accuracy of pregnancy tests (which is the example they used back in college).

I just don't like the car example, because we're not talking about a RANDOM sample of cars.

Each time a car goes by, the observer is 80% likely to be correct. It is only over the course of a large number of samples that the error rate makes the count inaccurate.

Like I said, though, it's been a while.
0 Replies
Reply Fri 8 Aug, 2008 05:21 pm
i understand your critique now. in defense, i'd say that the taxi problem's meant to be illustrative, not necessarily rigorous in every particular.
0 Replies

Related Topics

Alternative Einstein's riddle answer - Discussion by cedor
Urgent !!! Puzzle / Riddle...Plz helpp - Question by zuzusheryl
Bottle - Question by Megha
"The World's Hardest Riddle" - Discussion by maxlovesmarie
Hard Riddle - Question by retsgned
Riddle Time - Question by Teddy Isaiah
riddle me this (easy) - Question by gree012
Riddle - Question by georgio7
Trick Question I think! - Question by sophocles
Answer my riddle - Question by DanDMan52
  1. Forums
  2. » math problem
Copyright © 2022 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.04 seconds on 10/03/2022 at 01:05:26