LieBabyCryBaby wrote on Dec 6
th, 2006 at 7:53pm:
Digithead,
I simply don't follow your numbers. Sorry. If a process is accurate 90% of the time, that doesn't equate to being correct only 1 out of 12 times. Perhaps you have a theorem that accounts for this. If not, I'm sure you will at least make it sound impressive.
Nope, it's not impressive, it just requires the ability to do arithmetic. Understanding fractions and percentages would help too...
So to reiterate, they are called positive predictive value (PPV) and negative predictive value (NPV).
PPV is the probability that a person has the condition given that the test is positive.
With your example of 90% accuracy and assuming 1% of the population is deceptive and 99% are truthful. If we have 1000 examinees, this means 10 are deceptive and 990 are truthful.
That means you will have .9x10=9 true positives and 1 false negative. Notice the sneaky conversion from percents to decimals.
It also also means you will have .9x990=891 true negatives and 990-891=99 false positives. Ah, substraction, addition's tricky friend.
So the total of true and false positives in this example is 9+99=108 total positives. Are you still with me?
That means your PPV=9/108=8.3% probability that the person is deceptive given that the test is positive. Great accuracy if the test is positive.
Warning, division coming up.
In other words, 99/9=11 people will be falsely accused for every person correctly identified. To put it another way, 1 out of every 12 positives (11 false + 1 true positive) will be correct.
That's some fancy arithmetic, wouldn't you say?
If the base rate is 99% instead, the numbers come out the same for false negatives.
LieBabyCryBaby wrote on Dec 6
th, 2006 at 7:53pm:
Here's an interesting article about forensic "science."
http://men.msn.com/articlepm.aspx?cp-documentid=808224>1=8883 Most people don't realize it, but many of the forensic tools used in police work aren't as accurate as shows like "CSI" would have us believe. There aren't very many of them that you could stake a case on and be 100% sure of making the right call. As a district attorney is quoted in the article, "Hair analysis, fiber analysis, bite marks--you don't want to base too much of a case on those. Some prosecutors succumb to the temptation to rest their case on a fiber or a hair. But a good case is made up of a bunch of little things." Even fingerprints are said to be inaccurate a significant percentage of the time.
But would we throw out these methods that are not 100% accurate, and use eyewitness testimony alone?--Which, by the way is also nowhere near 100% accurate.
I absolutely agree. But we can pursue more accurate methods, discard ones that don't work, seek supporting evidence that in totality reduces error and continously improve the system through science. The CQT polygraph is not based on science and will never get more accurate, therefore it should be discarded...
LieBabyCryBaby wrote on Dec 6
th, 2006 at 7:53pm:
I will agree with any "anti-" person on this forum that polygraph charts alone should not determine guilt or whether a person should be hired for a job. But knowing from experience that the polygraph is usually right, I would also argue, as many agencies do, that we should keep it as a useful tool, despite the fact that it is not 100% accurate. Remember, those of you who claim to be "false positives": In law enforcement you use the best tools you have until something better comes along. If you pass the screening process and get the law enforcement job you want, you will in fact be using many of those same tools that are not 100% accurate, thereby creating your own "false positive" victims while you're right most of the time, not all of the time. Ironic, then, that many of you who want those law enforcement jobs are sitting here arguing against an imperfect law enforcement tool that is widely accepted, by many, many people and agencies, as one of those good tools.
If one million people do a foolish thing, it's still a foolish thing...