“Amid Surge in Popularity, Lie Is Put to the Polygraph”

Los Angeles Times staff writer Charles Piller reports. Excerpt:

The demand to cast an ever-wider net of security across the country has created a rush to embrace technologies that have demonstrated a sometimes staggering propensity for snaring the innocent.

Since last year’s terrorist attacks on the World Trade Center and the Pentagon, hijackers, spies and snipers are seen as blending invisibly into airports, public buildings and city streets — even infiltrating the very agencies that guard against attacks. With almost no one above suspicion, security agencies increasingly are looking for screening technologies that can peer into the thoughts of thousands or even millions of people.

Artificial-intelligence software that plucks terrorist needles from haystacks of unrelated data, facial recognition stations that see hijackers behind newly grown beards at airport checkpoints, and electronic identification systems for travelers are being implemented despite clear signs that the error-prone systems may do more harm than good, experts say.

The latest example in this trend is the 100-year-old polygraph. In a scathing report this month, the nation’s most respected scientific society, the National Academy of Sciences, debunked the use of polygraphs to catch spies and screen employees. The study called polygraph tests so flawed as to be “a danger to national security.”

But even before reviewing the rigorous assessment, a wide range of police and federal security agencies now say they have no plans to abandon the device. And unlike experimental, high-tech security tools that are not yet widely deployed, the “lie detector” is used daily by thousands of police departments and federal security agencies.

Security officials cite a lack of alternative technologies and, despite the report’s findings, an abiding faith that it is better to suspect many in order to detect one or two terrorists or criminals.

Always an Error Margin

“There’s always a margin of error,” said Wayne Jones, a recruiter for the San Jose Police Department. “But is it a good indicator? Yes. It’s not a fishing expedition.”

Experts view such widespread support for a discredited technology as a distressing sign of lowered standards of protection as the nation races to catch not only spies and terrorists, but those who might merely be contemplating a criminal act. It signals, they say, a growing disconnect between scientific certainty and security imperatives in the post-9/11 world.

“A key problem is the illusion of control. A lot of technology is marketed to make people think they know more than they do, and can do more than they can,” said Edward Tenner, author of “Why Things Bite Back: Technology and the Revenge of Unintended Consequences.”

“Not only will these technologies be a distraction, but there is an even greater danger — that terrorists may be able to work around them.”

Intelligence and military agencies use the polygraph extensively. In a growing trend, more than 62% of large police departments test job applicants and many test criminal suspects. It’s an effort borne of frustration. Security professionals are trying to satisfy public pressure to preempt acts of terrorism and other crimes. No technology can read minds.

But there is the polygraph.

Jones, of the San Jose police, credits polygraph testing of job applicants with saving his city from a Rampart-like scandal, in which crooked Los Angeles cops terrorized lawbreakers and innocents alike during the 1990s.

A Ringing Endorsement

“I put a lot of stock into it — and I’ve been in the business for 25 years,” he said.

Jones’ remarks were echoed by police examiners from Los Angeles, Dallas, Chicago and San Diego, as well as some with the CIA, Customs Service and Secret Service.

Yet the National Academy report could hardly have been more dismissive of the practice.

Overconfidence in the polygraph actually reduces security because many loyal employees are judged deceptive while most spies escape notice, the report noted.

“National security is too important to be left to such a blunt instrument,” said Stephen E. Fienberg, a computer scientist at Carnegie Mellon University in Pittsburgh and chairman of the academy panel.

Leave a Reply

Your email address will not be published. Required fields are marked *