[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Sign-in]  [Mail]  [Setup]  [Help] 

Status: Not Logged In; Sign In

The Media Flips Over Tulsi & Matt Gaetz, Biden & Trump Take A Pic, & Famous People Leave Twitter!

4 arrested in California car insurance scam: 'Clearly a human in a bear suit'

Silk Road Founder Trusts Trump To 'Honor His Pledge' For Commutation

"You DESERVED to LOSE the Senate, the House, and the Presidency!" - Jordan Peterson

"Grand Political Theatre"; FBI Raids Home Of Polymarket CEO; Seize Phone, Electronics

Schoolhouse Limbo: How Low Will Educators Go To Better Grades?

BREAKING: U.S. Army Officers Made a Desperate Attempt To Break Out of The Encirclement in KURSK

Trumps team drawing up list of Pentagon officers to fire, sources say

Israeli Military Planning To Stay in Gaza Through 2025

Hezbollah attacks Israeli army's Tel Aviv HQ twice in one day

People Can't Stop Talking About Elon's Secret Plan For MSNBC And CNN Is Totally Panicking

Tucker Carlson UNLOADS on Diddy, Kamala, Walz, Kimmel, Rich Girls, Conspiracy Theories, and the CIA!

"We have UFO technology that enables FREE ENERGY" Govt. Whistleblowers

They arrested this woman because her son did WHAT?

Parody Ad Features Company That Offers to Cryogenically Freeze Liberals for Duration of TrumpÂ’s Presidency

Elon and Vivek BEGIN Reforming Government, Media LOSES IT

Dear Border Czar: This Nonprofit Boasts A List Of 400 Companies That Employ Migrants

US Deficit Explodes: Blowout October Deficit Means 2nd Worst Start To US Fiscal Year On Record

Gaetz Resigns 'Effective Immediately' After Trump AG Pick; DC In Full Blown Panic

MAHA MEME

noone2222 and John Bolton sitting in a tree K I S S I N G

Donald Trump To Help Construct The Third Temple?

"The Elites Want To ROB Us of Our SOVEREIGNTY!" | Robert F Kennedy

Take Your Money OUT of THESE Banks NOW! - Jim Rickards

Trump Taps Tulsi Gabbard As Director Of National Intelligence

DC In Full Blown Panic After Trump Picks Matt Gaetz For Attorney General

Cleveland Clinic Warns Wave of Mass Deaths Will Wipe Out Covid-Vaxxed Within ‘5 Years’

Judah-ism is as Judah-ism does

Danger ahead: November 2024, Boston Dynamics introduces a fully autonomous "Atlas" robot. Robot humanoids are here.

Trump names [Fox News host] Pete Hegseth as his Defense secretary


Science/Tech
See other Science/Tech Articles

Title: Brain Scans May Be Used As Lie Detectors
Source: NY Post via Big News Network
URL Source: http://feeds.bignewsnetwork.com/red ... 4c0924b70&cat=c08dd24cec417021
Published: Jan 28, 2006
Author: Malcolm Ritter
Post Date: 2006-01-28 21:45:02 by Zipporah
Keywords: Detectors, Brain, Scans
Views: 19
Comments: 1

CHARLESTON, S.C. (AP) -- Picture this: Your boss is threatening to fire you because he thinks you stole company property. He doesn't believe your denials. Your lawyer suggests you deny it one more time - in a brain scanner that will show you're telling the truth.

Wacky? Science fiction? It might happen this summer.

Just the other day I lay flat on my back as a scanner probed the tiniest crevices of my brain and a computer screen asked, "Did you take the watch?"

The lab I was visiting recently reported catching lies with 90 percent accuracy. And an entrepreneur in Massachusetts is hoping to commercialize the system in the coming months.

"I'd use it tomorrow in virtually every criminal and civil case on my desk" to check up on the truthfulness of clients, said attorney Robert Shapiro, best known for defending O.J. Simpson against murder charges.

Shapiro serves as an adviser to entrepreneur Steven Laken and has a financial interest in Cephos Corp., which Laken founded to commercialize the brain-scanning work being done at the Medical University of South Carolina.

That's where I had my brain-scan interrogation. But this lab isn't alone. Researchers at the University of Pennsylvania have also reported impressive accuracy through brain-scanning recently. California entrepreneur Joel T. Huizenga plans to use that work to start offering lie-detecting services in Philadelphia this July.

His outfit, No Lie MRI Inc., will serve government agencies and "anybody that wants to demonstrate that they're telling the truth," he said.

Both labs use brain-scanning technology called functional magnetic resonance imaging, or fMRI. It's a standard tool for studying the brain, but research into using it to detect lies is still in early stages. Nobody really knows yet whether it will prove more accurate than polygraphs, which measure things like blood pressure and breathing rate to look for emotional signals of lying.

But advocates for fMRI say it has the potential to be more accurate, because it zeros in on the source of lying, the brain, rather than using indirect measures. So it may someday provide lawyers with something polygraphs can't: legal evidence of truth-telling that's widely admissible in court. (Courts generally regard polygraph results as unreliable, and either prohibit such evidence or allow it only if both sides in a case agree to let it in.)

Laken said he's aiming to offer the fMRI service for use in situations like libel, slander and fraud where it's one person's word against another, and perhaps in employee screening by government agencies. Attorneys suggest it would be more useful in civil than most criminal cases, he said.

Of course, there's no telling where the general approach might lead. A law review article has discussed the legality of using fMRI to interrogate foreigners in U.S. custody. Maybe police will use it as an interrogation tool, too, or perhaps major companies will find it a cheaper than litigation or arbitration when an employee is accused of stealing something important, other observers say.

For his part, Shapiro says he'd switch to fMRI from polygraph for screening certain clients because he figures it would be more reliable and maybe more credible to law enforcement agencies.

In any case, the idea of using fMRI to detect lies has started a buzz among scientists, legal experts and ethicists. Many worry about rushing too quickly from the lab to real-world use. Some caution that it may not work as well in the real world as the early lab results suggest.

And others worry that it might.

Unlike perusing your mail or tapping your phone, this is "looking inside your brain," Hank Greely, a law professor who directs the Stanford Center for Law and the Biosciences, told me a few days before my scan.

It "does seem to me to be a significant change in our ability ... to invade what has been the last untouchable sanctuary, the contents of your own mind," Greely said. "It should make us stop and think to what extent we should allow this to be done."

But Dr. Mark George, the genial neurologist and psychiatrist who let me lie in his scanner and be grilled by his computer, said he doesn't see a privacy problem with the technology.

That's because it's impossible to test people without their consent, he said. Subjects have to cooperate so fully - holding the head still, and reading and responding to the questions, for example - that they have to agree to the scan.

"It really doesn't read your mind if you don't want your mind to be read," he said. "If I were wrongly accused and this were available, I'd want my defense lawyer to help me get this."

So maybe the technology is better termed a "truth confirmer" than lie detector, he said.

Whatever you call it, the technology has produced some eyebrow-raising results. George and his colleagues recently reported that using fMRI data, a computer was able to spot lies in 28 out of 31 volunteers.

I joined an extension of that study. That's why I found myself lying on a narrow table in George's lab while he and his assistants pulled a barrel-shaped framework over my head like a rigid hood. As it brushed the tip of my nose and blotted out the light from the room, I looked straight ahead to see a computer screen, which would be my interrogator.

Then the table eased into the tunnel of the fMRI scanner, a machine the size of a small storage shed. Only my legs stuck out.

As I focused on the questions popping up on the computer screen, the scanner roared like a tractor trying to uproot a tree stump.

It was bombarding me with radio waves and a powerful magnetic field to create detailed images of my brain and detect tiny changes in blood flow in certain areas. Those changes would indicate those areas were working a bit harder than usual, and according to research by George and others, that would in turn indicate I was lying.

Some questions that popped up on that screen were easy: Am I awake, is it 2004, do I like movies. Others were a little more challenging: Have I ever cheated on taxes, or gossiped, or deceived a loved one. As instructed, I answered them all truthfully, pushing the "Yes" button with my thumb or the "No" button with my index finger.

Then, there it was: "Did you remove a watch from the drawer?"

Just a half-hour or so before, in an adjacent room, I'd been told to remove either a watch or a ring from a drawer and slip it into a locker with my briefcase. This was the mock crime that volunteers lied about in George's study. So I took the watch. As I lay in the scanner I remembered seizing its gold metal band and nestling it into the locker.

So, the computer was asking, did I take the watch?

No, I replied with a jab of my finger. I didn't steal nuthin.'

I lied again and again. Other questions about the watch popped up seemingly at random during the interrogation. Is the watch in my locker? Is it in the drawer? Did I steal it from the drawer?

The same questions came up about the ring, and I told the truth about those.

It would be a different computer's job to figure out which I was lying about, the watch or the ring. It would compare the way my brain acted when I responded to those questions versus what my brain did when I responded truthfully to the other questions. Whichever looked more different from the "truthful" brain activity would be considered the signature of deceit.

Finally, after answering 160 questions over the course of 16 minutes - actually, it was 80 questions two times apiece - I was done. The machine returned me to the bright light of the scanning room.

The computer's verdict? That would take a few days to produce, since it required a lot of data analysis. I didn't mind waiting. It's not like the result would help get me fired, or lose a lawsuit, or send me to jail.

Nobody in George's studies faced consequences like that, which is one reason the lab results may not apply to real-world situations. George has already begun another study in which volunteers face "a little more jeopardy" from the mock crime. He declined to describe it because he didn't want prospective volunteers to hear about it ahead of time. That work is funded by the Department of Defense Polygraph Institute.

Other questions remain. How would this work on people with brain diseases? Or people taking medications? How would this work on people outside the 18-to-50 age range included in George's recent work?

How about experienced liars? George hopes eventually to study volunteers from prisons.

And then there's the matter of the three people who got away with lying in his recent study. For some reason, the computer failed to identify the object they'd stolen. George says he doesn't know what went wrong.

But in a real-world situation, he said, the person being questioned would go through an exercise like the ring-or-watch task as well as being quizzed about the topic at hand. That way, if the computer failed in the experimental task, it would be obvious that it couldn't judge the person's truthfulness.

Because of that, George said, he's comfortable with entrepreneur Laken's plans to introduce the scanning service to lawyers, though just on a limited basis, by the middle of this year. Lab studies are obviously necessary, he said, but "at a certain point you really have to start applying and see how it works. And I think we're getting close."

But Jennifer Vendemia, a University of South Carolina researcher who studies deception and the brain, said she finds Laken's timetable premature. So little research has been done on using fMRI for this purpose that it's too soon to make any judgment about how useful it could be, she said.

Without studies to see how well the technique works in other labs - a standard procedure in the scientific world - its reliability might be an issue, said Dr. Sean Spence of the University of Sheffield in England, who also studies fMRI for detecting deception.

Speaking more generally, ethical and legal experts said they were wary of quickly using fMRI for spotting lies.

"What's really scary is if we start implementing this before we know how accurate it really is," Greely said. "People could be sent to jail, people could be sent to the death penalty, people could lose their jobs."

Greely recently called for pre-marketing approval of lie-detection devices in general, like the federal government carries out for medications.

Judy Illes, director of Stanford's program in neuroethics, also has concerns: Could people, including victims of crimes, be coerced into taking an fMRI test? Could it distinguish accurate memories from muddled ones? Could it detect a person who's being misleading without actually lying?

Her worries multiply if fMRI evidence starts showing up in the courtroom. For one thing, unlike the technical data from a polygraph, it can be used to make brain images that look simple and convincing, belying the complexity of the data behind them, she said.

"You show a jury a picture with a nice red spot, that can have a very strong impact in a very rapid way.... We need to understand how juries are going to respond to that information. Will they be open to complex explanations of what the images do and do not mean?"

There's also a philosophical argument in case fMRI works all too well. Greely notes that four Supreme Court justices wrote in 1998 that if polygraphs were reliable enough to use as evidence, they shouldn't be admitted because they would usurp the jury's role of determining the truth. With only four votes, that position doesn't stand as legal precedent, but it's "an interesting straw in the wind" for how fMRI might be received someday, he said.

It didn't take any jury to find the truth in my case.

"We nabbed ya," George said after sending me the results of my scan. "It wasn't a close call."

I was ratted out by the three parts of my brain the technique targets. They'd become more active when I lied about taking the watch than when I truthfully denied taking the ring.

Those areas are involved in juggling the demands of doing several things at once, in thinking about oneself, and in stopping oneself from making a natural response - all things the brain apparently does when it pulls back from blurting the truth and works up a whopper instead, George said.

Of course, nobody is going to make me or anybody else climb into an fMRI scanner every time they want a statement verified. The procedure is too cumbersome to be used so casually, George says.

But he figures that if a perfect lie detector were developed, that practical consideration might not matter. The mere knowledge that one is available, he said, might provoke people to clean up their acts.

"My hope," George said, "would be that it might make the world operate a little bit more openly and honestly."

---

On the Web:

Cephos Corp.: http://www.cephoscorp.com

No Lie MRI, Inc.: http://www.noliemri.com

fMRI information: http://www.radiologyinfo.org/content/functional-mr.htm Benefit&Risk

Post Comment   Private Reply   Ignore Thread  


TopPage UpFull ThreadPage DownBottom/Latest

#1. To: Zipporah (#0)

"What's really scary is if we start implementing this before we know how accurate it really is," Greely said. "People could be sent to jail, people could be sent to the death penalty, people could lose their jobs."

This is wonderful technology.

Let's apply it to all governments, everywhere.

Immediately.

Lod  posted on  2006-01-28   22:08:22 ET  Reply   Trace   Private Reply  


TopPage UpFull ThreadPage DownBottom/Latest


[Home]  [Headlines]  [Latest Articles]  [Latest Comments]  [Post]  [Sign-in]  [Mail]  [Setup]  [Help]