The following facts were noted in Farwell’s report. He followed the standard procedures, preparing to record the results for law enforcement. Before the test, Grinder was given a pretest in which details of the crime that he had already described were flashed on the computer. He was instructed to press a button when a certain type of “target” stimulus appeared, and to press a different button when something else appeared, which would include both probe stimuli and irrelevant information. If he was guilty, the probe stimuli should show the same response as the target stimuli.
The test itself was divided into blocks of twenty-four stimuli, each of which was presented three separate times. Grinder’s response to all of them was graphed and calculated in such a way that responses to “relevant” stimuli could be mathematically compared to his responses to “irrelevant” stimuli. Grinder participated in seven separate tests with five different sets of probes. He sat in a chair in front of the screen, wearing his orange prison jumpsuit and the sensor-equipped helmet that measured areas in the parietal, frontal, and central areas of his brain. If the test proved that Grinder was the perpetrator, and the testing was allowed into court (something that was not yet known), he faced a capital conviction, with a death sentence.
After forty-five minutes, the results seemed clear: the computer analysis showed “information present” for probe stimuli. The computed statistical confidence level was 99.9 percent accurate. There was no question about Grinder’s guilt: he had quite specific concealed knowledge about the crime. “What his brain said,” Farwell told reporters of the
Fairfield Ledger,
“was that he was guilty. He had critical detailed information only the killer would have. The murder of Julie Helton was stored fifteen years ago when he committed the murder.” As far as Farwell was concerned, he had proven his theory: the record of a crime can be stored indefinitely in the brain of the perpetrator, and technology can detect it, thus excluding other suspects. “We can use this technology to put serial killers in prison where they belong.”
Six days after submitting to brain fingerprinting, Grinder was in court to plead guilty. He was then taken to Arkansas to face a hearing. Sheriff Dawson acknowledged in a letter that the evidence provided by the test had been instrumental in obtaining the confession and guilty plea.
Arkansas authorities accompanied Grinder from Missouri so he could point out where he had murdered Cynthia. He showed them an area in the Ozark National Forest, but it was difficult to find anything so many years after the crime. For his guilty plea in Arkansas, he was sentenced to life in prison.
Brain Science
Clearly, brain fingerprinting is potentially revolutionary, although critics insist it needs more testing. Farwell has since been involved in other criminal cases, one with positive results for the future of the procedure, one without. In both cases, however, the procedure was not used in a jury trial but only during an appeal. At this writing, brain fingerprinting has yet to be allowed into court.
The issue of admissibility of new technologies purporting to be science is complex, dating back to 1923, when the District of Columbia Court of Appeals issued the first guidelines. In
Frye v. United States,
the defense counsel wanted to enter evidence about a device that measured the blood pressure of a person who was lying. The court had to scramble to figure out what to do, as there were no guidelines yet in place, and increasingly more attorneys were looking for scientific evidence to help convict or exonerate a defendant. The judge decided that the “thing” from which such testimony is deduced must be “sufficiently established to have gained general acceptance in the particular field in which it belongs.” In addition, the information offered had to be beyond the general knowledge of the jury.
This
Frye
standard became general practice in most courts for many years, vague though it was. Over the decades, critics claimed that it excluded theories that were unusual but nevertheless well supported. Each attempt to revise the
Frye
standard had its own problems.
In some jurisdictions now, including the federal courts, the
Frye
standard has been replaced by a standard cited in the Supreme Court’s 1993 decision in
Daubert v. Merrell Dow Pharmaceuticals, Inc.,
which emphasizes the trial judge’s responsibility as a gatekeeper. The court decided that
scientific
means grounded in the methods and procedures of science and
knowledge
is more reliable than gut instinct or subjective belief. The judge’s evaluation should focus on the methodology, not the conclusion, and on whether the evidence so deduced applies to the case. In other words, when scientific testimony is presented, judges have to determine whether the theory can be tested, the potential error rate is known, it was reviewed by peers, it had attracted widespread acceptance within a relevant scientific community, and the opinion was relevant to the issue in dispute.
Many attorneys look to these guidelines to distinguish between “junk science” and work performed with controls, scientific methodology, and appropriate precautions. But some guidelines are slippery, particularly when it comes to examining a suspect’s state of mind at the time of the crime. Subjective interpretations are the norm: a clinical practitioner uses a battery of objective assessments to examine the defendant’s background and activities just before the crime. The problem is that equally qualified practitioners can derive opposing conclusions from the same tests and observations, so state of mind at the time of the crime often comes down to whom the jury believes. There are certainly records of cases in which a defendant ably duped a practitioner about his ability to commit the crime in question. If Farwell’s invention can deliver on its promise, it could reduce reliance on subjective evaluation and perhaps increase the accuracy of assessments, at least as far as guilt is concerned.
“I have every reason to believe it will be viewed the same as DNA,” Farwell states. “We’re not reading minds, just detecting the presence or absence of specific information about a specific crime.” He believes that in the future, his device will dramatically alter the way suspects are interrogated. He views it as a way to reduce the number of false confessions and convictions that postconviction DNA testing has revealed over the past decade.
In November 2000, in an appeal for postconviction relief, an Iowa district court held an eight-hour hearing on the admissibility of this technology. The accused, imprisoned for twenty-two years, submitted to brain fingerprinting to try to prove he was innocent of committing a murder, and he passed. The MERM-ERS supported his alibi but not his participation in the crime. He then sought to have his conviction overturned, but the court said that the results of the test would not have affected the verdict. Yet, in the process, District Judge Timothy O’Grady ruled that the P300 theory met the Daubert Standard as admissible scientific evidence. But when the key witness heard about the test results, he admitted he’d lied in the original trial, so the convicted man appealed to the Iowa Supreme Court and was freed, based on a legal technicality and lack of evidence. While brain fingerprinting received a legal stamp of approval in this case, it has not yet been truly tested in the legal system.
As of early 2008, with more than 170 tests performed (80 of which were real-life situations as opposed to laboratory assessments), brain fingerprinting has proven reliable and accurate. There was not a single error with either information present or information absent. The CIA has given a generous grant for the work to continue. “I have high statistical confidence in it,” Farwell said.
One flaw, also relevant to the use of fingerprints, has been the inability of the test to distinguish between a person who was at a crime scene but did not commit the crime and one who was in the same place and is guilty.
Also, Farwell does not deal with memory research that demonstrates that memory is not a “storage tank” but an actively constructed process that can even result in false memories that are qualitatively as vivid as actual memories. More research must be done to accommodate issues such as age, substance abuse, stress, and memory disorders, all of which can affect the memory of a criminal. In addition, the subjective nature of the way investigators put together a case for test development—the basis for the probe stimuli—makes some scientists question its reliability.
In reports, Farwell says, “It is inevitable that the brain will take its rightful place as a central facet of criminal investigations.” While he’s probably correct in this prediction, there is still cause for concern. Brain fingerprinting might help to convict the guilty and exonerate the innocent, but its use might reinforce the use of other types of neurological testing that may appear to mitigate guilt. It could also force a reexamination of so many cases that this revolution could shake up the entire justice system. Whether we’re ready or not, scientific measurements, even ambiguous ones, will be proposed in future criminal cases.
My Brain Made Me Do It
Philosophers and many proponents of cognitive psychology hold that moral judgments are within our control, and thus people who choose to commit crimes, barring delusions, know what they are doing and that it is wrong. The legal system depends on this notion. However, recent research suggests that damage to an area of the brain just behind the eyes can transform the way people make moral decisions. The results indicate that the ventromedial prefrontal cortex, implicated in the feeling of compassion, may be the foundation for moral regulations, assisting us in inhibiting (or not) harmful treatment of others. Failure in its development, or damage to it, might alter the way a person perceives the moral landscape, which will thus affect his or her actions. If juries include information of this kind in their deliberations, it could mitigate the harshness of the sentences they impose on convicted criminals. While more research must be done, other types of brain scans are being entered as evidence in the trials of some heinous crimes to show that the perpetrator could not help what he did.
Stephen Stanko kidnapped a young woman and served nearly a decade in prison for his crime. While he was incarcerated, he collaborated with two professors on a book about his experience. After his release, he got involved with a librarian, whom he later murdered before raping her teenage daughter. He also killed a seventy-four-year-old man, and went on the run, stopping here and there to socialize in bars. He was soon captured, already courting his next victim. Despite his vow to be a good citizen, he had proven only that he was a dangerous psychopath. That meant he lacked empathy for others, had a strong tendency toward self-interest, was able to charm and con others, and felt no remorse for his actions. He used people up for his own purposes and committed more crimes without inhibition.
During his trial, the prosecutor said Stanko was a man who lacked all remorse for his actions. He was a psychopath who knew what he was doing when he did it but felt no remorse afterward. The defense attorney, William Diggs, wanted to prove that Stanko had no control over his actions, the result of a brain defect, and was therefore insane. He hired Dr. Thomas H. Sachy, founder of Georgia Pain and Behavioral Medicine, to test Stanko. Sachy scanned Stanko’s brain with a PET-scan machine, and testified that it “showed decreased function in the medial orbital frontal lobes.” He explained to the jury that one region of the brain directly above the eyes and behind the eyebrows did not function normally, and “this [is the] area of the brain that essentially makes us human.”
Although the jury rejected Dr. Sachy’s notion and convicted Stanko, the age of neuro-imaging had arrived, and eventually defense experts will improve both their testing and their testimony until someone, somewhere, will convince a jury that psychopaths cannot help what they do any more than psychotic people can; both groups are mentally ill and both groups would be accorded equal freedom from criminal culpability.
Another effort to employ brain scans for deception detection comes out of Germany, from the Berlin Bernstein Center for Computational Neuroscience. As reported in 2007, people tested with MRI machines performed simple skills while scientists attempted to “read” their intentions before these intentions became actions. Because these machines can identify different types of brain activity and link them to certain brain states or behaviors, the scientists believed they could distinguish and accurately predict what a person would decide to do. Thus far, the accuracy rate on a small sample of subjects has been just over 70 percent—greater than chance. As the tests get more refined, that accuracy level is expected to improve.
Researchers at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, are also involved in making such predictions. Subjects were asked to make decisions about adding or subtracting numbers before the numbers were shown on a computer screen. Brain patterns indicated that these processes were different. Bursts of activity in the prefrontal cortex—“thought signatures”—helped researchers predict results. While the decision making was made on a simple level, which means the claims for this science must be limited, more disturbing possibilities are on the horizon. Scientists might one day be able to tell, without his or her consent, what a person is thinking or feeling. This may assist in criminal investigations, but it might also have unpleasant repercussion on other levels of society.
Our final case involves many areas of forensic science and psychology. Such coordinated teamwork is certainly the wave of the future. While a collective effort may take longer to find the truth than the use of a brain scan, especially when evidence and testimony are tenuous, such work can pay off.
Sources
Akin, David. “Brain Wave: A Test That Can Detect Whether Someone Has Seen Something Before Is Being Promoted as a Tool to Screen for Terrorists.”
Globe and Mail,
November 3, 2001.