The Human Element: Addressing Bias in Forensic Analysis

Laolu Korede
4 min readSep 28, 2023

--

Imagine you’re a detective trying to solve a big puzzle. This puzzle is a crime, and you need to find clues to figure out what happened. That’s what forensic science is all about — finding clues to solve crimes.

But here’s the thing: sometimes, the people who look for clues can make mistakes because they have certain thoughts or feelings that are not quite fair. Let me explain it in a simpler way.

Thinking Shortcuts: When detectives and scientists look for clues, they might take shortcuts in their thinking. It’s like when you see your favorite color and you notice it everywhere. They might only look for clues that fit what they already believe. For example, if they think a certain person did it, they might only focus on clues that make that person look guilty. for example, a forensic investigator is called to the scene of a crime where a young woman was found murdered. The investigator arrives at the scene and is told by a police officer that the suspect is a young, black man. The investigator then begins to collect evidence, but they may be more likely to focus on evidence that supports the idea that the suspect is guilty, such as fingerprints or DNA found at the scene. They may also be more likely to ignore evidence that contradicts this idea, such as an eyewitness who saw someone else leaving the scene. In forensic terms, we call this Cognitive bias

Only Seeing What They Want: You have a favorite toy, and you really want to play with it. You might ignore other toys. In the same way, detectives and forensic scientists can sometimes only see the clues that agree with what they believe and ignore the ones that say something different. It can also be called confirmation bias. In the case of the Central Park Five In 1989, five young black men were wrongfully convicted of raping a white woman in Central Park based on coerced confessions and a flawed investigation. The police and prosecution had a preconceived idea of who the perpetrators were and focused their investigation on these individuals, disregarding evidence that pointed to other suspects. The confirmation bias of the investigators led to the wrongful conviction of the Central Park Five, who were later exonerated by DNA evidence. This case highlights the importance of recognizing and addressing confirmation bias in all aspects of the criminal justice system, including forensic analysis, to ensure that justice is served and innocent people are not wrongly convicted.

The Way Things Look: Sometimes, the way things look can trick forensic analysts. Like when you think your teddy bear is a monster in the dark, but it’s not. Detectives might see clues differently because they know certain things about the people involved. For example, if they know someone’s religion or where they come from, it can affect how they see the clues. In 2004, Brandon Mayfield, an American lawyer, was wrongly arrested and accused of involvement in the Madrid train bombings. The FBI had mistakenly concluded that Mayfield was the source of a fingerprint found at the crime scene. However, the FBI had access to additional contextual information that may have biased their analysis, including the fact that Mayfield was a Muslim man. In this case, the FBI’s knowledge of Mayfield’s religion may have led them to be more likely to see his fingerprint at the crime scene. Another example of contextual bias in forensic analysis is the case of Amanda Knox, an American student who was convicted of the murder of her roommate, Meredith Kercher, in Italy in 2007. Knox was eventually acquitted on appeal, but she spent four years in prison. Some experts believe that Knox’s conviction was influenced by contextual bias, such as the fact that she was a young, attractive woman from a wealthy family. The Italian media also played a significant role in the case, and they often portrayed Knox in a negative light.

Influencing Others: Also called Experimenter bias, Have you ever played a game where you want your friend to guess what you’re thinking? You might give them hints without realizing it. Well, sometimes, scientists can do that too. If they want a certain result, they might accidentally give hints to the people helping them, and that can change the outcome. For example, if the scientist expects a certain outcome, they may unintentionally communicate this expectation to the participant, leading them to behave in a way that confirms the scientist’s hypothesis.

Now, let’s talk about how we can make sure detectives and scientists do a fair job:

Teaching Them About Bias: We can teach them about these thinking shortcuts and how they can make mistakes. Just like when you learn about being fair and not taking sides in a game.

Making Rules: We can have rules for how detectives and scientists look for clues. It’s like playing a game with rules so everyone has a fair chance. Objectivity is a cornerstone of forensic analysis aimed at providing legal decision-makers with unbiased and accurate results

Not Telling Them Everything: Sometimes, we can keep some information a secret from detectives and scientists. This way, they won’t be influenced by what they know about the people involved.

Getting Help from Others: It’s like a routine check where someone else checks to make sure you’re being fair, probably an independent body. That’s what we can do with detectives and scientists. We can have someone else check their work to make sure they’re not making mistakes because of bias.

--

--