Can A.I. solve rape cases? To find out, a Cleveland professor programmed a computer to analyze thousands of police reports

CLEVELAND, Ohio – Artificial intelligence is suddenly everywhere: Robots are producing college essays and legal briefs, while machines cull through troves of data to make lightning-fast predictions.

The criminal justice system has been slower to adapt. But a Cleveland State University criminologist is pushing that boundary, demonstrating how machine learning can help solve a notoriously low-tech crime: rape.

In a pair of articles published in the September issue of the Journal of Criminal Justice, professor Rachel Lovell detailed how she used a computer to analyze thousands of incident reports written by Cleveland police officers over two decades in response to sexual assault.

Specifically, Lovell programmed the computer to measure officer bias in each report. The algorithms proved so adept at evaluating the language, they could predict which reports led to prosecutions and which ones died.

The implications, while uncertain, are still significant: If a computer can detect a form of writing that can influence the fate of a rape case, perhaps software engineers can develop a program to help officers write reports the day of a service call.

The results were a shocker to Lovell. She expected the more objective reports — those written with a “just-the-facts” tone — to result in more prosecutions. Instead, she found the opposite: The more subjective reports triggered success.

Looking back, Lovell says the results make sense. The matter-of-fact reports failed to convey the brutality of rape, while the subjective reports were rich with personalized details, elicited from the victims.

“The officers are making those reports victim-centric and really capturing the sexual assault,” said Lovell.

On a broader level, the studies demonstrate the limitations of human researchers. Though Lovell had access to thousands of incident reports, reading and coding each one was a near-impossible task.

Enter the computer. Criminal justice researchers are now tapping into big-data concepts more familiar to Silicon Valley and corporate America, discovering patterns imperceptible to the human eye.

“We’ve just scratched on the surface,” said Scott Mourtgos, a deputy chief for the Salt Lake City Police Department and doctoral candidate who has published numerous articles on policing and advanced quantitative methods.

In a recent study, Mourtgos used a text-mining application to analyze hundreds of survey responses from U.S. officers about their attitudes on de-policing, sometimes called the Ferguson Effect. De-policing can occur when an officer thinks the public is unfairly critical of the profession, Mourtgos discovered, but not necessarily if an officer thinks the public is merely ignorant of the job’s realities.

Mourtgos is excited by the possibilities A.I. brings to the field.

“There is a mountain of text data available to police departments and other government agencies that’s full of information that hasn’t been looked at,” he said. “Sky’s the limit.”

A pattern of doubts

Back in 2006, an 18-year-old woman exited a bus on Cleveland’s East Side. She was accosted by a man and raped in a backyard.

When she told her story to police, the responding officer seemed to have doubts. The woman “was very vague about the incident,” the officer reported. “…Victim’s clothing not dirty or disheveled.”

Officers administered a rape kit, which was later shelved. The case was soon closed.

In 2013, the Cuyahoga County prosecutors launched an initiative to address a rape kit backlog. The 18-year-old’s kit was among thousands in storage sent out for DNA testing.

The following year, police caught a rapist in the act and sampled his DNA. Not only did it match the 18-year-old’s assailant, it paired with another woman’s attacker three weeks later. The suspect turned out to be a convicted sex offender, The Plain Dealer revealed.

RELATED: How to catch a rapist? Study finds Cleveland, Cuyahoga authorities failed to collect DNA from nearly 15,000 suspects over 7 years

By then, Lovell was lead researcher for the county’s rape kit initiative, affording her access to thousands of incident reports tied to sexual assaults in Cleveland. As she perused them, she noticed that many patrol officers seemed to doubt the victim, almost subconsciously.

“Juvenile has had sex in the past,” one report read.

“Victim is a known prostitute and crack cocaine abuser,” read another.

“We observed no bruises, contusions on the female.”

“Victim was unable to keep eye contact, laughed during questioning…obviously being deceptive.”

Lovell got to the 18-year-old’s report: “Victim’s clothing not dirty or disheveled.”

To Lovell, the bias was striking. She wondered if it ever affected cases down the line.

Surprising results

Incident reports, typically written by patrol officers after service calls, are the first records of a crime. Detectives often study them before consulting anyone, and the narratives can sway prosecutors, defense attorneys, judges and juries.

Despite their importance, Lovell calls report-writing “a time-intensive, dreaded task.” She recalls learning the term “three-finger report,” used by a former police chief to describe a block of text the height of a three-finger salute.

Sexual assault reports, in particular, are often poorly written, some experts say, partly because officers treat rapes like any other violent crime.

Yet sexual assaults are unique, owing to their personal nature. Rather than fight their rapist, for example, many victims freeze up or dissociate themselves, affecting their memory. Police officers expect victims to behave emotionally following an attack, when the more typical response is detachment.

If patrol officers don’t get the type of information they’re used to — my attacker’s shirt was maroon, for instance — it can breed what experts call rape myths, leaving officers to doubt the assault occurred.

“If a victim isn’t forthcoming with details, an officer might be trained to think maybe a crime didn’t happen, but with sexual assault, it’s pretty normal for victims not to want to talk to a stranger about this horrible thing that happened to them,” said Bradley Campbell, a criminologist with the Southern Police Institute at the University of Louisville who studies officer training and sexual assault.

If bias existed in police reports, it mostly likely existed in rape reports, Lovell reasoned. In 2018, she set out to determine if it could be measured. She had at her disposal more than 5,000 incident reports linked to sexual assault kits, written between 1991 to 2015. She knew which cases were prosecuted and which weren’t.

rachel lovell

“There is a massive distrust of victims among police and the system of whole,” said Cleveland State University criminologist Rachel Lovell. “Victims expect the system to blame them. They often already blame themselves.” Photo: John J. Lawrence

At the time, Lovell was familiar with a machine-learning program called text mining, used by businesses to measure customer-satisfaction data.

“Here’s all this technology that’s transforming the way we work, and police haven’t updated how they fill out reports since the days of typewriters,” Lovell said.

The professor assembled a team of social and computer scientists that fed the incident reports — totaling 4 million words — into a computer system. The team instructed the machine to measure variables like subjectivity and asked it to identify relevant three-word clusters, such as “unlawful sexual conduct.”

To the team’s surprise, the computer revealed that the more subjective reports —”Doe explains that she is scared to death of the named suspect,” one might say — led to more prosecutions.

It also linked certain word clusters to successes and failures. The reports forwarded to prosecutors tended to cite assault and rape statutes —“issued for rape” or “Ohio Revised Code,” for example. They also described action steps, like “male was charged” or “victim comes forward.”

Dead-end reports hinted at victim inaction, like “victim did not” or “did not wish,” and they used closing language like “insufficient evidence” or “no further leads.”

They were also shorter by 61 words, on average. Sometimes officers express bias by saying little, Lovell reasoned.

Glimpsing the future

A.I. is not completely new to law enforcement, which has used it for surveillance tools like facial recognition, social media monitoring and gunshot detection. But Lovell’s research represents a new frontier among criminologists, who have begun scaling data to measure things like domestic violence, terrorism and crime hotspots.

In practice, mainstream adoption of A.I. is still a decade away for police, according to Andrew Wheeler, a North Carolina-based data scientist and police consultant who has used algorithms to predict crime across jurisdictions and identify problematic behaviors by cops in the field.

“There are maybe 18,000 police departments in the U.S., so uptake will be uneven, slow and idiosyncratic,” he said, noting that many public agencies lack resources for in-house data scientists. (Cleveland police are planning to hire several new crime analysts, city officials have said.)

In the meantime, Lovell said her studies establish the importance of incident reports, particularly in sexual assault cases, which are saddled by extremely low conviction rates.

Patrol officers, said Lovell, should embrace the subjective perspective of victims and use descriptive words reflecting the reality of rape. If a victim doesn’t remember certain things, an officer should ask what she does remember — what she was thinking, hearing, smelling, even tasting.

“A rape report should not be written using the same tone as a report of a stolen bicycle,” Lovell said.

Her research arrives as law enforcement leaders are showing greater awareness of gender-violence bias. The Department of Justice and International Association of Chiefs of Police have produced guidelines on adopting a trauma-centered approach, and the DOJ has acknowledged officer misconceptions about victims.

In the future, Lovell believes A.I. could help patrol officers by flagging biased language in reports or prompting them on rape statutes, a form of self-monitoring that is “pretty promising,” said Wheeler, the data scientist from North Carolina. Recently, Lovell developed a pilot glossary of rape-specific words and phrases, including those that signal bias, based on her findings.

Ultimately, a ChatGPT-like robot could help draft parts of an incident report, saving hours of time. Rather than depersonalizing the process, Lovell believes it would do the opposite: victims would know that nothing is left to chance, helping them stay engaged throughout the investigation.

Logo-favicon

Sign up to receive the latest local, national & international Criminal Justice News in your inbox, everyday.

We don’t spam! Read our [link]privacy policy[/link] for more info.

Sign up today to receive the latest local, national & international Criminal Justice News in your inbox, everyday.

We don’t spam! Read our privacy policy for more info.

This post was originally published on this site be sure to check out more of their content.