AI Study Shows What Phrasing Can Tank Solving Rape Crimes

An analysis of police reports found biased language led to more convictions
By Gina Carey,  Newser Staff
Posted Oct 8, 2023 12:00 PM CDT
AI Shows How Word Choice Helps Solve Rape Crimes
An analysis of police reports found biased language led to more convictions in rape crimes.   (Getty / Motortion)

While AI is often in the news for how it might destroy humankind (or at the very least, take our jobs), it also has potential for good, like helping solve crimes. Per Cleveland.com, a professor published two articles in the Journal of Criminal Justice that demonstrate how she used machine learning to analyze 20 years of sexual assault reports written by Cleveland police. Criminologist Rachel Lovell of Cleveland State University trained AI to examine those reports for bias, then predict which ones led to prosecutions. She was surprised to find that officers who demonstrated bias in their writing, rather than producing objective reports that stuck to the facts, ended up having cases solved more often.

"The officers are making those reports victim-centric and really capturing the sexual assault," Lovell said. She concluded that officers who captured the trauma of rape included more detail, which ultimately helped the cases. Incident reports, Cleveland.com writes, can sway the detectives, prosecutors, and judges who pick up the cases. Reports that weren't successful were typically shorter and included phrasing like "insufficient evidence" or "no further leads," as well as wording that suggested victims were not active, such as "victim did not" or "did not wish." Lovell created a glossary to help choose wording and argues that an AI program can help write incident reports that yield better conviction rates. "A rape report should not be written using the same tone as a report of a stolen bicycle," she said.

Police departments are already using AI in different ways. Per Esquire, technology on display at a police conference in Dubai far exceeded the capabilities of facial recognition software, including sentient analysis software to interpret moods during interrogations and a Segway outfitted with a machine gun. But the technology consistently shows bias against minorities, and with the lack of diversity in STEM jobs, "is apt to generate more built-in biases against people of color, the same people who are overpoliced and underprotected," Lovell said. (AI may also play a big role in detecting breast cancer).

Get the news faster.
Tap to install our app.
X
Install the Newser News app
in two easy steps:
1. Tap in your navigation bar.
2. Tap to Add to Home Screen.

X