Law and Legal

Police Agencies Sued Over Predictive Policing Programs

(NYPD)

Predictive policing programs have been a tool at the disposal of law enforcement agencies nationwide. But the practice has come under legal assault in America’s largest cities.

The strategy uses computer programs predicting where crimes will occur, and who will commit them. The data has been helpful for agencies deploying CompStat methodology and other similar strategies. However, they are now under fire in legal cases nationwide.

Predictive policing uses algorithms to crunch data and create lists of people and neighborhoods for officers to target.

The largest departments — New York, Chicago and Los Angeles — are all being sued for not releasing information about their “predictive policing” programs, reported the Los Angeles Times.

Some smaller departments also have been brought to court and before public records agencies.

The groups that filed the lawsuit say they want the departments to release the information to determine if the programs are fair, according to wlsam.com.

Police say some information about the programs are proprietary, and they also have privacy and safety concerns over data already in the systems.

predictive policing programs
Queens officers received the CompStat Award for their rigorous investigative work and clever use of NYPD technology while responding to a fatal shooting, 2017. (NYPD)

Top concerns by opponents is that the computer programs perpetuate the problem of minorities being arrested at higher rates than whites. If arrest and crime location data that show such biases are fed into the algorithms, they argue, police will continue targeting minorities and minority neighborhoods at higher rates.

However, their shortsightedness overlooks the fact that police officers are present protecting minorities in the same areas.

Several groups and organizations have taken police agencies to court in an effort to find out what data is being fed into the programs, how the algorithms work and exactly what the end results are, including which people and areas are on the lists and how police are using the data, reported Dave Collins of the Associated Press.

“Everybody is trying to find out how it works, if it’s fair,” said Jay Stanley, a senior policy analyst for the American Civil Liberties Union. “This is all pretty new. This is all experimental. And there are reasons to think this is discriminatory in many ways.”

The programs are developed by private companies such as Palantir and PrePol and can tell police where and when crimes are likely to occur by analyzing years of crime location data. Other, more criticized programs produce lists of likely criminals and victims based on people’s criminal history, age, gang affiliation and other factors.

Some cities are spending hundreds of thousands of dollars, even millions, on predictive policing programs, with many of the costs paid for by state and federal law enforcement grants. Several dozen U.S. police departments use some form of predictive policing, and more than a hundred others are considering or planning to start such programs, according to counts and estimates by different groups, according to Collins.

early warning programPolice officials say they can’t release some information about their predictive programs because of citizen privacy and safety concerns. Moreover, some data is proprietary. Regardless of the criticism, the programs are helping to reduce crime and better deploy officers in a time of declining budgets and staffing, they argue.

Some studies have arrived at conflicting conclusions about whether predictive policing is effective or biased, but there has not been definitive research yet, experts say.

Critics say they’ve already seen what they believe is evidence of biases in predictive policing, including increased arrests in neighborhoods heavily populated by blacks and Latinos and people on computer-generated lists being repeatedly harassed by police.

Mariella Saba believes predictive policing labeled her Los Angeles neighborhood, Rose Hill, as a crime hot spot, because she has seen heavy law enforcement activity. Friends and neighbors, many of them Latino, have been stopped by police multiple times, she claimed.

One friend, Pedro Echeverria, was shot three times by a police officer last year but survived. Prosecutors ruled the shooting justified, saying Echeverria had a gun and fought with officers. Police said they decided to stop him as he was walking on a street because he was in Rose Hill, a “known hangout” for gang members, according to a prosecutor’s report.

“It’s traumatic. It creates trauma,” Saba, 30, of the increased police activity. “I know better to never normalize this or see this as normal. I’m about to burst.”

Saba said she can’t be certain whether Rose Hill is the subject of predictive policing because police won’t release that information. A group she co-founded, the Stop LAPD Spying Coalition, sued the police department in February seeking data about its program.

cops can smell crime
LAPD gang investigator. (Photo courtesy Chris Yarzab)

The LAPD has released some data to the group but hasn’t hand over other information, including copies of “chronic offender bulletins” that list people of interest to police. The lawsuit is pending.

Furthermore, LAPD can’t release some information because of concerns about citizens’ privacy, and other data sought by Saba’s group doesn’t exist, said Josh Rubinstein, a police spokesman.

“We’re not trying to dodge anything,” he said. “They’re making assumptions about what we’re doing that aren’t true.”

The LAPD uses a data-mining program developed by Palantir Technologies, which was co-founded by tech financier and PayPal co-founder Peter Thiel with backing from an investment arm of the CIA. The company has helped the military in Iraq and Afghanistan.

Challenges in other cities, according to Collins:

— In Hartford, police are facing a complaint by the Connecticut ACLU to the state public records commission for not releasing information about analytical software for the city’s surveillance camera system that officials say will help predict crime and capture suspects.

— Journalists sued Chicago last year in an effort to get information on what data goes into its so-called “heat list,” which ranks certain people on how likely they are to become perpetrators or victims of crime. The case remains pending.

“People are rightfully skeptical of the government using computers to predict who’s going to commit a crime,” said Matthew Topic, a lawyer for the journalists. “Maybe this heat list is a legitimate tool. Maybe it could be used better. The whole point of having transparency laws is we, as the public, get to second-guess everything government does.”

— A judge in December ordered New York City police to release records about its predictive policing tools after officials declined to disclose documents requested by the Brennan Center for Justice at New York University School of Law. The center is seeking information about the department’s use of Palantir’s products and other records.

— Kentrell Hickerson, who is appealing his convictions on gang-related charges, is seeking information about New Orleans’ predictive policing program in court. A judge said in April that Hickerson can subpoena city officials for information on whether data from the program were used in his case. The case remains pending.

Want More Stories Like This?

Subscribe to our email list and get notified each time we release a breaking news story.






Thank you for subscribing. Please check your inbox to confirm your email subscription.

Author
news

News of interest to the law enforcement community.

Leave a comment

THANK YOU!

Be Informed

Get notified when "One Step Ahead" presented by Jay Wiley is Live- the Official Law Enforcement Today Radio Show






Thank you for subscribing. Please check your inbox to confirm your email address.