Philadelphia

Should your race or ZIP code determine jail time?

The central vestibule at Philadelphia's House of Correction - Photo by Mark Stehle

The central vestibule at Philadelphia's House of Correction - Photo by Mark Stehle

Philadelphia Mayor Jim Kenney ran for office last year with the slogan that equality "should not be defined by ZIP code,” race or ethnic origin. Turns out, that’s a tough promise to keep sometimes. 

Exhibit A: The city’s court system, which is in the midst of developing a computer program designed to predict future criminality – and deciding whether those exact factors should determine whether people get locked up or released on bail.

That system would generate a “risk score” for people arrested in Philadelphia based on a battery of data like arrest records, age, gender and employment history. The resultant score would then inform judges about how high to set bail or whether to release a defendant on their own recognizance, based on their predicted likelihood of reoffending or skipping court.

University of Pennsylvania Professor of Criminology Richard Berk said he’s advised officials involved in a $6.5 million partnership with the MacArthur Foundation to reform Philadelphia’s First Judicial District that they should also include factors like race and home ZIP codes in the new pre-trial program. Berk helped pioneer a similar program currently used by the city’s Adult Probation and Parole Department, which determines how intensively the city’s 45,000 probationers should be monitored based on predicted recidivism. 

Not surprisingly, his recommendations have been met with resistance.

“The pre-trial specifications [are] still being determined, but I am very confident that race will not be included, and probably not ZIP code,” Berk said. “The price, of course, is that there will be more forecasting errors. More folks will be mistakenly detained and more folks will be mistakenly released. That is not my decision; it is a political decision.”

These computer programs, sometimes called “risk assessment algorithms,” have grown in popularity as a tactic for reducing overcrowded jails jails by separating out less-risky suspects ahead of trial. Already in use in nearly a dozen states, Pennsylvania and Maryland are looking to implement similar computer programs that draw from Berk’s research.

Advocates like Berk say that, ideally, the systems are more effective than relying solely on the individual biases of a bail magistrate. Even better, the algorithms underpinning these systems are built to teach themselves to become more effective through periodic accuracy tests that compare outcomes against control groups.

On paper, it can sound great. But as the use of “risk assessment” tools has expanded, so, too, has a debate over whether or not to include divisive factors like race to improve accuracy, and concerns that the computerized systems can develop their own complex forms of racism from biased data. 

“It is better than a judge making a split-second decision,” acknowledged David Robinson, a consultant with Team Upturn, a firm that specializes in data ethics. “But does that mean we should be happy?”

His group is promoting a set of best practices recommendations for new risk-scoring programs, created by the Arnold Foundation. The guidelines are fairly simple: They largely recommend excluding factors like race and home address – but also other potentially controversial factors, like gender, drug use history and employment status.

Robinson said some information, like a subject’s ZIP code, can serve as “stand-ins” for race, given the highly segregated nature of many US cities. And racially biased policing methods, like stop-and-frisk, can taint other data, like the number of times a subject was previously arrested.

“It’s important to ask exactly what it is we’re predicting the risk of. Often it’s re-arrest,” he said. “Arrests are a proxy for bad behavior, but you’re using arrests in your model, and then you’re making more arrests based on that model.”

In other words, arrests can sometimes be a false proxy for criminality. If there are more police making more arrests in high-crime neighborhoods, those populations will automatically score higher simply because of where they live. 

“You’re going to patrol more on certain blocks where people have been found with guns. You’re going to look for more guns – and you’ll find more. Pretty soon, you’re in an echo chamber,” Robinson said, in reference to the trouble with using ZIP codes as an indicator. 

Similarly, if Philadelphia police are predisposed to arrest minorities without good cause, as studies suggest they frequently are, a computer program could falsely interpret arbitrary arrests as a sign of increased criminality amongst minority groups.

“You can end up in a feedback loop if someone is already at a high risk of re-arrest,” said Robinson. “Then you impose more exacting conditions on that person, and they’re arrested even more. And the cycle repeats itself.”

It’s already happened. A recent ProPublica investigation found that a risk-assessment algorithm in Broward County, Fla., had developed a bias against black offenders using the “machine learning” process hailed by advocates. Low-level black offenders were routinely scored as “riskier” than hardened white criminals. ProPublica found that system, known as COMPAS, had only a 20 percent accuracy rate at predicting future offenses.

And lower accuracy can actually be by design. Berk said all risk scoring systems are weighted in favor higher risk scores, just to be safe. This is called establishing a “cost ratio” that pushes the system to favor of higher bail, lockup or more intensive court surveillance even over releasing a potentially innocent person. In Berk’s initial system, built for the city’s probation department, the ratio was 20 to 1 in favor of tougher surveillance or detention.

“In general, it is seen as more costly to falsely conclude someone is a good bet than to falsely conclude that someone is a bad bet,” he said. “Some mistakes are worse than others.”

Berk has been publicly vocal about his point that concerns about discrimination in general are outweighed by other, deadlier risks. In a recent Inquirer editorial about the risk scoring system, he was quoted as pushing back against those criticisms.

“People who stress that point forget about the victims,” he said, in the editorial. “How many deaths are you willing to trade for something that's race-neutral?”

Not all of Berk’s colleagues in the field of statistics agreed with that assessment.

“I found that comment quite jarring. If someone fails to appear in court, of course it’s bad, but that doesn’t always mean someone dies,” said Robinson. “Even if the question involves someone risking their lives, the idea that the impact on fairness shouldn’t be considered is a troubling way to run a criminal justice system. We could eliminate the Bill of Rights and put a lot more potential criminals in prison. But we’re not going to do that.”

City officials involved in Philadelphia’s reform process, who spoke on the condition of anonymity, confirmed that they were erring on the side of excluding the most controversial factors from the new algorithm – race and ZIP codes. However, the system is still under development; it hasn’t been determined how other aspects, like arrests or cost ratios favoring false positives, would be handled.

Geoffrey Barnes, a Cambridge professor who conducted a technical analysis of Berk’s initial system prior to its implementation by the city’s probation department, said that risk assessment systems were accurate without racial data or ZIP codes.

“In very basic terms, however, any predictor variable that is correlated with the outcome that is being forecasted can only benefit this kind of model,” said Barnes. “Just because a variable boosts forecasting accuracy, however, does not mean that it makes a huge or even a noticeable contribution. Some of these contributions can be rather small. When this happens, the variables are usually dropped from future versions of the model.”

Barnes has been involved with ongoing accuracy tests of the Philadelphia probation department’s scoring system. To his point, he said that ZIP codes had actually been eliminated as a risk factor in that system in May. 

But the probation department declined to release a complete list of risk factors or accuracy reports without a Right-to-Know request. In a previous report by City&State, probation officials said the risk factors in their system were too “complicated” to explain.  

This ambiguity speaks to the ultimate fear expressed by ethics advocates like Robinson, who worries that people will not be able or allowed to understand the complex forces shaping their treatment by judges or the larger criminal justice system.

“The problem is that people who are not statisticians are not able to be critical consumers of the product,” he said. 

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.