Screening & Scoring
EPIC’s Screening & Scoring Project produces comprehensive resources that identifies instances of scoring and screening of everyday life, articulates common issues with these tools, analyzes potential violations of existing law with their use, and works to protect the public from the algorithmic harm these tools may cause.
Automated decision-making tools are tools or systems that analyze data in order to aid or replace human decision-making. These tools can vary from simple algorithms to scoresheets to machine learning algorithms. These tools can be self-developed by local or state agencies or purchased by private companies.
Everyone is Screened and Scored
Without notice or consent, people are often screened and scored at important junctions in their lives.
When applying for a job, applicants encounter dubious facial or voice analysis, “emotion detection,” a series of games to try to identify “fit”, and/or resume scanners.
Students and employees are subject to monitoring software aimed to control behavior and detect potential wrongdoing. In some districts across the country, students are frequently allocated to schools based on opaque automated screening algorithms. Algorithms used in more than 100 high schools in New York City, for instance, reportedly consider variables like test scores, attendance, and behavioral records.
Prospective tenants are screened through credit scoring, reputation scoring, and/or tenant screening tools. Third-party companies offering these tenant screening tools collect, store, and select records for housing providers to use in evaluating tenants.
Individuals, and specifically medical patients, often have to interface with algorithms to help determine public benefit eligibility and fraud determinations, medical prioritization, medical diagnostics, and more.
Child welfare agencies use risk assessment systems to assist in identifying children who are at high-risk of experiencing abuse or neglect. These risk assessments are used to determine whether the agency should initiate a family intervention, which could range from connecting parents with resources to removing children from the home.
Many government offices use risk assessments and other automated tools to screen potential contracting partners. Commercial financial risk assessments assess a business’s financial health and creditworthiness. Much like individual credit scores, business credit reports impact how much credit a business may get, what interest rates and repayment terms are attached to the loan, and what insurance premiums a company will have to pay.
Most people in the U.S. have a credit score and have had to use their credit score to obtain services. Credit reporting agencies have unlimited access to consumer information and this data is used to in a myriad of decisions beyond lending matters. Employers, utility service providers, insurance companies, landlords, among others, use credit scores to evaluate whether to offer their services to you.
Controversial risk assessment tools are used in the criminal justice system to set bail, determine criminal sentences, and can even contribute to determinations about guilt or innocence. Predictive policing tools are used to over surveil and over police communities of color.
These Tools are Problematic
As more entities rely on these tools to supplement and replace human decision-making, the inherent risks associated with these tools become more urgent. Scoring and screening tools are problematic for a variety of reasons, including:
- Bias and Accuracy Issues: These tools have significant bias and accuracy issues. These tools can have assumptions built into the system that reinforces biases and inequalities. Issues with data sources, such as limited data sets, or lack of external validation studies often result in inaccurate screening and scoring outcomes.
- Insufficient Transparency and Accountability: These tools have woefully insufficient transparency and accountability requirements. There is often a lack of transparency about how these tools work, what data points are being used, and the logic behind each automated decision. It is often unclear whether these tools have undergone proper validation studies for accuracy and reliability.
- Opacity and Secrecy: These tools obscure the decision-making process. Because many of these tools are proprietary third-party systems, companies often refuse to disclose information about how their systems work.
- Lack of Notice and Denial of Procedural Due Process: Entities using these systems often fail to provide adequate notice, if any, to individuals being scored and screened. The use of these tools can erode procedural due process protections because people do not know how these tools make determinations and what types of data is used. These tools also limit an individuals’ ability to challenge outcomes affecting their eligibility for services, jobs, housing, or benefits because they don’t understand how these tools work.
- Discrimination: These tools lead to flawed decisions that contribute to inequity, often replicating and exacerbating bias. These tools are often not designed to account for discrimination and instead, use biased training data and modeling to predict outcomes. They illustrate how inconsistent and highly-opaque algorithms can marginalized communities of color and people living with disabilities.
These Tools Need to be Regulated
Although civil rights laws exist to protect against certain types of disparate impact for members of protected classes, there is insufficient regulation about the use of these screening and scoring tools. Lawmakers must develop standards for the regulation of these tools and hold entities deploying these tools accountable. There must be oversight and enforcement that can mitigate and remedy harms caused by these tools. There also must be greater transparency and accountability mechanisms in place.
EPIC’s Screening & Scoring Work
Many areas of EPIC’s work involve aspects of screening and scoring. Whether in the criminal justice, surveillance and profiling, consumer protection, housing, or credit scoring context, EPIC has advocated for algorithmic transparency and the regulation of these tools. EPIC has utilized open records laws and have litigated for the release of information about the government’s screening and scoring tools. EPIC has filed complaints to the Federal Trade Commission and the DC Attorney General to investigate companies peddling these tools. EPIC has also submitted public comments to government agencies about the use of these tools.
Some examples of EPIC’s work:
EPIC published Liberty at Risk: Pre-Trial Risk Assessments in the U.S., a report that provides an overweight of risk assessment tools that practitioners and scholars can use to understand the nature of these systems, the context in which they are used, and help focus their evaluations of the fairness of these systems. As part of its reporting, EPIC also obtained several documents about states’ use of criminal justice algorithms.
EPIC sued the Justice Department for records concerning the government’s use of risk assessments and predictive policing techniques. EPIC’s case led to the disclosure of hundreds of pages of relevant records and revealed the existence of a previously-unknown DOJ report to the White House about the use of predictive analytics in law enforcement.
Through public records requests in six states, EPIC obtained documents about a secret DNA forensic source code used in forensic analysis. Law enforcement used the TrueAllele software test results to establish guilt, despite individuals accused of crimes not having access to the source code that produces these results. EPIC obtained validation studies, technical specifications, and other records about this controversial DNA forensic technique.
Surveillance and Profiling
EPIC sued Customs and Border Protection for records about the agency’s Analytical Framework for Intelligence (AFI), an analytic tool used to assign risk assessments to travelers, including U.S. citizens traveling domestically. The agency’s uses AFI to analyze personally identifiable information from a variety of sources like government data bases, commercial data brokers, or the internet. EPIC eventually obtained documents about this secretive scoring program.
EPIC sued the Department of Homeland Security for records about the agency’s Future Attribute Screening Technology (FAST) program. The FAST program was a “Minority Report” style initiative that tried to determine the probability of an individual to commit a future crime. EPIC obtained the program’s Privacy Threshold Analysis, project presentation, technical requirements, test-site installation, among other records.
EPIC has sent several complaints to the FTC asking the agency to investigate screening and scoring tools related to education, hiring, student athlete profiling, and commercial data brokers. EPIC has urged the agency to investigate HireVue, the Universal Tennis Rating Score, Choice Point, and regulating the commercial use of AI.
EPIC also filed a complaint with the D.C. Office of the Attorney General calling on the AG to take enforcement action against five online test proctoring companies.
EPIC has urged the FTC to investigate Airbnb’s use of “Trustworthiness” scores. EPIC also commented on the Department of Housing and Urban Development’s implementation of the Fair Housing Act’s Disparate Impact Standard.
EPIC sent comments to the Consumer Financial Protection Bureau’s about revising its regulations and/or providing new guidance to financial institutions about their use of AI and machine learning systems.
EPIC FOI Documents: District of Columbia
EPIC utilizes open government requests to gain an understanding of automated decision-making tools in high-risk functions in U.S. jurisdictions. EPIC is currently focusing on creating a comprehensive picture of scoring and screening tools within the District of Columbia and aims to expand to other jurisdictions in the future. So far, EPIC has sent a number of records requests to D.C. agencies and will continue to update this section as new information is released.
D.C. Housing Authority
- Awaiting Response
D.C. Public Schools
- Awaiting Response
D.C. Department of Human Services
- Pondera Proposal (July 19, 2021)
- Pondera Contract (July 19, 2021)
- D.C. OCP Solicitation Letter: Fraud Case Management and Data Analytics System Subscription(July 19, 2021)
- D.C. DHS-Pondera Bilateral Modification Agreement (July 30, 2021)
D.C. Department of Health Care Finance
- Awaiting Response
D.C. Child and Family Services Agency
- Awaiting Response
D.C. Office of Contracts and Procurement
- Awaiting Response
D.C. Department of Transportation
- Awaiting Response
D.C. Department of Youth Rehabilitation Services
- 2017 Family Resource Guide (Nov. 30, 2021)
- Structured Decision Making Demographic Breakdown (Nov. 30, 2021)
- 2017 Care Planning and Coordination Handbook (Nov. 30, 2021)
- Youth Level of Service/Case Management Inventory 2.0 User’s Manual (Nov. 30, 2021)
- 2020 Risk Offense Chart (Nov. 30, 2021)
- 2020 Presentation Examining the Structured Decision Making Risk Assessment Tool (Nov. 30, 2021)
- 2018 Memorandum Clarifying Aspects of the Structured Decision Making Tool (Nov. 30, 2021)
- 2020 Office of Research and Evaluation’s Memorandum on the Structured Decision Making Tool (Nov. 30, 2021)
- 2017 Structured Decision Making Overview and Procedures Presentation (Nov. 30, 2021)
- 2017 Proposed DYRS Risk Re-Assessment Scoresheet (Nov. 30, 2021)
- 2016 Structured Decision Making Overview (Nov. 30, 2021)
Recent Documents on Screening & Scoring
Concerning the company's use of an opaque proprietary algorithm to generate "trustworthiness" scores for prospective renters.
Concerning a company that purports to evaluate a job applicant’s qualifications based upon their appearance by means of an opaque, proprietary algorithm.
Racial Segregation and The Data-Driven Society: How our Failure to Reckon with Root Causes Perpetuates Separate and Unequal Realities
Rashida Richardson | 2021
Defining and Demystifying Automated Decision Systems
Rashida Richardson | 2021
Understanding Transparency in Algorithmic Accountability
Margot Kaminski | 2020
Power, Process, and Automated Decision-making
Ari Ezra Waldman | 2019
Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice
Richardson, Schultz, Crawford | 2019
The Right to Explanation, Explained
Margot Kaminski | 2019
Impoverished Algorithms: Misguided Governments, Flawed Technologies, and Social Control
Sarah Valentine | 2019
Artificial Intelligence Policy: A Primer and Roadmap
Ryan Calo | 2017
The Scored Society: Due Process for Automated Predictions
Danielle Citron & Frank Pasquale | 2014