Algorithmic Transparency: End Secret Profiling
- Supreme Court Won't Review Ruling on Secretive Sentencing Algorithms: The Supreme Court has declined to review the ruling of a state court that upheld the use of a secret algorithm to determine a criminal sentence. The petitioner Loomis argued that he was not able to assess the fairness or accuracy of the legal judgement, and that the secret "risk assessment" algorithm therefore violated fundamental Due Process right. EPIC has pursued several related cases to establish the principle of algorithmic transparency in the United States. In EPIC v. DHS, EPIC obtained documents about secret behavioral algorithms that purportedly determine an individual's likelihood of committing a crime. In a series of state FOI cases, EPIC obtained records from state agencies about the use of propriety DNA analysis tools to determine guilt or innocence. EPIC is currently litigating EPIC v. CBP before the DC Circuit Court of Appeals, a case concerning the secret scoring of airline passengers by the federal government. (Jun. 26, 2017)
- Court Rules Secret Scoring of Teachers Unconstitutional: A federal district court has held that firing public school teachers based on the results of a secret algorithm is unconstitutional. The case, Houston Federation of Teachers vs. Houston Independent School District, concerned a commercial software company's proprietary appraisal system that was used to score teachers. Teachers could not correct their scores, independently reproduce their scores, or learn more than basic information about how the algorithm worked. "When a public agency adopts a policy of making high stakes employment decisions based on secret algorithms incompatible with minimum due process, the proper remedy is to overturn the policy," the court wrote. EPIC recently filed a complaint asking the FTC to stop the secret scoring of young tennis players. EPIC has pursued several cases on "Algorithmic Transparency," including one for rating travelers and another for assessing guilt or innocence. (Jun. 13, 2017)
- EPIC to Congress: Data Protection Needed for Financial Technologies (Jun. 9, 2017) +
- EPIC Asks FTC to Stop System for Secret Scoring of Young Athletes (May. 17, 2017) +
- In Merger Reviews, EPIC Advocates for Privacy, Algorithmic Transparency (May. 9, 2017) +
- European Parliament Adopts Resolution on Big Data (Mar. 24, 2017) +
- EPIC Urges Senate Commerce Committee to Back Algorithmic Transparency, Safeguards for Internet of Things (Mar. 22, 2017) +
- EPIC Sues Justice Department Over "Risk Assessment" Techniques (Mar. 7, 2017) +
- Pew Research Center Releases Report on Algorithms (Feb. 8, 2017) +
- Aspen Institute Report Explores Artificial Intelligence (Jan. 30, 2017) +
- The Verge Features EPIC FOIA Docs on Secret Profiling System (Dec. 21, 2016) +
- European Parliament Explores Algorithmic Transparency (Nov. 7, 2016) +
- EPIC Urges Massachusetts High Court to Protect Email Privacy (Oct. 24, 2016) +
- EPIC Promotes "Algorithmic Transparency" at Annual Meeting of Privacy Commissioners (Oct. 20, 2016) +
- White House Releases Reports on Future of Artificial Intelligence (Oct. 13, 2016) +
- Presidential Science Advisors Challenge Validity of Criminal Forensic Techniques (Sep. 8, 2016) +
- Wisconsin Supreme Court Upholds Use of Sentencing Algorithms, But Recognizes Risks (Jul. 16, 2016) +
- White House Report Points to Risks with Big Data (May. 5, 2016) +
- At UNESCO, EPIC's Rotenberg Argues for Algorithmic Transparency (Dec. 8, 2015) +
- EPIC Pursues Public Release of Secret DNA Forensic Source Code (Oct. 14, 2015) +
- EPIC Pursues Lawsuit about Secret Government Profiling Program (Aug. 11, 2015) +
- Facebook Applies for Patent to Collect Users' Credit Scores (Aug. 5, 2015) +
- EPIC Pursues Documents about Secret Government Profiling Program (Jul. 1, 2015) +
- White House Report on "Big Data" Explores Price Discrimination, Opaque Decisionmaking (Feb. 5, 2015) +
- Senators Challenge Verizon's Secret Mobile Tracking Program (Jan. 30, 2015) +
- EPIC Urges House to Safeguard Consumer Privacy (Jan. 26, 2015) +
More top news
White House Report on the Future of Artificial Intelligence
In May 2016, the White House announced a series of workshops and a working group devoted to studying the benefits and risks of AI. The announcement recognized the "array of considerations" raised by AI, including those "in privacy, security, regulation, [and] law." The White House established a Subcommittee on Machine Learning and Artificial Intelligence within the National Science and Technology Council.
Over the next three months, the White House co-hosted a series of four workshops on AI:
- Legal and Governance Implications of Artificial Intelligence, May 24, 2016, Seattle, WA
- Artificial Intelligence for Social Good, June 7, 2016, in Washington, DC
- Safety and Control for Artificial Intelligence, June 28, 2016, in Pittsburgh, PA
- The Social and Economic Implications of Artificial Intelligence Technologies in the Near-Term, July 7, 2016, in New York City
EPIC Advisory Board members Jack Balkin, danah boyd, Ryan Calo, Danielle Citron, Ed Felten, Ian Kerr, Helen Nissenbaum, Frank Pasquale, and Latanya Sweeney each participated in one or more of the workshops.
The White House Office of Science and Technology issued a Request for Information in June 2016 soliciting public input on the subject of AI. The RFI indicated that the White House was particularly interested in "the legal and governance implications of AI," "the safety and control issues for AI," and "the social and economic implications of AI," among other issues. The White House received 161 responses.
On October 12, 2016, The White House announced two reports on the impact of Artificial Intelligence on the US economy and related policy concerns: Preparing for the Future of Artificial Intelligence and National Artificial Intelligence Research and Development Strategic Plan.
Preparing for the Future of Artificial Intelligence surveys the current state of AI, its applications, and emerging challenges for society and public policy. As Deputy U.S Chief Technology Officer and EPIC Advisory Board member Ed Felten writes for the White House blog, the report discusses "how to adapt regulations that affect AI technologies, such as automated vehicles, in a way that encourages innovation while protecting the public" and "how to ensure that AI applications are fair, safe, and governable." The report concludes that "practitioners must ensure that AI-enabled systems are governable; that they are open, transparent, and understandable; that they can work effectively with people; and that their operation will remain consistent with human values and aspirations."
The companion report, National Artificial Intelligence Research and Development Strategic Plan, proposes a strategic plan for Federally-funded research and development in AI. The plan identifies seven priorities for federally-funded AI research, including strategies to "understand and address the ethical, legal, and societal implications of AI" and "ensure the safety and security of AI systems."
The day after the reports were released, the White House held a Frontiers Conference co-hosted by Carnegie Mellon University and the University of Pittsburgh. Also in October, Wired magazine published an interview with President Obama and EPIC Advisory Board member Joi Ito.
EPIC has promoted Algorithmic Transparency for many years and is has litigated several cases on the front lines of AI. EPIC's cases include:
- EPIC v. FAA, which EPIC filed against the Federal Aviation Administration for failing to establish privacy rules for commercial drones
- EPIC v. CPB, in which EPIC successfully sued U.S. Customs and Border Protection for documents relating to its use of secret, analytic tools to assign "risk assessments" to travelers
- EPIC v. DHS, to compel the Department of Homeland Security to produce documents related to a program that assesses "physiological and behavioral signals" to determine the probability that an individual might commit a crime.
EPIC also has a strong interest in algorithmic transparency in criminal justice. Secrecy of the algorithms used to determine guilt or innocence undermines faith in the criminal justice system. In support of algorithmic transparency, EPIC submitted FOIA requests to six states to obtain the source code of "TrueAllele," a software product used in DNA forensic analysis. According to news reports, law enforcement officials use TrueAllele test results to establish guilt, but individuals accused of crimes are denied access to the source code that produces the results.
- ACM Public Policy Council, Statement on Algorithmic Transparency and Accountability (Jan. 12, 2017)
- Kate Crawford and Ryan Calo, There is a blind spot in AI research (October 13, 2016).
- We Robot 2017
- We Robot 2016
- Ryan Calo, A. Michael Froomkin, and Ian Kerr, Robot Law (Edward Elgar 2016)
- EPIC: Algorithms in the Criminal Justice System
- Alessandro Acquisti, Why Privacy Matters (Jun 2013)
- Alessandro Acquisti, Ralph Gross, Fred Stutzman, Faces of Facebook: Privacy in the Age of Augmented Reality (Aug. 4, 2011)
- Alessandro Acquisti, Price Discrimination, Privacy Technologies, and User Acceptance (2006)
- Steven Aftergood, “Secret Law and the Threat to Democratic Government,” Testimony before the Subcommittee on the Constitution of the Committee on the Judiciary, U.S. Senate (Apr. 30, 2008)
- Phil Agre, Your Face Is Not a Bar Code: Arguments Against Automatic Face Recognition in Public Places
- Ross Anderson, The Collection, Linking and Use of Data in Biomedical Research and Health Care: Ethical Issues (Feb. 2015)
- James Bamford, The Shadow Factory: The NSA from 9/11 to the Eavesdropping on America (2009)
- Grayson Barber, How Transparency Protects Privacy in Government Records (May 2011) (with Frank L. Corrado)
- Colin Bennett, Transparent Lives: Surveillance in Canada
- danah boyd, Networked Privacy (2012)
- David Burnham, The Rise of the Computer State (1983)
- Julie E. Cohen, Power/play: Discussion of Configuring the Networked Self, 6 Jerusalem Rev. Legal Stud. 137-149 (2012)
- Julie E. Cohen, Julie E. Cohen, Configuring the Networked Self: Law, Code, and the Play of Everyday Practice (New Haven, Conn.: Yale University Press 2012)
- Julie E. Cohen, Privacy, Visibility, Transparency, and Exposure (2008)
- Danielle Keats CItron & Frank Pasquale, The Scored Society: Due Process for Automated Predictions, 89 Washington Law Review (2014) 1
- Cynthia Dwork & Aaron Roth, The Algorithmic Foundations of Differential Privacy, 9(4) Theoretical Computer Science (2014) 211
- David J. Farber & Gerald R Faulhaber, The Open Internet: A Consumer-Centric Framework
- Ed Felten, Algorithms can be more accountable than people, Freedom to Tinker
- Ed Felten, David G Robinson, Harlan Yu & William P Zeller, Government Data and the Invisible Hand, 11 Yale Journal of Law & Technology (2009) 160
- Ed Felten, CITP Web Privacy and Transparency Conference Panel 2
- A Michael Froomkin, The Death of Privacy, 52 Stanford Law Review (2000) 1461
- Urs Gasser et. al., ed, Internet Monitor 2014; Reflections on the Digital World Berkman Center for Internet and Society
- Urs Gasser, Regulating Search Engines: Taking Stock and Looking Ahead, 9 YALE J.L. & TECH. 124 (2006)
- Jeff Jonas, Using Transparency as a Mask, (Aug. 4, 2010)
- Jeff Jonas & Ann Cavoukian, Privacy by Design in the Age of Big Data (Jun. 8, 2010)
- Ian Kerr, Privacy, Identity and Anonymity (Sep. 1, 2011 )
- Dr Ian Kerr Prediction, Presumption, Preemption: The Path of Law After the Computational Turn (Jul. 30, 2011)
- Rebeca MacKinnon, Where is Microsoft Bing’s Transparency Report? The Guardian (Feb. 14, 2014)
- Frank Pasquale, The Black Box Society: The Secret Algorithms That Control Money and Information (Jan. 5, 2015)
- Frank Pasquale, The Scored Society: Due Process for Automated Predictions, 89 Washington Law Review 1 (2014) (with Danielle Citron)
- Frank Pasquale, Restoring Transparency to Automated Authority, 9 Journal on Telecommunications & High Technology Law 235 (2011)
- Frank Pasquale, Beyond Innovation and Competition: The Need for Qualified Transparency in Internet Intermediaries, 104 Northwestern University Law Review 105 (2010)
- Frank Pasquale, Internet Nondiscrimination Principles: Commercial Ethics for Carriers and Search Engines, 2008 University of Chicago Legal Forum 263 (2008)
- Bruce Schneier, Accountable Algorithms (Sep. 21, 2012)
- Latanya Sweeney, Privacy Enhanced Linking, ACM SIGKDD Explorations 7(2) (Dec. 2005)
- Tim Wu, TNR Debate: Too Much Transparency? New Republic
Share this page:
EPIC relies on support from individual donors to pursue our work.
Subscribe to the EPIC Alert
The EPIC Alert is a biweekly newsletter highlighting emerging privacy issues.
by Ryan Calo, A. Michael Froomkin,