As part of EPIC's ongoing lawsuit for cell phone surveillance orders issued by federal prosecutors, the Department of Justice identified identified 75 orders and warrants for cell phone location data under § 2703(d) from the U.S. Attorney's Office for the Virgin Islands from 206-2019. During the same period, the attorneys handled 283 criminal cases. The U.S. Attorney's Office for the Virgin Islands is one of the smallest districts in the country. In February, EPIC obtained the number of location data requests for the District of Delaware, the first of five districts that the DOJ has agreed to search for location data requests. EPIC is still waiting for responses from 3 of the agency's other prosecutors' offices and will continue to update its comparative table as each district releases more information. Currently prosecutors do not release any comprehensive or uniform data about their surveillance of cell phone location data. In 2018, the U.S. Supreme Court ruled in Carpenter v. United States that the collection of cell phone location data without a warrant violated the Fourth Amendment. The case is EPIC v. DOJ, No. 18-1814 (D.D.C.).
The FTC announced Monday that the sale or use of racially biased algorithms is an unfair and deceptive trade practice in violation of the FTC Act. In a blog post, the Commission warned companies to ensure fairness and equity in their use of AI. The FTC cautioned companies to "Start with the right foundation," "Watch out for discriminatory outcomes," "Embrace transparency and independence," "Don't exaggerate what your algorithm can do or whether it can deliver fair or unbiased results," "Tell the truth about how you use data," "Do more good than harm," and "Hold yourself accountable–or be ready for the FTC to do it for you." The FTC cited its 2016 report on big data analytics and machine learning; its 2018 hearing on algorithms, AI and predictive analytics; and its 2020 business guidance on AI and algorithms. The post also cited a recent study from the Journal of the American Medical Informatics Association finding that AI may worsen healthcare disparities for people of color, even if an AI system was meant to benefit all patients. In 2019, EPIC filed a complaint with the FTC asking the Commission to investigate HireVue's use of opaque, unproven AI and to require baseline protections for AI use. Last year, EPIC petitioned the FTC to conduct a rulemaking on commercial uses of AI, including protections against discrimination and unfair bias.
A leaked draft of the European Commission's proposed AI regulation includes a ban on social scoring and strict limits on mass surveillance and other "high-risk" uses of AI. The draft regulation would generally prohibit AI which "manipulates human behaviour, opinions or decisions" to a person's detriment or which "exploits information or prediction about a person or group of persons in order to target their vulnerabilities[.]" The draft also requires notice to individuals when they interact with AI, prior authorization for the use of remote biometric identification tools (including facial recognition), and data impact assessments for "high-risk" systems. The draft is broadly worded and subject to exceptions—including exemptions for "investigating serious crime and terrorism"—but would impose a penalty of up to 4% of annual revenue on companies that violate the regulation. The official release of the proposed regulation is expected on April 21. EPIC has called for prohibitions on secret scoring, mass surveillance, and facial recognition.