EPIC logo

Testimony and Statement for the Record of

Chris Jay Hoofnagle
Director and Senior Counsel, Electronic Privacy Information Center West Coast Office

Data Security: The Discussion Draft of Data Protection Legislation

House Committee on Energy and Commerce
Subcommittee on Commerce, Trade, and Consumer Protection

July 29, 2005
2123 Rayburn House Office Building


Introduction

Chairman Stearns, Ranking Member Schakowsky, and Members of the Subcommittee, thank you for extending the opportunity to testify on data security legislation.

My name is Chris Hoofnagle and I am Senior Counsel to the Electronic Privacy Information Center, and director of the group's West Coast office, located in San Francisco. Founded in 1994, EPIC is a not-for-profit research center established to focus public attention on emerging civil liberties issues and to protect privacy, the First Amendment, and constitutional values.

EPIC has been on the forefront of the issues being considered in today's hearing. For instance, "commercial data brokers," companies that extract sensitive information from many sources and sell it as a "dossier" to others, have long been a matter of public concern.1 EPIC has engaged in extensive use of the Freedom of Information Act to determine the extent of interaction between the government and data brokers such as Lexis-Nexis, Acxiom, InfoUSA, and Merlin.2

We applaud the Members of the Committee and others who have crafted legislation to address security standards for companies that maintain personal information. In my testimony today, I will provide comment on the Discussion Draft of Data Protection Legislation. The Discussion Draft is a good first step in addressing the security risks presented by companies with personal information, but fails to fully confer upon individuals the tools they need to avoid misuse of personal information. I therefore recommend that the Committee move this legislation, with reasonable enhancements including: an option for credit freeze, a requirement that security measures include audit trails, and public reporting of security breaches to the Federal Trade Commission. I further recommend that the Committee go beyond security issues and consider the privacy risks raised by data brokers.


Data Insecurity

Well before the recent news of the Choicepoint debacle became public, EPIC had been pursuing the company and had written to the FTC to express deep concern about its business practices. On December 16, 2004, EPIC urged the Federal Trade Commission to investigate Choicepoint and other data brokers for compliance with the Fair Credit Reporting Act (FCRA), the federal privacy law that helps ensure personal financial information is not used improperly.3 The EPIC letter said that Choicepoint and its clients had performed an end-run around the FCRA and were selling personal information to law enforcement agencies, private investigators, and businesses without adequate privacy protection.

Since the Choicepoint breach, there has been a steady stream of news articles and public announcements concerning other companies that have failed to secure the personal information of individuals. The Privacy Rights Clearinghouse, a San Diego-based group, has posted a Chronology of these data breaches.4 As of this writing, this Chronology notes 60 different incidents where a company or government entity reported a security breach involving the Social Security number, drivers license number or financial account number. The Privacy Rights Clearinghouse estimates that 50,000,000 individuals have been affected by these known breaches.

This Chronology is worth revisiting for at least three reasons. First, it demonstrates the diversity of entities that store sensitive personal information and yet have experienced a security incident. While there have been major security breaches at commercial data brokers such as Lexis-Nexis and Merlin, there have also been security problems at banks, schools, government entities such as motor vehicle administrations, and retailers. This demonstrates the need for intervention across a broad array of entities.

A privacy-friendly approach would first emphasize the need for reducing the amount of personal information collected and maintained. Where retention of personal information is necessary, these entities should be subject to a framework of "Fair Information Practices." Fair Information Practices, or "FIPs," constitute a framework of rights and responsibilities that require entities to minimize the amount of information they collect, to use it only for purposes specified by the individual, to hold it in a secure manner, and to provide the individual access to and of the ability to correct their personal data.

Second, the Chronology demonstrates that security breaches may occur for reasons other than to commit identity theft. For instance, insiders at Bank of America, Wachovia, PNC Bank and Commerce Bank sold customers' personal information to attorneys and others who were engaged in debt collection efforts.5 That breach affected the records of over 600,000 accountholders. Sometimes systems are compromised for voyeuristic purposes, such as obtaining the contact information or communications data of celebrities or law enforcement officials.6 Security breaches may be motivated by a company attempting to obtain information about a competitor. Finally, extortion may motivate someone to obtain and disclose an individual’s personal information. For instance, in 2003, a Pakistani clerical worker performing transcription services for an American hospital threatened to release medical records if she was not paid for her services.7 Accordingly, Congress' approach should recognize that identity theft is not the only harm to be avoided. Legislation passed by Congress should recognize that security breaches may be motivated by a number of crimes unrelated to attempted identity theft.

Third, the Chronology demonstrates that entities that maintain personal information are subject to many different security risks. While we typically think of outsiders, such as malicious computer hackers, as the prime security risk, the Chronology shows that dishonest employees are a major security problem. Accordingly, Congress' approach should include measures likely to catch insiders who sell information. Audit trails—a requirement that entities record who accesses and discloses personal information—would go far in deterring and detecting dishonest insiders.

The Draft Should Contain Credit Freeze Language

In the Senate, Members are considering legislation that will prevent identity theft by allowing individuals to "freeze" their credit. Under these proposals, individuals can opt to erect a strong shield against identity theft by preventing the release of their credit report to certain businesses. Because a credit report is always pulled before a business issues a new line of credit, a freeze will make it very difficult for an impostor to obtain credit in the name of another person.8

According to US PIRG, 10 states have credit freeze laws enacted.9 The New Jersey law offers consumers the most benefit—any resident may freeze their credit report at minimal cost, and consumer reporting agencies must make the thaw mechanism work quickly, so that individuals can take advantage of instant credit offers.

We believe that a credit freeze is a good approach that will minimize security risks and reduce the risk of identity theft. Simply stated, this provision will make it more difficult for others to use a consumer’s credit report without their consent. Consumers will always have the ability to provide their credit reports in those transactions that they initiate.

The Need to Consider General Privacy Protections

The Discussion Draft would establish important security safeguards for all businesses with personal information, and heightened duties on information brokers. But while the Discussion Draft addresses security concerns, it does not contemplate whether general privacy restrictions are appropriate.

Information brokers have operated under a self-regulatory schema, known as the Individual Reference Service Group ("IRSG") Principles. Through these principles, the industry conferred upon itself the authority to sell detailed dossiers to almost anyone for almost any purpose. It was the promiscuity of these principles that led to the most recent Choicepoint breach, because the principles allowed data brokers to choose who is "qualified" buyer of personal information, and allowed sale to anyone with a "legitimate" business purpose.

A serious inquiry should be made into the purposes for which these dossiers are being sold. Congress should set limits on the contexts in which personal information can be sold, and when data is sold, limit the secondary uses of personal information.

The Discussion Draft of Data Protection Legislation

Section 2 Requirements for Information Security: All Companies

This section directs the Federal Trade Commission ("Commission") to promulgate regulations to require companies to implement policies and procedures to protect personal information. Companies would have to develop a security policy and statement on use of personal information. Companies would have to identify an employee as being responsible for information security. Finally, companies would have to develop processes to take preventive and corrective action to address security vulnerabilities, including the use of encryption.

We applaud the Members for encouraging the use of encryption to protect personal information. However, we wish to emphasize that once data is encrypted, it may still be vulnerable. For instance, the company may choose a poor encryption method that can be decoded easily. There is also the risk that a malicious actor, especially when he is an insider, will have the key or password to decode the encryption. Accordingly, an entity that uses encryption should not automatically be exempt from other data security responsibilities, such as the requirement to provide security breach notices.

We suggest three improvements to this section:

First, this section could be significantly enhanced by a requirement that companies employ audit trails to deter and detect insider misuse of personal information. An audit trail would record who accessed individuals' information, the purposes for which it was accessed, whether it was disclosed, and to whom it was disclosed. Simply put, encryption will be most effective at protecting data from outsiders; auditing will be a strong deterrent to insiders.

Second, where possible, companies should require customers to establish a password system for access to their file. Currently, many entities with sensitive personal information will give access to files based on the provision of simple biographical information, such as billing address, phone number, date of birth, or Social Security number. The problem is that these biographical identifiers often are found in publicly-available databases, such as phone books, public records, or the Internet.

Passwords have some disadvantages. Sometimes people choose poor passwords, but an institution can correct this by requiring the password to be a certain length. Sometimes individuals forget passwords, and in cases where that is a concern, a "shared secrets" password system could be employed. In such a system, the customer and business agrees upon a series of questions that can be asked to verify identity. They could include asking the customer what street they lived on as a child, the name of their first pet, or their favorite book or sports team. The questions are periodically rotated to prevent an impostor from learning these secrets.

Third, some companies are using automatic number identification ("ANI"), a form of caller ID, to identify or authenticate customers. ANI offers additional security over caller ID, but it now appears that ANI too can easily be "spoofed," or falsified, through the use of VOIP telephony.

In crafting security guidelines, the Commission will have to consider that new technologies may pose new risks to security systems. Accordingly, we recommend that the Commission be directed to periodically review security requirements, and new threats to personal data.

Section 2 Requirements for Information Security: Special Requirements for Data Brokers

This section would require information brokers to be audited by the Commission. It would also require data brokers to allow individuals to obtain their dossier annually at no cost.

We applaud these requirements. Individuals should be able to obtain personal information held by data brokers at no charge. Currently, industry practice on providing individuals access to their personal information varies widely. For instance, it is not clear whether information brokers provide the complete file of personal information when an individual makes a request for access. Choicepoint provides free access, and in a recent study where 11 people requested their files, the company provided individuals with their dossiers in a timely fashion. However, the study showed the many errors were found in the Choicepoint dossiers.10 Acxiom charges $20 for access, but in the study, the company only fulfilled half of the requests made and took an average of 89 days to comply. A legal mandate for free and timely access is needed.

Section 3 Notification of Database Security Breach

This section specifies the instances when a company must disclose to individuals that their personal information has been obtained by an unauthorized person. It defines breach of security as "the compromise of the security, confidentiality, or integrity of data that results in, or there is a reasonable basis to conclude has resulted in, the acquisition of personal information by an unauthorized person that may result in identity theft." It specifies how a company must give notice, and what the notice must contain. It specifies that a company with a security breach must provide three credit reports and a year of credit monitoring service to victims.

There are several critical aspects to this portion of the legislation. First, of course, is the severity of events that constitute a "breach of security." The language in the Discussion Draft tracks the California standard, except that the Discussion Draft includes the requirement that the security breach "may result in identity theft."

As we explained above, identity theft is only one risk from unauthorized access to personal information. Unauthorized access may be gained for other purposes that cause harm to the individual, such as stalking, obtaining information for debt collectors, corporate espionage, extortion, or mere voyeurism. The purpose of data security breach legislation is not just to warn individuals of a risk of identity theft; it is also designed to shine a light on poor data practices.

More importantly, as identity theft expert Beth Givens has argued, companies often cannot tell whether a security breach may result in identity theft. The motives of a person who gained access are not always clear. Identity theft can also occur months or even years after a security breach.

There has been much discussion of whether to give companies discretion to determine whether notice to the public is justified. No such discretion is given by the California law, and Congress should carefully consider the consequences of extending discretion at the federal level. It is already the case that one information broker, Acxiom, engaged in acrobatics to avoid giving notice of a 2003 security breach that reportedly involved 20 million records.11

Because it is difficult to gauge the risk of identity theft, because there are harms other than identity theft which may result from security breaches, and because there is already evidence that companies will go to great lengths to avoid giving security breach notices, we recommend eliminating the language that gives companies discretion not to give notice based on a determination whether the breach "may result in identity theft."

If Congress chooses to give some measure of discretion, it should set a standard that requires notice where there is a "reasonable risk or reasonable basis to believe that such access could lead to misuse of personal information." This standard recognizes that security breaches should focus on "misuse" of personal information instead of just identity theft, and would allow companies not to give notice where there is no reasonable risk of harm. There should also be a duty to thoroughly investigate suspected breaches. The standard set should not give data holders incentives to ignore these incidents.

The second critical factor is the scope of businesses that will be subject to the notification requirement. We think the standard set forth by the bill—any company that owns or possesses data—is the appropriate one. The California standard—any company that owns or licenses data—misses the mark in that some companies merely process data for others, but may still experience a breach.

A third critical factor is the form of notice. The California security notice legislation was in effect a type of "Freedom of Information Act" for security standards. Consumers and policymakers have benefited from learning more about security standards and breaches, but there have also been significant limitations—in many cases, only the victims learn of the breach. Consumers and policymakers would benefit from hearing of all breaches through a website that could be operated by the Commission. We would recommend that the following language be added to the legislation, so that there will be public reporting of security breaches:

"Information submitted to the Commission under sections 2(b)(1) and 3(a)(2) shall be posted at a publicly available website operated by the Commission."

Section 4 Enforcement by the Federal Trade Commission

This section specifies that the Commission will enforce the law, under its authority to address unfair and deceptive trade practices.

We recommend adding enforcement powers so that state Attorneys General can also enforce the law.

We further recommend that the Commission's authorization and appropriation be increased to account for the burdens associated with enforcing this law. The Commission must oversee a plethora of business practices—from deception in funeral businesses to "power output claims for amplifiers utilized in home entertainment products."12 This wide range of responsibility requires adequate funding.

Section 5 Definitions

This section defines the many terms in the legislation, including identity theft and information broker.

The definition of "identity theft" is narrow and does not encompass the full range of activities normally understood as identity theft. The current definition focuses on the use of others' personal information for the purpose of engaging in "commercial transactions." This does not recognize the problem of "criminal identity theft," where an individual uses the personal information of another in his interactions with law enforcement, leaving the victim with a criminal record. Accordingly, we recommend that if the law continues to include this term, that it be broadened to recognize other activities commonly understood to be "identity theft."

Defining "information broker" is a challenge. Many companies are engaged in the transmission of personal information to third parties. In some cases, this occurs within the individual's expectation, such as when information must be transferred to execute a transaction requested by a consumer. In others, the transfer of personal information raises unique privacy risks, and such businesses should be included in the definition of "information broker."

Further complicating this matter is the qualifier "whose business is to collect, assemble, or maintain personal information." Information brokerage is just a small percentage of the business of a company like Lexis-Nexis or even Choicepoint. Lexis-Nexis is a huge company; most of its information products have no bearing on privacy, such as the company's legal and scholarly research databases. According to Choicepoint, only about 11% of its operations consist of information brokerage outside the Fair Credit Reporting Act. Can it be said that Lexis-Nexis and Choicepoint are entities "whose business is to collect, assemble, or maintain personal information" for provision to third parties?

There have been many attempts to define an information broker, and thus far, we think the best is contained in S. 1332:

The term 'data broker' means a business entity which for monetary fees, dues, or on a cooperative nonprofit basis, regularly engages, in whole or in part, in the practice of collecting, transmitting, or otherwise providing personally identifiable information on a nationwide basis on more than 5,000 individuals who are not the customers or employees of the business entity or affiliate.

This definition limits the scope of the law to companies that regularly engage in maintaining large databases on non-customers for the purpose of providing them to a third party. It provides a good starting point for further discussion.

Congress should also consider giving the Commission rulemaking authority to address circumvention of this definition through corporate restructuring or technological tweaks. In passing the Fair and Accurate Credit Transactions Act, Congress included a provision that prohibits "technological circumvention" of the Fair Credit Reporting Act's provisions. The concern was that through database design or corporate reorganization, a consumer reporting agency may escape obligations to provide a free credit report. We think that a similar provision would be appropriate her to avoid a situation where a company simply reorganized to avoid security or privacy responsibilities.

The definition of "personal information" in the Discussion Draft is narrower than the California law. Under the California law, personal information "means an individual's first name or first initial and last name in combination with…" a Social Security number, drivers license number, or account number. The Discussion Draft would require the individual's first and last name, instead of just the first initial. We think that the federal legislation should be as broad as the California definition in this regard.

We further recommend that section 5(5)(A)(iii) should be modified. That section treats an account number in combination with an access code as "personal information." As currently written, it gives credit card companies an out from giving notice by claiming that the three-digit security code on the card must be present for a breach to occur. That is, even though the three-digit code is not necessary to make charges, they will claim that a breach does not require notice unless that code is included in the compromised files. We accordingly recommend that this section be changed to:

"(iii) Financial account number, or a credit card number, or a debit card number in combination with any required security code."

Section 6 Effect on Other Laws

This section specifies that all state laws concerning breaches of security or notification to individuals of breaches of security would be preempted.

The preemption language in the Discussion Draft is overly broad; it risks unintentionally preempting many different state laws that address security, but are not the target of this law. Data security needs are too varied to accommodate a nationwide uniform standard. Floor preemption is more appropriate here.

In privacy and consumer protection law, federal ceiling preemption is an aberration. Historically, federal privacy laws have not preempted stronger state protections or enforcement efforts. Federal consumer protection and privacy laws, as a general matter, operate as regulatory baselines and do not prevent states from enacting and enforcing stronger state statutes. The Electronic Communications Privacy Act, the Right to Financial Privacy Act, the Cable Communications Privacy Act, the Video Privacy Protection Act, the Employee Polygraph Protection Act, the Telephone Consumer Protection Act, the Driver's Privacy Protection Act, and the Gramm-Leach-Bliley Act all allow states to craft protections that exceed federal law.13 Even the Fair Credit Reporting Act is largely not preemptive.14

Although the federal government has enacted privacy laws, most privacy legislation in the United States is enacted at the state level. Many states have privacy legislation on employment privacy (drug testing, background checks, employment records), Social Security Numbers, video rental data, credit reporting, cable television records, arrest and conviction records, student records, tax records, wiretapping, video surveillance, identity theft, library records, financial records, insurance records, privileges (relationships between individuals that entitle their communications to privacy), and medical records.15

Finally, the data industry is in a weak position to argue that it cannot comply with state laws. This is an industry that "segments" or groups people by characteristics at the zip+4 level. They know where you live now, and where you lived ten years ago. No other industry is better equipped to use technology to comply with state law than the data brokers.

Section 7 Effective Date and Sunset

This section specifies that the act will take effect a year after enactment, and sunset 10 years from enactment.

While Congress and the Commission should continue to revisit data security issues, security requirements and rights in personal information should not automatically sunset. We suggest striking the sunset provision.

Section 8 Authorization of Appropriations

This section would authorize a yet to be determined amount to the Commission. For reasons explained above, we support greater funding of the Commission.

Conclusion

Mr. Chariman and Members of the Committee, thank you for inviting me to on the Discussion Draft of Data Protection Legislation. The Discussion Draft is a good first step in addressing security risks presented both by ordinary companies and information brokers. We recommend that the Committee move the legislation, with reasonable enhancements, including an option for credit freeze, requirements that security measures include audit trails, and public reporting of security breaches to the Federal Trade Commission.


1 See Chris Jay Hoofnagle, Big Brother's Little Helpers: How ChoicePoint and Other Commercial Data Brokers Collect, Process, and Package Your Data for Law Enforcement, 29 N.C.J. Int'l L. & Com. Reg. 595 (Summer 2004), available at http://www.epic.org/privacy/choicepoint/cp_article.pdf.

2 EPIC Choicepoint Page, available at http://www.epic.org/privacy/choicepoint/.

3 Letter from Chris Jay Hoofnagle, Associate Director, EPIC, and Daniel J. Solove, Associate Professor, George Washington University Law School, to Federal Trade Commission, Dec. 16, 2004, available at http://www.epic.org/privacy/choicepoint/fcraltr12.16.04.html.

4 Privacy Rights Clearinghouse, A Chronology of Data Breaches Reported Since the ChoicePoint Incident, available at http://www.privacyrights.org/ar/ChronDataBreaches.htm (last visited Jul. 24, 2005).

5 Jonathan Krim, Banks Alert Customers of Data Theft, Washington Post, May 26, 2005, available at http://www.washingtonpost.com/wp-dyn/content/article/2005/05/25/AR2005052501777.html

6 Kelly Martin, Hacker breaches T-Mobile systems, reads US Secret Service email and downloads candid shots of celebrities, SecurityFocus, Jan. 12, 2005

7 David Lazarus, A tough lesson on medical privacy Pakistani transcriber threatens UCSF over back pay, Oct. 22, 2003, available at http://www.sfgate.com/article.cgi?file=/c/a/2003/10/22/MNGCO2FN8G1.DTL.

8 Chris Hoofnagle, Putting Identity Theft on Ice: Freezing Credit Reports to Prevent Lending to Impostors, Securing Privacy in the Internet Age, Stanford University Press (forthcoming 2006) available at http://ssrn.com/abstract=650162

9 US PIRG, State Breach and Freeze Laws, available at http://www.pirg.org/consumer/credit/statelaws.htm.

10 PrivacyActivism, Data Aggregators: A Study of Data Quality and Responsiveness, May 18, 2005, available at http://www.privacyactivism.org/Item/222.

11 Robert O' Harrow, Jr., No Place to Hide 71-72, Free Press (2005). DOJ, Milford Man Pleads Guilty to Hacking Intrusion and Theft of Data Cost Company $5.8 Million, Dec. 18, 2003, available at http://www.usdoj.gov/criminal/cybercrime/baasPlea.htm; DOJ, Florida Man Charged with Breaking Into Acxiom Computer Records, Jul. 21, 2004, available at http://www.usdoj.gov/opa/pr/2004/July/04_crm_501.htm.

12 See generally Title 16 of the Code of Federal Regulations, available at http://www.access.gpo.gov/nara/cfr/waisidx_05/16cfrv1_05.html.

13 Respectively at 18 U.S.C. § 2510 et. seq., 12 U.S.C § 3401, 47 USC § 551(g), 18 USC § 2710(f), 29 USC § 2009, 47 USC § 227(e), 18 U.S.C. § 2721, and Pub. L. No. 106-102, §§ 507, 524 (1999).

14 See 15 USC § 1681t.

15 See generally, Robert Ellis Smith, Compilation of State and Federal Privacy Laws (Privacy Journal 2002)

 


EPIC Privacy Page | EPIC Home Page