Archive

FTC Advisory Committee on Online Access and Security

Final Report - First Draft

03 May 2000


Table of Contents

Introduction

Section 1: Online Access

What is Access?
Type of Access
Defining Personal Information
Additional Considerations
Inferred and Derived Data
Non-unique data
Axis of consideration
Covered Entities
Ease of Access
Fees
Usability of the access and correction system
Privacy implications
Access Options
Access Option 1: Default to Consumer Access
Access Option 2 --Total Access Approach
Access Option 3 -Case-by-Case Approach (Including Sectoral Considerations)
Access Option 4 - "Access for Correction"
Authentication
Ways of addressing the authentication problem
Account Subscribers
Cookies, identifiers, and partially personalized data

Section 2: Security

Competing Considerations in Computer Security
Directing Computer Security - Preliminary Considerations
Notice and Education
Notice
Consumer Education
Options for Setting Website Security Standards
Security Option 1 - Rely on Existing Remedies
Security Option 2 - Maintain a Security Program
Security Option 3 - Rely on Industry-Specific Security Standards
Security Option 4 - "Appropriate Under the Circumstances" Standard of Care
Security Option 5 - Sliding Scale of Security Standards
Security Recommendation
Other Considerations not Addressed
Wireless Technologies
Inter-Industry Data Sharing
Enforcement Options
Rely on Existing Enforcement Options
Third-Party Audit or Other Assurance Requirements
Create Express Private Cause of Action
Government Enforcement Program

Terms

Introduction

The purpose of the Advisory Committee on Online Access and Security ("ACOAS" or the "Advisory Committee") is to give advice and recommendations to the Federal Trade Commission ("FTC") concerning providing online consumers reasonable access to personal information collected from and about them by domestic commercial Web sites, and maintaining adequate security for that information.

In particular, the Charter of ACOAS directs that the Advisory Committee "will consider the parameters of reasonable access to personal information and adequate security and will present options for implementation of these information practices in a report to the Commission." (Charter of the Federal Trade Commission Advisory Committee on Online Access and Security "Charter," attached hereto as addendum A).

This is the final report of ACOAS. The Advisory Committee considered access and security as it relates to online information. Its work relates to the online world and should not be seen as a specific road map for off-line records.

A wide range of discussion was held in four formal meetings of ACOAS and in numerous subcommittee-working groups not held in the presence of any official of the FTC. All substantive proposals have been made available to members of ACOAS and to members of the public by having been promptly placed on the FTC's Web site for ACOAS, http://www.ftc.gov/acoas.

The advice of this Advisory Committee and the options presented are in the context of implementation of Fair Information Practices by commercial Web sites. The Charter neither requested nor precluded suggestions for legislation or mandatory regulation. Access to private sector records, in the view of some on the Advisory Committee, is not yet appropriate for legislative recommendation. Others on the Advisory Committee believe that there should be immediate legislative implementation of some of the options. It is, therefore, not possible for this Committee to reach a consensus on legislative recommendations.

The context of this committee's consideration was not to provide consensus options for legislation, mandatory regulation or self-regulation. Rather, the Advisory Committee presents a range of options that have been identified as ways to implement Fair Information Practice principles of access and security. The report is silent on whether these recommendations should be implemented voluntarily, by industry self-regulation, or by legislation. Access options have some support from at least one committee member, but do not represent a majority position, and therefore no consensus was reached on any access option. There was consensus on one security recommendation. Each option has with it a brief discussion of its pros and cons..

To some on the Advisory Committee, the options identified here should be further examined and tested and applied before they are enacted with the force of law. To others, these options, or at least some of them, provide a road map to legislative action.

The value of this report is that it reflects a review of the issues of access and security by a wide range of experts, practitioners, and advocates from all sides of the issue. It provides an analysis of the issues and an identification of options that hopefully will be helpful to the FTC in its continued efforts in privacy and the application of Fair Information Practice principles.

Section 1: Online Access

"Access" to personal data is frequently invoked as a fundamental part of any privacy program. But as the Advisory Committee's deliberations revealed, this apparently simple concept hides layers of complexity - and in many cases disagreement. This Section seeks to unpack the concept of access in a way that helps Web sites and policymakers understand the difficult questions that must be answered in fashioning an access policy.

We first identify the questions that must be answered in defining access - does it mean only an ability to review the data or does it include authority to challenge, modify, or even delete information?

We next ask another apparently simple question, "Access to what?" Businesses gather a wide variety of data from many sources. Sometimes information is provided by the individual consumer, sometimes by a third party. Sometimes it is derived by the business itself, using its own judgment or processes. Sometimes the data is imperfectly personalized - it relates to a computer that may or may not be used only by one person. How much of that data is covered by the access principle?

With those concepts in mind, the Committee lays out four illustrative options that show the many different ways in which the access principle could be implemented - ranging from "total access" to a narrow access option aimed at ensuring correction of important information.

Finally, this section addresses three additional questions that must be answered before any access policy can be implemented.

  • First, who does the policy cover ? Just the entity that gathered the data? Its corporate affiliates and agents? Or every company to which the data may have been passed?
  • Second, how easy should access be? In particular, should Web sites charge a fee to cover some or all of the cost of providing access? Is it fair to impose limits on multiple or duplicative access requests?
  • Third, how does a Web site know it is providing access to the right person? Giving access to the wrong person could turn a privacy policy into an anti-privacy policy. The final section of this report examines the difficulties and possible solutions inherent in trying to authenticate requests for access to personal data.

What is Access?

Access is the individual's ability to view, edit(1), and/or delete(2) his or her personally identifying information. The scope of access will vary between each access option put forth in following sections and due to other considerations such as whether the website in question is a covered entity and the type of authentication deemed appropriate.

Both consumers and businesses have a shared interest in the provision of reasonable access to consumer personal information. Reasonable access benefits individuals, society and business due to the openness and accountability it helps to promote. If done properly, the provision of access can also help reduce the costs to businesses and consumers of improper decision-making due to poor data quality. Moreover, increased access may help promote consumer trust and deeper customer relationships, which benefit both consumers and businesses. However, the manner in which to provide access and to what degree access should be provided are complex questions given the numerous types of non-personally identifiable and personally identifiable information, the "sensitivity" of that information, the sources of that information, and the various costs and benefits associated with providing access.

The method by which access is provided should be consistent with its storage and use by the business. For example, if the business stores the information in online storage such that it is instantly available for use by the business (e.g. as part of online transaction processing system or a web based e-commerce system), then instantaneous online access should be provided to consumers via an appropriate online terminal (e.g., web browser, ATM machine, telephone voice response unit).

Should the ability to view, edit or correct data vary with the use of the data?

a) Yes, no need to access, edit or correct data that is not actively used for anything, or merely maintained for system integrity, troubleshooting, or auditing.
 
b) Yes, only need to allow access, edit and correct data that is used to make important decisions such as financial or medical decisions, or employment decisions
 
c) Yes, where the information is collected from a public record source, a fair credit reporting agency or other entity that is responsible as the source of the information, edit and correction should be directed to the source of the data. Access request may also be directed to the source where required to do so.
 
d) No, the consumer should have the right to be able to access, edit or correct any data collected and maintained about them so long as the holder of the data can reasonably make that accessible.

Some members of the Committee thought that the use of the data should not be a factor in determining whether or not to grant a consumer the ability to view, edit or correct data maintained about them. Although the way the data is being used is an important consideration, it is a slippery slope. What is collected today and not used, might be in the future. Some may consider a use or decision unimportant, while others might consider such use very important.

Should the provision of access be determined in terms of the type of data?

For purposes of this discussion we felt it was best to group data into three broad classes, namely:

a) Whatever data the company maintains.
 
b) All but inferred data, with the exception of inferred data handled under separate laws or regulations (e.g. credit loan decision).
 
c) Only physical contact information, online contact information, biometric identifiers, financial account identifiers, sensitive medical data, transactional data and image (or other information linked to these categories).

There is a case for not having to provide a customer access to inferred data as this information may be the result of a proprietary model that provides the company competitive advantage; e.g. an indicator of a customer's future purchase behavior. The only counter would be when the derived data is used to make a decision about the customer that would result in an important denial of services - e.g. granting of a loan. However, it should be noted that consumers might be more interested in information that is derived about them than they are about the detailed information that they used to derive it in the first place.

With specific regard to correction, some Committee members believed that ascertaining whether inferences are right or wrong would be difficult and costly. Also, many inferences are not presumed by the inferer to be correct, but instead are useful to draw general conclusions, instead of conclusions of fact, and therefore this category of information is not practically corrected by the consumer. Other Committee members believe this is information formulated about a consumer and used in ways that affects their interaction with businesses. These members believe consumers have a strong interest in being able, at the very least, to view all the information that describes them in the hands of businesses.

The costs of providing access to other types of information such as click stream or log data could be considerable and fantastically expensive. In addition, some of the above options would create substantial authentication hurdles.

There are costs and benefits to both businesses and consumers that must be considered here. Consumers face a higher cost in not having correct data for certain types of information (credit information vs. marketing information, for instance). Some Committee members believe that there is a benefit in providing access in general to all types of information held by all businesses, and these benefits must be weighed against the costs.

Type of Access

View
Consumers can view information to which they have access.
 
Edit
Consumers can edit information to which they have access that is not certified by the business or a third party.
 
The business should provide a process by which consumers can challenge the correctness of the certified information and request changes to the information. The business is not obligated to change information that it believes is correct per its own certification (e.g., the record of a purchase transaction) or the certification of a third party, but should provide a process by which disagreements concerning the correctness of the information can be arbitrated.
 
Delete
Consumers can delete consumer-contributed information.
 
The business should provide a process by which consumers can challenge the correctness or appropriateness of information from other sources and request deletion of the information. The business is not obligated to delete third-party-sourced or self-sourced information that it believes is correct and appropriate to retain, but should provide a process by which disagreements concerning the accuracy and appropriateness of the information can be arbitrated.

Means of Access

Access should be provided via a means appropriate for the type of information and consistent with its storage and use by the business. If the business stores the information in online storage such that it is instantly available for use by the business (e.g., as part of an online transaction processing system or a web based e-commerce system), then instantaneous online access should be provided to consumers via an appropriate online terminal (e.g., web browser, ATM machine, telephone voice response unit).

If the business stores the information in storage for processing by batch processing systems (e.g., a batch billing system), then the information should be available to consumers via a frequently (e.g., once per week) scheduled batch process (e.g., a report run at regularly scheduled intervals and mailed to the consumer).

If the business stores the information in offline storage (e.g., magnetic tapes stored offsite), then the information should be available to consumers via an ad-hoc batch process (e.g., scheduled on demand).

Defining Personal Information

An important first step in considering whether, how, and under what circumstances to provide individuals with access to information is defining the information at issue. Defining the term personal information is central to the task of considering various options for providing access. The Committee considered several approaches to defining the information to be considered personal information for the purpose of providing individuals access. The options below are listed in descending order from broad to narrow. As discussed in the access options, access to data covered by each of these definitions could still be limited under the "default" or "case by case" options due to mitigating circumstances. The options below illustrate several approaches to defining the scope of information under discussion. The charts provide for easy comparison between the various options. Green indicates that the information is included in the definition. Red indicates that the information is excluded from the definition.

Access should be provided to:

Information maintained by a business and attached to the individual or a proxy for the individual.

This definition includes all information regardless of the medium (online v. offline), method (passive v. active), or source (data subject v. third party) from which it is obtained. This definition covers both information tied to traditional identifiers such as names and addresses, and envisions the development of online identifiers that provide the same ability to collect information about particular individuals and use it to make decisions that impact the individual in the online environment such as mobile device or other unique identifiers. This would include global and local unique identifiers. This definition reflects the concepts that 1) information need not be unique to be considered capable of identifying an individual; and, 2) the concept of "identifying" is rapidly changing in the online environment.

Type of Identifier Traditional Non-traditional Both
Medium of collection Online Offline Both
Method of collection Passive Active Both
Source of data Subject Third-party Both
Type of data Factual/Observed Derived/Inferred Both

Information maintained by a business about an individual that identifies him or her using a traditional identifier.

This definition includes all information that meets this definition regardless of the medium (online v. offline), method (passive v. active), or source (data subject v. third party) from which it is obtained. This definition would provide access to all information tied to an email address, a physical address, but would not provide access to information tied to a unique numeric identifier in the absence of additional identifying information. For example, click stream data tied to a unique number would not meet this definition unless it was associated with a name, email address or other traditional identifier.

Type of Identifier Traditional Non-traditional Both
Medium of collection Online Offline Both
Method of collection Passive Active Both
Source of data Subject Third-party Both
Type of data Factual/Observed Derived/Inferred Both

Information collected online about an individual that identifies him or her using a traditional identifier.

This definition further narrows option two by limiting the medium of collection to "online." Information collected by the organization through offline methods is not covered. However, data actively or passively collected online would be covered.

Type of Identifier Traditional Non-traditional Both
Medium of collection Online Offline Both
Method of collection Passive Active Both
Source of data Subject Third-party Both
Type of data Factual/Observed Derived/Inferred Both

Information collected online from an individual that identifies him or her using a traditional identifier.

This definition further narrows option three by limiting: 1) the medium of collection to "online"; and, 2) the source of collection to the individual. Information collected by the organization through offline methods is not covered. Information collected online or offline from a source other than the individual would not be covered. However, data actively or passively collected from the individual online would be covered.

Type of Identifier Traditional Non-traditional Both
Medium of collection Online Offline Both
Method of collection Passive Active Both
Source of data Subject Third-party Both
Type of data Factual/Observed Derived/Inferred Both

Additional Considerations

There are situations where providing individuals with access to information (depending on the definition selected above) conflicts with other interests. The Committee identified and considered several interests that in some members' opinions merit consideration in access determinations.

Inferred and Derived Data

Inferred or derived data is information that the business has not "collected" either passively or actively about the individual, but rather has inferred based on that observed behavior. It is the assumptions or conclusions that a business makes about an individual, not the factual record of the individual's action or behavior.

Briefly, inferred data can be defined as information gathered from sample data, not the data subject, that is calculated to result in a value applied to the data subject. Derived data can be defined as information gathered from the subject that is calculated to result in a value applied to the data subject.

The disclosure of inferred or derived data raises important considerations. Advocates of providing access to such data argue that it is used to make decisions about individuals and should therefore be available to individuals. Critics of providing access to such data argue that disclosing the assumptions or conclusions a business makes undermines competition by inviting competitors to attempt reverse engineering to unearth proprietary operations and allowing competitors to free-ride off the analytic work of rivals.

Non-unique data

If we accept the concept that information can be both personal to an individual and not unique to the individual, we must grapple with the privacy considerations of providing access to this data. In some instances we tolerate "over disclosure", such as telephone numbers where we disclose all calls made from a number back to the individual named on the account despite the fact that in multi-person homes this discloses the calls of other family members. While providing access to data associated with a computer or browser could be considered loosely analogous to the phone number situation, the information collected about an individual's use of the Web can on its face reveal more about the individual than the numbers dialed. Therefore, in the online environment it is important to consider the potential risks to others privacy access to such data poses, and attempt to design methods to address access that promote privacy in all respects.

Axis of consideration

Identifier Traditional Non-traditional Both
Medium Online Offline Both
Method Passive Active Both
Source Subject Third-party Both
Type Factual/Observed Derived/Inferred Both

Covered Entities

Should access obligations be extended to all parties with whom information has been shared?

The Committee felt that there were only three reasonable alternatives regarding which entities could be required to provide customers access to data maintained about them:

  • All entities that collect information from a data subject or receive that same information from the original data collector. This only applies to information that is actively maintained in a database of consumer information that can be linked/associated with individual consumers and/or consumer households.
  • The entity the consumer reasonably believes is the Data Collector and its agents (entities acting for the Data Collector and restricted in their use and transfer of the data). Notice of transfer to other third parties would be required, but access would not be required.
  • Data collector, parents, subsidiaries, and recipients including information intermediaries.

The Committee generally agreed that corporations should provide access to the data held by their agents (as defined above). However, several members of the Committee thought managing other third parties would be unduly burdensome, and that the consumers were better protected by requiring companies to provide notice about with whom they will share the information. Other members of the Committee held that the provision of notice is unduly burdensome to consumers who will be less likely to be aware of the existence of such third parties, let alone how to contact those companies and exercise access.

Still other members of the Committee believed the issue depended on whether the parent and/or subsidiaries are using this information. If they are, then they should make it accessible and protect it - if not, then no. With respect to "information intermediaries," it depends on how they treat and handle the data. If they use the information, view it and permanently store it then they should make it accessible and protect it. If not, then access is not required.

Is there an obligation to propagate corrections to incorrect data to other entities?

Whether or not third parties provide access to personal information, it is still possible for the original data collector - assuming they do provide the ability to access and correct data - to propagate any corrections made to those third parties. The Committee presented the following three options: (a) No obligation; (b) When reasonable; (c) Companies should always propagate corrections when passing information onto third parties that are receiving that information for the first time.

It would be desirable for a company when correcting errors to propagate these corrections to other entities, but it is recognized that the company may not be in a position to know all the entities that are currently maintaining related information about that individual, nor the state of that data (whether it has already been corrected or continues to be in error).

Should the access obligations of third parties vary with the use of the data?

With regards to different possible uses of data, the Committee puts forth the following four options:

(a) Yes, no need to access, edit or correct data that is not actively used for anything or merely maintained for system integrity, troubleshooting, or auditing;
 
(b) Yes, only need to allow access, edit and correct data that is used to make important decisions such as financial or medical decisions, or employment decisions;
 
(c) Yes, where the information is collected from a public record source, a fair credit reporting agency or other entity that is responsible as the source of the information, edit and correction should be directed to the source of the data. Access request may also be directed to the source where required to do so;
 
(d) No, the consumer should have the right to be able to access, edit or correct any data collected and maintained about them so long as the holder of the data can reasonably make that accessible.

However, some members of the Committee thought that the use of the data should not be a factor in determining whether or not to grant a consumer the ability to view, edit or correct data maintained about them. Although the way the data is being used is an important consideration, it is a slippery slope. What is collected today and not used, might be in the future. While some might consider use or decision unimportant, others might consider it very important.

Should the access obligations of third parties vary with the type of data?

With regard to different possible types of data, the Committee puts forth the following three categories:

(a) Whatever data the company maintains;
 
(b) All but inferred data, with the exception of inferred data handled under separate laws or regulations (e.g. credit loan decision);
 
(c) Only physical contact information, online contact information, biometric identifiers, financial account identifiers, sensitive medical data, transactional data and image (or other information linked to these categories).

There is a case for not having to provide customer access to inferred data as this information may be the result of a proprietary model that provides the company competitive advantage; e.g. an indicator of a customer's future purchase behavior. The costs of providing access to other types of information such as click stream or log data could be considerable and fantastically expensive. The only counter would be when the derived data is used to make a decision about the customer that would result in an important denial of services - e.g. granting of a loan. However, it should be noted that consumers might be more interested in information that is derived about them than they are about the detailed information that they used to derive it in the first place.

Consumers face a higher cost in not having correct data for certain types of information (credit information vs. marketing information, for instance). Some Committee members believe that there is a benefit in providing access in general to all types of information held by all businesses, and these benefits must be weighed against the costs.

Ease of Access

Ease of access includes issues surrounding both whether access fees should be allowed, and the degree of effort required by the data access provider to ensure that the information can be easily accessed, understood and corrected by the consumer. It also includes non-economic costs of access, such as potential risks to privacy.

Fees

The Committee discussed whether businesses should charge consumers a fee for access. The range of options identified include:

  • Never charge any fee. No costs should be incurred by the consumer to access their information
  • Selectively charge fees at nominal costs
  • Fees commensurate with type of data being accessed.
  • Fees commensurate with the use of data being accessed.
  • Fees commensurate with the amount of data being accessed.
  • Fees commensurate with frequency which a user accesses the data.
  • Fees commensurate with the nature of the data access requirement (e.g. if the customer wants real-time access to the data when normal access is not real-time (e.g. access normally provided within 24 hours).
  • The service provider is free to charge any reasonable fee, but the fee must be kept within specified ceilings and floors
  • Always charge a fee
  • Do not charge where an adverse decision was based upon the information, but charge a nominal fee, not greater than cost, for access in other cases.

Proponents would argue: The arguments for charging consumers a fee for access include:

  • Allow businesses to recover (some of) the cost of providing access
  • Provide a means to shift the cost of providing access to the consumers who used access (rather than passing it through to all consumers indirectly through higher prices)
  • Provide a deterrent to frequent nuisance access requests

Opponents would argue: The arguments against charging consumers a fee for access include:

  • Any fee may unduly limit the ability of consumers to access their information
  • Any fee may lessen the attractiveness of accessing personal information
  • There should be no charge to consumers for reasonable access to information about them

Usability of the access and correction system

The effort required of the data access provider to insure information is easily accessible by consumers might include:

  • An interface that is easy-to-use, not requiring any special training by a non-technical lay person; e.g. should be no harder to access than any of the services provided by the service provider.
  • Information presented in a legible and intelligible manner (e.g. not difficult to decipher codes)
  • Reasonably available access and correction systems (e.g., 24 hour, 7 day per week availability).
  • Adequate notice should be made to the consumer of what information is available for access and how to access and correct this information.

Proponents would argue: The arguments for providing ease of access include:

  • Consumer access should be designed to facilitate access by lay persons

Opponents would argue: The arguments against providing ease of access include:

  • Businesses should not be required to create and maintain new and/or special capabilities to provide access

Privacy implications

Centralization

As many companies that hold personal information are part of a larger corporate entity that may possess other data through different subsidiaries, there is a concern that access to information held by the parent company may force centralization of previously separated information. In some cases, combining and centralizing information poses an increased threat to personal privacy.

Centralizing and linking personal information is not a purpose or goal of access. The most expansive interpretation of access should not have the indirect effect of creating a new file or record on an individual. While concerns about avoiding centralization must be heeded, they should not prevent parent companies from implementing procedures that ease the ability of consumers to access information. This can be accommodated by providing central points to serve consumers access requests but not actually centralizing the maintenance or storage of data. For example, parent companies could create a central page (phone number, etc.), through which consumers could file requests to be processed by the various subsidiaries or through which they can easily identify and make requests directly to the subsidiaries. Subsidiaries may have different pieces of personal information in their records. Even this simple integration of information might increase the vulnerability of an individual's information to compromise - e.g., a bad actor, if they can guess the password, can get access to all the customer's private information from one convenient location - and therefore must be accompanied by a risk assessment and installation of appropriate security. Also, such a central point may be difficult to manage for companies that regularly acquire and divest subsidiaries.(3)

Authentication devices

The Committee wishes to emphasize the difference between authentication and identification. As we seek to provide individuals with access to personal information we must not move toward increased identification of individuals.

Maintaining the ability of individuals to be anonymous on the Internet is a critical component of privacy protection. Access systems should not require identification in all instances. Biometrics raise additional privacy concerns that must be explored and addressed. Finally, third party authentication systems raise important privacy concerns (creating additional records of individual's access requests). Inserting a third party into the relationship creates an additional opportunity (at times it may be responsibility) to collect and maintain information about the individual's interactions. What policies govern these entities' use of personal information? On the other hand, third parties - intermediaries -- can also play a role in the protection of identity. Currently several companies have established themselves as intermediaries whereby they act as a protector of identity and privacy between the individual and other entities.

Access Options

Access Option 1: Default to Consumer Access

The "default to consumer access" approach works from the presumption that consumers should have access to their personal information. The presumption of access is limited where considerations such as the privacy of another individual, the proprietary nature of certain information, or the cost of providing access outweigh the individual's interest in access.

The "default to consumer access" approach recognizes that consumer access to personal information serves multiple purposes, including but not limited to ensuring accuracy. By promoting openness, access promotes awareness of business information practices, aids the compliance process, and promotes greater trust between businesses and their customers. The openness about data collection promoted by the "default to consumer access" approach may increase consumer awareness of the trustworthiness and responsibility of the businesses that collect information about them, or it may lead consumers to call for more limited collection of information. In this regard, the "default to consumer access" approach may act to promote privacy-sensitive business practices by encouraging businesses to limit the information they collect about customers.

The over-arching principle of the "default to consumer access" approach is that:

Businesses should establish a mechanism whereby personal information maintained by a business that is retrievable in the ordinary course of business is made available to the individual.

There are exceptions to this rule. Information is not accessible to a consumer unless access involves taking steps that are taken on a regular basis in the business with respect to the information, or that the organization is capable of taking, with the procedures it uses on a regular basis. This limitation on access is designed to ensure that businesses need not create new systems to create personal information solely to provide access. The creation of new systems that link personal information with transactional and other data that is not on its own tied to or used to make decisions about the individual would create additional risks to privacy. However, this means that information that may be defined as personal information may be out of reach under this option.

In addition, personal information is not retrievable in the ordinary course of business if retrieval would impose an "unreasonable burden" on the business. This is a narrow, but important, exception to access. It allows for a purpose or cost benefit analysis in rare situations where the ability to retrieve the information would be very costly or disruptive to the business, and the information at issue is of marginal significance. It is here that sensitivity of data, uses of data, the purpose of the request, etc. could be considered. If an organization uses this exception to limit access, it must refer the individual to the provisions in its privacy notice that discuss it's data collection use, and consent/choice policies, or provide the individual with information equivalent to the privacy notice.

Proponents of this approach would argue:

This approach provides broad access rights and reasonably matches consumer expectations that they can access personal information collected about them.
 
Economic, proprietary, and other interests can be considered in decisions to limit access. However, these considerations occur within a framework that favors access ensuring that they will not be misused or expanded.
 
It ensures the growth of electronic commerce by limiting access where it would be unreasonably burdensome due to cost or other considerations.
 
It protects the competitive marketplace by ensuring that businesses can protect proprietary information. This provides a framework for access that also limits the potential for competitors to use access to gain knowledge about internal processes.

Opponents of this approach would argue:

Creating exceptions to access demands that someone is vested with the authority to sanction denials of access. It is critical that such determinations be made in a fair, open, and accountable manner.
 
Although the rule may be very straightforward for the majority of situations, difficulties in determining whether or not a particular business falls within the exceptions would require businesses to become experts in this area. In this regard, the rule may make access unduly complex.
 
Because businesses would not be required to provide access unless personal information is "retrievable in the ordinary course of business," access rights could vary quite a bit from business to business, or across different types of businesses. Businesses may try to use nuances in the interpretation of "retrievable in the ordinary course of business" to avoid providing access. Potentially, a business could set up its data structures so that the data could be used to make decisions about consumers without being retrievable as a separate bit of information.
 
The proprietary exception may inappropriately limit access. Consumers may have a significant interest in seeing derived or inferred data. This data may be used to make decisions about consumers. Limiting access will undermine consumer awareness by limiting information about what is being communicated and the potential impact of this information.

Access Option 2 --Total Access Approach

The "total access" approach works from the presumption that consumers would benefit from having access to all their PII in the possession of commercial websites and that there is no information that should remain off-limits or confidential. Not only does this verify the accuracy of that data, it also places the consumer in the position of knowing how his or her personal information is collected and used. In keeping with the purpose of providing consumers as much access as possible, businesses would provide initial access for free, while charging for repetitive access requests or terminating access upon unduly repetitive access requests.

The "total access" approach should be interpreted to only apply to existing records. For example, off-line information in the possession of a data collector but not yet linked or joined with online information would not be subject to access. In addition, creating more comprehensive records of individuals should not be done in order to establish "total access".

The over-arching principle of the "total access" approach is that:

Businesses should be completely transparent in their information collection and use practices.

Proponents would argue:

By providing greater access rights, businesses could increase the reliability and accuracy of data, could build consumer confidence and trust, could experience a public relations benefit, could make better decisions based on better data, could expand markets by giving consumers greater confidence in online privacy, and could experience greater efficiencies if they limit information collection to only what is necessary.
 
Consumers might experience an enriched understanding of data collection practices, increased confidence in the online environment, more control over the accuracy of personal information, the ability to identify inaccurate data before it harms them, the ability to make better privacy decisions in the marketplace (including decisions to protect anonymity), and the ability to better police businesses for compliance with any stated policies.

Opponents would argue:

For businesses, this approach would lead to a substantial increase in costs, including: any required modifications or new design requirements placed on existing systems, new storage costs, new personnel costs, new legal costs and losses due to the disclosure of internal practices and proprietary information. It also affects the confidentiality of procedures companies use to make decisions and assumptions about user data.
 
Consumers would also experience additional costs, such as: pass through costs for system upgrades, new personnel, etc., and potential opportunity costs of businesses not investing in new products.
 
Both businesses and consumers could be harmed by unauthorized access to a greater amount of information. Businesses may face a higher liability in this case and consumers may be risking more of their privacy.

Access Option 3 -Case-by-Case Approach (Including Sectoral Considerations)

A third approach would be to treat different information differently depending on a calculation that takes into consideration, among other things, the content of the information, the holder of the information, the source of the information, and the likely use of the information. This approach is necessarily more complex, recognizing that whether access is appropriate depends on a variety of factors. Different sectors, record-keeping systems, and types of data raise different issues. The challenge, therefore, would be to develop an administrable set of rules.

Unlike the Default option that is premised on a presumption of access, under the case-by-case or sectoral approach there is no presumption for or against access. Rather, the access inquiry requires an analysis of the relevant factors. This approach would differ from others under consideration in that it most likely would not afford access to information that, if inaccurate, is unlikely to have an adverse effect. This determination will depend both upon the nature of the data in question (e.g., information regarding children) and the record-keeping system in question. On the other hand, it is clear that under this third approach, there would be categories of data to which access is more limited than in the other approaches. For example, inferred data, "non-factual data" or internal identifiers may be less accessible under this approach than under the other approaches.

Proponents would argue:

Flexibility:The approach affords greater flexibility than the Default or Total Access approaches.
 
Focusing on Need: By considering each type of data and industry sector on its merits, the approach may more accurately balance factors bearing on whether to provide access.
 
Reflecting Expectations: This approach may more realistically address the expectations of both consumers and businesses, because few consumers are interested in obtaining access to information that will not make a material difference in their lives.
 
Fair Apportioning of Costs: In circumstances in which consumer access fees do not fully cover the costs of providing access, this approach would reduce costs to consumers, and more fairly apportion costs of access. Specifically, it would avoid the problem under a broad-based approach encouraging access to most data of forcing consumers uninterested in obtaining access to bear the costs of creating the access infrastructure.

Opponents would argue:

Administrability: The approach may involve far too many factors to allow a comprehensible set of rules to emerge. Moreover, many of the factors, e.g., sensitivity, are difficult to assess objectively.
 
Inconsistency: The complexity of this approach may yield inconsistent results if different decision makers are assessing issues such as sensitivity.
 
Reduced Consumer Education: This approach does not recognize that the predominant purpose of providing access is to inform consumers of what information is "out there" about them.

Although an approach involving a default access rule would be easier to apply, it may not reflect the real purposes behind providing access. As a substantial number of participants in the larger committee meetings and this subgroup indicated, the purpose for providing access may be more limited than promoting consumer awareness. Access itself may not enhance "consumer privacy" per se, but rather ensure the accuracy of data and protect against adverse decisions based upon incorrect data. Indeed, this notion of access also is consistent with the Online Privacy Alliance Guidelines, which address access to help assure the accuracy of information.

In fact, the purpose may be as limited as providing consumers an opportunity to correct erroneous data and not to provide consumers an opportunity simply to know what's out there. A sectoral or case-by-case approach also may allow a more precise weighing, in light of the nature of the data and the sector involved, the consumer's reasonable expectations about the data, and the costs of providing access, of whether access to the particular type of data is warranted. As with all options, the cost of providing access is a consideration that must be factored into the analysis. As the purpose for the data use becomes more significant, however, cost may become less of a factor.

This approach is consistent with U.S. privacy laws, which adopt a sector-by-sector approach to data privacy based upon the sensitivity of the information and whether it is likely to be used in a way that could adversely affect the data subject. For example, access and correction are the norm with regard to financial information used to make credit granting and employment decisions (under the Fair Credit Reporting Act), and medical record information.

In addition, according to the U.S. Privacy Protection Study Commission, which addressed access issues in the mid-1970s, decisions about access to information are unique to the particular information system.

This approach would assign different access rights to different sectors or types of data. Depending on the number of factors in the calculation, the permutations could be extensive. This approach would afford access to all sensitive data such as financial information, health information, or information relating to children, and other data in sectors of the economy that may affect individuals in a materially adverse way if it is inaccurate. In these instances, it would yield the same result as the Default Rule and Total Access approaches.

Access Option 4 - "Access for Correction"

This option takes a relatively narrow view of what constitutes "reasonable access." It is at the opposite end of the spectrum from the "total access" approach of Option 2.

The approach begins by asking why access to personal data is important to consumers. One reason for allowing access - correcting errors - is of interest to both the individual and the to Web site. If the Web site uses personal data to grant or deny some significant benefit to consumers, then errors in the Web site's files could cause real harm to the consumer. Giving the consumer access to the data allows the consumer to challenge or correct errors. Both the consumer and the Web site have an interest in the accuracy of such data, so allowing access and correction helps both parties. Thus, even if allowing access increases the Web site's costs, as it often will, and even if the costs cannot be passed on to consumers, the Web site itself will get some benefit from access designed to improve the accuracy of important data.

Error-correction is not the only justification for allowing access. A second justification goes under a variety of headings - "accountability," or "education," or even "consciousness-raising." These labels reflect a view that, if consumers can see the files maintained about them, Web sites will be more cautious about gathering sensitive or unnecessary information. An even more sweeping justification for access assumes that individuals retain some rights to data about themselves, even in the hands of others - "That's my data, and I have a right to see it".

In these justifications, the interests of Web site and consumer are no longer aligned. Maintaining an access system has costs, and the Web site gets relatively little benefit from these costs. While the Web site may benefit in a general way from consumer education, it can educate consumers less expensively by providing a detailed notice about what data it collects, rather than maintaining an individual access system.

The "access for correction" option treats correction as the touchstone for defining reasonable access. While the other justifications have force, this option treats them as insufficient to outweigh the costs of access - in money for the Web site and in privacy risks for consumers. Under this option, a Web site would grant access to personal data in its files only after answering two questions in the affirmative: (1) Does the Web site use personal data to grant or deny significant benefits to an individual? (2) Will granting access improve the accuracy of the data in a way that justifies the costs?

The first question resolves many of the issues that are more difficult to resolve under the other options presented here. Examples of information that is used to grant or deny significant benefits include credit reports, financial qualifications, and medical records. By focusing on whether the Web site uses personal data to grant or deny a significant benefit, the approach would largely exclude personal data that is not tied closely to an individual, because it is unlikely (though not impossible) that significant benefits will be granted anonymously. Similarly, the approach calls for access only to information that is collected and retrieved in the ordinary course of business. With some qualifications set out below, however, the approach would allow access to information that has been provided by a third party, as long as the information is used to grant or deny significant benefits.

The second question - whether allowing access to correct errors justifies the cost - raises a variety of possible exceptions to access. Inferred data, such as judgments made about the consumer by third parties or Web site employees or even expert information systems, are not usually susceptible to direct correction, although the underlying data can be corrected. Access is not justified if it would reveal trade secrets or would compromise the confidential communications of Web site employees or third parties. In general, as the likelihood of improving the accuracy of personal data declines and the cost of providing secure access increases, the cost-effectiveness of access also declines and the case for access grows weaker.

Proponents of this option would argue:

1) The most obvious goal served by allowing access to personal data is correction of erroneous data that may be used to make important decisions about an individual. The access principle has typically been implemented in contexts where correction of data is essential - such as credit reports and other instances where errors have a direct impact on the individual and where those errors can be reduced by allowing access.
 
2) The other goals advanced for access -- education, accountability, consciousness-raising -- are better served by consumer notice instead of an expensive and little-used access system . These reasons for access do not justify the costs of providing access, which include not just time and expense but in some circumstances a very real risk to privacy if access itself results in the compromise of personal data. Rather than pay those costs and take those risks for a large body of mostly insignificant data, we should concentrate on providing access to data that is important and correctable.
 
3) There is no compelling reason to provide access to uncorrectable data, unless the real goal is to raise the cost of maintaining personal data so high that Web sites just give up and stop gathering the information. Those who want to restrict information gathering should argue for that goal explicitly and not try to achieve the goal indirectly through unnecessarily broadening the access principle..

Opponents of this option would argue:

1) Access serves broader purposes of openness, accountability, and fairness. This option reduces the principle of access to a single purpose - correction of errors.
 
2) Access is necessary for informed consumer choice. Only broad access will create the kind of accountability that is required among Web sites. Once consumers truly know what Web sites are collecting about them, they will force those sites to adopt responsible data collection policies.
 
3) Industry should bear the costs of handling data responsibly. If the cost of providing access outweighs the value of the information to a Web site, the site should revisit its' information-gathering practices. Cost benefit analyses of this type are good for privacy and for businesses. Just as industries that can't afford new pollution controls close their old factories, businesses that are incapable of handling information responsibly should close their doors.

Authentication

Complex as the access issue may seem at this point in our report, it is about to become even more so. For unlike the other Fair Information Practice principles, the access principle sometimes pits privacy against privacy.

Simply stated, the problem is this - On the one hand, privacy is enhanced if consumers can freely access the information that commercial Web sites have gathered about them. On the other hand, privacy can be destroyed if access is granted to the wrong person - an investigator making a pretext call, a con man engaged in identity theft, or, in some instances, one family member in conflict with another.

How can consumers get the benefits of access to their personal data without running the risk that others will also gain access to that data? The answer is to employ techniques that adequately authenticate consumers - that provide some proof of consumer's identity or authority before the consumer is given access to personal data.

But how much proof is enough? If the consumer must produce three picture IDs, privacy will be protected, but access will be difficult. If the standard of proof is set too low, access will be encouraged, but the risk of compromise will grow. This section of our report attempts to illustrate the ways in which these competing interests can be addressed. In the end, it will be clear that there is no single answer to the dilemma described above. The proper level of authentication depends on the circumstances.

To take one example, the level of authentication may depend upon whether the consumer will simply view the information or will correct or amend it as well. Allowing the wrong individual to view someone else's data is a violation of privacy - and may lead to additional harm ranging from embarrassment to loss of employment - but allowing the wrong person to "correct" that personal information can result in devastating consequences. For example, some criminals have gained access to an individual's credit card accounts by filling out a change of address card with the post office and diverting the individual's credit card statements to another location. With access to the individual's bank statements and credit card bills the crook has ample information to impersonate the victim. (The Postal Service has recently initiated changes to make this more difficult.(4)) For this reason, where correction or amendment is provided, an audit trail should be maintained to aid in identifying potential problems.

In judging the proper level of authentication, it is necessary to bear in mind that the risk of liability will heavily influence the Web site's choices. Sooner or later, whatever level is set, the Web site's authentication requirements will turn out to be either too strict or not strict enough. A business runs a risk of liability if it allows the wrong person to access personal information. Although it is not clear what specific remedy an individual might have under existing law, the lack of certainty regarding liability presents a problem for both individuals and businesses. If liability is strict and put upon businesses, they may raise the barrier to access very high, burdening individuals' access in an effort to avoid liability. On the other hand, if existing legal remedies do not provide sufficient penalties for inappropriate access, individuals' privacy may suffer. How to strike an appropriate balance that spurs good practices, encourages the deployment of robust authentication devices, and does not overly burden access is the question.

Another question that must be asked is how the effort to achieve strong authentication will affect anonymity on the Internet. Anonymity is a feature of modern life and one we have all relied upon to provide a level of privacy. Pseudonymity (the use of a pseudonym such as a "screen name") is a step away from anonymity but provides important privacy protections for identification information in the online environment. Authentication is in some cases the opposite of anonymity. [It is important to preserve both anonymity and pseudonymity in our quest for authentication tools and procedures. ] Technologies such as biometrics might provide very strong authentication, but at a cost in terms of anonymity. Similarly, relying on data held by a third party to authenticate a consumer may provide that third party with even more details about the consumer than before. This means that access and authentication raise questions about the privacy policies of third parties that may be involved in the authentication process. At the same time, third parties - intermediaries - can also play a role in the protection of identity. Currently several companies have established themselves as intermediaries protecting identity and privacy between the individual and other entities.

Ways of addressing the authentication problem

So, how can Web sites choose an authentication policy? There is no one right answer. In this section we look at two case studies to identify ways in which commercial Web sites might strike a balance in addressing the authentication problem. Often, the solutions chosen will depend on the Web site's relationship with the consumer, as well as the kind of data to which access is provided.

Account Subscribers

Perhaps the fewest difficulties arise where a subscriber establishes an account with a Web site. In many cases, the individual may be given access to information about his or her account if he or she simply provided only the information required to establish and secure the account. But relying on information such as name, address and phone number to authenticate the identity and authorization of an account holder is risky because the information is so widely available. In fact, many of the most common "shared secrets" (such as social security numbers or mother's maiden name) have been compromised by widespread use.

For this reason, it is common practice both offline and online to require some additional piece of information that is thought to be more difficult to compromise. Many businesses require individuals to use a shared secret (password) to access an account.

Even a password requirement, with all its inconveniences and costs, suffers from serious security flaws. Many consumers use the same password at multiple places, or leave themselves reminders on yellow stickies, or use obvious passwords (e.g., "password") - all of which compromise the integrity of the authentication system. Authenticating identity has become a far more complex endeavor than it once was.

Where an account requires a password it would be appropriate to provide access to the data when the requester appears to be the account holder (tested), has the password, and presents some verifiable information about recent account activity. Such an approach would provide a two-factor method of authentication, but preserve the privacy offered by the initial account.

The Committee discussed the feasibility of using authentication devices as a method for obtaining consumer access to personal data. Some Committee members expressed concern that "perfect" authentication tools may be prohibitively expensive or too cumbersome for widespread use. However, the Committee did hear from authentication vendors who said that a wide range of authentication solutions are available today that solve the password 'problem' described above. These solutions take the form of hardware tokens that are as easy to use as an ATM card or software tokens that can be downloaded easily to a PC, PDA or cell phone. Even so, the risks associated with misuse and misappropriation of such devices remains.

Case study
 
A subscriber opens an email account with a free mail service. Establishing the email account does not require the subscriber to disclose personal information. The subscriber is assigned an email address and asked to establish a password to protect the account. If the subscriber requests access to personal information held by the service, how should the service determine whether to authorize access? What level of authentication should be required?
 
Options
 
a. Require the same information for access (account name and password). This approach errs on the side of ease of use for the account holder. But in doing so it relies upon one token (account name) which is frequently shared with others (email address for example) and another token (password) which is (as our discussions indicate) relatively easy to compromise.
 
b. Require my account name, my password, and information about recent account activity. This method adds some protection against unauthorized access. By asking for the account name (something I have), my password (something I know), and recent account activity (something I know that is dynamic, and unlikely to be known or discernible by others) it adds an additional protection.
 
c. Require either of the above sets of information and send the requested information to the account.
 
d. Require either of the above sets of information.
 
e. Require that the request for access be made through the account, and send to the account a one-time access code. This approach would build in an additional precaution against unauthorized use. By requiring the request to come from the account (similar to credit card authorization that must come from the registered phone of the account holder) and returning a one-time access key to the account the system could further limit unauthorized access. This feature might cause a minor delay, but it does not require the individual to remember additional pieces of information.

Cookies, identifiers, and partially personalized data.

A harder authentication problem arises if the Web site seeks to provide access to data that is not tied to an individual subscriber's account. Sometimes, data is gathered about the activities of a particular computer, through the use of cookies or other unique identifiers. But such data maybe only partially personalized -- the computer may have more than one user. The consequences of disclosing information about an individual's use of a Web site or click stream data to another person (family member, co-worker, other) could be damaging.

In such circumstances, how can a service authenticate that the individual is the person to whom the data relates? Can Web sites provide access in fashion that reflects the potential adverse consequences of disclosing information to someone other than the subject of that information. Should the level of access authorized be lowered due to the complexities of connecting the user to the data? Are there other policies that would address the privacy interest and have a lower risk of unintentionally disclosing data to the wrong individual? Does this concern vary from Web site to Web site?

Again, there is no single answer to these questions, as our case study shows.

Case study:
 
A Web site assigns each visitor a unique identifier - a cookie - that is used to track and retain data about the visitor's activities at the site. The Web site does not request or gather information about specific visitor's identities. A visitor requests access to information that the Web site has about her use of the site. How should the Web site proceed?
 
Options
 
a. Require only the identifier (the cookie). This would make it quite easy for the user to get access to personal data; but if the identifier is tied to an imperfect proxy for the individual (such as a computer) it is possible that other individuals may gain access to the individual's personal information. If a cookie attached to a browser by a specific Web site was used to provide access it could allow all members of a family, or other group, who share a computer to access each others' information. We tolerate this "over disclosure" in certain cases, such as telephone calls where we disclose all calls made from a number back to the individual named on the account despite the fact that in multi-family homes this discloses other family members' calls. However, in the online environment over disclosure could be more damaging because the information collected about an individual's use of the Web can on its face reveal more about the individual. For these reasons the identifier alone may be insufficient to grant access in many situations. But there may be instances where the identifier alone is sufficient proof of account ownership to grant access. For example, if the Web site is a general interest site that only retains information about how often visitor returns, providing access to someone other than the person who visited may raise little concern. But, if the Web site is focused on a specific disease then providing any information to the wrong person, even information about number of visits, could be quite harmful.
 
b. Require the individual to open an account and allow access to data collected from this point forward. This certainly sounds more secure, but it may not limit inappropriate access. For example, if the account is browser-based and there are several individuals who use the browser, this option would allow one individual to access all the data and prevent the others from accessing any.
 
c. Require the identifier but limit the scope of access. This option acknowledges the risk of inappropriate access and it seeks to mitigate the harm by limiting the information provided. For example, a Web site could provide categories of information it has collected rather than the actual information. (Note that at this point the Web site is providing something more like notice than access.) In some instances, disclosing to the wrong individual the mere fact that a Web site has information tied to a unique identifier could be harmful. For example, if the Web site's subject is a sensitive or revealing topic the mere fact that it has any information tied to the identifier could cause damage in the wrong hands.
 
d. Delete the file and commit not to collect additional data. This option acknowledges the risk of inappropriate access and seeks to provide for the individual's privacy interest in another fashion. This solution is something other than access. In essence, it protects the user's general privacy interest by deleting the information instead of providing access. Even this solution is not risk-free, however. It could allow some one other than the subject of the data to have the data deleted.
 
e. Disassociate the data from the identifier and use it for statistical, aggregate or other non-individually based purposes. This option, too, provides something other than access. Unlike option d above, it recognizes the site's commercial interest in utilizing the data - though in non-identified or anonymous form. Again, while this option does not directly serve the individual's access interest, it does protect the consumer's general privacy interest by removing information that connects the data to them.
 
f. Require no identifier but provide only a general description of the kinds of data collected. This solution also provides notice rather than access; it provides notice of the kinds of data that the site gathers. It errs on the side of limiting the impact of inappropriate disclosures and acknowledges that even the fact that a browser has an identifier associated with a specific site or service could in some circumstances be revealing and potentially harmful.

Section 2: Security

The Advisory Committee examined how to ensure the security of personal data held by commercial Websites. This section first describes competing considerations in computer security. After then looking at some possibilities for regulating computer security in online systems, it discusses the importance of notice and education as supplements to standards for protecting personal data. It presents competing options for setting Website security standards and recommends a specific solution to protect the security of personal data.

Competing Considerations in Computer Security

Most consumers - and most companies - would expect companies that collect and hold personal data to provide security for that data. . But security, particularly computer security, is difficult to define, particularly in a regulatory or quasi-regulatory context. Identifying the most effective and efficient solution for data security is a difficult task. Security is application-specific and process-specific. Different types of data warrant different levels of protection.

Security - and the resulting protection for personal data - can be set at almost any level depending on the costs one is willing to incur, not only in dollars but in inconvenience for users and administrators of the system. Security is contextual: to achieve appropriate security, security professionals typically vary the level of protection based on the value of the information on the systems, the cost of particular security measures, the costs of a security failure in terms of both liability and public confidence.

To complicate matters, both computer systems and methods of violating computer security are evolving at a rapid clip, with the result that computer security is more a process than a state. Security that was adequate yesterday is inadequate today. Anyone who sets detailed computer security standards - whether for a company, an industry, or a government body - must be prepared to revisit and revise those standards on a constant basis.

When companies address this problem, they should develop a program that is a continuous life cycle designed to meet the needs of the particular organization or industry. The cycle should begin with an assessment of risk; the establishment and implementation of a security architecture and management of policies and procedures based on the identified risk; training programs; regular audit and continuous monitoring; and periodic reassessment of risk. These essential elements can be designed to meet the unique requirements of organizations regardless of size.

In our advice to the FTC, we attempt to reflect this understanding of security. Our work, and this report, reflects the various types of on-line commercial sites, and the fact that they have different security needs, different resources, and different relationships with consumers. The report reflects this understanding and seeks to identify the range of different possibilities for balancing the sometimes-competing considerations of security, cost, and privacy.

Directing Computer Security - Preliminary Considerations

Before turning to the options it is worthwhile to comment on several issues that the Committee considered but did not incorporate directly into its list of options.

First, we considered whether guidelines or regulations on security should contain some specific provision easing their application on smaller, start-up companies or newcomers to the online environment, but we ultimately determined that new entries should not receive special treatment when it comes to security standards. In part, this is because organizations that collect personal data have an obligation to protect that data regardless of their size. In part, this is because we concluded that any risk assessment conducted to evaluate security needs should take into account the size of the company (or, more appropriately, the size of a company's potential exposure to security breaches). In many cases (but not all), a smaller Website or less well-established company will have fewer customers, less data to secure, and less need for heavy security. A smaller site may also have an easier time monitoring its exposure manually and informally. And of course, even a small site may obtain security services by careful outsourcing.

Second, we noted that several of the proposed options depend on or would be greatly advanced by inter-industry cooperation and consultation on appropriate and feasible security standards. Often, there are significant barriers to sharing information about adverse events, including fears of antitrust actions and liability exposure. In the past, the government's willingness to provide clarity on antitrust rules to allow useful cooperation among firms has been helpful, and similar guidance that will encourage industry members to cooperate in the development or enforcement of security standards and procedures without fear of antitrust liability will be helpful here.

Third, it is vital to keep in mind that companies need to protect against internal as well as external threats when considering solutions designed to secure customers' personal data. Many companies have already implemented information security policies that protect sensitive corporate data (i.e., compensation information) by limiting access to only those employees with a "need to know." Companies need to implement similar measures that protect customer data from unauthorized access, modification or theft. At the same time, mandated internal security measures can pose difficult issues. For example, it is not easy to define "unauthorized" employee access; not every company has or needs rules about which employees have authority over computer or other data systems. And many companies that have such rules amend them simply by changing their practices rather than rewriting the "rule book." Even more troubling is the possibility that internal security requirements that are driven by a fear of liability could easily become draconian - including background checks, drug testing, even polygraphs. We should not without serious consideration encourage measures that improve the privacy of consumers by reducing the privacy of employees.

Fourth, we are concerned about the risks of regulation based on a broad definition of "integrity." Some concepts of security - and some legal definitions - call for network owners to preserve the "integrity" of data. Data is typically defined as having integrity if it has not been "corrupted either maliciously or accidentally" [Computer Security Basics (O'Reilly & Associates, Inc., 1991)] or has not been "subject to unauthorized or unexpected changes" [Issue Update on Information Security and Privacy in Network Environments (Office of Technology Assessment, 1995, US GPO)]. These definitions, issued in the context of computer security rather than legal enforcement, pose problems when translated into a legal mandate. If integrity were read narrowly, as a legal matter, it would focus on whether a Website has some form of protection against malicious corruption of its data by external or internal sources. If the definition is read broadly, it could lead to liability for data entry errors or other accidental distortions to the private personal information it maintains.

Authentication and authorization controls for access to information are integral parts of system security. To establish appropriate authentication and authorization, businesses must consider the value of the information on their systems to both themselves and the individuals to whom it relates, the cost of particular security measures, the risk of inside abuse and outside intrusion, and the cost of a security failure in terms of both liability and public confidence. This discussion of security pertains both to information in transition and information in storage.

Notice and Education

After considerable discussion, the Advisory Committee has developed a wide range of possible options for setting standards for protecting personal data held by commercial Websites. Before presenting these options, we will address two policy options that the group considered but determined were unsatisfactory on their own. While insufficient standing alone, the Advisory Committee concluded that development of programs to educate consumers on security issues and a requirement that companies post notice describing their security measures are approaches that should be examined as possible supplements to some of the options in the Security Options below.

Notice

Notice is viewed as an appropriate tool for informing individuals about the information practices of businesses. It is critical to the consumer's ability to make informed choices in the marketplace about a company's data practices. In the area of security, as in the area of privacy, there is not necessarily meaningful correlation between the presence or absence of a security notice statement and the true quality of a Website's actual security. A security notice could be more useful if it allows consumers to compare security among sites in an understandable way. Since it is difficult to convey any useful information in a short statement dealing with a subject as complex as the nuts and bolts of security, most such notices would be confusing and convey little to the average consumer. Further, providing too many technical details about security in a security notice could serve as an invitation to hackers. As was discussed at some length by the Advisory Committee, these considerations also mean that it is not possible to judge the adequacy of security at Websites by performing a "sweep" that focuses on the presence or absence of notices.

Notice is important in triggering one of the few enforcement mechanisms available under existing law. If a posted notice states a policy at variance with the organization's practices, the FTC may exercise its enforcement powers by finding the organization liable for deceptive trade practices. But security notices are ineffective standing alone. At the same time, we believe that they could be useful in conjunction with one of the other options discussed in Section D. The form such notice should take will vary depending upon the option selected.

Consumer Education

In addition to notice, consumer education campaigns are also useful to alert consumers about security issues, including how to assess the security of a commercial site and the role of the consumer in assuring good security. Regardless of what security solutions the FTC decides to recommend, it would be extremely valuable for the FTC, industry associations, state attorneys general, and others to sponsor consumer education campaigns aimed at informing Internet users about what to look for in evaluating a company's security. In addition, no system is secure against the negligence of users, so consumers must be educated to take steps on their own to protect the security of their personal data.

Options for Setting Website Security Standards

The Advisory Committee has identified two sets of options for those seeking to set security standards. In essence, these options address two questions: How should security standards be defined? And how should they be enforced?

The question of how security standards should be defined requires consideration of the parties responsible for the definition as well as issues of the scope and degree of flexibility and changeability of the standards. The entries that could be responsible for setting security standards explicitly include government agencies, courts, and standards bodies. Furthermore, it could be left up to Websites themselves to develop security programs (perhaps with a requirement that each site develop some security program), or it could be left to market forces and existing remedies to pressure Websites into addressing security at an appropriate level.

In this section, we set forth five options for setting security standards that fall along a continuum from most regulatory to most laissez faire. Each of the proposals reconciles the three goals of adequate security, appropriate cost, and heightened protections for privacy in a different manner. For each option, we have presented the arguments deemed most persuasive by proponents and opponents of the option.

Security Option 1 - Rely on Existing Remedies

Before requiring any particular security steps, wait to see whether existing negligence law, state attorneys general, and the pressure of the market induce Websites that collect personal information to generate their own security standards. It is worth noting that the insurance industry has started to insure risks associated with Internet security. The emergence of network security insurance may force companies to seriously address security issues, as the presence or absence of adequate security will be taken into account in the underwriting process utilized to determine rates for premium.

Proponents would argue:

  • Consumers who suffer harm as the result of negligence can typically bring tort actions. There is no reason to think that consumers who are harmed by a breach would lack a remedy for any specific injury they may suffer.
  • Damages are often quantifiable (e.g., credit card charges or lost work time due to identity theft). And even when they are not quantifiable (disclosure of embarrassing medical data, for example), the problem is no more difficult for juries to resolve than similar intangible wrongs routinely resolved by juries today (e.g., libel damages or "false light" claims).
  • It is therefore reasonable to wait for such litigation and to correct any gaps that may emerge in the law when and if the lack of a remedy has been demonstrated.

Opponents would argue:

  • This approach does nothing proactive to advance good practices in the marketplace, and will result in a long delay before security issues are addressed and consumers are protected. It will take some time before litigation based on existing negligence law results in judgments. And it will take time for the market to respond to this, if that even happens at all.
  • If relying on existing remedies fails to work, we will be in the same or worse position then as we are now, and many more consumers will have had their privacy violated due to security breaches.
  • In the meantime, businesses that would welcome guidance from experts may be left to flounder and face lawsuits because of a lack of awareness, even if they are well intentioned.

Security Option 2 - Maintain a Security Program

Require all commercial Websites that collect personal information to develop and maintain (but not necessarily post) a security program for protecting customers' personal data. This option could take one of two forms:

The contents and methodology of the security program could be specified, and businesses could be required to post a brief notice indicating their compliance.

The requirement could be limited to a simple mandate that the Website adopt a security strategy without specifying the details or requiring that it be posted.

Proponents would argue:

  • A security program is necessary for a commercial Website of any size that collects identifiable information personally identifiable information and wishes to keep the information confidential.
  • The scope of the program may vary depending upon the size of the company and in the case of a very small business, one person may be able to effectively handle security on a part time basis. However, just as marketing, human resources, and accounting are considered essential business functions for companies of any size, maintaining a security program is also critical to any company's operations.

In support of option 4 a., security professionals believe that any effective program, even if managed by only one person part time, should involve the elements of risk assessment, implementation of controls based on the risks, testing and monitoring of controls, and periodic re-assessment of risks.

Also in support of option 4 a., a statement that the company maintains a security program that assesses risks and implements appropriate controls to address the risks need not be incomprehensible to consumers or too burdensome for businesses to comply with and insures consumers and businesses that security has been considered in the system design.

Opponents would argue:

  • Developing and maintaining a program -- but not testing it or otherwise verifying or assuring that the organization is complying with the program -- will only result in an illusion of security.
  • The costs of developing, testing, verification, and assurance (especially to small or not technically savvy businesses) will be significant, diverting resources from the main business purpose. Many firms would not know where to turn or how to take the first step in developing such a program.
  • If the plan description is posted, much of it may both be incomprehensible to non-technical users and all-too-clear to technically savvy attackers.

Security Option 3 - Rely on Industry-Specific Security Standards

All businesses operating online that that collect personal information could be required to adhere to security standards adopted by a particular industry or class of systems. There are three quite different options for how the standards are developed:

A government-authorized third party could develop standards through a process that encourages public participation (notice and comment) and may include governmental review.

The standards could be established by any third-party but the FTC could require that the standards address specific topics (e.g. access, data integrity, notice, authentication, authorization, etc.).

The standards could be developed by any third-party as long as the identity of the standard-setting organization is revealed to consumers (this is in effect a security "seal" program).

Proponents would argue:

  • No government agency is smart enough or fast-moving enough to set network security standards for a particular industry. Industry-specific standards should be set by industry because each sector has different computer security needs and methodologies.
  • Industry groups will have a strong incentive to avoid setting too low a bar. Every company with a brand name is held accountable for the products sold under that name. So too with security standards-setting organizations; those that are associated with serious security breaches will lose the confidence of the public.

The three options presented under this heading are quite different, and c) is significantly better than the others. It associates a security standard with a "brand name" so that consumers can decide whether security at the site is sufficient. Option b) simply adds a requirement that the standards address certain issues. In most cases this will be unnecessary and in other cases insufficient. Option a) requires that the government license standard-setting organizations; it also requires notice and comment and perhaps government review for such standards. This option is nearly indistinguishable from requiring government-written standards and will require that the FTC or some other body make hundreds if not thousands of individualized decisions about what security practices should be required in which industries, decisions that will have to be remade every three months as security standards and challenges evolve.

Opponents would argue:

  • Allowing industry to develop (and police) itself invites lax standards and under-enforcement. Self-regulatory organizations that are comprised solely of the industry at issue will not develop robust standards because doing so may subject its members to additional implementation costs and expose them to greater liability.
  • The insular nature of the standard setting process does not adequately assess and address the needs and values of other parties - other industries, the public, and policy makers. In the absence of other stakeholders industry will fail to address important concerns or craft proposals that undercut other important public policies.
  • The standard setting process lacks public accountability. It is inappropriate to develop substantive policy through entities and processes that lack institutional mechanisms for ensuring public accountability and oversight.

Opponents will find that options a-c do not address their general concerns with industry-generated standards. However, opponents may find that proposal "a" partially responds to criticisms 1 and 2 because it constructs a process for soliciting public and policy maker input and review and to a limited extent addresses concerns about industry capture and stakeholder participation. However, because it does not permit other stakeholders to participate in the formulation of the standards, it is unlikely to fully ameliorate these concerns. In addition, since the item to be protected, personal information, is likely to be considered less valuable by the business than individuals, the concern about lack of representation is heightened. Opponents may find that proposal "b" (while weaker than "a") provides some restraint on the standard-setting process by allowing outside interests to decide what issues must be addressed. Option "c" will garner the greatest opposition from opponents as it fails to address any of the concerns outlined above.

Security Option 4 - "Appropriate Under the Circumstances" Standard of Care

Require all commercial Websites holding personal information to adopt security procedures (including managerial procedures) that are "appropriate under the circumstances." "Appropriateness" would be defined through reliance on a case-by-case adjudication to provide context-specific determinations. As the state of the art evolves and changes, so will the appropriate standard of care. An administrative law judge of the FTC or another agency or a court of competent jurisdiction could adjudicate the initial challenge.

Proponents would argue:

  • This approach allows for an assessment of security tied directly to considerations of circumstance and knowledge. It is impossible to summarize in any detail the balance that must be struck between security and usability; even for the most sensitive data, such as medical information, it may be necessary to lower security standards in order to assure prompt treatment for the injured.
  • The creation of a general standard that is informed by the security practices of others similarly situated at a certain date and time allows for flexibility and growth while encouraging ongoing progress. A similar approach is found in judging medical treatment - doctors are not regulated by an elaborate rulebook but rather by the requirement that they practice medicine in accordance with accepted professional standards. The law leaves definition of those standards to the particular case.

This approach is designed to encourage increasingly strong security practices. If a bright line rule is adopted, there is little doubt that the pace of technical change will leave the adequacy of regulation in the dust, and what was intended to be a regulatory floor will become a ceiling in practice. Rising tides do raise all boats, except those that are anchored to the bottom.

Opponents would argue:

  • In the absence of clear minimum-security standards, courts and companies will lack guidance, because there are no universally accepted security standards.
  • For consumers, the absence of any clear definition of what is sufficient security may put their personal information at risk from companies who do not share the same risk assessment about what is "appropriate under the circumstances."
  • For commercial Websites, there are also disadvantages to this approach; their security precautions will not be judged until after a breach has occurred, which means that the precautions are more likely to be viewed as inadequate in hindsight.
  • An after-the-fact security standard could lead many Websites to ignore security until they are sued.

Security Option 5 - Sliding Scale of Security Standards

Require commercial Websites that collect personal information to adhere to a sliding scale of security standards and managerial procedures in protecting individuals' personal data. This scale could specify the categories of personal data that must be protected at particular levels of security and could specify security based upon the known risks of various information systems. In the alternative or as part of the standard, there could be minimum-security standards for particular types of data. The sliding scale could be developed by a government agency or a private sector entity and could incorporate a process for receiving input from the affected businesses, the public, and other interested parties.

Proponents would argue:

  • A sliding scale allows for the matching of consumer protection risk to data source, thereby allowing companies to develop a more efficient compliance and technology infrastructure.
  • A sliding scale provides commercial flexibility in the way Websites comply with security standards.

Opponents would argue:

  • This option will embroil the agency or private sector entity in trying first to gauge the sensitivity of numerous, different types of data and then to match the sensitivity with particular security measures. It is an impossible task, and the results will be a mess.
  • If the sliding scale is produced at a high level of generality, it will be unenforceable and probably incomprehensible; if it is made specific enough to enforce, it will be a straitjacket for many businesses and a series of loopholes for others.

Even if it could be prepared properly the first time, a sliding scale would have to be updated almost constantly, tasks for which bureaucracies are ill suited.

Security Recommendation

The great majority of the Committee believes that the best protection for the security of personal data would be achieved by combining elements from Options 2 and 4. (Of course, existing remedies would not be supplanted by this solution.) We therefore recommend a solution that includes the following principles:

  • Each commercial Website should maintain a security program that applies to personal data it holds.
  • The elements of the security program should be specified (e.g., risk assessment, planning and implementation, internal reviews, training, reassessment).
  • The security program should be appropriate to the circumstances. This standard, which must be defined case by case, is sufficiently flexible to take into account changing security needs over time as well as the particular circumstances of the Website -- including the risks it faces, the costs of protection, and the data it must protect.

Other Considerations not Addressed

Wireless Technologies

There are significant new technologies appearing that will have privacy implications. For example, a unique identifier may or may not link to a specific individual. But with new wireless technology, it may be linked to an individual identified with a phone number, location information, or other identifier.

Inter-Industry Data Sharing

In general, the Committee agrees there are significant barriers to inter industry sharing of information about adverse events, including fears of anti-trust actions and liability exposure. During the discussion of the security section of the report, the issue of information sharing between the government and private sector was raised. There was disagreement about the inclusion of this issue in the report, mostly surrounding the need to clarify the Freedom of Information Act (FOIA) to encourage businesses to share information with governmental entities. The Committee took no position on this issue.

Enforcement Options

The Advisory Committee was asked to provide its views on access and security in the context of the Fair Information Practice principles and industry self-regulation. We did not examine legislative or enforcement options in any detail, but it was difficult to address some of the access and security issues without giving some thought to the question of enforcement. As part of the security discussion, in particular, we assembled a range of representative options for enforcement of security principles. Some of these options are consistent with self-regulation, and others would require government intervention. We record them here, not for the purpose of recommending any particular course of action but to show the range of possibilities open to industry and government.

Rely on Existing Enforcement Options

Many of the options include the publication of the Website's security procedures or its adherence to particular standards. Such postings are subject to traditional FTC and state enforcement if the statements are false. It is also of course possible for consumers to bring their own actions for fraud, false statements, or underlying negligence in the handling of the data.

Third-Party Audit or Other Assurance Requirements

Rely on independent auditors to ensure compliance with standards. This structure could require security standards to be verified by an external body and could require public disclosure of the findings. This option would provide more flexibility and could adjust faster to the changing threat environment. It would, however, introduce an additional cost and overhead that may not be justified by all industries and for all levels of risk exposure. It would, on the other hand, introduce a neutral, objective assessment of a company's security infrastructure relative to its industry.

Create Express Private Cause of Action

Congress could establish a private right of action enabling consumers to recoup damages (actual, statutory, or liquidated) when a company fails to abide by the security standard established through one of the options set out above.

Government Enforcement Program

The FTC or another agency could enforce compliance with standards using its current enforcement power or using newly expanded authority. The enforcement could establish civil or criminal fines, or both and other equitable remedies. (This option is, in some respects, modeled after the regulations governing the financial services industry as enforced by the Federal Financial Institution Examination Council (FFIEC). The FTC could establish a similar enforcement regime for other industries.)

Terms

The Committee found it useful to generate a table of terms in order to further our discussions. These terms have proven useful, but are not to be considered generally accepted definitions of the Committee or online industry.

Term Possible Definition(s)
Access The mechanism(s) by which individuals can view data specific to themselves
Access/ Participation The mechanism(s) by which individuals can view data specific to the themselves AND edit and update that data for accuracy and completeness
Access; authorized The mechanisms by which access to data is granted by challenges to the requesting entity to assure proper authority based on the identity of the individual, level of access to the data, and rights to manipulation of that data.
Access; reasonable To be defined as part of the advisory committee's work. Generally regarded as meaning that access cannot be constrained by artificial barriers set by interfaces, frequency, or cost of access.
Access; unauthorized Access by an entity who does not have proper authority to access the information in the manner it is being accessed (view, modify, delete, etc.).
Alias A name, usually short and easy to remember and type, that is translated into another name or string, usually long and difficult to remember or type. Commonly used as a single name for a list of e-mail addresses or hyper-link re-directs.
Anonymous Describes an entity whose identity is unknown.
Authentication Process by which an entity's identity becomes known.
Authentication technique Method(s) by which an entity's identity can become known. Weak authentication includes weak passwords only. Strong authentication includes complex passwords combined with tokens (either physical, biometric, or electronic)
Authorization Process by which a known (not anonymous) entity gains specified privileges such as access, read or write rights, system administration rights, etc.
Biometric Identifier Methods of authentication based on the requester's unique biological traits, such as retinal patterns, handprint, thumbprint, voiceprint, facial details, etc.
Biometrics The science of determining, storing, comparing, and validating the identity of an entity based on biometric identifiers.
Certificate Authorities Entities that are empowered to sign digital certificates in order to add credibility to the certificate.
Choice/Consent One of the five elements of Fair Information Practices, choice indicates that, once provided Notice/Disclosure, have options of choice over the data being requested and the use of that data, including secondary use and sharing with 3rd parties.
Choice: Opt-in/Opt-out Opt-out -mechanism that states data collection and/or use methods and provides user choice to decline such collection and/or use

Opt-in - mechanism that states data collection and/or use methods and provides user choice to accept such collection and/or use

Commercial Web site A for-profit entity operating an Internet web site.
Consumer An individual (anonymous or identified) who interacts with commercial entities for personal benefits.
Cookie A general mechanism which server side connections can use to both store and retrieve information on the client side of the connection. In essence, cookies are small data files written to a computer's hard drive by Web sites when that computer views the site using a Web browser. These data files contain information the site can be extremely simplistic containing only non-identifying data, identifying non-personally identifiable data (GUID's), or such things as passwords, lists of pages you've visited, and the date when you last looked at a certain page. Internet standards require that websites can read only those cookie files that they have issued.
Data Collection The processes and sources used by a commercial entity to accumulate information about consumers. There are many methods of collection, including automated methods (see cookies), direct or indirect entry by consumers, and 3rd party sources.
Data Practices The methods by which a commercial entity manages information. Specifically, the policies and methods used by a commercial entity in the collection, storage, access, security and distribution of customer information.
Digital Certificate A digital certificate is a statement signed by an independent and trusted third party. That statement usually follows a very specific format laid down in a standard called X509, but it doesn't have to. Digital certificates consist of three parts: information about the object being certified (name, etc.), public key of the entity being certified, and the signature of the certifying authority.
Electronic Signature (aka Digital Sigature) The term `electronic signature' means information or data in electronic form, attached to or logically associated with an electronic record, and executed or adopted by a person or an electronic agent of a person, with the intent to sign a contract, agreement, or record.

Alternative: The term `electronic signature' means an electronic sound, symbol, or process attached to or logically associated with a record and executed or adopted by a person with the intent to sign the record.

Encryption Any procedure used in cryptography to convert plain text into ciphertext in order to prevent any but the intended and/or authorized recipient from reading that data.
Enforcement/Redress Mechanisms to ensure compliance (enforcement) and appropriate means of recourse by injured parties (redress)
Fair Information Practices A set of principles designed to guide commercial entities in their data practices for customer and consumer information. See also Notice, Consent, Access, Security, and Enforcement.
GUID (Globally Unique Identifier) Typically a long sting of alphanumeric characters that are assigned in such a manner that they are guaranteed to be unique within a well-defined context. These numbers are generally, but not always, assigned using a standard protocol called the UUID (Universally Unique Identifiers), however numbers such as social security numbers could also be considered GUID's.
Host Enterprise The entity controlling the systems storing personal information to be accessed by consumers
Identity An identity is the collection of information that uniquely identifies and/or locates an individual. Usually some combination of first and last name, mailing address, email address, phone number, and age can by used to uniquely identify an individual.
Identity, verifying Process by which an individual's identity if proven. Verification is different from authentication. Jane Doe can enter data identifying herself as Joan Smith and can be authenticated as such. However, Jane Doe's identity can only be verified as being Jane Doe unless she impersonates the identity of Joan Smith.
Internet A giant network of servers and client computers interconnected through a set of protocols and distributed throughout the world.
Metadata Data about data, in particular, a description of the data, its history, and authentication data (a digital signature, for example) related to the history and/or description
Notice/Awareness One of the five Fair Information Practices, notice of an entity's data policies and practices must be provided to consumers prior to collection of personal information.
Password A set of characters either assigned to or chosen by a consumer to be used to gain access to information and/or services. There are various methods of password construction ranging from weak (alpha only, not case sensitive - johndoe) to strong (alpha, numeric, and special characters required, case required - J0hEn_D0)
Personal information collected online Information that is overtly collected from an individual via an online media(5) (e.g., the world wide web); i.e., the individual contributes the information, and information that is collected from an individual by observation while the individual surfs the web (visits web sites) with or without the consumers knowledge.
Personal information; inferred Information that is not directly contributed by an individual or automated collection process, but is calculated as a result of analysis of the collected data. For example, a new attribute of "potential new car buyer" can be inferred by seeing that a consumer has recently and frequently visited car-selling sites, auto loan sites, and product-rating sites for automobiles.
Personally Identifiable Information (PII) Those data elements that enable the identification and/or location of a unique individual. PII can be achieved as a single data point (such as e-mail address) or by a combination of data points (first name, last name, postal address).
Personally Identifiable Information: Sensitive A classification of data used in the EU data directive that specifies certain information as deserving special treatment due to it's sensitive nature, including financial, health, religious, and sexual data. Sensitive data generally requires higher standards of authentication, authorization, choice, security, and distribution.
PIN Personal identity number, a code known by an individual that is used to gain access to controlled resources, e.g., a PIN is used in conjunction with a magnetically coded bankcard to gain access to bank accounts via an ATM machine
Prospect A prospective, or unrealized, customer. A prospective customer may be a person who has expressed an interest in a product or service, a related product or service, was referred by another commercial entity, or a person who has received a gift of a product or service from a customer of the business.
Security One of the five Fair Information Practices, security assures that information shall be protected from unauthorized access, use, or distribution and shall not suffer quality degradation or loss.

 

Contact Info for Drafters:

Deirdre Mulligan deirdre@cdt.org 202/637-9800
Ron Plesser ron.plesser@piperrudnick.com 202/861-3969
James Allen jallen@ecustomers.com 512/306-5103
Stewart Baker
Lance Hoffman hoffman@seas.gwu.edu 301/656-6205
home on sabbatical

(w) 202/994-5513

Andrew Shen Shen@epic.org
Rick Lane Rlane@uschamber.com 202-463-5804
Richard Purcell richarp@microsoft.com 425/936-9332

Endnotes:

1. Businesses should provide a process by which consumers can challenge the correctness of certified information and request changes to the information. Businesses should not be obligated to change information that is correct per the business' own certification, but should provide a process by which disagreements concerning the correctness of the information can be arbitrated.

2. Businesses should provide a process by which consumers can challenge the correctness or appropriateness of information from other sources and request deletion of the information. Businesses should not be obligated to delete third-party-sourced or self-sourced information that the business believes is correct and appropriate to retain, but should provide a process by which disagreements concerning the accuracy and appropriateness of the information can be arbitrated.

3. As general background on the issues raised in this document, the subcommittee recommends study of the Department of Commerce's European Union Directive on Data Protection FAQ #8. The current version of this FAQ can be found at http://www.ita.doc.gov/td/ecom/RedlinedFAQ8Access300.htm

4. See The Privacy Rights Clearinghouse http://www.privacyrights.org/AR/id_theft.htm for more information.