Archive

Online Access & Security Committee

I. Introduction and Summary

II. Access

Both consumers and businesses have a shared interest in the provision of reasonable access to consumer personal information. Reasonable access benefits individuals, society and business due to the openness and accountability it helps to promote. If done properly, the provision of access can also help reduce the costs to businesses and consumers of improper decision-making due to poor data quality. Moreover, increased access may help promote consumer trust and deeper customer relationships, which benefit both consumers and businesses. However, the manner in which to provide access and to what degree access should be provided are complex questions given the numerous types of non-personally identifiable and personally identifiable information, the "sensitivity" of that information, the sources of that information, and the various costs and benefits associated with providing access.

There is an extremely broad range of policy options on how access should be provided, from a very simplified "default rule" approach to a much more complex approach that subjects the scope of access to a calculation based on the sensitivity of personal data and the use of that data. We have identified three basic approaches, which we discuss in more detail below. They are: 1) the default rule approach, 2) the total access approach, and 3) the case-by-case approach.

Once a decision has been made that the individual should be provided access to information about them maintained by a business, the question is how to ensure that only the individual, and no one else, can gain access. Authentication devices provide a means of limiting access to authorized individuals - in this case the subject of the information. The Committee worked to identify the authentication options that would best ensure access to information is provided only to the individual to whom the information pertains.

During its meetings the Committee concluded that where information is tied to a specific identifier - for example a name, address, or unique identifier - access could be provided. Thus, the discussion of access covers both information tied to a specific individual's name and address and information tied to a unique identifier that has been assigned to the individual or his or her browser. However, as discussed below the second case raises additional authentication concerns that must be acknowledged and addressed.

Where a decision has been made to extend access rights to a consumer what is the scope of those rights and which entities are obligated to allow the consumer to exercise them. Does access ever or always include the ability to correct, amend or delete data? Should the answer vary according to the data and other considerations? Which Entities are required to provide access to data? All those who are capable of providing access or some subset? These issues are discussed in section III.

Finally, The Advisory Committee also examined how to ensure the security of personal data gathered by commercial websites. Computer security, is difficult to define, particularly in a regulatory or quasiregulatory context. Identifying the most effective and efficient solution for data security is a difficult task. Security is application-specific. Different types of data warrant different levels of protection. Despite these difficulties, the Committee has proposed a recommendation for ensuring adequate security for personal data gathered by commercial websites.

A. A Default Rule Approach

Under a "default rule approach" based on the principles outlined by the BBBOnLine seal program, the scope of access is guided by the premise that consumers should be given as much access to their personally identifiable information (PII) as practicable. This approach would establish a default rule that PII collected online is generally accessible, with some limitations or exceptions when the cost of providing access far outweighs the benefits, and for derived data. The "default rule approach" recognizes that consumers have reason to view the information collected by businesses about them beyond being able to ensure its accuracy. Indeed, the fairly broad access rights under a "default rule approach" may promote awareness of business information practices as much as they promote accuracy. Under one theory, this broad access could affect businesses and consumers by increasing consumer awareness of the trustworthiness and responsibility of the businesses that collect information about them. Feasibly, this broad access could show the extent of information held about consumers, possibly making them wary and leading them to call for more limited collection of information. In this regard, broad access under a "default rule approach" may act to promote privacy by potentially dampening the interest of businesses in collecting more information than they need from consumers.

The over-arching rule of the "default rule approach" is that businesses should establish a mechanism whereby "personally identifiable information (PII) and "prospect information" that the business maintains with respect to an individual is made available to the individual on request.(1)

PII (and prospect information) is information collected from an individual online (actively or passively) and is information that when associated with an individual can be used to identify him or her.(2)

As an example, click-stream data is not "PII" unless it is linked to a name, email address or similarly identifying information.

Information is not PII unless it is "retrievable in the ordinary course of business." Information is retrievable in the ordinary course of business if it can be retrieved by taking steps that are taken on a regular basis in the business with respect to the information, or that the organization is capable of taking with the procedures it uses on a regular basis.

Information is not retrievable in the ordinary course of business if retrieval would impose an "unreasonable burden."(3) The only time a purpose or cost benefit analysis would be done would be in the rare situations where the ability to retrieve the information would be very costly or disruptive, and in that situation access could be denied if the need for the information was marginal. It is here that sensitivity of data, uses of data, purpose of the request, etc. would be considered.

Some other aspects of the "default rule approach" rules are:

  • "reasonable terms" may be placed on access, such as frequency limits and fees, except that requests may not be limited to fewer than one request per year and charges of greater than $15 per request are not allowed;
  • organizations are not required to set up new systems to maintain information beyond a time when it no longer serves the organization's purposes;
  • organizations are not required to provide access to derived data or data collected from outside sources;
  • steps to assure accuracy of data and processes to correct inaccuracies must be established;
  • organizations have flexibility to decide how to make "PII" available, i.e., in what form;
  • "proper identification" (undefined) may be required; and
  • there is no explicit requirement for access to be provided individuals by third party transferees.

As explained in the BBB Online policies:

The term "individually identifiable information" is intended to encompass information that, when associated with an individual, can be used to identify him or her, for instance, email addresses and other information that is compiled and linked to an email address. Account, billing, and online transactional information are examples of individually identifiable information. Information need not be unique to be considered capable of identifying an individual. Consequently, addresses, telephone numbers, and dates of birth constitute individually identifiable information. Information must be capable of identifying an individual, however. Consequently, data generated by passively browsing an online site (also known as navigational or click-stream data) does not constitute individually identifiable information unless it is linked to a name, email address, or similar information that identifies an individual.

In addition, the information must be information collected by the organization from the individual online. Information received by the organization, online or offline, that was collected online from the individual by others (who are not making the collection as an agent or contractor of the organization) is not itself individually identifiable information in the hands of the organization. This includes, for example, public records information in the possession of the organization that was collected online from the individual by the government agency.

Information is retrievable in the ordinary course of business only if it can be retrieved by taking steps that are taken on a regular basis in the conduct of the business with respect to that information or that the organization is capable of taking with the procedures it uses on a regular basis in its conduct of its business. Information is not retrievable in the ordinary course of business if retrieval would impose an unreasonable burden.

An organization is not required to set up any new systems to maintain information or to maintain individually identifiable information or prospect information beyond a time when it no longer serves the organization's purposes.

An organization must establish a mechanism whereby, upon request and proper identification of the individual, it makes available to the individual the individually identifiable information or prospect information it maintains with respect to the individual. The information subject to this requirement tends to be, but is not limited to, (i) account or application information, for example, name, address, and level of service subscribed to, and (ii) billing information and similar data about transactions conducted online, for example, date and amount of purchase, and credit card account used.

If an organization can not make information that it maintains available because it can not retrieve the information in the ordinary course of business, it must provide the individual with a reference to the provisions in its privacy notice that discuss the type of data collected, how it is used, and appropriate choices related to that data, or provide the individual with materials on these matters that are at least as complete as the information provided in the privacy notice.

Organizations have substantial flexibility in deciding how best to make the individually identifiable information or prospect information available to the individual. For example, an organization may choose the form in which it discloses this information to the individual. Monthly statements from banks and credit card companies are examples of appropriate mechanisms to satisfy this disclosure obligation, even though they may reveal more than the individually identifiable information that the individual submitted to the organization online. The organization also determines the reasonable terms under which it will make such information available such as limits on frequency and the imposition of fees. Frequency limits that require intervals of more than a year between requests and/or fees of more than $15 for a response to an annual request would not be reasonable except in extraordinary circumstances.

The "default rule approach" or BBB OnLine approach is similar to the access principle adopted as part of the Safe Harbor discussions proposed by the U.S. Department of Commerce. The Safe Harbor was developed in response to the European Union Directive on Data Protection which, among other things, mandated that consumers be provided reasonable access to their personal information.(4)

The Safe Harbor's access principle is as follows: Individuals must have access to personal information about them that an organization holds and be able to correct, amend, or delete that information where it is inaccurate, except where the burden or expense of providing access would be disproportionate to the risks to the individual's privacy in the case in question, or where the rights of persons other than the individual would be violated. Despite some language differences, the "default rule approach" and the Safe Harbor access approach are extremely similar. They both stand for the proposition that access should be provided unless the costs are too high.

Analysis of "default rule approach"

Proponents of this approach would argue:

  • This approach provides broad access rights and reasonably matches consumer expectations that they can access personal information collected about them. Consumers would be able to view all of the account information that they have provided and be able to correct such information if it is inaccurate.
  • Consumers would be able to view passively collected information (such as click-stream data) only when that information is linked to information that identifies them. When data is not personally identifiable, it poses much less of a privacy risk and may not need to be as accurate.
  • Businesses would not be forced to provide access if it turns out to be too costly. The growth of electronic commerce would not be burdened to as great an extent compared to that burden which could occur with the costs of providing full access.
  • Because businesses would not have to provide access to derived data, businesses could protect proprietary information. These businesses could avoid negative consequences if their competitors could otherwise use broad access to gain knowledge about how their internal processes work.
  • Compliance with this approach would likely satisfy the requirements of the European Union Directive on Data Protection.
  • The means of providing access would provide flexibility for businesses while giving consumers access without undue delay. This system would be consistent with the storage and use practices of businesses.

Opponents of this approach would argue:

  • One difficulty with the "default rule approach" lies with the determination of when the costs of providing access outweigh the benefits of that access. Specifically, who makes such a determination and how is that determination made?
  • Although the rule may be very straightforward for the majority of situations, difficulties in determining whether or not a particular business falls within the exceptions would subject web masters to becoming familiar with the latest determinations of those fringe areas. In this regard, the rule may make access unduly complex. Some examples of real world issues are:
  • Click-stream Data: Click-stream data compilation is difficult and expensive. It may be personally identifiable if a consumer becomes a customer of the commercial web site. An example might be where a consumer surfs around a web site and then decides to purchase an item or open an account. The consumer would then provide information to the web site operator that could be traced back to the click-stream data. One web session could result in numerous entries in each of these categories:
    • URL lists;
    • Web addresses;
    • Web session/duration levels;
    • Log-in or web page access information.
  • Interactive Fora Data: Information entered by a consumer in a chat room, interactive forum or Webcast would be difficult and expensive to compile. This type of information can be personally identifiable if the interactive event calls for registration or a log in process. The information may be scattered throughout a broadcast or transmission, resulting in small bits of data.
  • Customer Preferences Data: Preference information that is controlled by the consumer would be difficult and expensive to compile. As an example, a consumer of a commercial web site might become a customer and establish a shopping cart or a watch list for securities. The information might be changed at any time by the consumer (e.g., watch list can be revised at will). It would be very difficult and expensive to compile and provide access to such information on a historical basis.

Because businesses would not be required to provide access unless PII is "retrievable in the ordinary course of business," access rights could vary quite a bit from business to business, or across different types of businesses. Businesses may try to use nuances in the interpretation of "retrievable in the ordinary course of business" to avoid providing access. Potentially, a business could set up its data structures so that the data could be used to make decisions about consumers without being retrievable as a separate bit of information.

1) Consumers may have a significant interest in seeing data derived from information collected about them. As this data is what is used to make decisions based on their behavior, providing access may increase consumer awareness about what is being communicated about them and the potential impact of this information.

2) Limiting access to only that information which is collected online from that consumer does not allow the consumer to see the scope of any profiling that may be undertaken. Consumers will not be aware of what information is being used by businesses to make decisions.

3) Although this approach provides access to click-stream information when linked to PII, click-stream information attached to a Globally Unique Identifier also poses a risk to personal privacy. Consumers may expect to be able to see how they might be targeted based on this not identifiable, yet personal, collection of data.

4) The exceptions for providing access are too broad and unfairly limit individual access in favor of business interests. While rights to access should be weighed in balance with other considerations, the current access principles allow the entities least likely to consider the rights of the data subject - the data collector - to make that determination. The current access principle allows for numerous situations for refusal to access on the basis of expense or burden….(5)

5) Any fee (limited to $15 under this approach) may unduly limit the ability of consumers to access their information or it may lessen the attractiveness of accessing personal information.

Authentication considerations of the "default rule approach": Authentication requirements for account affiliated data

Where an account has been established access to the information retained in that account can generally be provided. In general the individual's ability to access information about the account should not be burdened by intrusive requests for information beyond what was required to establish and secure the account. However, it is common practice both offline and online to require some additional piece of information that is thought to be more difficult to compromise.

Where an account has been opened and is activated through a password it would be appropriate to provide access to the data when presented with a person who appears to be the account holder (tested), has the password, and presents some verifiable information about recent account activity. Such an approach would provide a two-factor method of authentication, but preserve the privacy offered by the initial account.

I subscribe to an Internet Service Provider providing them my name, address, and billing information. At a later date I request access to data they retain about my usage of the account. What should be used to authenticate that I am the account holder? Should that same authentication grant me complete access to data, or should an additional level of protection be afforded to certain data?

My name, address, and billing information are useful for authentication, however they are also widely available from other sources. Therefore they may not be sufficient to provide access. Many businesses require individuals to use a shared secret (password, mother's maiden name) to access an account. Concerns have been raised that passwords become hard to maintain, and frequently individuals resort to using simple ones or placing them in easily accessed places (the yellow sticky), and that some shared secrets have become so widely used they are no longer secret (social security numbers). The move to dynamic shared secrets (such as Amazon.com's use of two recent purchases (a shared secret)) would be a positive step. It provides a "something you know" token, but allows it to be dynamic (a benefit for security and privacy), and varied between services (because it is service based it is unlikely to be used by multiple systems).

Case study:

|I open an email account with a free service. Establishing the email account does not require me to disclose personal information. I am assigned an email address and am asked to establish a password to protect my account. If I request access to personal information held by the service, how should they determine whether to authorize my access? What level of authentication should be required?

Options
 
a. Require the same information for access (account name and password). This approach errs on the side of ease of use for the account holder. But in doing so it relies upon one token (account name) which is frequently shared with others (email address for example) and another token (password) which is (as our discussions indicate) relatively easy to compromise.
 
b. Require my account name, my password, and information about recent account activity. This method adds some protection against unauthorized access. By asking for the account name (something I have), my password (something I know), and recent account activity (something I know that is dynamic, and unlikely to be known or discernible by others) it adds an additional protection.
 
c. Require either of the above sets of information and send the requested information to the account.
 
d. Require either of the above sets of information and require that the request for access be made through the account, and send to the account a one-time access code. This approach would build in an additional precaution against unauthorized use. By requiring the request to come from the account (similar to credit card authorization that must come from the registered phone of the account holder) and returning a one-time access key to the account the system could further limit unauthorized access. This feature might cause a minor delay, but it does not require the individual to remember additional pieces of information.

B. Total Access Approach

The Federal Trade Commission could also consider an expanded version of the "default rule approach" where access would also be provided to derived data(6) and data collected from both off-line and on-line environments. Under the "default rule approach," access is granted to off-line information only when that information is merged with on-line information. Under this "total access approach," access would be granted to information gathered off-line if it could be linked to information collected on-line. Furthermore, access to non-PII could also be provided if the non-PII was linked to a GUID. Under this approach, if a business has the ability to provide access, the business should provide access. Some exceptions could be allowed, such as when proprietary information would be unreasonably jeopardized. This could be characterized as more of a "total access approach." In keeping with the purpose of providing consumers as much access as possible, businesses would provide initial access for free, while charging for repetitive access requests or terminating access upon unduly repetitive access requests.

This approach would implicate the full range of costs and benefits for businesses and consumers. For businesses, this approach would lead to a substantial increase in costs, including: any required modifications or new design requirements placed on existing systems, new storage costs, new personnel costs, new legal costs and potential increased liability. Consumers would also experience additional costs, such as: pass through costs for system upgrades, new personnel, etc., potential opportunity costs of businesses not investing in new products, potential loss of privacy if someone other than the consumer wrongly access this personal information, and the potential privacy threat posed from the aggregation of personal data that would not otherwise be aggregated. On the other hand, this broad access could significantly benefit businesses. By providing greater access rights, businesses could increase the reliability and accuracy of data, could build consumer confidence and trust, could experience a public relations benefit, could make better decisions based on better data, could expand markets by giving consumers greater confidence in online privacy, and could experience greater efficiencies if they limit information collection to only what is necessary. Consumers benefits are also increased by a total access approach. Consumers might experience an enriched understanding of data collection practices, increased confidence in the online environment, more control over the accuracy of personal information, the ability to identify inaccurate data before it harms them, the ability to make better privacy decisions in the marketplace (including decisions to protect anonymity), and the ability to better police businesses for compliance with any stated policies.

Proponents would argue:

  • Consumers should be able to access all of these types of information that are being collected about them. Without these types of information, consumers will not know the extent of profiling that is occurring. Moreover, information collected off line may pose just as much of a privacy risk as that collected on line.
  • Derived data may have a life impact and should be able to accessed by consumers.
  • Individuals should have the right to see data derived (given the ability to
  • identify and authenticate users) from information collected from them. As this data is used to make decisions based on their behavior, it is critical that this also be made available. Access to this derived data could, but does not necessarily, include the ability to review or see the algorithms used to derive such data.

Opponents would argue:

  • Providing this scope of access would be extremely costly to businesses and may result in too little consumer gain. Some of the other fair information practices (such as notice and choice) may be more important in protecting privacy.
  • Access to click-stream data and derived data will do little to improve its accuracy, which should be a predominant consideration when deciding access rights.
  • Access to derived data may jeopardize businesses by forcing them to disclose internal practices and proprietary information.
  • Companies in an off-line setting do not have to provide access to how they make decisions - on-line businesses should not be treated disparately.
  • Click-stream data, when not attached to PII, poses a very little privacy risk. Providing access to this information would be counterproductive because of the need to authenticate such information.
  • Providing access to derived data would affect the confidentiality of
  • procedures companies use to make decisions and assumptions about user data. Without this confidentiality, some companies and industries would be unable to maintain their current market viability.
Authentication considerations of the "total access approach": Complications of authentication requirements for non-account affiliated data:

Because this option would provide broader access rights to consumers, it raises additional but not insurmountable authentication concerns. In providing access to non-account affiliated data businesses must take additional steps to limit inappropriate access. Consider the situation where an individual has not opened an account with a service, but the service has collected data about the individual (or some proxy for me) and her activities. Under the "total access approach" how can a service authenticate that the individual is the person to whom the data relates? Should the level of access authorized be lowered due to the complexities of authenticating my connection to the data? Are there other policies that would address the privacy interest and have a lower risk of unintentionally disclosing data to the wrong individual? Does this concern vary from Web site to Web site?

Case study:

A Web site assigns each visitor a unique identifier that is used to track and retain data about the visitor's activities at the site. The Web site does not request or gather information about specific visitor's identities. A visitor requests access to information that the Web site has about her use of the site. How should the Web site proceed?

How can a site authenticate that the person requesting access is the person on whom they have collected a unique identifier based profile? How can Web sites provide access to what they have in a fashion that reflects the potential adverse consequences of disclosing information to someone other than the subject of that information. The consequences of disclosing information about an individual's use of a Web site to another person (family member, co-worker, other) could be quite damaging. Depending upon the type of information or service the Web site provides, inappropriate access to click stream data could be quite harmful.

Options

a. Require the identifier (presenting the cookie). This would make it quite easy for the user to access; however if the identifier is tied to an imperfect proxy for the individual (such as a computer) it is possible that other individuals may gain access to the individual's personal information. If a cookie attached to a browser by a specific Web site was used to provide access it could allow all members of a family, or other group, who share a computer to access each others' information. We tolerate this "over disclosure" in certain cases, such as telephone calls where we disclose all calls made from a number back to the individual named on the account despite the fact that in multi-family homes this discloses other family members' calls. However, in the online environment over disclosure could be more damaging because the information collected about an individual's use of the Web can on its face reveal more about the individual. For these reasons the identifier alone may be insufficient to grant access in many situations. But there may be instances where the identifier alone is sufficient proof of account ownership to grant access. For example, if the Web site is a general interest site that only retains information about how often visitor returns, providing access to someone other than the person who visited may raise little concern. But, if the Web site is focused on a specific disease then providing any information to the wrong person, even information about number of visits, could be quite harmful.

b. Require the individual to open an account and allow access to data collected from this point forward. This may or may not limit inappropriate access. For example, if the account is browser based and there are several individuals who use the browser this would allow one individual to access all the data and prevent the others from accessing any.

Require the identifier but limit the scope of access. This option acknowledges the risk of inappropriate access and to mitigate the harm it may cause limits the information provided. For example, a Web site could provide categories of information it has collected rather than the actual information. In some instances disclosing to the wrong individual the mere fact that a Web site has information tied to a unique identifier could be harmful. For example, if the Web site's subject is sensitive or revealing topic the mere fact that it has any information about tied to the identifier could in the wrong hands cause damage.

Delete the file and commit not to collect additional data. This option acknowledges the risk of inappropriate access and seeks to provide for the individual's privacy interest in another fashion. While this does not directly serve the individual's access interest, it would protect their general privacy interest by deleting the information. It poses no threat of inappropriate access. However, it could allow some one other than the subject of the data to have the data deleted.

Disassociate the data from the identifier and use it for statistical, aggregate or other non-individually based purposes. This option acknowledges the risk of inappropriate access, but also recognizes the commercial interest in utilizing the data in non-identified or anonymous form. While this does not directly serve the individual's access interest, it would protect their general privacy interest by removing information that connects the data to them.

a. Require no identifier but provide only a general description of the kinds of data collected. This errs on the side of limiting the impact of inappropriate disclosures and acknowledges that even the fact that a browser has an identifier associated with a specific site or service could in some circumstances be revealing and potentially harmful.

C. A Case-by-Case Approach

A third approach would be to treat different information differently, depending on a calculus involving the content of the information, the holder of the information, the source of the information, the likely use of the information. This approach is necessarily more complex, recognizing as it does that each different type of data raises different issues. The challenge therefore would be to develop an administrable set of rules.

Why this approach?

While an approach establishing a default rule of access enjoys easier application, it may be that it does not reflect the real purposes behind providing access. We have heard, both in the larger committee meetings and our subgroup meetings that the purpose behind providing access may be more limited than promoting consumer awareness. For example, the purpose may not be to enshrine "consumer privacy" but rather to protect data and ensure its accuracy. In fact, the purpose may be as limited as providing consumers an opportunity to correct erroneous data (and not to provide consumers an opportunity simply to know what's out there). A case by case approach may allow a more precise weighing of whether considering the nature of the data, the consumer's reasonable expectations about the data and the costs of providing access to the data, access to a particular type of data is warranted.

How would this approach work?

Essentially this approach would assign different access rights to different data. Given the many factors in the calculus, the permutations are extensive. The following is one example of this approach (in italics). Although a case-by-case approach can be very complex, the following example shows how a case-by-case approach could result in a manageable rule. The outcome of the following example is also very similar to the outcome of the "default rule approach," even though it may have involved a different analysis.

Consumers should be provided access to information about them and about their relationship with the business. Information about the consumer includes information that describes them (e.g., identity, contact information, consumer specified personal preferences), information that describes their relationship with the business (account numbers, account balances, etc.).

Information about the consumer's relationship with the business includes information that describes the history of their commercial transactions with the business (e.g., purchases, returns), and information about accounts maintained for the consumer with the business.

Consumers should only be given access to information for which it is possible to unambiguously authenticate that the person requesting access is the person the information is about.

The consumer needn't be given access to metadata used by the business solely for the purpose of facilitating an ongoing relationship with the consumer (e.g., GUID's), temporary/incidental data maintained by the business solely for the purpose of maintaining the integrity of interactions with the consumer (e.g., transaction audit records), or inferences the business has derived from other information (e.g., inferred preferences).

How does this approach differ from the other approaches?

It may be that much of the data gets treated similarly under each of the approaches. On the other hand, it is clear that under this third approach, there will be categories of data to which access is more limited than in the other approaches. For example, inferred data, "non-factual Data" or internal identifiers may be less accessible than under the other approaches. This approach does afford the flexibility to alter the calculus however: if the decision is to protect so called sensitive information: financial, health or relating to children, then this information, regardless of its provenance should be accessible.

Proponents would argue:

  • By allowing each type of data to be considered separately, we can undertake a more accurate balance of the propriety of providing access.
  • This approach provides a more realistic way to vindicate both consumer and business expectations.
  • This approach would depress costs to all consumers. With a broad based approach that encourages access to most data, consumers less interested are forced to bear the costs of creating the infrastructure. A narrower approach would allow costs to more fairly apportioned.

Opponents would argue:

  • There are far too many factors involved to allow a comprehensible set of rules to emerge. Moreover, many of the factors, e.g. sensitivity are difficult to assess objectively.
  • This approach does not recognize that the predominant purpose of providing access is to inform consumers of what is "out there" about them.
Type of Access
 
View
Consumers should be able to view all information to which they have access.
 
Edit
Consumers should be able to edit all information to which they have access that is not certified by the business or a 3rd party.
 
The business should provide a process by which consumers can challenge the correctness of the certified information and request changes to the information. The business is not obligated to change information that it believes is correct per its own certification (e.g., the record of a purchase transaction) or the certification of a 3rd party, but should provide a process by which disagreements concerning the correctness of the information can be arbitrated.
 
Delete
Consumers should be able to delete all consumer contributed information.
 
The business should provide a process by which consumers can challenge the correctness or appropriateness of information from other sources and request deletion of the information. The business is not obligated to delete 3rd party sourced or self-sourced information that it believes is correct and appropriate to retain, but should provide a process by which disagreements concerning the accuracy and appropriateness of the information can be arbitrated.
 
Means of Access

Access should be provided via a means appropriate for the type of information and consistent with its storage and use by the business. If the business stores the information in online storage such that it is instantly available for use by the business (e.g., as part of an online transaction processing system or a web based e-commerce system), then instantaneous online access should be provided to consumers via an appropriate online terminal (e.g., web browser, ATM machine, telephone voice response unit).

If the business stores the information in storage for processing by batch processing systems(7) (e.g., a batch billing system), then the information should be available to consumers via a frequently (e.g., once per week) scheduled batch process (e.g., a report run at regularly scheduled intervals and mailed to the consumer).

If the business stores the information in offline storage (e.g., magnetic tapes stored offsite), then the information should be available to consumers via an ad-hoc batch process (e.g., scheduled on demand).

Cost to Consumers

There should be no charge to consumers for reasonable requests for view, edit and delete access to online information about them.

Consumer requests for access no more frequently than the rate at which the information changes under normal circumstances are considered reasonable requests for access. A business may assess a reasonable charge to cover its expenses for more frequent requests to online information.

Businesses may also assess reasonable charges to cover their expenses for batch access requests and requests to offline information.

Authentication considerations of the "case-by-case approach"

The authentication issues arising in this approach would depend upon the data that was deemed appropriate for access. Thus the discussions under the previous two options are relevant.

This approach may allow considerations of data sensitivity and use to be valued in considering the provision of access. In turn, decisions surrounding security and perhaps authentication needs would vary depending upon the sensitivity of the data the business maintains. Particularly in the difficult area of non-account information, the risks of inappropriate access to sensitive information might benefit from the case-by-case review prior to establishing access procedures. There are important privacy interests on both sides that must be respected.

In addition, the definition of access - does it include correction and deletion rights - creates questions of sensitivity. Where access also connotes the right to amend or correct the information it maybe important to heighten or readjust the authentication requirements.

III. The meaning of Access: access, correct, amend, delete

A. Should the ability to access, edit or correct data vary with the use of the data?

a. Yes, no need to access, edit or correct data that is not actively used for anything, or merely maintained for system integrity, troubleshooting, or auditing.
 
b. Yes, only need to allow access, edit and correct data that is used to make important decisions such as financial or medical decisions, or employment decisions
 
c. Yes, where the information is collected from a public record source, a fair credit reporting agency or other entity that is responsible as the source of the information, edit and correction should be directed to the source of the data. Access request may also be directed to the source where required to do so.
 
d. No, the consumer should have the right to be able to access, edit or correct any data collected and maintained about them so long as that can be reasonably made accessible by the holder of the data.
 
Costs and Benefits Discussion:

Many members of the sub-committee thought the use of the data should not be a factor in determining whether or not to grant a consumer the ability to access, edit or correct data maintained about them. Although the way the data is being used is an important consideration, it is a slippery slope. What is collected today and not used, might be in the future. What is considered an unimportant use or decision by some, might be considered very important by others Who should decide what decisions are "important", and what is the basis for that distinction? Furthermore, if data is not really used, or if care is not provided in ensuring its accuracy then why go through the expense of collecting and maintaining it?

Some sub-committee members state that privacy is not a process, but instead is a "commitment". These sub-committee members believed the "process" definition causes companies to not properly narrow their uses of personal information.

Should the ability for a consumer to edit or correct data be determined in terms of the type of data?

a. This could be viewed as part of the case-by-case approach, but could be integrated in to other approaches to access.
 
For purposes of this discussion we felt it was best to group data into three broad classes; namely:
 
a. Whatever data the company maintains
 
b. All but inferred data, with the exception of inferred data handled under separate laws or regulations (e.g. credit loan decision)
 
c. Only physical contact information, online contact information, biometric identifiers, financial account identifiers, sensitive medical data, transactional data and image (or other information linked to these categories)

What should be done in situations where derivations are a source of competitive advantage as in the case of credit scoring or risk assessment? There is a case for not having to provide a customer access to inferred data as this information may be the result of a proprietary model that provides the company competitive advantage; e.g. an indicator of a customer's future purchase behavior. The only counter would be when the derived data is used to make a decision about the customer which would result in an important denial of services - e.g. granting of a loan. However, it should be noted that consumers may be more interested in information that is derived about them than they are about the detailed information that they used to derive it in the first place.

There are costs and benefits to both business and consumers that must be considered here. Consumers face a higher cost in not having correct data for certain types of information (credit information vs. marketing information, for instance). Some sub-committee members believe that there is a benefit of providing access in general to all types of information held by all businesses, and these benefits must be weighed against the costs..

Who should be allowed to edit or correct data? An authenticated user only? An authenticated user or their an agent acting on their behalf?

Should entities requesting that information be corrected have to provide proof that the information is wrong? Yes, corrected information should be verifiable.

Should consumers be able to correct any wrong information? Yes, why not? It is important for both the service provider and the consumer to work from a common base of correct information. The only caveat is that the information must be verified as correct, as we require proof that the information being corrected is wrong, and the new information is correct.

Should users be able to correct an inference? Some sub-committee members stated that ascertaining whether inferences are right or wrong will be difficult and costly. Also, many inferences are not presumed by the inferer to be correct, but instead are useful to draw general conclusions., instead of conclusions of fact, and therefore this category of information is not practical to be corrected by the user.Other sub-committee members believe this is information formulated about a consumer and used in ways that affects their interaction with businesses. These members believe consumers have a strong interest in being able, at the very least, to view all the information that describes them in the hands of businesses.

What about click stream information or log data? Information could be wrong in one part per million. Providing the ability to edit or amend this information could be considerable and fantastically expensive.

Must companies retain a record of the information that was incorrect after it has been corrected? Why would a company want to except perhaps as a record of decisions and transactions that might have been made erroneously based upon the incorrect data, prior to correction? Certainly, companies should be allowed to maintain a record of the information that was incorrect, after it has been corrected, but not required to do so. What should be done in the event that the accuracy of the data is disputed and irreconcilable? Unless there is room for reasonable doubt and disagreement (e.g. an inference), an investigation should take place?

There is a distinction between indicating which information is incorrect and actually correcting the information. Which do we want? One can't be too careful about correcting data, we must be sure that the correcting source is authenticated and that the correct information is verifiably correct

Some sub-committee members believe the entire principle of access lays a framework for the correction of data. While access alone provides some benefit to consumers, but a more powerful right is to allow for the correction of data.

Concern was expressed by several members of the sub-committee that some options would create substantial authentication hurdles (e.g. who do you give access to all the Clickstream and Navigation data connected with a particular LUI?)

C. Authentication Considerations of Access, Correction, Amendment

The level of authentication required to safeguard personal information may vary depending upon whether access permits the record subject to view information or allows the information to be corrected or amended as well. While providing access to the wrong individual violates the record subjects privacy - and may lead to additional harm ranging from embarrassment to loss of employment - allowing personal information to be corrected or amended by the wrong individual can result in other forms of harm. Where correction or amendment is provided, an audit trail should be maintained to aid in identifying potential problems.

The inappropriate correction or amendment of information could lead to faulty decision making by those who rely on the record. In circumstances where the record is relied upon for important substantive decisions, such as financial and health, inappropriate changes can have devastating consequences. For example, some criminals were gaining access to individual's credit card accounts by changing the individual's mailing address. The crook would fill out a change of address card with the post office diverting the individual's mail to another location. With access to the individual's bank statements and credit card bills the crook had ample information to impersonate the victim. The Postal Service has recently initiated changes to make this more difficult.(8)

Therefore, in considering what form of authentication a business should employ the level of authorization conveyed by authentication must be considered.

IV. Access: Where, when and at what cost?

A. Which Entities are required to provide access to data

1. All entities that collect information from a data subject and actively maintain a database of consumer information that can be linked/associated with individual consumers and/or consumer households.
 
2. The entity the consumer reasonably believes is the Data Collector and its agents (entities acting for the Data Collector and restricted in their use and transfer of the data). Notice of transfer to other entities would be required, but access would not be required.
 
3. Data collector, parents, subsidiaries, and recipients including information intermediaries

The committee felt that there were only three reasonable alternatives regarding which entities could be required to provide customers access to data maintained about them.

Obviously, entities that don't possess the data cannot offer access to it.

Clearly, a company collecting information from consumers should, where such data is maintained in a form which can be linked back to an individual consumer or consumer household, make it accessible to the consumer under reasonable conditions of access, unless there is some legitimate reason for refusing (see later sections).

The sub-committee agreed for general purposes, at a maximum, access should be provided only for information that is maintained on-line and for which the customer can practically be provided access to; e.g. information collected but not maintained would be impractical to be provided (e.g. demographic data used for determining candidates for a direct mail solicitation, but not maintained after the mailing address list is generated) would not be reasonable to provide access to. However, the sub-committee understands certain areas of sensitive information (e.g. medical, financial) may necessitate additional rights to access. Another example would be information collected to conform to legal or regulatory or audit requirements, and maintained off-line, on tapes, or in serial files that would be difficult and costly to provide access to. As noted in many of the other comments, many members of the sub-committee thought ability to access was one factor to consider, but that there are other factors which should allow a data collector to not have to provide access (e.g. type of information, use, cost, etc.)

The issue, and a point of contention for the sub-committee, was whether this requirement should be extended to include the parent, and all the subsidiaries of the corporation? And whether or not the right of access should be extended to all parties with whom information has been shared, including information intermediaries hired to assist the data collector? For example, when the customer data management function is outsourced to third parties. Some members of the sub-committee thought this extension of access to third party recipients was necessary for sufficient consumer protection. The sub-committee generally agreed that corporations should provide access to the data held by their agents (as defined above). However, several members of the sub-committee thought managing other third parties would be unduly burdensome, and that the consumers were better protected by requiring companies to provide notice of with whom they will share the information. Other members of the sub-committee held that the provision of notice is unduly burdensome to consumers who will be less likely to be aware of the existence of such third parties, let alone how to contact those companies and exercise access

Still other members of the sub-committee believed the issue depended on whether the parent and/or subsidiaries are using this information. If they are, then they should make it accessible and protect it. If not, then no. With respect to "information intermediaries," it depends on how they treat and handle the data. If they use the information, view it and permanently store it then they should make it accessible and protect it. If not, then access is not required.

B. Ease of access.

This includes issues surrounding both whether access fees should be allowed, and the degree of effort required by the data access provider to ensure that the information can be easily accessed, understood and corrected by the consumer. It also includes non-economic costs of access, such as potential risks to privacy.

1. Fees

i. Never Charge any fee. No costs should be incurred by the consumer to access their information

ii. Selectively charge fees Nominal costs

1) Fees commensurate with type of data being accessed.

2) Fees commensurate with the use of data being accessed.

3) Fees commensurate with the amount of data being accessed.

4) Fees commensurate with frequency which a user accesses the data.

5) Fees commensurate with the nature of the data access requirement (e.g. if the customer wants real-time access to the data when normal access is not real-time (e.g. access normally provided within 24 hours).

iii. The service provider is free to charge any reasonable fee, but the fee must be kept within specified ceilings and floors

iv. Always charge a fee

a. Usability of the access and correction system

i Interface is easy-to-use, does not require any special training by a non-technical lay person; e.g. should be no harder to access than any of the services provided by the service provider.

ii. Information is legible and intelligible (e.g. not difficult to decipher codes)

iii. The access and correction system should both be reasonably available.

Adequate notice should be made to the consumer of what information is available for access and how to access and correct this information.

Costs and Benefits Discussion:

Should fees be waved if there is a hardship?

Privacy implications

A. Centralization

As many companies that are holding personal information are part of a larger corporate entity that may possess other data through different subsidiaries, would access to all the information held by the parent company necessarily bring together all this previously separated information? And, would this combining of information in itself pose an increased threat to personal privacy?

Sub-committee members agreed the goal of access is not to centralize more personal information. The most expansive interpretation of access should not have the indirect effect of creating a new file or record on an individual. Under this hypothetical expansive interpretation, the individual would have access to all available personally identifiable information existing at the time of the request.

However, some sub-committee members believe that these concerns should not prevent parent companies from implementing procedures increasing ease of access. One proposal made by Rob Goldman of Dash.com is to have parent companies create a central page, which would direct consumers to their various subsidiaries which may have different pieces of personal information in their own distinct records, although even this simple integration of information might increase the vulnerability of an individual's information to compromise - e.g. now a bad guy if they can guess the password, can get access to all the customer's private information from one convenient location.. Also, such a linked page may be extremely difficult to manage for companies which regularly acquire and divest subsidiaries.

As general background on the issues raised in this document, the subcommittee recommends study of the Department of Commerce's European Union Directive on Data Protection FAQ #8. The current version of this FAQ can be found at http://www.ita.doc.gov/td/ecom/RedlinedFAQ8Access300.htm

B. Authentication devices

The Committee wishes to emphasize that difference between authentication and Identification. As we seek to provide individuals with access to personal information we must not move toward greater identification of individuals.

Maintaining the ability of individuals to be anonymous on the Internet is a critical component of privacy protection. Access systems should not require identification in all instances. Biometrics raise additional privacy concerns that must be explored and addressed. Finally, third party authentication systems raise important privacy concerns (creating additional records of individuals access requests). Inserting a third party into the relationship creates an additional opportunity (at times it may be responsibility) to collect and maintain information about the individual's interactions. What policies govern these entities' use of personal information? On the other hand, third parties - intermediaries -- can also play a role in the protection of identity. Currently several companies have establish themselves as intermediaries whereby they establish themselves as a protector of identity and privacy between the individual and other entities.

IV. Security

The Advisory Committee also examined how to ensure the security of personal data gathered by commercial websites.

A. Competing Considerations in Computer Security

Security has often been treated as an obligation of companies that handle personal data. But security, particularly computer security, is difficult to define, particularly in a regulatory or quasiregulatory context. Identifying the most effective and efficient solution for data security is a difficult task. Security is application-specific. Different types of data warrant different levels of protection.

Security - and the resulting protection for personal data - can be set at almost any level depending on the costs one is willing to incur, not only in dollars but in inconvenience for users of the system. Security is contextual: to achieve appropriate security, security professionals typically vary the level of protection based on the value of the information on the systems, the cost of particular security measures, the costs of a security failure in terms of both liability and public confidence.

To complicate matters, both computer systems and methods of violating computer security are evolving at a rapid clip, with the result that computer security is more a process than a state. Security that was adequate yesterday is inadequate today. Anyone who sets detailed computer security standards - whether for a company, an industry, or a government body - must be prepared to revisit and revise those standards on a constant basis.

When companies address this problem, they should develop a program which is a continuous life cycle designed to meet the needs of the particular organization or industry. The cycle should begin with an assessment of risk; the establishment and implementation of a security architecture and management of policies and procedures based on the identified risk; training programs; regular audit and continuous monitoring; and periodic reassessment of risk. These essential elements can be designed to meet the unique requirements of organizations regardless of size.

In our recommendations to the FTC, we attempt to reflect this understanding of security. Our work, and this report, reflect the various types of on-line commercial sites, and the fact that they have different security needs, different resources, and different relationships with consumers. The report reflects this understanding and seeks to identify the range of different possibilities for balancing the sometimes competing considerations of security, cost, and privacy.

B. Regulating Computer Security - Preliminary Considerations.

Before turning to the options it is worthwhile to comment on several issues that the Committee considered but did not incorporate directly into its list of options.

First, we considered whether guidelines or regulations on security should contain some specific provision easing their application on smaller, start-up companies or newcomers to the online environment, but we ultimately determined that new entries should not receive special treatment when it comes to security standards. In part, this is because organizations that collect personal data have an obligation to protect that data regardless of their size. In part, this is because we concluded that any risk assessment conducted to evaluate security needs should take into account the size of the company (or, more appropriately, the size of a company's potential exposure to security breaches). In many cases (but not all), a smaller website or less well-established company will have fewer customers, less data to secure, and less need for heavy security. A smaller site may also have an easier time monitoring its exposure manually and informally. And of course, even a small site may obtain security services by careful outsourcing.

Second, we noted that several of the proposed options depend on or would be greatly advanced by inter-industry cooperation and consultation on appropriate and feasible security standards. In conjunction with the adoption of any of the proposed options, we urge the FTC or the Department of Justice to make assurances to industry members that cooperation in the development or enforcement of security standards and procedures will not result in antitrust liability.

Third, it is vital to keep in mind that companies need to protect against internal as well as external threats when considering solutions designed to secure customers' personal data. Many companies have already implemented information security policies that protect sensitive corporate data (i.e., compensation information) by limiting access to only those employees with a "need to know." Companies need to implement similar measures that protect customer data from unauthorized access, modification or theft. At the same time, mandated internal security measures can pose difficult issues. For example, it is not easy to define "unauthorized" employee access; not every company has or needs rules about which employees have authority over computer or other data systems. And many companies that have such rules amend them simply by changing their practices rather than rewriting the "rule book." Even more troubling is the possibility that internal security requirements that are driven by a fear of liability could easily become draconian - including background checks, drug testing, even polygraphs. We should not without serious consideration encourage measures that improve the privacy of consumers by reducing the privacy of employees.

Fourth, we are concerned about the risks of regulation based on a broad definition of "integrity." Some concepts of security - and some legal definitions - call for network owners to preserve the "integrity" of data. Data is typically defined as having integrity if it has not been "corrupted either maliciously or accidentally" [Computer Security Basics (O'Reilly & Associates, Inc., 1991)] or has not been "subject to unauthorized or unexpected changes" [Issue Update on Information Security and Privacy in Network Environments (Office of Technology Assessment, 1995, US GPO)]. These definitions, issued in the context of computer security rather than legal enforcement, pose problems when translated into a legal mandate. If integrity is read narrowly, as a legal matter it would focus on whether a Website has some form of protection against malicious corruption of its data by external or internal sources. If the definition is read broadly, it could lead to liability for data entry errors or other accidental distortions to the private personal information it maintains. Authentication controls for controlling access to information are an integral part of system security. Therefore, to establish appropriate authentication businesses must consider the value of the information on their systems to both themselves and the individuals to whom it relates, the cost of particular security measures, the risk of inside abuse and outside intrusion, and the cost of a security failure in terms of both liability and public confidence.

C. Notice and Education

After considerable discussion, the Advisory Committee has developed a wide range of possible options for setting standards for protecting personal data gathered by commercial websites. Before presenting these options, we will address two policy options that the group considered but determined were unsatisfactory on their own. While insufficient standing alone, the Advisory Committee concluded that development of programs to educate consumers on security issues and a requirement that companies post notice describing their security measures are approaches that should be examined as possible supplements to some of the options in Section D.

Notice. Notice is viewed as an appropriate tool for informing individuals about the information practices of businesses. It is critical to the consumer's ability to make informed choices in the marketplace about a company's data practices. In the area of security, as in the area of privacy, there is not necessarily meaningful correlation between the presence or absence of a security notice statement and the true quality of a Website's actual security. A security notice could be more useful if it allows consumers to compare security among sites in an understandable way. Since it is difficult to convey any useful information in a short statement dealing with a subject as complex as the nuts and bolts of security, most such notices would be confusing and convey little to the average consumer. Further, providing too many technical details about security in a security notice could serve as an invitation to hackers. (As was discussed at some length by the Advisory Committee, these considerations also mean that it is not possible to judge the adequacy of security at Websites by performing a "sweep" that focuses on the presence or absence of notices.)

Notice is important in triggering one of the few enforcement mechanisms available under existing law. If a posted notice states a policy at variance with the organization's practices, the FTC may exercise its enforcement powers by finding the organization liable for deceptive trade practices. But security notices are ineffective standing alone and should not be an option. At the same time, we believe that they could be useful in conjunction with one of the other options discussed in Section D. The form such notice should take will vary depending upon the option selected.

Consumer Education. In addition to notice, consumer education campaigns are also useful to alert consumers about security issues, including how to assess the security of a commercial site and the role of the consumer in assuring good security. Regardless of what security solutions the FTC decides to recommend, it would be extremely valuable for the FTC or industry associations to sponsor consumer education campaigns aimed at informing Internet users about what to look for in evaluating a company's security. In addition, no system is secure against the negligence of users, so consumers must be educated to take steps on their own to protect the security of their personal data.

D. Options for Setting Website Security Standards

The Advisory Committee has identified two sets of options for those seeking to set security standards. These security recommendations apply both to information in transition and information in storage. In essence, these options address two questions: How should security standards be defined? And how should they be enforced?

The question of how security standards should be defined requires consideration of the parties responsible for the definition as well as issues of the scope and degree of flexibility and changeability of the standards. The entries that could be responsible for setting security standards explicitly include government agencies, courts, and standards bodies. Furthermore, it could be left up to websites themselves to develop security programs (perhaps with a requirement that each site develop some security program), or it could be left to market forces and existing remedies to pressure websites into addressing security at an appropriate level.

In this section, we set forth five options for setting security standards that fall along a continuum from most regulatory to most laissez faire. Each of the proposals reconciles the three goals of adequate security, appropriate cost, and heightened protections for privacy in a different manner. Policy makers should consider this when selecting a course of action. For each option, we have presented the arguments deemed most persuasive by opponents and proponents of the option.

1. Government-Established Sliding Scale of Security Standards - Require commercial Websites that collect personal information to adhere to a sliding scale of security standards and managerial procedures in protecting individuals' personal data. This scale could specify the categories of personal data that must be protected at particular levels of security and could specify security based upon the known risks of various information systems. In the alternative or as part of the standard, there could be minimum security standards for particular types of data. The sliding scale could be developed by the FTC or another government agency and incorporate a process for receiving input from the affected businesses, the public, and other interested parties.

Proponents would argue:

1) A sliding scale allows for the matching of consumer protection risk to data source, thereby allowing companies to develop a more efficient compliance and technology infrastructure.
 
2) A sliding scale provides commercial flexibility in the way Websites comply with security standards.

Opponents would argue:

1) This option will embroil the FTC in trying first to gauge the sensitivity of numerous, different types of data and then to match the sensitivity with particular security measures. It is an impossible task, and the results will be a mess.

2) If the sliding scale is produced at a high level of generality, it will be unenforceable and probably incomprehensible; if it is made specific enough to enforce, it will be a straitjacket for many businesses and a series of loopholes for others.

3) Even if it could be prepared properly the first time, a sliding scale would have to be updated almost constantly, a task for which bureaucracies are ill-suited.

2. "Appropriate Under the Circumstances"/"Standard of Care" - Require all commercial Websites holding personal information to adopt security procedures (including managerial procedures) that are "appropriate under the circumstances." "Appropriateness" would be defined through reliance on a case-by-case adjudication to provide context-specific determinations. This standard would operate in a manner similar to that governing medical malpractice for physicians: as the state of the art evolves and changes, so does the appropriate standard of care. An administrative law judge of the FTC or another agency or a court of competent jurisdiction could adjudicate the initial challenge.

Proponents would argue:

1) This approach allows for an assessment of security tied directly to considerations of circumstance and knowledge. It is impossible to summarize in any detail the balance that must be struck between security and usability; even for the most sensitive data, such as medical information, it may be necessary to lower security standards in order to assure prompt treatment for the injured.

2) The creation of a general standard that is informed by the security practices of others similarly situated at a certain date and time allows for flexibility and growth while encouraging ongoing progress. A similar approach is found in judging medical treatment: doctors are not regulated by an elaborate rulebook but rather by the requirement that they practice medicine in accordance with accepted professional standards. The law leaves definition of those standards to the particular case.

3) This approach is designed to encourage increasingly strong security practices. If a bright line rule is adopted, there is little doubt that the pace of technical change will leave the adequacy of regulation in the dust, and what was intended to be a regulatory floor will become a ceiling in practice. Rising tides do raise all boats, except those that are anchored to the bottom.

Opponents would argue:

1) In the absence of clear minimum security standards, courts and companies will lack guidance, because there are no universally accepted security standards.

2) For consumers, the absence of any clear definition of what is sufficient security may put their personal information at risk from companies who do not share the same risk assessment about what is "appropriate under the circumstances."

3) For commercial websites, there are also disadvantages to this approach; their security precautions will not be judged until after a breach has occurred, which means that the precautions are more likely to be viewed as inadequate in hindsight.

4) An after-the-fact security standard could lead many websites to ignore security until they are sued.

3. Rely on Industry Specific Security Standards - All businesses operating online that that collect personal information could be required to adhere to security standards adopted by a particular industry or class of systems. There are three quite different options for how the standards are developed:

a. The standards could be developed by a government-authorized third party through a process that encourages public participation (notice and comment) and may include governmental review.
 
b. The standards could be established by any third-party but the FTC could require that the standards address specific topics (e.g. access, data integrity, notice, authentication, etc.).
 
c. The standards could be developed by any third-party as long as the identity of the standard-setting organization is revealed to consumers (this is in effect a security "seal" program).

Proponents would argue:

1) No government agency is smart enough or fast-moving enough to set network security standards for a particular industry. Industry-specific standards should be set by industry because each sector has different computer security needs and methodologies.

2) Industry groups will have a strong incentive to avoid setting too low a bar. Every company with a brand name is held accountable for the products sold under that name. So too with security standards-setting organizations; those that are associated with serious security breaches will lose the confidence of the public.

3) The three options presented under this heading are quite different, and c. is significantly better than the others. It associates a security standard with a "brand name" so that consumers can decide whether security at the site is sufficient. Option b. simply adds a requirement that the standards address certain issues. In most cases this will be unnecessary and in other cases insufficient. Option a. requires that the government license standard-setting organizations; it also requires notice and comment and perhaps government review for such standards. This option is nearly indistinguishable from requiring government-written standards and will require that the FTC or some other body make hundreds if not thousands of individualized decisions about what security practices should be required in which industries, decisions that will have to be remade every three months as security standards and challenges evolve.

Opponents would argue:

1) Allowing industry to develop (and police) itself invites lax standards and under-enforcement. Self-regulatory organizations that are comprised solely of the industry at issue will not develop robust standards because doing so may subject its members to additional implementation costs and expose them to greater liability.

2) The insular nature of the standard setting process does not adequately assess and address the needs and values of other parties - other industries, the public, policy makers. In the absence of other stakeholders industry will fail to address important concerns or craft proposals that undercut other important public policies.

3) The standard setting process lacks public accountability. It is inappropriate to develop substantive policy through entities and processes that lack institutional mechanisms for ensuring public accountability and oversight.

4) Opponents will find that options a-c do not address their general concerns with industry-generated standards. However, opponents may find that proposal "a" partially responds to criticisms 1 and 2 because it constructs a process for soliciting public and policy maker input and review and to a limited extent addresses concerns about industry capture, and stakeholder participation. However, because it does not permit other stakeholders to participate in the formulation of the standards it is unlikely to fully ameliorate these concerns. In addition, the fact that the item to be protected, personal information, is likely to be considered less valuable by the business than individuals, the concern about lack of representation is heightened. Opponents may find that proposal "b" while weaker than "a" provides some restraint on the standard-setting process by allowing outside interests to decide what issues must be addressed. Option "c" will garner the greatest opposition from opponents as it fails to address any of the concerns outlined above.

4. Maintain a Security Program - Require all commercial Websites that collect personal information to develop and maintain (but not necessarily post) a security program for protecting customers' personal data. This option could take one of two forms:

a. The contents and methodology of the security program could be specified, and businesses could be required to post a brief notice indicating their compliance.
 
b. The requirement could be limited to a simple mandate that the website adopt a security strategy without specifying the details or requiring that it be posted.

Proponents would argue:

1) A security program is necessary for a commercial website of any size that collects personally identifiable information and wishes to keep the information confidential.
 
2) The scope of the program may vary depending upon the size of the company and in the case of a very small business, one person may be able to effectively handle security on a part time basis. However, just as marketing, human resources, and accounting are considered essential business functions for companies of any size, maintaining a security program is also critical to any company's operations.
 
3) In support of option 4 a., security professionals believe that any effective program, even if managed by one person part time, should involve the elements of risk assessment, implementation of controls based on the risks, testing and monitoring of controls, and periodic re-assessment of risks.
 
4) Also in support of option 4 a., a statement that the company maintains a security program that assesses risks and implements appropriate controls to address the risks need not be incomprehensible to consumers or too burdensome for businesses to comply with and insures consumers and businesses that security has been considered in the system design.

Opponents would argue:

1) Developing and maintaining a program -- but not testing it or otherwise verifying or assuring that the organization is complying with the program -- will only result in an illusion of security.

2) The costs of developing, testing, verification, and assurance (especially to small or not technically savvy businesses) will be significant, diverting resources from the main business purpose. Many firms would not know where to turn or how to take the first step in developing such a program.
 
3) If the plan description is posted, much of it may both be incomprehensible to non-technical users and all-too-clear to technically savvy attackers.

5. Rely on Existing Remedies - Before requiring any particular security steps, wait to see whether existing negligence law, state attorneys general, and the pressure of the market induce Websites that collect personal information to generate their own security standards. It is worth noting that the insurance industry has started to insure risks associated with Internet security. The emergence of network security insurance may force companies to seriously address security issues, as the presence or absence of adequate security will be taken into account in the underwriting process utilized to determine rates for premium.

Proponents would argue:

1) Consumers who suffer harm as the result of negligence can typically bring tort actions. There is no reason to think that consumers who are harmed by a breach would lack a remedy for any specific injury they may suffer.
 
2) Damages are often quantifiable (credit card charges or lost work time due to identity theft for example). And even when they are not quantifiable (disclosure of embarrassing medical data, for example), the problem is no more difficult for juries to resolve than similar intangible wrongs routinely resolved by juries today (libel damages, for example, or "false light" claims).
 
3) It is therefore reasonable to wait for such litigation and to correct any gaps that may emerge in the law when and if the lack of a remedy has been demonstrated.

Opponents would argue:

1) This approach does nothing proactive to advance good practices in the marketplace, and will result in a long delay before security issues are addressed and consumers are protected. It will take some time before litigation based on existing negligence law results in judgments. And it will take time for the market to respond to this, if that even happens at all.
 
2) If relying on existing remedies fails to work, we will be in the same or worse position then as we are now, and many more consumers will have had their privacy violated due to security breaches.

3) In the meantime, businesses that would welcome guidance from experts may be left to flounder and face law suits because of a lack of awareness, even if they are well intentioned.

E. Security Recommendation

The great majority of the Committee believes that the best protection for the security of personal data would be achieved by combining elements from Options 2 and 4. We therefore recommend a solution that includes the following principles:

a) Each commercial website should maintain a security program that applies to personal data it has collected.
 
b) The elements of the security program should be specified (e.g., risk assessment, planning and implementation, internal reviews, training, reassessment).

The security program should be appropriate to the circumstances. This standard, which must be defined case by case, is sufficiently flexible to take into account changing security needs over time as well as the particular circumstances of the website -- including the risks it faces, the costs of protection, and the data it must protect.

Enforcement Options

Government Enforcement Program - The FTC or another agency could enforce compliance with standards using its current enforcement power or using newly expanded authority. The enforcement could establish civil or criminal fines, or both and other equitable remedies. (This option is, in some respects, modeled after the regulations governing the financial services industry as enforced by the Federal Financial Institution Examination Council (FFIEC). The FTC could establish a similar enforcement regime for other industries.)

Third-Party Audit or Other Assurance Requirements - Rely on independent auditors to ensure compliance with standards. This structure could require security standards to be verified by an external body and could require public disclosure of the findings. This option would provide more flexibility and could adjust faster to the changing threat environment. It would, however, introduce an additional cost and overhead that may not be justified by all industries and for all levels of risk exposure. It would, on the other hand, introduce a neutral, objective assessment of a company's security infrastructure relative to its industry.

Create Express Private Cause of Action - Congress could establish a private right of action enabling consumers to recoup damages (actual, statutory, or liquidated) when a company fails to abide by the security standard established through one of the options set out in Section I.

Rely on Existing Enforcement Options - Many of the options include the publication of the website's security procedures or its adherence to particular standards. Such postings are subject to traditional FTC enforcement if the statements are false. It is also of course possible for consumers to bring their own actions for fraud, false statements, or underlying negligence in the handling of the data.

UNUSED TEXT --

D. Security of authentication devices

Authentication devices vary and so do the likelihood of unauthorized use, loss, and theft. The Committee discussed the problems with over reliance on passwords - use of one password at multiple places, yellow stickies, common passwords - all of which compromise the integrity of the authentication system. Similarly, in the offline world the reliance on widely available information such as name, address and phone number to authenticate the identity and authorization of an account holder is risky. The use of shared secrets (social security numbers) which have been compromised by wide spread use raises additional concerns about the strength of authentication devices. Authenticating identity has become a far more complex endeavor than it once was.

E. Feasibility of authentication devices

The full Committee also discussed the feasibility of authentication devices. The Committee expressed concern that "perfect" authentication tools may be prohibitively expensive or too cumbersome for widespread use. However, the Committee has heard from authentication vendors who that a wide range of authentication solutions are available from a number of security vendors today that solve the password 'problem' described above. These solutions take the form of hardware tokens that are as easy to use as an ATM card or software tokens that can be downloaded easily to a PC, PDA or cell phone. The Committee notes that the questions of liability for misuse and misappropriation of such devices remains.

F. Liability

The allocation of liability for inappropriate access and inappropriate use, loss, or theft of authentication devices is an important consideration. While there is not explicit statutory assessment of liability, currently a business could potentially be held liable for allowing the wrong person to access personal information. However, on the other hand if a company allows an unauthorized individual other than the data subject to access personal information it is unclear whether or not the individual would have a remedy under existing law. The lack of certainty regarding liability presents a problem for both individuals and businesses. If liability is strict and put upon businesses they may raise the barrier to access very high, burdening individuals' access rights in an effort to avoid liability. While there are public relation and other market forces to consider, if there is no express liability for inappropriate access businesses may not take appropriate care in establishing robust authentication systems and individuals' privacy may suffer due to inappropriate access. How to strike an appropriate balance that spurs good practices, encourages the deployment of robust authentication devices, and does not overly burden access is the question. This issue is part of the question of how best to facilitate the development of robust and risk-appropriate security and access procedures. As mentioned above, this is an important component of ensuring data integrity and limiting unauthorized access and must be expressly considered and addressed within companies' security plans.

Glossary

View Access
The ability of a consumer to examine a piece of information.
 
Edit Access
The ability of a consumer to change a piece of information.
 
Delete Access
The ability of a consumer to remove a piece of information.
 
Challenge Access
The ability of the consumer to request that a piece of information be changed or deleted (usually because the consumer considers the information incorrect or unnecessary for the business to retain).
 
Consumer Certified Information
about a consumer that the consumer has asserted is correct. For example, shopping preferences submitted by the consumer.
 
3rd Party Certified Information
about a consumer that a 3rd party has asserted is correct. For example, a medical diagnoses provided by a physician.
 
Self-Certified Information
about a consumer that the business asserts is correct. For example, the information associated with a transaction between the consumer and the business.
 
Uncertified Information
collected by the business that is not certified by the consumer, a 3rd party or the business. For example, click stream information that may or may not represent the actions of a particular consumer.
 
Consumer Contributed Information
the consumer has explicitly provided directly to the business. For example, the consumer's credit card number as entered by the consumer in the course of completing a transaction.
 
3rd Party Sourced Information
provided to the business by a 3rd party. For example, a credit report provided by a credit reporting agency.
 
Self-Sourced Information
collected by the business without the active participation of the consumer. For example, click stream data.

Endnotes:

1. "Personally identifiable information" is substituted for the BBBOnLine's term "individually identifiable information." "Prospect information," a term borrowed by BBBOnLine from the Direct Marketing Association, is information provided by a third party, such as when ordering a gift.

2. Information collected online by others than the organization to whom the access request is made, or collected offline, is not "III." However, if "III" is merged with other non-III data, the access request would cover the merged data.

3. This was carefully constructed language that borrowed from a concept in the Americans with Disabilities Act, which requires certain accommodations if not an "unreasonable burden," generally interpreted roughly to mean "do it unless the cost is very great and that cost far outweighs the benefits."

4. The Directive states, in relevant part, that "Member States shall guarantee every data subject the right to obtain from the controller:

(a) without constraint at reasonable intervals and without excessive delay or expense: - confirmation as to whether or not data relating to him are being processed and information at least as to the purposes of the processing, the categories of data concerned, and the recipients or categories of recipients to whom the data are disclosed, - communication to him in an intelligible form of the data undergoing processing and of any available information as to their source, - knowledge of the logic involved in any automatic processing of data concerning him at least in the case of the automated decisions referred to in Article 15 (1);
 
(b) as appropriate the rectification, erasure or blocking of data the processing of which does not comply with the provisions of this Directive, in particular because of the incomplete or inaccurate nature of the data;
 
(c) notification to third parties to whom the data have been disclosed of any rectification, erasure or blocking carried out in compliance with (b), unless this proves impossible or involves a disproportionate effort.

5. Commentary by the Trans Atlantic Consumer Dialogue on the Safe Harbor Access policy.

6. Derived (or inferred information) has been defined by the Online Access & Security Committee as: "information attributed to an individual that is derived from other information known or associated with the individual. Imputed data can be data generated through the application of a mathematical program to known data, or it can be information such as census data that can be imputed to a range of individuals based on residence or some other trait (commonly called overlay data)" and "deductive information inferred from detailed data which has proprietary value based upon the unique business logic applied to raw data (e.g. profile information)." Derived data is similar to credit scores in the context of credit reports.

7. Rather than debate what is meant by "online information," I've chosen to include all information that could have been collected online or used online, even if it is no longer stored in an "online" system.

8. See The Privacy Rights Clearinghouse http://www.privacyrights.org/AR/id_theft.htm for more information.