Privacy Principles for California


Send to PrinterSend to Printer


Copyright © 1998-2014
Privacy Rights Clearinghouse
Posted March 3, 1998

Draft -- for Discussion Purposes
Note:  These principles were drafted for discussion purposes by the Joint Legislative Task Force.
They were not adopted.

Prepared for the Joint Legislative Task Force on Personal Information and Privacy
Senator Steve Peace, Chair

Summary of Privacy Principles

Definition: The word "organization" is used broadly to also mean government agency, business, nonprofit, association, etc.

1. Principle of proactivity. Privacy implications are recognized explicitly and shall be considered when personally identifiable information is to be collected, accessed, stored, merged or otherwise manipulated, and when the application of any new information technology is introduced.

2. Principle of secondary use. Personal information shall not be used or disclosed for purposes other than those for which it was collected. Secondary use is permitted only with the affirmative consent of the individual.

3. Principle of access. An organization shall make specific information available to individuals about its policies and practices relating to the handling of personal information. Individuals shall have reasonable means to learn about, obtain and review, and when necessary, correct and amend information about themselves.

4. Principle of affirmative consent. The knowledge and consent of the individual are required for the collection, use or disclosure of personal information.

5. Principle of relevance. The collection of personal information shall be limited to that which is necessary for the transaction with the individual and purposes identified by the organization. The purpose for which personal information is collected should be specified at the time of collection. Personal information shall be retained only as long as necessary for the fulfilment of those purposes.

6. Principle of accuracy. Personal information shall be as accurate, complete and up-to-date as is necessary for the purposes for which it is to be used.

7. Principle of security. Reasonable physical, technical and administrative safeguards will be taken to protect personal information against the risk of unauthorized access, collection, use, disclosure or disposal.

8. Principle of accountability. An organization is responsible for personal information under its control and shall designate a person who is accountable for the organization's compliance with the principles. A mechanism for oversight and enforcement shall be established to ensure the observance of these principles. An individual shall be able to challenge compliance with the above principles with the person who is accountable within the organization.

9. Principle of progress. As information technologies advance, privacy
considerations are likely to change. The principles will be reviewed on
a regular basis to ensure their adequacy.

* * * * * * * * * * * * * *

Analysis of Privacy Principles

Introduction

California is a uniquely privacy-conscious state. It is one of only ten states with a constitutional right to privacy. The constitutional provision not only prohibits the state from committing privacy intrusions, but also applies to the private sector as well (Article 1, Section 1, California Constitution, amended in 1972).

California has many privacy-related laws on the books. These address government agency information use, telephone records and wiretapping, credit reporting, telemarketing, medical records, employment records, cable television viewing patterns, video rental records, merchant information gathering, insurance record-keeping, and identity theft. In many instances, California has led the nation in the creation of such laws.

Californians themselves take extra steps to protect their privacy. For example, over 50% of households have unlisted telephone numbers. That figure reaches nearly 70% in Sacramento, San Diego, Los Angeles, San Jose, Fresno and Oakland. The national average, in contrast, is 24% of households. (Survey Sampling Inc., 1997)

The Privacy Rights Clearinghouse itself is uniquely Californian, the only program of its kind in the country. It was established with funding from the California Public Utilities Commission's Telecommunications Education Trust (TET), a grant program established in the late 1980s from a fine levied upon Pacific Bell for deceptive and abusive marketing practices. At the time, both the fine and the TET grant program were unprecedented in the nation. (The TET program ceased operation in 1995.)

California is also home to many high-tech companies that have pioneered privacy-enhancing technologies such as encryption. These companies continue to lead the world in developing a broad range of software products that enable individuals, corporations and government agencies to safeguard personally identifiable information.

It is therefore fitting that California's leadership in the development of sophisticated information technologies is matched by its attention at this time to the development of a set of privacy principles to guide government agencies and private sector entities alike in the handling of personal information.

It is fitting that privacy principles are being discussed at this time for another reason. The European Union's (EU) Data Protection Directive will be enacted in October 1998. The Directive states that the transmission of personally identifiable information from one of the member countries to any country without "adequate" privacy protection will be prohibited. (Chapter IV, Article 25: "Member States shall provide that the transfer to a third country of personal data ... may take place only if ... the third country in question ensures an adequate level of protection.")

The United States lacks an omnibus privacy protection law. We have instead taken the sectoral approach, adopting specific laws for industries such as credit reporting, cable television, and video records. Policy analysts predict that the data of, say, multinational corporations based in EU countries will be restricted from transmitting personally identifiable data into the U.S. because we have no over-arching privacy law. Nor do we have a Privacy Commission as do the European nations, Canada, New Zealand and Australia. By adopting privacy principles, perhaps California can be the first state in the nation which passes the "adequacy" test of the European Union vis-a-vis its Data Protection Directive.

Source of principles: The following nine principles have been adapted from several existing sets of "fair information practices," developed since the early 1970s. Key among these are the U.S. Department of Health, Education and Welfare Fair Information Practices (1973), the international principles of the Organization of Economic Cooperation and Development (OECD, 1980), and the Canadian Standards Institute Privacy Code (1996).

Format of principles: For each of the nine principles, the following issues are discussed: (1) why it is needed; (2) the issue(s)/scenarios addressed by the principle, including situations where the principle was not applied and harm occurred; and (3) precedents, where the principle has been applied in law and/or industry practice.

Definition: The word "organization" is used broadly to also mean government agency, business, nonprofit, association, etc.

* * * * * * * * * ** * * * * * * * * *

1. Principle of proactivity. Privacy implications are recognized explicitly and shall be considered when personally identifiable information is to be collected, accessed, stored, merged or otherwise manipulated, and when the application of any new information technology is introduced.

Why it's needed: It is important that privacy implications be considered proactively rather than reactively. Actions taken by state or local government in which personally identifiable information is at issue can have significant impacts on individuals' privacy. This includes legislative action, regulatory agency decisions, court decisions, and government-funded programs administered by private businesses under contract with a government agency.

Likewise, the personally identifiable information collected and disseminated by private sector entities can also impact individuals' lives. Information compiled in a variety of commercial data bases is used to make decisions about employment, insurance, health care, and credit, to name just a few applications.

The privacy implications of new technologies and new applications of existing technologies must be considered before the information infrastructure is developed. Ignoring privacy implications up front leads to retrofitting the system after the fact -- an expensive proposition, not only in economic terms, but also societal and personal. Once privacy has been lost, it is difficult, and often impossible, to restore.

Issue(s)/scenarios addressed: In the past five years, there have been numerous examples of what happens when information technologies are introduced without first considering the privacy implications.

  • In 1996, Lexis-Nexis launched its P-Trak "people finding" service, which included the sale of individuals' Social Security numbers. The resulting public outcry prompted Lexis-Nexis to alter the product to exclude the SSN.

  • In February 1998, the Washington Post reported that two large supermarket chains with in-store pharmacies, Giant and CVS, were selling customers' prescription information to a Massachusetts company. That company, Elensys, in turn arranged with drug manufacturers to pay the pharmacies to have "educational" messages and solicitations mailed to the customers with particular ailments. Within two days of the news story, both Giant and CVS announced they were curtailing the practice due to the negative response from their customers.

  • In March 1997, the U.S. Social Security Administration began offering access to individuals' Personal Earnings and Benefit Estimate Statement (PEBES) on its Internet web site, www.ssa.gov. Due to immediate public criticism regarding insufficient safeguards to prohibit access by unauthorized users, the SSA suspended operation just one month later. It then launched a series of public forums in which it invited input from private sector high-tech businesses, consumer advocates and public officials. Its resulting implementation plan can be considered a model for any entity which introduces applications of information technologies involving personally identifiable information. The SSA implementation plan includes

(1) the development of a privacy and security policy for online services,

(2) ad hoc advisory assistance from experts,

(3) the preparation of privacy impact assessments on significant projects, and

(4) publication of a periodic privacy review for public dissemination.

[Privacy and Customer Service in the Electronic Age: Report to Our Customers, U.S. Social Security Administration, Publication Number 03-012, Sept. 1997, p. 32.]

The growing practice of government agency "data matching" deserves some special consideration here regarding the principle of proactivity. On both the state and local government levels, government agencies in California are engaged in initiatives which involve the merger of data across several agencies. In San Diego County, for example, the Department of Health and Human Services, is developing a "Consolidated Client Index." Another such endeavor is Project Heartbeat which would merge the various data bases of agencies that serve at-risk youth. The latter is the subject of Assembly Bill 1801 (Davis) which would enable the agencies to share data with one another via a pilot project in San Diego County.

At the state level, the California Department of Education has been studying ways to develop standards for the handling of K-12 student records to enable records to be more efficiently merged and transferred when a student moves to another school. This program is conducted in conjunction with the development of a nationwide standard called SPEEDE/ExPRESS, with the ultimate purpose of being able to develop a longitudinal data base of public school student records wherever they might have attended school.

The merger of government agency records, while desirable from the standpoint of cost savings and efficiency of services, is often viewed as the ultimate in "Big Brother" endeavors -- a cradle to grave dossier on each of us. If the principle of proactivity is to be applied anywhere, it should certainly be applied in any data matching programs proposed by government agencies. By conducting a privacy impact assessment, government will be able to determine the consequences, both intended and unintended, of such programs. And by applying the remainder of the privacy principles as discussed below, policy makers will be able to further evaluate whether or not the particular data matching program is sound public policy vis-a-vis personal privacy considerations.

Precedent: There are many situations where the principle of proactivity is required and practiced. When legislation is crafted, the fiscal implications must be charted. In the construction of highways and other major building projects, the environmental impact must be assessed. The City of San Diego adopted a privacy policy in October 1996 (900-13) which includes "consideration of privacy effects" as a principle when "introducing and using information technologies."

The implementation of Caller ID in California serves as an excellent example of the principle of proactivity at work. When the local telephone companies proposed to implement Caller ID in the early 1990s, the California Public Utilities Commission (CPUC) held several public forums around the state. It learned that privacy was a significant concern of those who participated. About half of those who testified and submitted written comments thought Caller ID would severely invade their privacy. The other half thought it would enhance their privacy.

The CPUC's resulting decision in 1992 (92-06-065) required that the local telephone providers (Pacific Bell, GTE and several regional companies) educate consumers about the privacy implications of the service before Caller ID could be implemented. Indeed, when the service was introduced in 1996, an extensive consumer education campaign was launched to raise Californians' awareness of the privacy implications of the service and to inform them of their number blocking options.

2. Principle of secondary use. Personal information shall not be used or disclosed for purposes other than those for which it was collected. Secondary usage of information is permitted only with the affirmative consent of the individual.

Why it's needed: The restriction on secondary uses of information is at the heart of all privacy policies dating back to the original code of Fair Information Practices, adopted by the U.S. Department of Health, Education and Welfare in 1973. There is a tremendous temptation, especially in this age of powerful computers and decreasing costs of operation, to find additional uses of information. Author Erik Larson, in The Naked Consumer, discusses the universal laws governing the flow of data collected about individuals. One such law is that "data always will be used for purposes other than originally intended. (Henry Holt and Co., 1992, p. 14)

The individual who enters into a transaction with a service provider such as a merchant fully expects the information that is divulged in that transaction to be used solely to complete the transaction, and perhaps to do business with that entity in the future. Such transactional data might include a credit card number, name and address, a Social Security number, a driver' license number, sizes of clothing, brand names and quantity of supermarket goods, tastes in music, titles of books, videos rented, and so on.

Such data has considerable value beyond the initial transaction. And therein lies the temptation of secondary use -- the use of that information for purposes other than the original reason for gathering it.

Given the increasing digitization of our various daily transactions, the possibility exists that the totality of information collected will be merged to form a comprehensive picture of each of us, an "electronic dossier." The secondary uses of this comprehensive data set are virtually limitless, with the most troubling of them involving surveillance and social control.

As with many of these privacy principles, the principle of secondary usage works in conjunction with other principles, especially relevance and affirmative consent, discussed below. Because information often has value for secondary uses, the information gatherer is likely to want to collect additional information to enhance the value. The principle of relevance (number five) states that only the information necessary for the matter at hand shall be collected. And if information is to be sold, exchanged or otherwise made available for secondary uses, permission from the individual must be obtained -- the principle of affirmative consent (number four).

Issue(s)/scenarios addressed: There are many examples in which personally identifiable information, collected for one purpose, is put to secondary uses without the consent of the individual.

  • Perhaps the most prevalent example of a violation of the principle of secondary usage concerns Social Security numbers. The SSN was originally developed as a record-keeping number for the Social Security Administration's management of U.S. citizens' retirement benefits. Some restrictions were placed on secondary uses of the SSN by the federal Privacy Act of 1974, but only by local, state and federal government agencies. These restrictions have been watered down over the years.

  • But no restrictions have been placed on private sector entities, resulting in significant secondary use of the SSN and tremendous harm to individuals. Insurance companies use the SSN as subscriber ID numbers. Cable companies use it to identify their customers. Employers use it as the employee ID. Colleges and universities use the SSN as the student number. The financial industry (credit, banking) uses the SSN as customer identifiers, and even as PIN numbers.

  • It is the financial industry's use of SSNs that has contributed to the crime of identity theft, estimated to victimize a half a million Americans a year. Imposters who obtain someone's SSN -- a relatively easy matter given the SSN's profligate use throughout society -- can apply for credit in the victim's name and open bank accounts. They rack up thousands of dollars of expenses in the victim's name before reaching the credit limits on each account, and then move on to someone else's identity. Victims are left with a ruined credit history, and must spend months and even years regaining their financial health. Identity theft results in billions of dollars in fraud annually, not to mention the loss in productivity to the victims who must spend many hours and days cleaning up their credit.

  • Had the SSN been prohibited from being used as an identifier by private sector entities, and had it not been adopted as the key to consumers' finances, we are likely not to be experiencing the explosion of identity theft crimes today.

  • Another example of secondary use is the credit "header," which has taken on a life of its own as a "people-finding" tool. Yet, its original purpose was simply to provide the necessary identification for credit grantors using the individual's credit report to make a decision regarding the extension of credit.

  • The so-called "product registration" form is another example of violation of the secondary use principle. While the ostensible use of this form is to identify the person who has just purchased a product which has a warranty, the information collected on the form -- hobbies, income, education, home ownership -- is used by marketers to solicit other goods and services to that individual. It should be noted that this form usually includes a disclosure and consent statement at the end. But it is written in such small type and vague language that it is questionable whether the statement truly meets the standard of informed consent as required in this principle.

Precedent: There are many precedents in law regarding secondary usage.

  • The Privacy Act of 1974 (5 USC 552a), which applies to federal government agencies, states that information collected for one purpose shall not be used for other purposes without first getting the permission from the individual.

  • California's own version of the Privacy Act, the Information Practices Act (Civil Code 1798) states in its "legislative declaration and findings" that "in order to protect the privacy of individuals, it is necessary that the maintenance and dissemination of personal information be subject to strict limits."

  • Recently, the Federal Communications Commission issued a ruling which restricts secondary use of telephone records. Telecommunications companies must obtain customer consent before they can use their records, calling patterns and other personal information to market new services to them.

  • California voter registration records have been restricted to political-related and research uses.

  • California Department of Motor Vehicles records have been restricted to specific uses since the actress Rebecca Schaeffer was murdered in 1989 by a stalker who obtained her residential address.

3. Principle of access. An organization shall make specific information available to individuals about its policies and practices relating to the handling of personal information. Individuals shall have reasonable means to learn about, obtain and review, and when necessary, correct and amend information about themselves.

Why it's needed: Information in a wide variety of data bases is used to make critical decisions about each and every one of us. For example, when an individual is turned down for a job, an apartment, or a loan, he or she must be able to determine whether or not the information used to make that decision was accurate. Access to that information is necessary to make this assessment.

Access is also at the foundation of laws regarding government compilation of personally identifiable information. It is especially critical in a democratic society that citizens have a right of access to information about them in order to prevent abuse of power.

An often overlooked aspect of access is education about how access can be obtained. Reasonable efforts must be made to educate individuals about the existence and use of personally identifiable information held by the organization. Education efforts should include how personal information is obtained, used, stored and disclosed, as well as individuals' rights as expressed in these privacy principles.

It should be noted that California has a long tradition of encouraging and requiring consumer education to lessen the harmful impacts of the introduction of information technologies (for example the California Public Utilities Commission's Telecommunications Education Trust). When consumers are informed, they are better able to make decisions to safeguard their personal information.

Issue(s)/scenarios addressed: While access is a major part of several privacy-related laws, there are still significant gaps where consumers lack access.

  • The compilers of consumer profile information in the direct marketing industry do not afford consumers access to records about them. Companies such as Metromail, Polk, and Database America, to name a few, have not developed the necessary infrastructure to enable consumers to learn what their records contain.

  • The California Information Practices Act, which gives citizens the right to obtain the records compiled about them by state government agencies, has been weakened significantly since the Office of Information Practices was defunded and closed in the early 1990s. That office was instrumental in monitoring access procedures and informing citizens of their right of access. Without this office, the state has no idea of the totality of data bases containing personally identifiable information compiled by state government agencies. And individuals have no "one-stop shopping" clearinghouse for information about their access rights.

Precedent: There are a number of laws addressing access to both private and public sector data bases.

  • Access is at the heart of both the federal Freedom of Information Act and the Privacy Act of 1974. California's equivalents to those acts also include strong rights of access: the Public Records Act and the Information Practices Act.

  • The federal Fair Credit Reporting Act and its California equivalent give consumers a right to obtain their credit report and correct errors. That right also includes being told who else has accessed that report.

  • The federal Cable Communications Policy Act gives subscribers the right to inspect and correct errors in their account record. California also has its version of this law.

  • Several other California laws provide individuals with access to their personal records held by private sector entities. Californians have a right of access to their medical records, something not available in about half the states. We also have a right of access to our employment records. And the Insurance Information and Privacy Protection Act provides access to insurance records.

4. Principle of affirmative consent. The knowledge and consent of the individual are required for the collection, use or disclosure of personal information.

Why it's needed: The definition of informational privacy revolves around control -- the right of individuals to determine when, how and to what extent they will divulge personal information about themselves to others. [Adapted from Alan Westin, Privacy and Freedom, Athaneum, 1967] The foundation of the principle of affirmative consent is the ability of the individual to control his or her personal information.

With the proliferation of data bases containing ever increasing amounts of information about us, the principle of affirmative consent becomes even more important. The totality of information in data bases is being referred to by some as our "digital persona," an entity that is taking on a life of its own. Certainly, when we are represented in a multitude of transactions by a virtual being that is comprised of many discrete bits of data about us, we must be able to control the development of our digital persona through the principle of affirmative consent.

This principle is also at the foundation of a democratic society, both in our relationship to government entities, as well our transactions with the private sector. With affirmative consent fully in force, secret information collecting is restricted.

Issue(s)/scenarios addressed: The word "affirmative" has been added to "consent" in this principle for a specific reason. Often, consent is obtained invisibly. It's a "negative option" hidden in wording in the fine print, for example the Conditions of Use language that individuals receive when they become a customer of a bank or credit card company. Few people read or understand such language. And as a result, they give consent for their data to be disclosed to third parties without realizing it.

Another way consent is obtained is through coercion. The Privacy Rights Clearinghouse has learned of numerous companies which refuse to provide service to individuals who will not disclose their Social Security number, for example, some cable TV companies, some medical clinics, some insurance providers, and virtually all cellular phone companies. The PRC has also received complaints from individuals who have been fingerprinted at banks as a requirement to cash noncustomer checks. If they do not consent to being fingerprinted, they are not able to cash the check.

Electronic communications afford many ways for personal data to be compiled without the individual knowing it. Here are some examples:

  • A few years ago, a pharmaceutical company advertised an 800 number which allergy sufferers could call to obtain the pollen count in their city that day. Unbeknownst to them, their phone numbers were collected by the drug company through a process called Automatic Number Identification (there is no way to block the transmission of phone numbers when calling 800 and 900 numbers, even when the Caller ID blocking code *67 is used). Using a reverse directory, the drug company obtained the names and addresses of those callers and sent them mail solicitations about its new allergy medication.

  • Electronic mail addresses are "harvested" by software programs and then used by marketers to send "junk" e-mail solicitations, also called "spam," to millions of Internet users. Such solicitations are sent without consent. They clog the information superhighway and are the source of numerous complaints to Internet providers and lawmakers. Bills currently before the California Legislature and Congress would require consent.

  • When individuals "surf" the Internet, their "clickstream" can be compiled invisibly. These electronic footprints can then be merged into a detailed profile of the individual's interests. Many web sites require that users register before proceeding further, revealing their names, addresses and other personal information. Few such web sites provide privacy policy statements which disclose what is done with such information and whether or not there is the option to restrict third party use of the data. A recent survey conducted by the Federal Trade Commission found that only a small percentage of sites disclose their information collection activities. In a majority of sites, disclosure and consent are absent.

The debate of "opt-in" vs. "opt-out" is at the heart of the affirmative consent principle. When opt-out is practiced, the entity that gathers and disseminates the information assumes consent unless the individual indicates that the information is not to be used, a kind of "negative option." The personal information is disseminated until the time that the individual exercises the opt-out option, if at all.

When opt-in is practiced, it is assumed that the individual does not consent to information being collected and used. Affirmative consent must be provided before the individual's personal information is used. Opt-out is the norm in this country. If the principle of affirmative consent were truly practiced, opt-in would guide the collection and dissemination of personal information.

Generally, the opt-in option is used in situations where sensitive personal information is compiled and used, such as medical records. But California's medical records confidentiality law contains a troubling example of opt-out. The opt-out clause is italicized.

California Civil Code 56.16. "Unless there is a specific written request by the patient to the contrary, nothing in this part shall be construed to prevent a provider, upon an inquiry concerning a specific patient, from releasing at its discretion any of the following information: the patient's name, address, age, and sex; a general description of the reason for treatment ...; the general nature of the [treatment], the general condition of the patient; and any information that is not medical information as defined in subdivision ( c) of Section 56.05."

How many patients will think to carry a written document to the hospital which instructs staff not to disclose information about their treatment there? That is the nature of opt-out, and the reason the word "affirmative" is added to the principle of consent. In an opt-out scenario, the burden is on the consumer to prohibit the release of information about them. Few consumers are sufficiently aware and proactive to take such action. The practice of opt-in places the burden on the information user to obtain consent from the individual. This enables the individual to be fully informed of the uses to be made of the information and to have the opportunity to exercise the option without coercion.

We close the discussion of affirmative consent by looking at an emerging technology that will test the very efficacy of this and the other principles. Advancements in the technology of video surveillance present several challenges to the principle of affirmative consent. Satellites are now capable of recording images on the ground to the resolution of a few square feet. (Government satellites engaged in intelligence gathering can record images to the resolution of a few inches, although these are not available for commercial use.) Back on earth, video surveillance cameras are becoming commonplace in business establishments, public places, schools, and the workplace. Digital video technology already exists, although not yet for broad commercial use, that can scan faces and obtain "face prints," similar to fingerprints in that they comprise a unique individual identifier. With such technology, a crowd could be scanned, and the identities of those present can be known, certainly a chilling possibility, one rife with implications for our civil liberties.

The privacy implications of video and satellite technology are just now emerging. These sophisticated technologies bring into question the principle of affirmative consent, as well as the other principles. They present a challenge for policy makers attempting to limit the many privacy intrusions facing citizens in this information age.

Precedent: A recent amendment to the federal Fair Credit Reporting Act provides individuals with an important right of affirmative consent. When an employer conducts a background check on a job applicant, it must obtain the affirmative consent of that person. (And if a negative hiring decision is made, the applicant must be able to obtain a copy of the report, an example of the access principle, discussed above.)

California law requires consent of all parties before a telephone conversation can be recorded. We are one of only 12 states with an all-party recording law.

Another example of consent is the medical records release form signed by patients to enable their health care provider to release information about the patient to other health care providers and to the insurance company, required under California's medical records confidentiality law, Civil Code 56. Unfortunately, many such release forms are worded broadly. And the element of coercion can be present in such transactions -- no signature, no service.

5. Principle of relevance. The collection of personal information shall be limited to that which is necessary for the transaction with the individual and purposes identified by the organization. The purpose for which personal information is collected should be specified at the time of collection. Personal information shall be retained only as long as necessary for the fulfilment of those purposes.

Why it's needed: The advancement of computer and telecommunications technologies enables companies to collect tremendous amounts of personal information, merge it with other data, and analyze it to find previously unknown relationships. Witness the growth of "data mining" and "data warehousing" in the service and financial industries for the purpose of "database marketing." This growing practice enables the company to learn as much as possible about its customers in order to target products and services to them. Government agencies are also examining the practice of data warehousing, merging the contents of numerous data bases of many agencies into one record, to be used for a variety of applications.

With these vast data collection capabilities come the temptation to use the data for other purposes, and to take advantage of the data's commercial value by selling it to third parties. It should be noted here the particular danger that arises from the commercial use of government information. In order to increase the monetary value of their data, cash-strapped government agencies might collect more information than is needed for the matter at hand. Indeed, government agencies are already selling public records information to private sector information vendors, who in turn package the information and sell it to any number of users. Without strict adherence to the principle of relevance, government agencies may be encouraged to increase the revenue potential of their data by collecting more than is needed.

The principle of relevance places restrictions on the amount of time information can be retained and disseminated. This brings the notion of "social forgiveness" into the discussion. Should, for example, someone's graffiti vandalism conviction at age 19 prevent that person from getting a job 10 years later when his life has been turned around and earlier misdeeds are no longer relevant?

The principle of relevance also limits the element of coercion in the provision of services. In both the private and public realms, individuals are often required to divulge specific information to obtain a good or service -- for example, name and address for telephone service. The principle of relevance prohibits the collection of extra information, unrelated to that good or service.

The principle of relevance works in conjunction with other principles, notably secondary usage. Issues surrounding this principle have been discussed in number two above and are summarized here:

  • These two principles collectively prevent the over collection of information.

  • They inhibit the development of extensive dossiers on individuals.

  • They require the organization or agency to specify the purpose for which it is collecting the data at the time of collection.

Issue(s)/scenarios addressed: Significant related discussion has already been presented under the principle of secondary usage above. We focus the discussion here on the ultimate question regarding relevance, whether there are situations in which it is appropriate to gather no personally identifiable information -- the issue of anonymity.

We are swiftly approaching the time when virtually all of our daily transactions are conducted via systems in which data is collected. Technology forecasters describe the day when we will all carry multi-purpose "smart cards." These will take the place of cash, driver's licenses, credit and debit cards, library cards, employee ID cards, highway toll-booth recording systems, student IDs, phone cards, health insurance cards, and so on. Smart cards might also hold our medical records, our vital documents such as birth and marriage certificates, and our voter registration form, to name a few applications.

We can all no doubt understand the significant convenience of smart cards. But the privacy-related threats are evident as well. The totality of "electronic bread crumbs" that we leave along our life-long path comprises a comprehensive dossier of who we are. The potential uses of such robust data dossiers give rise to chilling scenarios of surveillance and social control.

The ultimate test for anyone applying the principle of relevance is to ask the question: Is any personally identifiable data required for the application under consideration? Does the cash component of a smart card require the disclosure of one's identity when used at the supermarket, the newspaper vending machine, the subway, the soda machine, the phone booth, or the parking meter? Does the automated highway toll collection booth need to know that Jane Smith passed the recording station at such and such a time? Or is it sufficient to note that account number 54321 was just debited for the amount of the toll. Should data be collected only because it can be gathered, given the power, ease and affordability of computer technology? Or shall we build anonymity into information technologies when appropriate? That will be the true test of the principle of relevance.

Precedent: There are several federal and state laws which embody the principle of relevance, or collection limitation.

  • The federal Privacy Act, for example, requires the agency to determine that the information it has on file is relevant to the mission of the agency. California's version of the Privacy Act, the Information Practices Act, requires that the purpose for which the information is gathered be specified in a notice provided to the individual when the information is collected. But the IPA does not appear to contain a collection limitation clause per se. It should be noted that the IPA only pertains to state government records, not to personally identifiable information at the local government level.

  • California law places limits on the information collected by merchants when customers pay for goods or services by credit card. California Civil Code 1747.08 prohibits the collection of information other than that provided on the credit card -- name, account number and expiration date -- with specific exemptions for situations where, for example, a product must be delivered to a residence. The credit card transaction can be successfully completed with the limited information provided on the card. This law prevents the collection of additional information in order to deter fraud and to protect the privacy of the card holder. It restricts the collection of address and phone number information that might enable the merchant to compile a data base of customers for solicitation purposes.

  • Another example of the principle of relevance can be found in labor law. Employment laws at the federal and state levels prohibit certain questions to be asked of job applicants in order to prevent discrimination based on age, sex, marital status, race and other factors.

The principle of relevance also contains a clause regarding retention of records. Both federal and state credit reporting laws place limits on the length of time negative information can be indicated on one's credit report. Debts more than seven years old must be removed; and bankruptcy information must be removed after ten years. Another example of records retention limitations involves driving records. The California Motor Vehicles code places limits on the length of time certain driving violations can be part of one's record.

6. Principle of accuracy. Personal information shall be as accurate, complete and up-to-date as is necessary for the purposes for which it is to be used.

Why it's needed: Many important decisions are based on information held in data bases -- whether one get a job, can rent an apartment, receives insurance, is extended an automobile loan or a mortgage, or is granted a credit card, to name a few situations. For decisions to be made fairly, information must be accurate and up to date.

This principle functions in conjunction with the principle of access. The individual must be able to obtain and review his or her record in order to know whether or not it is accurate.

Issue(s)/scenarios addressed: The annals of the Privacy Rights Clearinghouse are replete with stories of individuals who have been harmed by erroneous information -- a credit report which contains an imposter's credit history, for example; a record in the Medical Information Bureau data base which has been miscoded, indicating a serious medical condition when there is none; a criminal record pulled for the wrong John Smith, resulting in employment denials; to name a few.

The Bronti Kelly case has recently received attention in the news. It provides a classic example of the harm that can befall individuals when data bases contain erroneous information. The Kelly case also exemplifies the importance of adhering to the principle of access (number three).

In the early 1990s, Bronti Kelly was employed by a major department store for a time, and was laid off. He then attempted to obtain employment at many other department stores in Southern California to no avail. With no income, he became homeless. Kelly finally landed a job at a department store, but after one week was laid off. In frustration, he pressed the human resources manager for the reason he was fired. He was told that a few days after he started work, the company accessed an employment background check service which listed Kelly as having been convicted for shoplifting. Kelly then realized that an imposter who had stolen his wallet years before had been carrying his identification documents when committing crimes, thereby giving Kelly a criminal record.

During his long period of unemployment, Kelly had not been notified about the data base that had been used to make negative employment decisions about him. He therefore had no opportunity to alert employers to his erroneous criminal record. The privacy principles of both accuracy and of access had not been followed by the companies to which he had applied for work. Kelly sued both the department store company and the background check service. He won a small judgment against them in January 1998.

It should be noted that as of October 1997, amendments to the federal Fair Credit Reporting Act requiring disclosure and consent in the background check process would prevent the Bronti Kelly case from happening today.

Precedent: There are strong precedents in both federal and state laws regarding the accuracy of personally identifiable information. The federal Privacy Act and California's Information Practices Act contain provisions which enable individuals to correct or amend erroneous information in government agency records.

In the private sector, the Fair Credit Reporting Act and its California equivalent contain provisions for the correction of erroneous information, and sanctions if records are not corrected. The federal Fair Credit Billing Act contains a similar provision. Additional federal laws in which the individual is able to challenge the accuracy of records and have them corrected include the Family Educational Rights and Privacy Act (FERPA) and the Cable Communications Policy Act. There are similar laws at the state level.

Additional laws at the state level which give individuals the right to have records corrected or amended are the Insurance Information and Privacy Protection Act, and the Medical Records Confidentiality Act.

7. Principle of security. Reasonable physical, technical and administrative safeguards will be taken to protect personal information against the risk of unauthorized access, collection, use, disclosure or disposal.

Why it's needed: The thought of computer hackers often comes to mind when information security is considered. Likewise, the necessity for encryption of data is often seen as the cure-all for information security. Indeed, the use of privacy-enhancing technologies like encryption is essential in order to safeguard data, especially personally identifiable information that is particularly sensitive. But such high-tech applications are not the total solution.

Security goes much further than protecting computer systems from outside intruders. There is also the threat of "insiders" who take advantage of their access to personal information to browse records without proper authorization, and perhaps to provide information to illegitimate users for a price. There are also those within the organization who are unwittingly "conned" into disclosing information.

An all-too common security breach is disposing of records without shredding them. Investigators of identity theft cases often report that the imposter was able to obtain the information needed to commit the crime by retrieving unshredded credit card transaction slips and loan applications from dumpsters. Indeed, many of the privacy abuses reported to the Privacy Rights Clearinghouse are the result of "low-tech" security breaches.

Information has value to a multitude of users who are not authorized to have access to it. The principle of security is therefore central to any code of privacy protection.

Issue(s)/scenarios addressed: We are reminded daily of the implications of inadequate security of personally identifiable information. Here are a few examples which illustrate both low-tech and high-tech security breaches.

  • A student requested a copy of her file from the Student Aid Commission. When it arrived, she was shocked to find that in addition to her file, the names, addresses, Social Security numbers and phone numbers of 20 other people were in the same envelope. They had been included by accident by a Commission employee.

  • Before departing the singles dating service office, a fired employee stole a computer diskette containing the supposedly confidential mailing list of all its clients. He sold the list to other dating services in the area.

And these stories of medical records security breaches from news reports:

  • An employee of an AIDS assistance center downloaded the names of 4,000 HIV-positive patients and mailed the computer diskettes to two Florida newspapers.

  • A psychiatric hospital employee anonymously faxed the medical records of a member of Congress to the New York Post on the eve of her Congressional primary election. She awoke to a front-page story of her attempted suicide. She won the race, despite the story.

A crime that perhaps best symbolizes the nationwide absence of a "culture of confidentiality" is identity theft. This fast-growing crime is largely a result of societywide security lapses:

  • Many credit grantors do not check the identity of applicants adequately Credit bureaus are allowed to sell credit "headers," which include the Social Security number, without restriction. Many businesses do not shred documents before putting them in the dumpster. Organizations of all types allow employee access to sensitive information, including employees' and customers' Social Security numbers, without password protection or need-to-know authorization. Even the federal government has been instrumental in this crime by allowing the SSN to be used as health insurance account numbers and as student ID numbers in countless schools, colleges and universities. The SSN is thereby carried on plastic cards in tens of millions of wallets.

Precedent: There are several laws in which security is mandated. But such laws are generally limited in effectiveness because of loopholes and weak sanctions. Clearly, for the security principle to be effective, it must be grounded in policies and procedures with significant sanctions.

The federal Fair Credit Reporting Act and its state equivalent contain clauses which prohibit individuals' credit histories from being obtained by anyone without a "legitimate business purpose." There are penalties, albeit limited, for illegitimate access.

California's Medical Records Confidentiality Act restricts who can access individuals' health care information, although with some loopholes, and with limited sanctions for improper disclosure.

Outside the realm of laws, security-minded organizations, including government agencies, corporations and nonprofit organizations, have adopted privacy policies which include the requirement that employees who handle personal information sign confidentiality agreements. This practice is expected to grow as organizations realize the high costs of lawsuits, lost productivity and low morale that accompany inadequate security.

8. Principle of accountability. An organization is responsible for personal information under its control and shall designate a person who is accountable for the organization's compliance with the principles. An individual shall be able to challenge compliance with the above principles with the person who is accountable within the organization. A mechanism for oversight and enforcement shall be established to ensure the observance of these principles.

Why it's needed: A common complaint of consumers is their inability to find someone within the organization, whether it's a government agency or a company, who is responsible for the handling of personally identifiable information and who can take action when privacy abuses have occurred. They are further frustrated at not being able to gain redress for the invasion of their privacy. Clearly, there exists an "accountability void" in public and private sectors alike.

Issue(s)/scenarios addressed: Included in Precedent, below.

Precedent: It is difficult to find strong precedents for information accountability in the operation of government agencies and in corporate practices. This is perhaps a result of the absence of an omnibus privacy protection law in this country.

The accountability principle is crucial to the successful implementation of the totality of the privacy principles. But there are few role models on which to apply this principle. A significant challenge facing those who adopt the privacy principles is crafting an effective mechanism for accountability.

Both the federal Privacy Act and the California Information Practices Act are weak vis-a-vis the principle of accountability. Neither law clearly requires an agency to develop an infrastructure for accountability. The Information Practices Act was significantly weakened in this regard in the early 1990s when the Office of Information Practices was defunded by the Legislature. An additional weakness of the IPA is that it only pertains to state government agencies, and does not extend to the local level where there are significant compilations of personally identifiable information. On the federal level, the U.S. Office Management and Budget oversees the Privacy Act, but is virtually invisible in this function.

Clear accountability mechanisms for information handling in the private sector are also difficult to find. Consumers who have complaints about how their own information was handled often experience being referred from department to department. Many give up the exercise of gaining redress for their grievance when they realize their only recourse is to sue the company. But that's hardly an option for most individuals who are no match for the corporation's "deep pockets" and extensive legal resources.

Perhaps the answer to the "accountability void" lies in nontraditional strategies. Alternative dispute resolution, for example, might well prove to be highly effective in resolving many of the disputes consumers have with companies and government agencies alike. Indeed, the Privacy Commissioners in New Zealand and in the province of Quebec, where there are omnibus privacy laws covering both the private and public sectors, report that the majority of all privacy disputes which they handle are resolved through mediation.

Public disclosure of privacy abuses might also prove effective -- perhaps a web sites which individuals can access to find out whether company X or government agency Y engages in practices contrary to these privacy principles or even its own privacy code.

Such nontraditional mechanisms as alternative dispute resolution and an online complaints web site would likely need some type of oversight body. In this decidedly non-regulatory era, this too calls for creative thinking. Is there a place for a "privacy monitor," either within state government or established in the nonprofit sector? Can an entity be created which provides incentives for responsible information practices, such that sanctions for bad practices are required in only the most grievous instances where mediation and public disclosure have failed? Such an entity might have the following functions:

  • Promote alternative dispute resolution.

  • Shine the spotlight on good privacy practices as well as bad, whether in the public, private or nonprofit sectors.

  • Foster extensive consumer education to make individuals more privacy vigilant.

  • Encourage adoption of privacy principles and serve as a stamp of approval.

  • Conduct research and publish reports on policy alternatives, uses of technology, and other issues. Convene public forums to discuss controversial issues.

9. Principle of progress. As information technologies advance, privacy considerations are likely to change. The principles will be reviewed on a regular basis to ensure their adequacy.

Why it's needed: Information technologies are advancing at breakneck speed, spawning applications that are ever-changing. By the same token, the privacy implications of information-based services are also changing. The principles adopted today may not be effective for the technology of tomorrow. Therefore, periodic review of the principles is necessary to ensure their relevance and efficacy.

Issue(s)/scenarios addressed: The need for ongoing review and revision of the privacy principles can be illustrated by the evolution of Caller ID in California. Caller ID was implemented by local telephone companies in 1996, after many years of wrangling over the issue. The service allows the subscriber to view the telephone number of the calling party on a display device next to the phone.

The California Public Utilities Commission required that extensive public education about the privacy implications of this service be conducted by the phone companies and by numerous community-based organizations which were awarded grants. The education campaign focused on the use of blocking mechanisms, available free to consumers. As a result of the education campaign, over half of households chose the strongest form of privacy protection, Complete Blocking.

But Caller ID is not a static service. Now, the phone companies want to add the person's name to the telephone number, so that when the called person looks at the display device next to the phone, both the number and the name of the caller will be shown. This represents a significant enhancement, or privacy intrusion, depending on your point of view. Whether the CPUC addresses this addition to Caller ID with the attention it initially focused on the fledgling service remains to be seen. It is not likely, however, that the same level of care will be taken to educate Californians about the privacy implications of "advanced Caller ID."

Precedent: The principle of review is well-established in public policy proceedings. For example, "sunset review" is a mechanism the Legislature uses to comprehensively review the need for and performance of administrative agencies that it has created.

If privacy principles are adopted through legislation or through informal agreements, a recurring requirement should be established to review the principles vis-a-vis enhanced information technologies and other developments that we cannot foresee at this time. If the principles are established through legislative action, the review process could perhaps be delegated to an entity like the Little Hoover Commission or the Bureau of State Audits.

 

 

 



X

Sign In!

Loading