Eight Reasons to be Skeptical of a "Technology Fix" for Protecting Privacy


Send to PrinterSend to Printer


Copyright © 2000-2014
Privacy Rights Clearinghouse
Posted October 14, 2000

[in the spirit of Jerry Mander's Four Arguments for the Elimination of Television]

 
Presentation by Beth Givens
Computer Professionals for Social Responsibility
Annual Conference
University of Pennsylvania, Philadelphia

 

Thank you for the opportunity to speak today. It's a pleasure to participate in your annual conference.

The Public Opinion Landscape

The Privacy Rights Clearinghouse is a nonprofit consumer education and advocacy organization. In addition to publishing many practical guide on ways to safeguard privacy, we also operate a hotline -- and now an e-mail inquiry service -- to answers consumers' questions and take their complaints. Since we began operating in 1992, we have talked with tens of thousands of individuals about a wide variety of issues. During this time, we have seen the privacy issue explode.

  • A recent Lou Harris poll found that nearly 90% of individuals are concerned about threats to their privacy.
  • 80% think their personal information is out of control.

Concerns about threats to privacy when using the Internet are especially high.

  • In another Lou Harris poll, 70% of those who said they were not Internet uses said they would be if they could be assured that their privacy would not be threatened.
  • A recent Pew poll found that 86% want Internet companies to be required to ask their permission before collecting their personal information. That's called an "opt-in policy," and is contrary to the opt-out policy favored by the Clinton Administration and by industry. www.pewinternet.org

The Pew poll also found that half of Net users (54%) believe that web tracking is harmful because it invades their privacy.

Interestingly, this poll found that 94% want Internet privacy violators to be punished, with the penalties ranging from sending the company president to jail, fining the owners, shutting down the site, or listing the errant site on a list of fraudulent web sites.

An Overview of Privacy Technologies

The view from inside the beltway is considerably different from these public opinion poll findings. Three weeks ago I attended the Privacy Technology Fair in the U.S. Capitol (Sept. 20, 2000), where a number of companies were showing how their products could protect consumer privacy. The message was clear: Let the marketplace provide. Let these resourceful companies give us any number of technology fixes, and we won't have to rely on the heavy, burdensome hand of the government to protect citizens' privacy. Industry self regulation and technology are the answer.

I would like to lend a skeptic's voice to that message. And that's why I've chosen the title of my presentation as "Eight Reasons to be Skeptical of a 'Technology Fix' for Protecting Privacy" ... in the spirit of Jerry Mander's classic book Four Arguments for the Elimination of Television. (Wm. Morrow & Co., 1978)

I'm going to begin by listing just a few of the many technology-based services that are available now or are soon to be launched. I will then list eight reasons why I am not convinced that these services are the entire answer to safeguarding consumer privacy.

Then, to close, I will discuss approaches that I believe are more likely to provide meaningful protection of our personal privacy.

I want to say up front that as a skeptical voice, I am not stating that these services and technologies should not be developed or that they are somehow devious. What I will stress now and in the conclusion is that they should not be seen as the only answer to the critical issue of privacy protection.

Let me return to the Privacy Technology Fair in the Capitol on September 20, 2000:

When we entered the room in the Capitol where the technology demonstrations were held, we were handed a poster entitled "Privacy Technology in the Digital Age." The poster lists 27 companies offering products that safeguard privacy. Not all 27 companies were represented, but many were. We were also given a booklet produced by the Senate Judiciary Committee entitled "Know the Rules -- Use the Tools. A Resource for Internet Users."

Both of these documents list many, although not all, of the privacy-enhancing services on the market today, or that are soon to be launched. I'll provide web addresses of where you can get these lists in a moment.

  • The largest category of privacy products are the cookies managers, with products like Cookie Crusher and Cookie Terminator.

  • Another category is products that enable users to surf anonymously, like the Anonymizer and the Zero Knowledge product Freedom. AT&T has a product called Crowds.

  • There are products that are able to serve you with personalized advertisements based on your profile. Persona is one such product. Privada is another. These services include other identity management tools as well, and in general promise to keep your identity separate from your profile. AllAdvantage is a surf-for-money service that gathers profile data while you surf. These products have varying degrees of privacy protection. It's beyond the scope of my presentation to go into all the differences.

  • Another category of products enables the user to purchase items anonymously, or to use temporary credit card numbers so as not to place their credit account at risk for identity theft. Such products are offered by American Express, IPrivacy and Incogno SafeZone.

  • There are products for securing electronic mail, like Disappearing Inc., PGP encryption, and Hushmail.

  • Another category of products enables you to take advantage of the soon-to-be-launched P3P -- or Platform for Privacy Preferences. You will be able to set your own privacy preferences on your browser. Then when you visit sites that have translated their privacy policies into the preference categories, your computer can determine to what extent your wishes are going to be honored when you visit those sites. Microsoft and AT&T are teaming up to offer a P3P-enabler. Other preferences managers are IDcide, and YOUpowered Orby.

This is not a complete list by any stretch of the imagination -- with my apologies to those whose products I've not included or have glossed over too quickly. And the categories are somewhat fluid, as many products do several things. Here are some websites where you can find more complete lists:

  • Senate Judiciary Committee's "Know the Rules: Use the Tools" at http://permanent.access.gpo.gov/lps16318/privacy.pdf

  • Privacy Leadership Initiative, www.understandingprivacy.com. (This website is currently under construction.) This is the industry coalition that produced the poster I mentioned earlier, distributed to participants in the Technology Fair and the U.S. Department of Commerce’s conference held the previous day. (U.S. Dept. of Commerce’s Online Privacy Technologies Workshop and Technology Fair, Sept. 19, 2000)

  • "Online Privacy Technologies," an information-packed PowerPoint presentation by Dr. Lorrie Faith Cranor of AT&T Labs-Research, given at the October 19th Dept. of Commerce Workshop in Washington, D.C. www.research.att.com/~lorrie

A Skeptic’s Concerns

Now ... the eight reasons to be skeptical of a "technology fix" for protecting privacy:

1. The first reason is consumer confusion. In a self-regulatory environment, which as I said earlier is the direction that has been taken by the Clinton Administration, the burden is on consumers to protect their own privacy. I've spent the last nine years talking with consumers who have privacy problems. Trust me. Most consumers are going to find these services to be a significant challenge.

2. Many of these services -- loosely called "infomediaries" -- enable customized ads and offers to be delivered to web surfers. The ads are served based on their interests. Profiles are obtained when individuals fill out survey forms when they become users of the infomediary service. In addition, profiles are compiled on the fly, as the individual surfs the Net. My skepticism lies in the fact that these services are legitimizing the collection of data -- and legitimizing it in an environment of minimal legal protection, a topic that I will revisit in the conclusion.

3. Related to reason two is what I call the dilemma of "secondary use." If personally identifiable data is collected, it will be used for other purposes. You can count on it, especially in an environment of weak legal protection which we have with the self-regulatory mode here in the U.S. Let me give you two examples of secondary use, both from the "offline" world.

Example one: Smith's Foods is a grocery store chain headquartered in Utah, with supermarkets located in the Southwest states. It has a discount club card program that is used by a high percentage of consumers who shop there. Smith’s was subpoenaed by the U.S. Drug Enforcement Administration (DEA) for data from its club card data base on specific individuals they were investigating. Were they looking for high-volume purchases of certain over-the-counter medications that are used in the manufacture of "meth"? No, they were seeking evidence that these individuals were purchasing large quantities of plastic "baggies." Now, how many people can you think of that have reason to buy lots of plastic bags – perhaps Girl Scout troop leaders who wrap up the sandwiches for their troop outings, for example. ("Bargains at a price: shoppers’ privacy. Cards let supermarkets collect data," by Robert O’Harrow, Washington Post, Dec. 31, 1998, p. A01)

Subpoenaing data is a secondary use that the company has little control over, especially if it has a court-ordered warrant from law enforcement for a criminal matter. Civil subpoenas are somewhat different in that the company can alert the data subject to the subpoena and give him or her the ability to hire an attorney to fight the subpoena. I think it's only a matter of time before divorce attorneys view these vast storehouses of customer profiles as ripe for the plucking to support, say, their client's case that the spouse is a bad parent and shouldn't have custody of the children.

By the way, if you use an infomediary that claims to separate your identity from your profile -- claiming that the twain shall never meet -- do not rest in confidence. Computer scientist Latanya Sweeney of Carnegie Mellon conducted research in which she determined that 87% of the American populace can be uniquely identified by only three pieces of data -- date of birth, gender and the five-digit ZIP code. ("It doesn’t take much to make you stand out," Newsweek, Oct. 16, 2000)

Let me give one more example of secondary use: Last week the Wall Street Journal hosted a Technology Summit. One of the participants, the CEO of the large advertising company Ogilvey and Mather, said: "If you wanted to find fat people to market Weight Watchers to, you asked the local DMV for a list of heights and weights." (Rochelle Lazarus). That's a classic example of secondary use. ("Marketers, privacy forces are no closer to consensus," by Jason Anders, Wall Street Journal, Oct. 4, 2000)

4. Many of these services give their users a false sense of security and the illusion of control. Let's look a little closer at that illusion.

Privacy policies can change overnight. Doubleclick is one such example. Amazon.com is another.

The company can go bankrupt. And its data base can then be sold as a company asset. We've recently witnessed this situation with Toysmart.com.

Companies can merge with other companies, thereby enabling the now-affiliated entities to share and merge their respective customer files. For example, what is going to happen now that banks can affiliate with insurance companies and brokerage firms under one corporate roof under the implementation of the 1999 Financial Services Modernization Act? Or if AOL is allowed to merge with Time-Warner, imagine the rich data profiles that can be shared and combined from their customer files comprised of millions of Americans.

In the worst case scenario, we could experience a breakdown of the social structure because of economic collapse or political instability, or both. In a situation where civil liberties are suspended, no company is going to be able to ensure its customers that its data bases containing comprehensive profiles won't be used to monitor dissidents and maintain social control.

5. The Federal Trade Commission, which is the federal agency that comes closest to being a U.S. privacy watchdog and enforcement agency, does not have the resources, or even the mission, to track down the full range of privacy abusers and penalize them. As you no doubt know, the U.S. does not have a Privacy Office akin to the European countries' data protection commissions, Canada's Privacy Commissioners, the similar agencies in Australia and New Zealand, and many other countries. [See David Banisar, Privacy and Human Rights: An International Survey of Privacy Laws and Developments (EPIC and Privacy International, 2000), www.privacyinternational.org/survey/index.html]

6. The biases of the company that produces the product are embedded in the technology. The writings of Phil Agre and other academicians make this point in much more depth than I have time to do in this presentation. [Philip E. Agre and Marc Rotenberg, eds. Technology and Privacy: The New Landscape (MIT Press, 1997)]

Let me illustrate this point by posing a question: When P3P is operational and bundled into browsers, will the preferences settings be pre-loaded to favor the collection of user data (opt-out), or will they be set at the maximum privacy protection level (opt-in)? I think we know the answer to that question.

7. Relying on technology fixes to protect our privacy takes us another step farther away from the belief that privacy is a basic human right that should be protected by law. All the privacy-enhancing services in the world are virtually useless without the protection grounded in law. I don't mean the sectoral approach that we have here in the U.S., with a patchwork of laws covering just a few industries -- like credit reporting, video rental records, telemarketing, and cable TV. But, rather, I’m referring to an omnibus approach to protecting personal data, taken by virtually all other industrialized nations.

8. These technologies are not likely to affect companies' offline practices. Do you remember when AOL decided that it would use its customer information to telemarket to them? Online privacy policies are not likely to cover such offline practices.

Recommendations for Meaningful Privacy Protection

What is the answer, if technology fixes are not? I will mention four ways to ensure more meaningful protection of privacy.

1. The first is the codification of the Fair Information Principles into law, to create a firm legal foundation for privacy protection. These principles are notice, consent, access, security, collection limitation, purpose specification, use limitation, compliance, and accountability. There are several variations on this theme, but the Fair Information Principles usually encompass these provisions. (See our paper, "A Review of the Fair Information Principles" at www.privacyrights.org/ar/fairinfo.htm.)

2. The second recommendation is that privacy impact assessments be conducted before services and technologies are designed, cemented into place, and launched.

Would Intel have developed the Pentium III chip with its embedded unique identification code had it fully explored the implications for putting such a technology on the market?

In 1997 would the Social Security Administration have launched the online version of its Personal Earnings and Benefits Estimate Statement in the format that it did, had it first conducted an impact assessment? As it was, the SSA removed the PEBES service from its website within days of introducing it because of the controversy that it generated. It then conducted workshops around the country inviting the input of scientists, consumer advocates, government officials, and citizens before launching a much more secure service later on.

3. The third recommendation is to develop technologies and services using a design process called "value-sensitive design," a term that I believe was coined by Professor Batya Friedman of the University of Washington. There are a number of proponents of this process, including Helen Nissenbaum of Princeton. You can get more information about the proponents and the concept from the Value-Sensitive Design website at www.ischool.washington.edu/vsd.

I do not profess expertise in this field of research -- only the realization that this is an answer to the problems that we have all observed in this awkward and at times destructive adolescence of the Internet, where we seem to be experiencing privacy crises on a weekly basis.

Here is how Dr. Friedman describes Value-Sensitive Design:

"Value-Sensitive Design refers to an approach to the design of technology that accounts for human values in a principled and comprehensive manner throughout the design process. It is primarily concerned with values that center on human well-being, human dignity, justice, welfare, and human rights.

Specific values include trust, accountability, freedom from bias, access, autonomy, privacy, and consent. VSD connects the ... [designers] with the people who think about and understand the values of the stakeholders who are affected by the systems. ... VSD requires that we broaden the goals and criteria for judging the quality of technological systems to include those that advance human values." www.ischool.washington.edu/vsd

Dr. Friedman uses the example of computerized medical systems that have been designed with such direct stakeholders as the insurance companies and hospitals in mind, but that have not taken into consideration the values -- especially the privacy values -- of the indirect stakeholders, the patients.

There was a news clip earlier this week that illustrates this topic well. In a House Subcommittee hearing, the CEO of MoneyForMail.com, Larry Chiang, warned that the way web-based businesses are designed and launched is a privacy scandal waiting to happen. He said many Internet companies are headed by young CEOs who are running cash-strapped companies, and are often oblivious to privacy concerns. At the same time, these start-ups control consumers' personal information, worth a great deal of money. Obviously Value-Sensitive Design is not a factor in these situations. (Internet Wire: "Internet is a ‘privacy scandal waiting to happen’ says MoneyForMail.com CEO Larry Chiang before Congress today, Oct. 10, 2000, biz.yahoo.com)

4. My fourth and final recommendation is that such values be introduced in the education system in the early grades, following students through their schooling into college-level business schools, computer science programs, communications courses, and other disciplines. Children must be taught critical thinking and analytic skills, a difficult task considering the commercializing influences in schools today -- Whittle's Channel One, for example, and ZapMe!, an online service aimed at young people in the classroom. (According to recent news stories, ZapMe has since discontinued its school-based program.)

Conclusion

With that I close with some of my favorite bumper stickers.

You know the oft-seen bumper sticker on old Volkswagon vans -- "Question Authority" -- from the 1970s.

And of course you know your own organization’s motto, "Question Technology."

I'll leave you with another -- "Question Design."

Thank you for attention 

 

 



X

Sign In!

Loading