Menu
Home
Log in / Register
 
Home arrow Computer Science arrow The Privacy Engineer’s Manifesto
< Prev   CONTENTS   Next >

Privacy

Merriam-Webster's Dictionary defines privacy as:

1. a: the quality or state of being apart from company or observation: seclusion

b: freedom from unauthorized intrusion one's right to privacy

2. archaic: a place of seclusion

3. a: secrecy

b: a private matter: secret

According to Yael Onn et al. in Privacy in the Digital Environment:

The right to privacy is our right to keep a domain around us, which includes all those things that are part of us, such as our body, home, thoughts, feelings, secrets, and identity. The right to privacy gives us the ability to choose which parts in this domain can be accessed by others, and to control the extent, manner, and timing of the use of those parts we choose to disclose.[1]

Privacy defined colloquially seems to be subjective rather than systematic or governed by objective or pragmatic requirements; privacy is certainly contextual, including cultural and time-sensitive contexts that introduce variability and complexity. What one person may feel is the appropriate level of privacy can change, based on the situation. One person's sense of what is the appropriate privacy level for a given situation may be different from another's. Further complicating this is the fact that across the world, cultural values and social norms vary widely. Finally, the same person's notions and sensitivities may change over time and context, which is to say, what one may want to share at one point in his or her life may change as life progresses, just as it changes based on the environment.

Consider, as an example, the act of wearing a bathing suit. An office worker would probably feel that his or her sense of privacy was being violated if a condition of employment was to wear a bathing suit to work; but this is not so for a swimming

pool lifeguard. External social and cultural norms would also be violated in the former instance (contextual). However, even for a lifeguard, the type and cut of bathing suit is a factor to acceptability, social normative value, and sense of well-being (subjective).

The challenge of privacy engineering is to architect and design products, processes, or systems that are sufficiently configurable to allow this sort of control.

An Operational Definition of Privacy

Data privacy may be defined as the authorized, fair, and legitimate processing of personal information. Much of the activity resulting from this functional definition will appear

to focus on organizations' and the management's philosophies and policies from that perspective, but it must always be remembered that the individual data subject—literally the subject matter of the information (i.e., the individual to whom the data applies)— remains the ultimate requirement-setting entity. To the extent feasible, flexibility built into privacy-engineered solutions will always be critical to properly govern that very human variability. Note, too, that it is not always possible to make everyone happy.

Although this operational definition may seem deceptively simple, we can break it down into its components to start to see this definition as the beginnings of a pragmatic framework to not only define data privacy but also to begin to build it from these foundations (Figure 2-1).

Figure 2-1. What is privacy?

We have already discussed and defined personal information, so now let's turn to what is meant by processing, authorized, and fair and legitimate.

Processing of Personal Information

Data is processed upon any action or inaction that can be performed in relation to that data or dataset. Processing personal information includes, but is not limited to, collection, storage, use, sharing, organization, display, recording, alignment, combination, disclosure by transmission, copying, consultation, erasure, destruction, and alteration of personally identifiable information and any data related to it.

By martin Abrams, Executive director and Chief strategist for the information Accountability Foundation

The slogan “keep Austin weird” works really well for that swinging Texas city, but the culture in Hebron, Texas, would likely not be associated with “weird,” at least not in the same way. local cultures are reflected in the way people interact with people. And privacy is one of those areas where culture is reflected.

Privacy scholars such as Alan westin who established the basis for modern privacy management understood that privacy culture is a function of how a society balances the autonomy of the individual against the interests of the group, and then factors in the way a society defines a space reserved for the individual, free from observation from others. Although residents of both Hebron and Austin might have similar views on concepts of space, the balance between individual expression and community cohesiveness would be very different. understanding cultural diversity and applying it to privacy is difficult enough when making decisions about what is an appropriate use in Texas, now think about looking at a global program that needs to work in Germany, Japan, weird Austin, and stern Hebron. How does an engineer begin building application requirements that fit the cultural context of diverse populations?

let's use an example. millions of smartphones are sold each year in places as diverse as Galesburg, illinois; Bangalore, india; and Frankfurt, Germany. Each smartphone has a unique signature, just like each of us have distinct finger prints. All smartphones are designed to run on wi-Fi networks. This design factor saves consumers money on their monthly mobile bills. it is no surprise that most consumers want to save money, so they set their phone to look for available wi-Fi networks.

An innovative engineer quickly figured out that one can track a device through a physical space like a store by equipping the space with wi-Fi. Furthermore, the engineer can see how much time the individual spends within a physical quadrant and can then link that information to the activities that take place in that quadrant. if it is a store, the activity is most likely shopping. For example, if the mobile device is in a home improvement store, the engineer now knows how long the device

spends in the paint department and when it moves from paint to window treatments. maybe he or she can even link the shopping activity to the items purchased and track what the device buys over time. it's not the device that buys the item, it is actually the individual holding the device; while the device might not have a cultural perspective, the individual does. it really doesn't make any difference whether we know the name of the individual. The actions we take based on tracking the device are particular to that individual. so the privacy question becomes: is it appropriate to take actions based on the predicted behavior of the individual holding the device?

The answer is: it depends.

in the united states we have many conflicting values. First and foremost, we believe that we are free to observe what we are free to see and hear within the public commons. in the physical world, we, as a society, have defined the public commons: Pretty much, it is anything outside one's home. it is the public street, the shopping mall, front yard, and the courtyard, if one is flying over in an airplane. Furthermore, we are free to express ourselves based on how we process what we have observed. making a sales offer is a form of expression. This value is captured by the First Amendment to the us Constitution.

The American people also cherish seclusion. That means, in our private space, we are free to do what we will do and think what we will think without fear of others observing and using what they hear and see. our home is our castle, and it is not part of the public commons. You may watch me in my front yard, but you may not look in my window and invade my seclusion.

in the united states, the wi-Fi-enabled store is the public commons. The observation of a device in a public space is probably okay, even if some might consider it obnoxious. Furthermore, we are free to think about what we have learned and apply that knowledge for practical ends such as increasing sales.

The preeminent nature of observation based on free expression doesn't have the same deference in other cultures. in those cultures, the sense that privacy as a fundamental right trumps the recording of what we observe and making use of that information. This is particularly so for most other western cultures. in Germany or France, the collection of the device signature, if it is easily linkable to an identifiable individual, is probably subject to data protection law. such a collection would be a processing of personal information that requires either permission from the individual or the law. Furthermore, any additional processing of that information, even storage, would also require permission from the law or the individual. we are talking about the same activity in different locations and having two different takes on whether the use is appropriate.

us culture puts a premium on free observation in the public commons, while societies with traditional data protection have no such deference for free observation. so, if an engineering team were to develop an observation model for a client that is dependent on observing devices in a physical space, the application would probably work in us stores but would be a violation of both societal norms and laws in stores in western Europe. The analysis might be entirely different in Asia, where rights to seclusion are limited but where such observation might be seen as violating norms necessary for a crowded society where physical space is limited.

The laws are different because the cultures are different.

These differences in privacy culture have impacted digital public policy for more than 30 years. Justice michael kirby, former chief justice of the Australia High Court, led the experts that developed the organisation for Economic Co-operation and development (oECd) Privacy Guidelines between 1978 and 1980. He said the most difficult issue he had to overcome in leading that group was the huge deference Americans give to free expression. Even though these differences are understood, we tend to default to what feels comfortable to each of us. Business concepts based on monetizing the fruits of observation have been developed in the united states, but when the same applications are applied outside the united states, we tend to see friction.

ninety percent of the privacy issues that concern both individuals and regulators are the same no matter where the activity takes place. These include ensuring security, accommodating transparency, and not facilitating illegal behavior by others. if one deals with these issues, one can have a fairly high level of certainty that an application is okay. moving beyond what is the same, one can anticipate key cultural markers. one such marker is at what age an individual reaches the age of maturity. This influences the consent children and adults are able to grant.

lastly, one needs to be truly sensitive to cultural differences related to observation.

You know when the technology tracks behavior, so tracking is an indicator that a cultural review is necessary when a technology is taken from one geographic

market to another. such applications probably require a privacy impact assessment (discussed in Chapter 10) with experts who understand the cultural frame. lastly, there are cultural aspects to automated decision making. if applications make decisions without human intervention that impact the ability of an individual to gain employment, get credit or insurance, or travel, one should check cultural norms related to such decision making.

Just be sensitive to the fact that what is appropriate where you are doesn't mean it will be appropriate somewhere else; if you keep this in mind, you should be successful in your data-use initiatives.

Authorized

Authorized processing of personal information only happens where the person or organization processing it has appropriate privilege for that processing. Additionally, there is a chain of custody and a sense of fiduciary responsibility that must follow the PI throughout the lifecycle of its processing. For example, those who can access a system containing PI must be authenticated to be the person he or she claims to be and that individual must also be acting within a role that would allow him or her to process the data within a system.

The type of data, the nature of the processing, as well as local laws and regulations will determine the nature and level of permission that may be required. The four primary protocols for permission gathering are:

• Opt out/Opt in

• Implied consent

• Informed consent

• Express consent

Opt out allows processing of PI unless or until an individual rejects data processing according to the context at hand. Opt in (the logical twin to opt out) is where no processing is allowed unless and until permission is granted. These concepts are relatively new in the comparative areas under the law, as discussed below, particularly in Common Law jurisdictions.

Context, narrowness of purpose, and transparency practices can make opt out or opt in relatively effective mechanisms.

Implied consent is a relatively straightforward concept where the context of collection and other processing is deemed so routine, obvious, and expected that permission for processing within this context may be implied by an individual's participation in the contextual relationship at all. An example of implied consent would be when PI is used for necessary processes (business or otherwise). When you give your name and telephone number for a reservation, the permission to use it to hold your table and for the maître'd to use it to call you is implied because it is necessary and within the scope of the function for which it is being used. However, if the maître'd chose to send text messages to the reservation number to solicit charitable donations to his favorite charity, he would be violating the implied consent to use contact information.

Informed consent relates to a very well-established and understood area of contract and tort law where a data subject has all relevant and timely facts to enable a reasonable choice of whether, how, how much, and for what purpose data will be processed. A good example of well-informed consent in a nondata context is the difference between giving consent or accepting the risks of skiing vs. receiving medical treatment from a trained doctor. In the former example, an individual is physically aware of his condition, standing on a snowy mountain, on two small skis. Yet there may be unexpected risks, and thus a disclaimer may be written on his ticket, but that disclaimer may be in smaller type and with no individualized explanations. In the latter example, however, the doctor and patient have very different levels of expertise, the procedures and risks may be unfathomable to the reasonable layperson, and the side effects may be unknowable without specific clarity. The type and depth of disclaimer and expository of risks and rewards are much different and far more extensive in this case.

Informed consent requires some responsibility and action on the part of the data subject and so may never become universally accepted as the standard for gaining or maintaining authorization, but its longevity in other fields of risk management and conflict resolution and the various aspects that allow breaking informed consent into measurable components make this form of consent particularly attractive to the budding privacy engineer.

Express consent is simply where a person takes a specific observable action to indicate and confirm that they give permission for their information to be processed. An example of this is checking a box that says, “yes” on an online form.

So that it does not go unrecognized, express consent and informed consent are both subspecies of the opt in.

The strength and validity of any of these permission forms and types depend on the clarity, conspicuousness, and proximity of data processing intended to be governed by authorization. It must be clear that the user knew what was being accepted to make the permission valid when permission was granted. Similarly, permission must be freely given and not under duress for data processing to be authorized to the appropriate degree.

The other key ingredient is, for all these different forms of permission, they must be presented before personal information is collected and before it is processed. For example, there has been much debate about the ability for web site operators to use cookies on the first page of a web site where notice is presented about the possibility of data collection through electronic means. In fact, the difficulty in ensuring that data subjects know

and understand the potential and actuality of data privacy in a clear, conspicuous, and proximate fashion is one of the many reasons that those processing the data, governing bodies, and users are skeptical that a governance and enforcement regime focused on “Notice and Consent” is effective in today's data-enriched environment.

Permission is only one component of ensuring that PI is processed with authorization. In addition to ensuring that one has permission to use the data, one also has to be able to manage and prevent unauthorized use or access to the data. This requires using controls and measures to ensure PI and related data is processed in an authorized and legitimate manner. These controls and measures can take the form of administrative, logical, technical, and physical routines or a combination of all of these, which will be discussed later in this chapter and in Chapter 6.

By Eduardo ustaran, data Protection lawyer and author of The Future of Privacy

is individual choice still the essence of data privacy law? in the early days of data protection as a regulated activity, putting people in control of their information was thought to be what mattered the most. From the 1980 oECd Guidelines to the latest version of the Eu e-privacy directive, consent has been a cornerstone across legal regimes and jurisdictions. European data protection law is based on the principle that an individual's consent is the most legitimate of all legitimate grounds to use information about people. But does this approach still hold true? Can we—as individuals—attempt to have a meaningful degree of control over the vast amount of information we generate as we go about our lives?

information about who we are, what we do, what we are like, and how we behave is captured every single second of the day. From the moment we turn on the light (or the Blackberry or our smartphone) in the morning to the moment we turn it off in the evening, every action that involves using technology is recorded somewhere.

The internet has maximized this in such an unprecedented way that the value of the information we generate by simply using it makes other more traditional identifying factors look trivial. From a legal perspective, this phenomenon has entirely distorted the meaning and scope of personal data, but the point is that information about us is constantly flowing around the world without our knowledge, let alone our consent.

let's face it, attempting to put people in control of their own information by giving them the power to consent to the uses made by others is simply unachievable.

The concept of consent should not be underestimated. The ability to make choices is what makes people free. However, pretending that we can take a view in any meaningful way as to how information about us is gathered, shared, and used by others is wishful thinking. we cannot even attempt to recognize what personal information is being made available by us in our daily comings and goings, so how could we possibly decide whether to consent or not to every possible use of that information? Consent might have been a valid mechanism to control data handling activities in the past, but not anymore.

so what now? is data privacy dead? i hope not. But in the same way that our ability to control our own information is moving away from us, our responsibility to decide what others can know about us is also receding. our privacy is less than ever in our own hands because the decision-making power is not really ours. Any legal regime that puts the onus on individuals (who are meant to be protected by that regime) is bound to be wrong. The onus should not be on us to decide whether a cookie may reside in our computer when hardly anyone in the real world knows what a cookie does. what the law should really do is put the onus on those who want to exploit our information by assigning different conditions to different degrees of usage, leaving consent to the very few situations where it can be truly meaningful.

The law should regulate data users, not data subjects. like it or not, individuals have a limited role in the data-handling decision-making process. That is a fact, and regulation should face up to that fact. Technology is more and more complex, while our human ability to decide remains static. Feeding us with more detailed and complex privacy policies does not change that. in the crucial task of protecting our personal information and our privacy, consent can only have a residual role.

Continuing to give consent a central role in the protection of our privacy is not only unrealistic, but also dangerous because it becomes an unhelpful distraction for individuals, organizations, and regulators. The emphasis must simply be put elsewhere.

Fair and Legitimate

Of all the concepts that underpin the notion of data privacy, the ability to provide information handling that is “fair and legitimate” is probably the most complex and difficult to reduce to a scientific rule or even an approximate measurable metric.

The concept of fair and legitimate processing is not limited to the organizational view of fair as “necessary” (or, more often, “desired”) processing. However, a series of principles called the Fair Information Practice Principles (FIPPs)as embraced by the OECD in the OECD Privacy Guidelines, is a useful prism through which to look at the notion of fairness and legitimacy.

  • [1] Yael Onn et al., Privacy in the Digital Environment. Haifa Center of Law & Technology, 2005.
 
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
 
Subjects
Accounting
Business & Finance
Communication
Computer Science
Economics
Education
Engineering
Environment
Geography
Health
History
Language & Literature
Law
Management
Marketing
Philosophy
Political science
Psychology
Religion
Sociology
Travel