Menu
Home
Log in / Register
 
Home arrow Computer Science arrow The Privacy Engineer’s Manifesto
< Prev   CONTENTS   Next >

Designing a Privacy Policy

Some organizations begin taking action on mitigating business risks before an official Privacy Policy is published, but defining the policy should be a high priority. Sadly, many enterprises copy policies they find on other companies' web sites and post what amounts to an ad hoc policy of their own before any due diligence has been exercised with regard to knowing their personnel's, process's, or technology's requirements. It's a sad fact,

but a vast majority of enterprises own what we call “complianceware”—stuff that they purchase, license, or otherwise “acquire” just in case there is a data breach or a regulatory inquiry at a later date but that they never actually completely deploy.

An example of this is where an enterprise purchases an identity management suite of products and sets the roles to “employee” or “nonemployee” without regard to a good policy that would illuminate why individuals required access to process data or how the roles or employees themselves should be protected and governed. A good privacy policy should be linked closely to this type of deployment. It will set its requirements before deployment or, better yet, before purchase or development if the identity solution is homegrown.

The next section describes the key considerations for crafting an effective privacy policy as well as how to maintain it.

What Should Be Included in a Privacy Policy?

Policies must be designed to meet a complex set of competing needs:

• Local and international legal, jurisdictional, and regulatory necessities, depending on the scope of the enterprise

• Organization or business requirements

• Permission for the marketing–customer relationship for management or business intelligence

• Brand identity

• Industry standards

• Usability, access, and availability for end users of information systems

• Economic pressure to create value through efficient sharing or relationship building

• Enforceability and compliance

• Ethical obligations

• Realistic technology capabilities and limitations

Everything with a digital heartbeat is connected through dynamically formed relationships governed by privacy, security, and trust policies. This means there may be multiple interactive or cascading privacy policies based on the role of the various parties of interest:

• Customers

• Employees or contractors

• Third parties impacted by the enterprise

• Intellectual property owners

• Data types

Each privacy policy should start with the data type and its anticipated lifecycle and be aligned with the enterprise brand and the enterprise standards of conduct. The policy should add value by managing data:

• Respecting and managing regulatory and industrial standards compliance

• Using personal information and confidential data related to it safely and ethically

• Reconciling differences and leveraging synergies between overlapping or competing enterprise policies and goals for other areas, such as audit or litigation data preservation, records management, and physical and IT security

• Establishing a basis for objective respect and trust between an enterprise and its customers, employees, and other impacted groups

As discussed in Chapters 2 and 3, there are several sets of external standards and guidelines defining privacy requirements, including the OECD guidelines for the protection of privacy and transborder flows of personal data, GAPP, PbD, sectorial and competition laws in the United States, APEC privacy accountability frameworks, and the European Union (EU) Data Protection Directive (and member-states implementation of its requirements).[1] These external guidelines and principles can provide a framework for ensuring that the Privacy Policy will offer compliance within the related jurisdictional area.

It should, of course, be noted in the privacy requirements that:

• Not all laws are granular enough to provide one objective interpretation that must be instantiated

• All rules and regulations can always be harmonized to be free of directly conflicting standards and so-called best practices

What is possible is an objective working framework that will become the policy for the enterprise and, ultimately, the basis for process and technology policies, as described in the sidebar.

By Dr. Mark Watts, Head of information Technology law, Bristows

Europe is not a country. it isn't. And while this will be blindingly obvious to most people reading this book, it's surprising how often i hear it assumed that Europe is essentially a country, with a single, homogenous data privacy law that sets out the rules applicable across the entire region (50 or so countries). if only life were that simple. if only European privacy rules were that simple. sadly, they're not. And the point here is not to ridicule anyone's understanding of European geography or laws, but rather to make the point that, although when working “internationally” in privacy we all make assumptions—we have to, to rationalize the almost overwhelming legal complexity involved—making the wrong assumptions can quickly cause a project to go astray.

Perhaps the most common working assumption i see crossing my desk is that the data privacy laws of a particular country are either (i) completely and utterly different from those that apply at “home” (usually the country of the parent company) so none of our existing data privacy policy can possibly apply, or (ii) absolutely identical to those that apply at home and so we don't need any special consideration or handling in the privacy policy; in other words, the international privacy policy can simply be the same as a domestic one. Unfortunately, most of the time, neither “working” assumption works particularly well. A sensible, well-drafted data privacy policy written to meet, say, north American legal requirements will contain much of relevance and application to Europe and beyond because good information handling practices, such as transparency, data quality, and security, are just that—good practices that should transcend country borders. But equally, to assume that that's all there is to it and that, say, north American laws can be exported globally would be complacent and would be to ignore significant cultural differences and priorities, not to mention historical sensitivities. Many an international company has come unstuck making this assumption.

For example, assuming the laws that relate to monitoring employee communications in, say, Finland are the same and so just as permissive as those in the United states (an assumption we see a lot) could easily land a company in hot water. Equally, for a European-headquartered company to assume that there are no security breach notification laws in the United states simply because there are so few at home in Europe at the moment can be just as problematic. A privacy policy built on shaky, overly broad assumptions can put a company, even a company that is trying very hard to do the right thing, in breach of applicable law, despite it following its privacy policy to the letter. Perhaps more worryingly, sometimes a breach can occur precisely because a company followed a privacy policy—admittedly, a poor privacy policy—to the letter.

shaky assumptions can lead to another, more subtle but equally problematic risk— the risk of unnecessary overcompliance. now, this isn't to suggest that companies should develop policies requiring only the minimum amount of compliance required by local law (essentially as little as the company can get away with) but would a company really want to apply the highest common denominator—the strictest standard anywhere—to all of its operations worldwide? surely not. For example, would it really be wise to export the highly restrictive Finnish laws on monitoring employee communications to every country where a company does business?

Most unlikely, because although this approach would ensure compliance with the communication monitoring laws of almost all other countries where the company has employees, it could seriously hamper its business operations in countries with more permissive regimes. This isn't a risk of noncompliance; it isn't a risk of breach. it's a risk of overcompliance that can fetter existing business processes, potentially inhibit sales, and, just as importantly for the privacy professional and privacy engineer, can damage their internal credibility within the company. All in all, overcompliance can be as much of a problem for the company as undercompliance.

The problem here is not that broad “international” assumptions are being made. They have to be. A global company with operations subject to the data privacy laws of hundreds of different countries cannot realistically be expected to identify every last detailed requirement of every last applicable law because, at least from a regulatory point of view, the world is still a very big place. so developing an

international privacy policy (including all procedures, consent statements, contracts, and other supporting documents that go with it) has to involve making certain assumptions. it's just that they have to be the right assumptions. you have to know when it's safe to assume (or indeed, force) conformity between countries at a privacy policy level and when to leave enough room to accommodate important local differences in countries' laws.

Where does one start? As good a place as any for most companies is to think carefully about what it actually wants its international privacy policy to do. is it meant to be some all singing, all dancing document that seeks to set out the various compliance requirements for each of the countries where the company does business? or is it intended to be something with less lofty ambitions, merely a common set of requirements that will improve compliance everywhere while accepting that in certain countries there will be a “delta” between the requirements of the policy and those of applicable law?

Well-advised companies adopt the second approach, prioritizing the simplicity of a common, global policy that leads to a “good” (and hopefully even “very good”) level of compliance everywhere over the more comprehensive and unwieldy, not to mention expensive, approach directed at full compliance everywhere, at least on paper and most likely only on paper. By adopting the second approach, companies are recognizing that there will inevitably be some specific (but hopefully minor) country legal requirements that are not covered by the policy in detail and which may not be complied with to the letter and only in spirit. in an attempt to plug the most significant of any known “gaps” like this, companies often develop country-specific annexes or sections in their privacy policy. An example of this would be a section specific to data collected in switzerland that extends the privacy policy's requirements to information about legal entities (e.g., companies) as well as individuals (i.e., human beings). To include such an onerous requirement in the main body of the data privacy policy would be to export the swiss requirement globally unnecessarily, requiring all companies to apply the policy in full to information about legal entities even though it is not legally required where they operate. including the obligation in an additional annex to the policy and restricting it to data collected in switzerland enables compliance with the local requirement while limiting its impact geographically.

But—tweaking the facts slightly—what if the parent company developing the privacy policy is, say, a swiss bank? in this case it may be desirable or even essential to require its global operations to handle data about legal entities as if they were all subject to swiss data privacy law. This would suggest that the “swiss” provision should be included in the body of the privacy policy rather than being buried in an annex limited to data collected in switzerland.

And this is how international privacy works; there are few if any invariably true assumptions that can be built into any global privacy policy. They always have to be considered and reconsidered on the particular facts for the company developing the policy. Done well, the result can be a robust privacy policy with a good degree of conformity from country to country, capable of generating clear technical requirements that give the privacy engineers a chance of coding “privacy.” Done poorly, the result can be a policy that's unnecessarily strict, or with too many exceptions, or which is simply too vague to be useful, any one of which can require last minute changes to the Privacy Policy (and consequently any technical requirements based on it), something which, in my experience, coders really don't seem to like.

  • [1] OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data are available at oecd.org/document/18/0,3343,en_2649_201185_1815186_1_1_1_1,00. html. A downloadable version of the Generally Accepted Privacy Principles (GAPP), along with additional information about the development and additional privacy resources, can be found at aicpa.org/privacy. Information about the European Union's Directive on Data Protection is available at ec.europa.eu/justice_home/fsj/privacy/index_en.htm.
 
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
 
Subjects
Accounting
Business & Finance
Communication
Computer Science
Economics
Education
Engineering
Environment
Geography
Health
History
Language & Literature
Law
Management
Marketing
Philosophy
Political science
Psychology
Religion
Sociology
Travel