Log in / Register
Home arrow Computer Science arrow The Privacy Engineer’s Manifesto
< Prev   CONTENTS   Next >

Internal vs. External Policies

Data protection standards such as the OECD Guidelines and GAPP, among others, require that privacy policies should be published both internally in enterprises and externally (actually, externally, it is usually a statement or notice of an enterprise practices that is posted, not the actual policy) to give notice to users of systems, customers, or other data subjects interacting with the enterprise. Failure to comply with the enterprise public notices can lead to:

Dissatisfied customers: Customers and other users will expect compliance to the privacy protection actions as indicated within the notice. It may be considered an implied contract. If there is a breach, users will tend to look to safer sites. If a user discovers identity theft that seems to have come from personal information collected by an enterprise, that user will take it out on the enterprise maintaining the site that failed them.

Regulatory investigations: Where an enterprise has not lived up to its notice commitments, regulators from one or more jurisdictions will likely investigate the problems and may take either criminal or civil actions or both against both the enterprise and, conceivably, against employees within the enterprise.

Bad publicity: Forty-six US states, the District of Columbia, plus other US territories have security breach notification laws that involve personal information. There are comparable laws throughout the world. The media keep a lookout for such notifications and determine when breaches are significant. Any breach scares people, and serious breaches equal bad publicity.

Litigation: Potential liability in privacy-related lawsuits has been increasing steadily in recent years. This expanding legal exposure has been fueled by plaintiffs' class action lawyers targeting privacy litigation as a growth area. Moreover, federal and state government agencies, as well as data protection agencies throughout Europe and Asia, are becoming increasingly aggressive in their efforts to investigate and respond to privacy and data security concerns and incidents. The Federal Trade Commission (FTC) is imposing stricter standards on businesses, while state attorneys general are pursuing enforcement actions and conducting high-profile investigations in response to data breaches and other perceived privacy violations.

Harm to brand: For most enterprises, the equity invested in their brands is an invaluable but fragile asset. When privacy protection problems occur, the reaction of the enterprise is crucial to the maintenance of a very positive brand.

Weak innovation: Effective innovation comes from making improved products that deliver what people want. To find what customers and potential customers want requires the collection of data. An enterprise that does not protect the privacy of data will weaken the ability to collect the data needed to determine where innovation is required.

Employee distrust: Just as customers can be turned off when privacy notice failures occur, employees can begin to distrust their enterprise when their data is not protected as the privacy notice promise.

An enterprise should consider creating training based on internal privacy rules that are more granular, specific, and more restrictive than externally posted notices. These internal policies should be coordinated with a human resources policy team to ensure that staff and business partners know exactly what to do, how to get help when they need it, and how and when these may be enforced and encouraged.

These policies must all be reflected and are instantiated in product and systems development as discussed further in Chapters 5 and 6.

By Dr. Annie i. Antón, Professor in and Chair of the school of interactive Computing at the georgia institute of Technology

Peter swire, nancy J. and lawrence P. Huang Professor, scheller College of Business, georgia institute of Technology

in March 2013 we participated in a panel titled “Re-Engineering Privacy law” at the international Association of Privacy Professionals Privacy summit. The topic of the panel closely matches the topic of this book, how to bring together and leverage the skill sets of engineers, lawyers, and others to create effective privacy policy with correspondingly compliant implementations. As a software engineering professor (Antón) and a law professor (swire), we consider four points: (1) how lawyers make simple things complicated; (2) how engineers make simple things complicated; (3) why it may be reasonable to use the term “reasonable” in privacy rules but not in software specifications; and (4) how to achieve consensus when both lawyers and engineers are in the room.

1. How lawyers make simple things complicated. A first-year law student takes Torts, the study of accident law. A major question in that course is whether the defendant showed “reasonable care.” if not, the defendant is likely to be found liable. sometimes a defendant has violated a statute or a custom, such as a standard safety precaution. More often, the answer in a lawsuit is whether the jury thinks the defendant acted as a “reasonable person.” The outcome of the lawsuit is whether the defendant has to pay money or not. We all hope that truth triumphs, but the operational question hinges on who can prove what in court.

The legal style is illustrated by the famous Palsgraf case.[1] A man climbs on a train pulling out of the station. The railroad conductor assists the man into the car. in the process, the man drops a package tucked under his arm. it turns out the package contains fireworks, which explode, knocking over some scales at the far end of the platform. The scales topple onto a woman, causing her injury.

From teaching the case, here is the outline of a good law student answer, which would take several pages. The answer would address at least four issues. For each issue, the student would follow iRAC (issue, Rule, Analysis, Conclusion) form, discussing the issue, the legal rule, the analysis, and the conclusion: (1) Was the man negligent when he climbed on the moving train? (2) When the railroad conductor helped the man up, was the conductor violating a safety statute, thus making his employer, the railroad, liable? (3) When the man dropped the fireworks, was it foreseeable that harm would result? (4) Was the dropping of the package the proximate cause of knocking over the scales? in sum, we seek to determine whether the railroad is liable. The law student would explain why it is a close case; indeed, the actual judges in the case split their decision 4-3.

Engineers design and build things. As such, they seek practical and precise answers. instead of an iRAC form, engineers seek to apply scientific analytic principles to determine the properties or state of the “system.” The mechanisms of failure in the Palsgraf case would be analyzed in isolation: (1) The train was moving, therefore, the policy of only allowing boarding while the train is stopped was not properly enforced, thereby introducing significant safety risk into the system. (2) The scales were apparently not properly secured, thus a vibration or simple force would have dislodged the scales, introducing safety risk into the system. is the railroad liable? An engineer would conclude the compliance violation and unsecured scales means that it would be liable. The engineering professor would congratulate the engineering student for the simple, yet elegant, conclusion based on analysis of isolated components in the engineering, simplicity is the key to elegance.

The lawyer may agree in theory that simplicity is the key to elegance, but law students and lawyers have strong reasons to go into far more detail. The highest score in a law school exam usually spots the greatest number of issues; it analyzes the one or two key issues, but also creates a research plan for the lawyers litigating the case. For example, the railroad has a safety rule that says the conductor shouldn't help a passenger board when the train is moving, but surely there are exceptions? in the actual case (or the law school exam), the lawyer would likely analyze what those exceptions might be, especially because finding an applicable exception will free the railroad from liability. The good exam answer may also compare the strange chain of events in Palsgraf to other leading cases, in order to assess whether the plaintiff can meet her burden for satisfying the difficult-to-define standard for showing proximate cause.

in short, lawyers are trained to take the relatively simple set of facts in Palsgraf and write a complex, issue-by-issue analysis of all the considerations that may be relevant to deciding the case. The complexity becomes even greater because the lawyer is not seeking to find the “correct” answer based on scientific principles; instead, the lawyer needs to prepare for the jury or judge, and find ways, if possible, to convince even skeptical decision-makers that the client's position should win.

2. How engineers make simple things complicated. A typical compliance task is that our company has to comply with a new privacy rule. For lawyers, this basically means applying the Fair information Privacy Principles (FiPPs), such as notice, choice, access, security, and accountability. The law is pretty simple.

The engineer response is: How do we specify these rules so that they can be implemented in code? stage one: specify the basic privacy principles (FiPPs). stage two: specify commitments expressed in the company privacy notice. stage three: specify functional and nonfunctional requirements to support business processes, user interactions, data transforms and transfers, security and privacy requirements, as well as corresponding system tests.

As an example, some privacy laws have a data minimization requirement. giving operational meaning to “data minimization,” however, is a challenging engineering task, requiring system-by-system and field-by-field knowledge of which data

are or are not needed for the organization's purposes. stuart shapiro , Principal information Privacy & security Engineer, The MiTRE Corporation, notes that an implementation of data minimization in a system may have 50 requirements and 100 associated tests. input to the system is permitted only for predetermined data elements. When the system queries an external database, they are permitted only to the approved data fields. There must be executable tests—apply to test data first and then confirm that data minimization is achieved under various scenarios.

For the lawyer, it is simple to say “data minimization.” For the engineer, those two words are the beginning of a very complex process.

3. Why it may be reasonable to use the term “reasonable” in privacy rules. swire was involved in the drafting of the HiPAA medical privacy rule in 1999–2000. Antón, the engineer, has long chastised swire for letting the word “reasonable” appear over 30 times in the regulation. Words such as “promptly” and “reasonable” are far too ambiguous for engineers to implement. For example, consider HiPAA §164.530(i)(3): “the covered entity must promptly document and implement the revised policy or procedure.” Engineers can't test for “promptly.” They can, however, test for 24 hours, 1 second, or 5 milliseconds. As for reasonable, the rule requires “reasonable and appropriate security measures”; “reasonable and appropriate polices and procedures” for documentation; “reasonable efforts to limit” collection and use “to the minimum necessary”; a “reasonable belief” before releasing records relating to domestic violence; and “reasonable steps to cure the breach” by a business associate.

The engineer's critique is: How do you code for “promptly” and “reasonable”? The lawyer's answer is that the HiPAA rule went more than a decade before being updated for the first time, so the rule has to apply to changing circumstances. The rule is supposed to be technology neutral, so drafting detailed technical specs is a bad idea even though that's exactly what engineers are expected to do to develop HiPAA-compliant systems. There are many use cases and business models in a rule that covers almost 20% of the Us economy. over time, the Department of Health and Human services can issue FAQs and guidance, as needed. if the rule is more specific, then the results will be wrong. in short, lawyers believe there is no better alternative in the privacy rule to saying “reasonable.”

The engineer remains frustrated by the term “reasonable,” yet accepts that the term is intentionally ambiguous because it is for the courts to decide what is deemed reasonable. if the rule is too ambiguous, however, it will be inconsistently applied and engineers risk legal sanctions on the organization for developing systems not deemed to be HiPAA compliant. in addition, “promptly” is an unintentional ambiguity that was preventable in the crafting of the law. By allowing engineers in the room with the lawyers as they decide the rules that will govern the systems the engineers must develop, we can avoid a lot of headaches down the road.

4. How to achieve happiness when both lawyers and engineers are in the same room. organizations today need to have both lawyers and engineers involved in privacy compliance efforts. An increasing number of laws, regulations, and cases, often coming from numerous states and countries, place requirements on companies. lawyers are needed to interpret these requirements. Engineers are needed to build the systems.

Despite their differences, lawyers and engineers share important similarities. They both are very analytic. They both can drill down and get enormously detailed in order to get the product just right. And, each is glad when the other gets to do those details. Most engineers would hate to write a 50-page brief. Most lawyers can't even imagine specifying 50 engineering requirements and running 100 associated tests.

The output of engineering and legal work turns out to be different. Engineers build things. They build systems that work. They seek the right answer. Their results are testable. Most of all, it “works” if it runs according to spec. By contrast, lawyers build arguments. They use a lot of words; “brief” is a one-word oxymoron. lawyers are trained in the adversary system, where other lawyers are trying to defeat them in court or get a different legislative or regulatory outcome. For lawyers, it “works” if our lawyers beat their lawyers.

given these differences, companies and agencies typically need a team. To comply, you need lawyers and engineers, and it helps to become aware of how to create answers that count for both the lawyers and the engineers. To strike an optimistic note, in privacy compliance the legal and engineering systems come together. your own work improves if you become bilingual, if you can understand what counts as an answer for the different professions.

We look forward to trying to find an answer about how to achieve happiness when both lawyers and engineers are in the room. Antón presumably is seeking a testable result. swire presumably will settle for simply persuading those involved. However, we both agree that the best results come from collaboration because of the value, knowledge, and expertise that both stakeholder groups bring to the table.

Policies, Present, and Future

Policies have to be living documents that can be readily changed as a business changes or as the regulatory environment changes; however, they should not be changed lightly or at whim. There is overhead associated with policy changes, especially in the privacy space. For instance, a change in policy may indicate a change in use of data, which then may require an enterprise to provide notice of the change to whomever's data is affected and get permission for the new uses of the data. Even without a pressing need for change, it is important to review policies on a regular basis, perhaps annually, to determine if change is necessary.

A good policy needs to be forward looking and, at the same time, accurate to the current state. It should be sufficiently detailed as to give direction and set parameters, but not so detailed as to be overly specific or to require excessive change. Each enterprise will need to find the balance between what is communicated as “policy” and what is communicated as an underlying standard or guideline for meeting the requirements of the policy. Key stakeholders should review policies and practices at least annually to see if revisions are warranted.

Engineered privacy mechanisms can ease the change and improvement of the policies, especially with the specific procedures, standards, guidelines, and privacy rules that need to change if there are policy revisions. The privacy component discussed in Chapters 6, 7, 8, and 9 addresses this crucial need.


Privacy policies are powerful tools in the overall privacy engineering process. Privacy professionals, lawyers, and compliance teams can use them to communicate expected behaviors and leverage them to create accountability measures. In the process of policy creation, internal and external—including systems' users and regulators—requirements and expectations must be gathered. These same requirements and expectations in the traditional lexicon can also be leveraged as engineering requirements in the privacy engineering model and execution sense. We will explore how such requirements fit into a system's model in Chapters 5 and 6. In the remaining chapters of Part 2, we will continue to call on these policy requirements in the context of discrete tools and features that rest in the privacy engineering toolkit.

  • [1] Palsgraf v. Long Island Railroad Co., 248 N.Y. 339 (N.Y. 1928).
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
Business & Finance
Computer Science
Language & Literature
Political science