Menu
Home
Log in / Register
 
Home arrow Computer Science arrow The Privacy Engineer’s Manifesto
< Prev   CONTENTS   Next >

Privacy Requirements Derived from Privacy Frameworks

In Chapters 2 and 3, we explained how the privacy frameworks provide guidance as to which privacy requirements should be included in the requirements statements, be they in a use case, in the user-experience definition, or in the data-model metadata. The following outline provides some privacy requirements to be considered:

Purpose: Collect and process for purposes that are relevant to the services being provided. PI must not be collected or used for purposes that are materially different from the original purpose for which the data were provided:

• What purposes do the PI data perform?

• Does each data attribute, related to personal information, have a direct relationship to the purpose for which it was collected and processed?

• What privacy rules are needed to ensure that the purpose principle is satisfied? Are there other metadata that support the purpose principle?

• Is there a chance that a data subject, whether an individual or an enterprise, would be embarrassed or damaged by the processing or publication of the personal data?

Notice: System creators, owners, and fiduciaries must explain to users how their information will be used, collected, protected, retained, kept accurate, accessed, corrected, or otherwise processed before any processing occurs:

• Does the requirements statement define a complete notice that satisfies the notice principle?

• Is the intended processing included in the notice (some types of processing may require supplemental or just-in-time notices)?

• How and when will the notices be presented to the user (and how or if will the user need to acknowledge or accept these notices)?

• Are there statutory or common law requirements concerning notice in all jurisdictions wherever the system impacts?

• Is the notice clear, consistent, relevant, and current?

• Can innovative presentation techniques be used to explain the notice requirements in a way that encourages review and facilitates understanding (for instance, would animation or a pop-up video make the notice more appealing and clearer)?

Choice/consent: Data subjects must consent to the collection and use of their personal information:

• Choices must be shown clearly.

• Choices must not be easily bypassed or ignored.

• Defaults most be explained clearly.

• Prechecked boxes should be avoided.

• Defaults should be set to either lessen the sharing of PI or that default must be so clearly tied to the notice and the context so that the only reasonable expectation any user would have would be that the information is shared (a form of implied or informed consent).

• Tools for the privacy engineer regarding choice should be considered during all phases of the data lifecycle of PI so that choices made by the data subject may be recorded, audited, and corrected along the way.

• Limit the types of data allowed to be collected and segmenting more sensitive PI (or disassociating identifying attributes from aggregate data) are technical, managerial, as well as policy and legal decisions.

Transfer: Data should not be transferred to third parties for their own use without the data subject's permission:

• Data transferred to and from a third party must be “adequately protected” by contract, administrative, technical, logical, and physical means.

• The transfer of data to different geographic areas, such as member-states of the European Union, may require an additional legal mechanism (such as Safe Harbor Certification or Model Contracts) to make the transfer legitimate.

• PI should not be transferred to third parties without the proper procedures included as part of the overall

architecture. Are the proper procedures in place for all types of third-party transfers and all impacted jurisdiction?

• No PI should be transferred to a third party or geographic area without appropriate agreements and approved security provisions that detail how the data will be processed and how they will protected. As part of vendor management and the sourcing or procurement process, ensure third parties are vetted from a privacy and security controls perspective before data feeds are enabled.

• Encryption and obfuscation techniques are the obvious tools to leverage when a system owner wishes to prevent an attack of data in motion.

Access, correction, deletion: Data subjects must have a means of accessing the personal information that has been collected about them. They also are entitled to delete or amend false or inaccurate data:

• Can data be segmented (group together) so that different segments can be handled with different privacy, encryption, or security rules?

• Can roles be defined so that privacy risks can be managed by means of privacy rules?

• Are rules concerning correction and deletion in compliance with the laws or regulations of all jurisdictions impacted by the system or process or by the enterprise policies?

• Although there is currently heavy debate over proposals to include a yet to be defined “right to be forgotten,” a right of deletion is not absolute as of this date. (We will approach this subject again in Chapter 14 in our discussion of how the future may look on a broad scale.[1]) For the privacy engineer, engineering tactics that allow for search and removal of common media such as photos or video and some ability to add metadata would be a wise addition as this debate widens.

Security: Use appropriate technical, logical, and administrative measures to ensure only authorized access and use of data:

• Do you leverage ISO and other standards for information and physical security (see ISO framework as discussed in Chapter 3) and work with information security teams within your enterprise?

• Are the security and encryption rules defined for each data attribute?

• Are security rules covered for data transfers, especially across jurisdictional lines?

Minimization: Collect and process the minimum necessary data to achieve the identified, legitimate intended purposes. The minimization principle is closely related to the purpose limitation requirement where only the necessary PI is collected and processed to achieve a legitimate purpose:

• For each piece of personal information data being collected, the following statement must be true: Without X data attribute, I cannot do Y legitimate task and I need no less than X to do Y. Is each personal information data attribute being collected needed to accomplish the solution being designed or is it being collected “just in case”?

• If data are being collected for potential big data purposes, can big data analysis be used to identify a person, thus raising a potential privacy issue?

Proportionality: Data collection should be legitimately proportional to need, purpose, and sensitivity of data. This requirement can be one-step further abstracted to connect those data to quality and value:

• As with minimization and purpose, can collection be limited wherever possible to only what is required?

• Think in terms of a Venn diagram that parses the proposed data, asking what is the minimum data needed that is proportional to the purpose intended?

• Is the amount of data and the character of data itself proportional to the purpose, the sensitivity of the data, or the risk to a data subject should it be compromised?

• Do the data subject and data fiduciaries keep a common perspective where risk and value are balanced? The following formula for proper weight of data comparison may be considered for overall data protection and governance efforts, but it also fits well into the discussion regarding proportionate collection and use, where Data Value (DV) > Data Risk (DR) = Success.

Retention: Retain data only as long as they are required:

• Are archiving rules for each data attribute well established?

• Instead of determining how long data can be kept, determine how soon (in whole or in part) it can be deleted and act accordingly; wherever possible, define controls to automate deletion. If this is not feasible from a business or technical perspective, have data inventory review and deletion processes been created (archiving rules)?

• Have data destruction tactics such as degaussing, permanently encrypting, and destroying keys or overwriting data after specific deadline been considered?

Act responsibly: Put a privacy program in place:

• Is the privacy team included on the project team?

• Has a privacy-oriented data governance or data stewardship program been established?

By Peggy Eisenhauer, Founder of Privacy & information management services— margaret P. Eisenhauer, P.C.

in 2008, my son received a fun game, Fluxx, for his birthday.

[2] Fluxx calls itself “The Card game with Ever-Changing Rules!” it could also call itself “The Card game that Provides a Perfect metaphor for Privacy Requirements!”

Fluxx game play is quite simple. There are four kinds of cards: Actions, Rules, Keepers, and goal. The Basic Rule says that, on each turn, you draw one card from the deck and play one card. To win, you collect the Keepers and meet the requirements of the goal.

The twist with Fluxx is that everything can change while you're playing. Players take Actions, such as stealing a Keeper. Players change the Rules. instead of draw 1, you have to draw 3. or play all the cards in your hand. one possible Rule prevents you from winning unless you meet the goal and have the Radioactive Potato card. some of the Actions change the Rules. For example, one Action card lets you “Trash a Rule.” Any player can change the goal as well. instead of needing the “milk” and “Cookies” Keepers, now you need the “Rocket” and the “moon.” And nonplayers can join game at any time by picking up three cards from the top of the deck.

The ever-changing nature of Fluxx illustrates the challenges that we face in defining privacy requirements. When setting privacy requirements, we consider the data elements and proposed processing activities and establish rules to address four sets of mandates: (1) specific legal requirements for privacy and security, (2) requirements driven by internal policies, (3) requirements driven by industry standards, and (4) requirements that likely exist as a matter of stakeholder expectations for appropriate processing. At a particular point in time, the first three types of requirements can be objectively known. The fourth type of requirement (addressing consumer and regulator expectations for appropriate use) is subjective. For example, consumers generally feel that processing to prevent fraud is appropriate, but they disagree as to whether it is appropriate for companies to scan all the data off their driver's licenses in connection with a retail product return. nonetheless, privacy professionals can collaborate with their product design teams to document a solid set of privacy requirements for any proposed activity.

As is always the case with requirements engineering, however, the requirements change over time. This is especially true for privacy requirements, due to the rapid evolution of legal standards and industry codes, mercurial consumer and regulatory views about privacy, and the dynamic nature of internal policies (which have to keep up to date with at least the laws). Privacy requirements are also challenged by changes within the business itself. Within any given company:

Business objectives constantly evolve, creating new goals. Companies want to wring every last ounce of value from their data, leading to new uses for existing databases, novel types of analytics and business intelligence processes, and increasing pressure to leverage customer or consumer data assets to benefit partners and create incremental revenue streams.

Business requirements evolve, requiring new rules. Companies move from controlled technology deployment to ByoD (bring your own device) programs. Customer and consumers are increasingly empowered to engage via social media platforms and mobile apps.

Routine actions have consequences. Companies outsource, requiring new rules for vendor management and data transfers. They enhance data and create inferred data, pushing the boundaries of what may be considered appropriate by consumers and requiring new rules for notice, choice, and access.

Companies have security breaches. Even the actions of other entities can have consequences, as we know from revelations about government programs that demand access to consumer data.

Privacy professionals and product designers also need to recognize that some business attributes hinder achievement of some privacy objectives, and even privacy objectives sometimes compete. let's consider a real-life scenario: a company may be committed to providing more transparency, but this may trigger an expectation that consumers will have additional choice. For example, the company may disclose that it is using cookies to recognize consumers' devices, but consumers will then want the ability to opt out of having the cookies placed. However, providing additional choice may make it more difficult to meet security requirements, for instance, if the cookies are used as part of a multifactor authentication process.

Additionally, as in Fluxx, the actions of various stakeholders (and the orders of actions) are not predictable. nonplayers (such as regulators) routinely take actions that affect the business. nonplayers (especially regulators) can also change the rules. it is thus critical to have a deep understanding of not only the legal requirements for data processing but also the more subjective opinions about appropriateness of the processing held by the myriad stakeholders: employees, consumers, industry groups, regulators. Because the rules are rapidly changing, companies must anticipate new requirements so they can implement new rules effectively and efficiently.

Consider, for example, the legal requirements for collecting children's data. The “Basic Rule” in the united states under the Children's online Privacy Protection Act (CoPPA) required verifiable parental consent in order to collect personal information from children under 13 years old via a commercial web site. CoPPA regulated web sites that targeted kids as well as general interest web sites, if the site operators knew that kids were submitting personal information. Although CoPPA was one of the most effective (and enforced) privacy laws, concerns persisted about the collection of personal information from children. These concerns intensified when studies revealed that web sites targeting children were collecting vast amounts of data via passive technologies, such as cookies, without triggering CoPPA requirements.

in 2012, the Federal Trade Commission revised CoPPA rules to add new requirements. The new rule expands the definition of “personal information” to include persistent identifiers, device iDs, geolocation data, and images. it also expands the definition of “website or online service directed to children” to clarify that a plug-in or ad network is covered by the Rule when it knows or has reason to know that it is collecting personal information through a child-directed web site or online service. All web sites that appeal to children must age-gate users. Companies operating general interest web sites online are now playing a very different game.

like every great game, Fluxx has expansion packs and variant versions, such as Eco-Fluxx, Zombie Fluxx, and (our personal favorite) Monty Python Fluxx.

Expansion packs for “Privacy Fluxx” should be coming as well. information security requirements are constantly evolving and becoming more prescriptive. The “infosec monster mandate Fluxx” will require very specific types of security controls and impose even stricter liability for data breaches. We can see this trend today in jurisdictions such as Russia and south Korea.

“Consumer Protection Fluxx” is already imposing new rules based on evolving concepts of privacy harm. Expect new purpose limitation requirements and “minimum necessary” standards as well as opt-in consent for secondary uses of personal information. Additional limits on the collection, use, and disclosure of personal information based on the nature of the technology used (such as cookies) are also featured in this version.

Companies that operate in a multijurisdictional environment know the challenges associated with “global Data Protection Fluxx” quite well. These companies will face exponentially greater complexity as they define privacy requirements for systems and processing activities that touch data from multiple countries. These companies must account for all possible local requirements and implement controls to meet the most restrictive requirements. As in the united states, international data protection authorities are focused on data security and consumer protection. international regulators seek to achieve privacy goals by limiting data retention periods and cross-border data transfers.

  • [1] It should be noted that there is not yet any proposal to notify victims of the behavior of the unintelligent or those who intend evil which now seems to dominate the hearts of legislators catapulting the legal discussion into historical data revision. To be fair, they do so with innocent intent to remove silly or youthful indiscretions. From an engineering perspective, it is worth a thought of creativity as to how a system would react to a picture or court filing being removed from public view where that media or filing specifically impacts the victim or other third-party beneficiary. Perhaps having a metadata attribute cataloging those individuals identified in the media so that the person requesting removal would have to document approval of impacted parties before holders of valid information were forced to remove it is needed. This is worthy of debate and a great deal of design before a notion like this would be enacted into law.
  • [2] Fluxx is a product from Looney Labs, available online at looneylabs.com/games/fluxx

    or at local retailers.

 
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
 
Subjects
Accounting
Business & Finance
Communication
Computer Science
Economics
Education
Engineering
Environment
Geography
Health
History
Language & Literature
Law
Management
Marketing
Philosophy
Political science
Psychology
Religion
Sociology
Travel