Log in / Register
Home arrow Computer Science arrow The Privacy Engineer’s Manifesto
< Prev   CONTENTS   Next >

The Dawning of the Personal Information Service Economy

The Information Age, the service economy, and the ability to provide and derive value from personal information are combining like never before and, accordingly, a new class of services is unfolding; these new services are classified as “personal information services.” There are currently at least two classes of personal information services.

The first types of services are those that aspire to help individuals manage and protect personal information. Security tools, single sign-on/identity management services, “do not track” technologies and policies, and compliance solutions for managing web-based cookies are all examples of this kind of personal information service.

The second type of personal information service are services that use personal information to provide value—sometimes to the individual and often to the enterprise. Examples of such services are personalization tailored to individual wants and desires, device recovery, or data retrieval and cloaking services. Clearly, there is overlap between data management and value-based services and a near infinite possibility for combining value propositions for personal data in emerging business, cultural, and individual value scenarios.

As individuals contribute more about what they want to do and what they want their communities to do (either socially or economically), the combination of all these actions will impact the whole economy. Personal information services may become a pivotal economics resource that can drive or measure an economy.

Data-Centric and Person-Centric Processing

There is a powerful movement toward data-centric and person-centric computing. Data centric implies that data and information processed from it are primary design drivers.

Person centric implies that a person is also a primary design driver. Taken together data-centric and person-centric processing involves the processing of personal information (PI) and thus potential privacy concerns. Privacy engineering is a crucial

competency when designing and implementing data-centric and person-centric systems. Data-centric and person-centric design and execution require a proactively engineered system architecture because:

• It takes data to protect data. We need to collect data from customers and those with whom customers may interact to determine whether privacy rules based on statutory or enterprise privacy policies apply.

• The scope of PI is expanding. What was once considered just “machine” data (i.e., not personal) is being recognized as something else.

• DV > DR = Success. A well-designed system ensures that data value (DV) exceeds data risk (DR)

• Privacy engineering is about user experience, brand definition and augmentation, and meeting customer satisfaction.

• Privacy engineering also translates into repeatable engineering principles rather than handcrafted one-off design and execution.

By John Berard, Founder, Credible Context

got your attention? Privacy is certainly a problem that can be solved. But first our mindset needs to change. Privacy—encompassing the transparent collection, secure storage, meaningful use and scheduled deletion of personally identifiable data—has no single right answer.

That's because privacy is not a single, static goal. unlike the strength of a bridge, yaw of a yacht or recurring field in a relational database, privacy is a lock that opens with a combination, not a key. in designing systems to deliver on a commitment to privacy,

the variables of time, place, platform and intended use are only a few of the constantly changing inputs that can overwhelm a more traditional, linear engineering approach.

Whereas a bridge needs to accommodate the weight of cars and trucks and a yacht must navigate the hull pressure of tide and wind and a database seeks to create order out of business chaos: privacy hopes to deliver on something even more challenging—the expectations of people. Worse, the need is to meet the expectations of people not in a group but as individuals. There is no engineering table for privacy.

This is guidance that can be drawn from former iBM executive irving WladawskyBerger[1] who made a career of applying technology solutions to new classes of highly complex problems, “many based on disruptive innovations which we have not encountered before.” Sound familiar? no development has been quite as disruptive as the internet and the digital data stream it creates.

This is why many find “privacy by design” such a compelling concept. As described by the course catalog at Carnegie Mellon, which offers a privacy engineering degree, the emphasis of “privacy by design” is on “safeguards that can be incorporated into the design of systems and products from the very beginning of the development process.”[2]

Rather than retrofit systems with data protection and privacy attributes, the notion is, “Wouldn't it be great to build them in at the start?” But is that the answer?

The difficulty is in defining what “them” are.

To engineer a solution to meet the demands of consumers, business, and government for more transparent, informed, and value-driven use of data, we need to think holistically. data protection and privacy engineering cannot only be about structured collection, hardened storage, authenticated access, and clear use, but must also accommodate the kind of variability that is human behavior.

We know that most existing systems running on data are designed as point solutions. The internal Revenue Service (iRS) needs data to ensure the size of our refund, supermarkets need data to stock their shelves and colleges see social security numbers as a way to manage applicant and student files.

But what happens when the iRS can see new enforcement value in our data? What happens when the supermarkets see more value in selling our shopping preferences to advertisers? And what happens when college entrance decisions become based not on scores but on social context?

in each case the very best engineered solution to what had been the present problem did not have the ability to anticipate what insights might arise or the flexibility to cover whatever might come next—and something always comes next.

data can now be connected, collated and queried in ways previously unimagined. The results can be of great benefit. But the dark cloud in all this is our inability to predict what stories our data will tell. This increases anxiety over who gets to tell them. Systems able to manage this uncertainty and unease are a tall order whose solution requires that we flip the engineering model on its head. To be blunt, we need to begin at the end rather than the beginning.

in many respects, the model for effective privacy engineering cannot be public works like the Hoover dam, concerned with resistance, rebar strength and the heat of hydration of concrete, or a software program like Microsoft Word, built, in part, to correct spelling and grammar.

if the goal is to deliver on privacy, especially at the edge of the network as represented by the smartphone in the hands of its owner, a quite different model must emerge, one as fluid in its approach and design as it is hard and fast in its results. The one that comes most to mind was devised more than 60 years ago to solve the problem of the delivery of supplies to a constantly moving army. The result was the birth of operations Research and an end to World War ii.

As studied at Cornell university, operations Research “deals with decisions involved in planning the efficient allocation of scarce resources to achieve stated goals and objectives under conditions of uncertainty and over a span of time.”[3] That says pretty well what may be the best approach to delivering on a promise of privacy— contingent upon shifting variables of time, place, platform and purpose.

What is telling is that at the start operations Research was not a single discipline but rather a matrix of many. The necessity of working together—manufacturing, transportation, topographic, finance and communications, to name a few—to solve a new problem on the battlefield may be the perfect metaphor for managing our privacy relationships today.

Although data are far from scarce, the usable insights they generate are exceptional; so much so that their pursuit has spawned whole new industries (e.g., Big data).

And if privacy, by its definition, is personal, then each transaction we make must be tagged at a different level of care and concern.

The implications for privacy engineering are as clear as they are counterintuitive.

By focusing on the outcome of data use—less expensive health care, quicker oil and gas exploration, the most suitable advertising for an individual consumer—we can begin to design systems to be both focused and flexible.


Privacy engineering in the intelligence stage is crucial because information provided by or gathered about individuals often determines:

• What we build

• How we build it

• How it works

• How our customers use it

• How well it protects our customer or other persons involved

• The risks it may pose to our business and to future markets

Privacy engineering uses engineering principles and processes to build privacy controls and measures throughout system and data lifecycles (development, production, and retirement). Privacy is important to people impacted by the systems; privacy protection encourages trustworthiness and other factors that people expect when working with an enterprise or with an enterprise's systems. Privacy engineering will further assist in:

• Protection of customers and other people impacted by our systems and their data

• Improving trust by the people impacted by enterprises and their systems

• Developing secure and respectful computing that may drive more data sharing and engagement

• Gathering better information that will help create better tools

• Greater innovation and opportunity in the marketplace

All of these areas will be examined in this book. We begin our journey in Chapter 2 with a look at the foundation concepts and frameworks of privacy engineering.

  • [1]
  • [2]
  • [3]
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >
Business & Finance
Computer Science
Language & Literature
Political science