Bonus Content: Privacy’s Meaningful Purpose

A few years ago, I dreamed up a concept of “meaningful privacy” to better define the discussion around the broad topic of privacy. I noticed that not every piece of data is equal. Some things are kept private because there is a concern of actual harm if the information is publicized. Some other things are kept private because of societal or cultural norms and traditions. Privacy is not and end in itself- we have it for the purpose of protecting information. However, different data has different value. Therefore, the value of privacy is relative, varying according to the data in question. One effect of this concept is to treat different breaches according to the type (or value) of data in question.

There is a huge and illuminating problem with this idea of “meaningful privacy”: Just because someone didn’t steal anything from your house doesn’t mean you feel comfortable about a break-in. Although privacy is not an end in itself, it is intrinsically upsetting when our privacy is violated. The biggest fear is the potential for future violations of privacy: just because no harm occurred as a result of one violation, there is no guarantee about future violations. Furthermore, a past violation of privacy indicates a vulnerability and thus the potential for future violations. With a diminished expectation of privacy, there is diminished privacy. Privacy is of little use if it cannot be relied upon.

Horizon: The Dawn of Zero Privacy?

Horizon: Zero Dawn is a problem because I don’t know which game I have to slide out of my top 5 in order to fit it into that list. (It might be have to replace “Child of Light,” which pains me, but replacing any would pain me… maybe “Outlaws” will move to #6 …) It’s an incredible game in its own right, with beautiful artwork, well-written characters, and genuinely fun gameplay. I find its story especially fascinating—and particularly relevant as we grapple with a framework for governing and living in an age of digital information and interconnected devices. Though its central technological focus is on Artificial Intelligence and the future of humanity, it touches a multitude of topics- including data privacy.

Although Judge Richard Posner famously decried privacy as a way for bad people get away with bad things, privacy is important for personal development and free association. Privacy is essential to our culture, and it is only valuable inasmuch as it is protected and reliable. Our expectations of privacy follow us into our digital extensions. However, one of the best methods of securing privacy is impractical in the face of consumer demands for interconnection and convenience.

I. Can We Have Privacy by Design When We Demand Designs that Compromise our Privacy?

The Federal Trade Commission’s favored method for protecting Privacy is “Privacy By Design.” In simple terms, this often means designing a product to rely as little on privacy as possible. After all, if no data is collected, there is no data to steal. However, there are serious questions about the feasibility of this approach in the face of consumer expectations for interconnected devices.

Privacy by Design is a much better idea than the sophomoric idea of increasing security measures. Designing a house not to be broken into is better than trying to just put a good lock on the front door. To put it another way: Think of it as building a dam without holes rather than trying to plug all of the holes after you finish building.

I’ve heard tech entrepreneurs talk about “The Internet of Things” at conferences for many years, now. They talk about it like it’s a product currently in development and there’s an upcoming product launch date that we should be excited about- like we can line up for outside of a retail store hours before the doors open so we can be the first to get some new tech device. This is not how our beloved internet was created. Massive networks are created piece by piece- one node at a time, one connection at a time. The Internet of Things isn’t a tech product that will abruptly launch in Q3 of 2019. It’s a web of FitBits, geolocated social media posts, hashtags, metadata, smart houses, Alexas and Siris, searches, click-throughs, check-ins, etc. The “Internet of Things” is really just the result of increasingly tech-savvy consumers living their lives while making use of connected devices.

That’s not to diminish its significance or the challenges it poses. Rather, this highlights that this “Coming Soon” feature is really already here, growing organically. Given that our society is already growing this vast network of data, Privacy by Design seems like an impossible and futile task. The products and functions that consumers demand all require some collection, storage, or use of data: location, history, log-in information- all for a quick, convenient, personalized experience. One solution is for consumers to choose between optimizing convenience and optimizing privacy.

II. A Focus on Connected Devices

Horizon: Zero Dawn is a story deliberately situated at the boundary of the natural world (plants, water, rocks, trees, flesh and blood) and the artificial world (processed metals, digital information, robotics, cybernetics). As a child, Aloy falls into a cavern and finds a piece of ancient (21st century) technology. A small triangle that clips over the ear, this “Focus” is essentially a smart phone with Augmented Reality projection (sort of… JawBone meets GoogleGlass and Microsoft Hololens). This device helps to advance the plot, often by connecting with ancient records that establish the history of Aloy’s world (it even helps with combat and stealth!).

It’s also a privacy nightmare. The primary antagonist first sees Aloy -without her knowledge- through another character’s Focus. Aloy’s own Focus is hacked several times during the game. A key ally even reveals that he hacked Aloy’s Focus when she was a child and watched her life unfold as she grew up. (This ultimately serves the story as a way for the Sage archetype to have a sort of omniscience about the protagonist.) For a girl who grew up as an outcast from her tribe, living a near-solitary life in a cabin on a mountain, with the only electronic device in a hundred miles, she manages to run into a lot of privacy breaches. I can’t imagine if she tried to take an Uber from one village to the next.

Our interconnected devices accumulate deeply astonishing volumes of data- sometimes, very personalized data gets captured. In a case heard by the Supreme Court this month, a man in Ohio has his location determined by his cell phone provider. The police obtained this information and used it as part of his arrest and subsequent prosecution. The Supreme Court recently heard a case about the use of warrants for law enforcement to access cell phone data. (This is different from the famous stalemate between the FBI and Apple after the San Bernadino shooting, when Apple refused an order to unlock the iPhone of a deceased criminal.)  As connected devices become omnipresent, questions about data privacy and information security permeate very nearly every side of every facet of our daily lives. We don’t face questions about data the way that one “faces” a wall; we face these questions the way that a fish “faces” water.

From cell phone manufacturers to social media platforms, the government confronts technology and business in a debate about the security mechanisms that should be required (or prohibited) to protect consumers from criminals in myriad contexts and scenarios. In this debate, the right answer to one scenario is often the wrong answer for the next scenario.

Conclusion: Maybe We Don’t Understand Privacy In a New Way, Yet

The current cycle of consumer demand for risky designs followed by data breaches is not sustainable. Something will have to shift for Privacy in the 21st century. Maybe we will rethink some part of the concept privacy. Maybe we will sacrifice some of the convenience of the digital era to retain privacy. Maybe we will try to rely more heavily on security measures after a breakthrough in computing and/or cryptography. Maybe we will find ways to integrate the ancient privacy methods of the 20th century into our future.

 

Privacy (as the Withholding of Information) in the Information Age

Business professionals in e-commerce talk about information like it is today’s fundamental commodity. Yet information— raw data— is less helpful than we tend to think. Privacy becomes harder to maintain in an era in which business and government think that more data is always better and that accruing data will solve problems. Information is necessary, but not sufficient, to solving problems and pushing progress along.

Lots of entities want information: governments want information about their citizens, employers want information about their employees, corporations want information about their consumers, etc. Such entities have always wanted information, but only recent technological developments have made it reasonable to obtain and organize that information. The biggest remaining barrier to such information collection is the ethical and legal concept of privacy. My contention is that the mere gathering of data is less helpful than the gatherers might think.

One way to think of this issue is to see human action as having two components: 1) an internal motivation or attitude and 2) an external display of action. So, if I purchase a large supply of plastic drinking cups, the store computers may recognize my purchase and correlate it to the kinds of other items people purchase with drinking cups: plastic cutlery, snack food, soda, and so forth. The store wants to predict my motivation by examining my action and correlating my action with similar actions and using inductive reasoning to sell me more things. But what if my motivation in buying many cups is to have a cup stacking competition? Or to have a 2nd grade class plant lima beans? The problem with relying heavily on gathering information is that you can only make guesses about the internal state of the actor.

The debatable assertion is this: Humans cannot be captured by data sets. Some (who probably favor Hume) may say they can, but it must be conceded that the data set must become extremely, extremely large. Perhaps more importantly, some elements essential to that data set cannot be collected through transaction records, e-mails, Facebook “likes”, tweets, and all other collectable data. Seen in this way, a reasonable fear emerges: as entities gather data, they act on that data as though it is a more complete picture than it actually is. Another way to state this issue is “data does not explain itself.”

There are a few important takeaways about the limits of the power of data:

1) You don’t get to know people from their Facebook profiles.

2) Stores know what people buy, but not always why they buy them.

3) Privacy can protect both parties from an incomplete picture.

4) Data is a raw material. It must be processed with understanding, refined through meaning and context, and crafted with wisdom into usable information and then into intelligence.

5) Computer systems can record observations of fact and interact according to algorithm, but cannot “understand” any “significance” or “meaning” of any data.

NOTE: There is so much to this subject! I expect to return (probably repeatedly) to this subject in more specific settings to explore deeper nuances and applications of issues.