Bonus Content: Privacy’s Meaningful Purpose

A few years ago, I dreamed up a concept of “meaningful privacy” to better define the discussion around the broad topic of privacy. I noticed that not every piece of data is equal. Some things are kept private because there is a concern of actual harm if the information is publicized. Some other things are kept private because of societal or cultural norms and traditions. Privacy is not and end in itself- we have it for the purpose of protecting information. However, different data has different value. Therefore, the value of privacy is relative, varying according to the data in question. One effect of this concept is to treat different breaches according to the type (or value) of data in question.

There is a huge and illuminating problem with this idea of “meaningful privacy”: Just because someone didn’t steal anything from your house doesn’t mean you feel comfortable about a break-in. Although privacy is not an end in itself, it is intrinsically upsetting when our privacy is violated. The biggest fear is the potential for future violations of privacy: just because no harm occurred as a result of one violation, there is no guarantee about future violations. Furthermore, a past violation of privacy indicates a vulnerability and thus the potential for future violations. With a diminished expectation of privacy, there is diminished privacy. Privacy is of little use if it cannot be relied upon.

Advertisements

Horizon: The Dawn of Zero Privacy?

Horizon: Zero Dawn is a problem because I don’t know which game I have to slide out of my top 5 in order to fit it into that list. (It might be have to replace “Child of Light,” which pains me, but replacing any would pain me… maybe “Outlaws” will move to #6 …) It’s an incredible game in its own right, with beautiful artwork, well-written characters, and genuinely fun gameplay. I find its story especially fascinating—and particularly relevant as we grapple with a framework for governing and living in an age of digital information and interconnected devices. Though its central technological focus is on Artificial Intelligence and the future of humanity, it touches a multitude of topics- including data privacy.

Although Judge Richard Posner famously decried privacy as a way for bad people get away with bad things, privacy is important for personal development and free association. Privacy is essential to our culture, and it is only valuable inasmuch as it is protected and reliable. Our expectations of privacy follow us into our digital extensions. However, one of the best methods of securing privacy is impractical in the face of consumer demands for interconnection and convenience.

I. Can We Have Privacy by Design When We Demand Designs that Compromise our Privacy?

The Federal Trade Commission’s favored method for protecting Privacy is “Privacy By Design.” In simple terms, this often means designing a product to rely as little on privacy as possible. After all, if no data is collected, there is no data to steal. However, there are serious questions about the feasibility of this approach in the face of consumer expectations for interconnected devices.

Privacy by Design is a much better idea than the sophomoric idea of increasing security measures. Designing a house not to be broken into is better than trying to just put a good lock on the front door. To put it another way: Think of it as building a dam without holes rather than trying to plug all of the holes after you finish building.

I’ve heard tech entrepreneurs talk about “The Internet of Things” at conferences for many years, now. They talk about it like it’s a product currently in development and there’s an upcoming product launch date that we should be excited about- like we can line up for outside of a retail store hours before the doors open so we can be the first to get some new tech device. This is not how our beloved internet was created. Massive networks are created piece by piece- one node at a time, one connection at a time. The Internet of Things isn’t a tech product that will abruptly launch in Q3 of 2019. It’s a web of FitBits, geolocated social media posts, hashtags, metadata, smart houses, Alexas and Siris, searches, click-throughs, check-ins, etc. The “Internet of Things” is really just the result of increasingly tech-savvy consumers living their lives while making use of connected devices.

That’s not to diminish its significance or the challenges it poses. Rather, this highlights that this “Coming Soon” feature is really already here, growing organically. Given that our society is already growing this vast network of data, Privacy by Design seems like an impossible and futile task. The products and functions that consumers demand all require some collection, storage, or use of data: location, history, log-in information- all for a quick, convenient, personalized experience. One solution is for consumers to choose between optimizing convenience and optimizing privacy.

II. A Focus on Connected Devices

Horizon: Zero Dawn is a story deliberately situated at the boundary of the natural world (plants, water, rocks, trees, flesh and blood) and the artificial world (processed metals, digital information, robotics, cybernetics). As a child, Aloy falls into a cavern and finds a piece of ancient (21st century) technology. A small triangle that clips over the ear, this “Focus” is essentially a smart phone with Augmented Reality projection (sort of… JawBone meets GoogleGlass and Microsoft Hololens). This device helps to advance the plot, often by connecting with ancient records that establish the history of Aloy’s world (it even helps with combat and stealth!).

It’s also a privacy nightmare. The primary antagonist first sees Aloy -without her knowledge- through another character’s Focus. Aloy’s own Focus is hacked several times during the game. A key ally even reveals that he hacked Aloy’s Focus when she was a child and watched her life unfold as she grew up. (This ultimately serves the story as a way for the Sage archetype to have a sort of omniscience about the protagonist.) For a girl who grew up as an outcast from her tribe, living a near-solitary life in a cabin on a mountain, with the only electronic device in a hundred miles, she manages to run into a lot of privacy breaches. I can’t imagine if she tried to take an Uber from one village to the next.

Our interconnected devices accumulate deeply astonishing volumes of data- sometimes, very personalized data gets captured. In a case heard by the Supreme Court this month, a man in Ohio has his location determined by his cell phone provider. The police obtained this information and used it as part of his arrest and subsequent prosecution. The Supreme Court recently heard a case about the use of warrants for law enforcement to access cell phone data. (This is different from the famous stalemate between the FBI and Apple after the San Bernadino shooting, when Apple refused an order to unlock the iPhone of a deceased criminal.)  As connected devices become omnipresent, questions about data privacy and information security permeate very nearly every side of every facet of our daily lives. We don’t face questions about data the way that one “faces” a wall; we face these questions the way that a fish “faces” water.

From cell phone manufacturers to social media platforms, the government confronts technology and business in a debate about the security mechanisms that should be required (or prohibited) to protect consumers from criminals in myriad contexts and scenarios. In this debate, the right answer to one scenario is often the wrong answer for the next scenario.

Conclusion: Maybe We Don’t Understand Privacy In a New Way, Yet

The current cycle of consumer demand for risky designs followed by data breaches is not sustainable. Something will have to shift for Privacy in the 21st century. Maybe we will rethink some part of the concept privacy. Maybe we will sacrifice some of the convenience of the digital era to retain privacy. Maybe we will try to rely more heavily on security measures after a breakthrough in computing and/or cryptography. Maybe we will find ways to integrate the ancient privacy methods of the 20th century into our future.

 

ISPs Tell Two Lies: “This is Fair” and “This Will Work”

Intro: The Parable of the Watermelon Stand

Once upon a time, two folks (Alphonzet and Balantanoid) decided to sell watermelons at a roadside stand. The two-step business model was: 1) buy watermelons for $1 apiece from a farm, then 2) transport them in their pickup truck to the roadside stand, where they sold the watermelons at a retail price of $1 apiece. After some time, accountant Balantanoid informed business partner Alphonzet that, due to the price of gasoline and other incidental business costs, they were actually losing money. Alphonzet reviewed the numbers and pondered, and then ventured a solution:

“Do you think we need a bigger truck?”

Businesses looking to buy consumer information from ISPs are like the characters in this story considering using a bigger truck. More data isn’t what businesses need, and there is danger is believing otherwise. Furthermore, ISPs unjustly shirk responsibility that ought to come with the entitlement to the data they intend to sell.

I. Background.  Internet Service Providers Aren’t Satisfied With a de facto Monopoly

Internet service providers have no competitors and provide a borderline necessity. They can charge anything (and do) and provide a low quality product and service (as they do), and customers will still pay them (and they do). This isn’t enough for them. The telecommunications industry has successfully lobbied congress into repealing an FCC order that previously prevented the sale of tracked, identifiable consumer data to third parties.

Of course, ISPs are the only ones who can risk fighting their customers. Service providers operating on the internet can’t antagonize their customers because they are subject to fundamental concepts of free market capitalism: If they anger their customers, their customers will go elsewhere. ISPs don’t have “customers” in the traditional sense. They have “victims” or “hostages”- so it makes sense that ISPs wouldn’t worry about treating them like customers.

II. “This Will Work.” ISP’s Already Lie to Consumers and Government- Now They Get to Lie to Businesses

I don’t know how many lies the telecommunications industry had to tell congress to get the FCC’s rule repealed. Probably not many- after some generous donations, congress rarely asks very many questions, or cares about answers. But the lie that ISPs are relying on now is for 3rd party companies to believe that (in the context of the aforementioned parable) a bigger truck will turn their watermelon business profitable. There are two likely outcomes of this business arrangement: either advertising will get better, more efficient, more streamlined, more effective, and benefit both advertiser and consumer, OR advertising will become more obnoxious, more noisy, less useful, less relevant, more intrusive, and worse for consumers and advertisers.

In his NYT Op-Ed on this legislation, former FCC Chairman Tom Wheeler gives the example of ISPs selling data to car dealerships about which customers are visiting car websites, thus allowing car dealers to target more likely customers. One interpretation is that this will help car dealers only target relevant audiences, and customers will get better opportunities and information as customer-business connectivity is optimized. My experience is that this is supremely unlikely.

My most recent experience with targeted advertising is that the business model is not effective. I spent an evening looking for a new pair of shoes from online stores. The next day, ads for shoes show up on my Facebook feed. But I had already bought shoes. I was no longer a potential customer in that market. No amount of advertising is going to persuade me to make a purchase, because the purchase was already complete.

More data doesn’t mean you understand your customer better. You need the right data- and ISPs just can’t provide that. Data science simply isn’t good enough yet. The algorithms consistently fail to capture human thought, intent, and desire. The greater danger in the loss of this privacy isn’t in other parties knowing who you are- it’s in other parties THINKING they know who you are.

This example reveals two facts that render third party purchases of consumer data useless: a single data point or grouping of data points doesn’t tell you all of the important data about a consumer, and second, consumers move faster than companies. For the same reasons that cause all of us to receive junk mail addressed to people who haven’t lived at an address for years, (or even addressed to deceased persons), companies efforts to use consumer data are routinely ineffective. The myriad problems with the over-reliance on big data is its own subject, but one that informs this issue.

The effort to make money off of violating privacy won’t work because companies aren’t equipped to turn data into sales.

III. “This Is Fair.” Justice Requires That ISPs Pick A Single Classification: Common Carrier or Private Enterprise

There is a doctrine in tort law that common carrier services like buses and trains have reduced duties to customers. Private carriers have more discretion about how to run their business, but have increased liability. In the famous tort case Paslgraf v. Long Island Railroad, a railroad company was not held liable when a passenger’s explosive package accidentally detonated, causing injuries. Part of the reasoning relied on the notion that the railroad was a common carrier, and such service providers are not liable for some acts of their customers because they have less discretion regarding their customers than a private carrier has.

This reasoning ought to be applied to internet service providers: ISPs can be either a common carrier or a private carrier, but must accept the responsibilities and limitations of whichever classification they choose.

If ISPs want the benefits of being private enterprises, they need to take on the liability commensurate with those benefits. The concept of safe harbours in the DMCA is predicated on the notion that ISPs are a sort of public utility or common carrier. ISPs that want the benefits of private business need to be liable for crimes and damages that common carriers would not be liable for.

ISPs believe they have a right to the data of their individual customers, such as their browser histories and app usage rates. If they are so interested in the private information of their customers, they should take on criminal liability for crimes committed by their customers, from piracy to identity theft to terrorism or child pornography. This is the burden of responsibility. If an ISP is truly entitled to the content of a customer’s online activity, they are responsible for that content. There is no entitlement without responsibility. This is a fundamental precept of justice that permeates the law.

If the ISP does not want to be liable for the crimes committed using their services, they must opt for the common carrier approach to providing internet and information services. The idea of ISP access to consumer data without responsibility to the consumer is not just offensive to privacy or comfort- it is offensive to the very concept of justice and fairness. It is the ISP getting something extra from a consumer in return for nothing. Forcibly taking from someone in exchange for nothing is the clearest possibly understanding of theft.

Conclusion

The data that ISPs will sell to 3rd parties is unlikely to make advertising substantially better, due to the challenges in execution. The larger issue is settling the classification of ISPs in the context of telecommunications law. ISPs can be either private enterprises or common carriers. They cannot continually shift their classification from moment to moment to suit convenience, reaping rewards and rejecting responsibility.

Update: ISPs earn their place… And they really have a cultural status.

Employer Facebook Checks: How the Law Struggles with Culture and Ignores Metaphysics

Question of privacy in cyberspace cover a vast range of applications. One that I find interesting is the use of social media as a tool by potential employers to research prospective employees. This is interesting because it is at an intersection of cultural, technology, law, and metaphysics.

It is increasingly common for employers to check on a prospective employee’s Facebook page (or other social media). I like to use the case study of Stacy Snyder in this NYTonline article: http://goo.gl/bMw0Kl

The issue is that a student-teacher was dismissed over a photo on her MySpace (that dates the example a bit, eh?) that was captioned “Drunken Pirate.” This situation becomes the image of concern: an employer delving into your personal (yet published) photo album to look for objectionable material.

Let me divide up the issues:

1) The legal and/or cultural claim to privacy. Before Facebook or MySpace, it would be extraordinary for an employer to ask to see photos from your latest party as part of the application process (barring government security clearance checks). Although social media has allowed us to share such personal material with a wider range of friends, we are not culturally comfortable simply surrendering previously private/personal material to the entire public sphere.

2) Context is everything. Bill Waterson’s iconic character, the rascally 2nd-grader Calvin, once explained that people are wrong to assert that “photos never lie,” for, in fact, all they do is lie. To illustrate, Calvin clears one area of his room and puts on a tie to have himself photographed as a clean, tidy young boy (he is normally dressed in a t-shirt and has a notoriously messy bedroom). So it may be argued with Facebook photos, Tweets, etc: Can a single snapshot, sentence, or post represent an individual- even partially? Can it be completely incorrect? Without further explanation, how badly can it be misinterpreted? This claim speaks not only to the protection of the poster, but also raises the question of whether investigating an applicant’s social media is truly helpful in obtaining accurate data about the applicant. A related issue here is the notion of Performance Identity online (see: Life on the Screen by Sherry Turkle). Many posts and photos may be uploaded not as a reflection of actual identity, but as an effort to entertain or amuse a particular audience.

3) The Metaphysical puzzles of being and identity over time. One of the core points of the NYTonline article linked above is that the internet makes possible the storage of everything we say or do- FOREVER. One question is whether applicants ought to be judged by high school or college photos or posts. Indeed, the question is founded on an ancient metaphysical quandary: what is the relationship with one’s self over time? We have a cultural concept of “not being the same person” at age 15 as at age 30. Yet right now, many 30 year old job applicants could be in the position to defend the digital traces left by their 15 year old selves.

The final point to note here is that Facebook was not designed to be a massive social media platform through which employers scouted and screened applicants. It was a way for college (and later high school) students to communicate and make limited broadcasts to a select audience. It was a kid’s toy, really. To me, it still is- I think that’s why my generation sometimes feels weird that our parents have Facebook profiles. The platform was never made for “grown-ups” or “grown-up things.” That was an accident, and treating it otherwise is a mistake.

Privacy (as the Withholding of Information) in the Information Age

Business professionals in e-commerce talk about information like it is today’s fundamental commodity. Yet information— raw data— is less helpful than we tend to think. Privacy becomes harder to maintain in an era in which business and government think that more data is always better and that accruing data will solve problems. Information is necessary, but not sufficient, to solving problems and pushing progress along.

Lots of entities want information: governments want information about their citizens, employers want information about their employees, corporations want information about their consumers, etc. Such entities have always wanted information, but only recent technological developments have made it reasonable to obtain and organize that information. The biggest remaining barrier to such information collection is the ethical and legal concept of privacy. My contention is that the mere gathering of data is less helpful than the gatherers might think.

One way to think of this issue is to see human action as having two components: 1) an internal motivation or attitude and 2) an external display of action. So, if I purchase a large supply of plastic drinking cups, the store computers may recognize my purchase and correlate it to the kinds of other items people purchase with drinking cups: plastic cutlery, snack food, soda, and so forth. The store wants to predict my motivation by examining my action and correlating my action with similar actions and using inductive reasoning to sell me more things. But what if my motivation in buying many cups is to have a cup stacking competition? Or to have a 2nd grade class plant lima beans? The problem with relying heavily on gathering information is that you can only make guesses about the internal state of the actor.

The debatable assertion is this: Humans cannot be captured by data sets. Some (who probably favor Hume) may say they can, but it must be conceded that the data set must become extremely, extremely large. Perhaps more importantly, some elements essential to that data set cannot be collected through transaction records, e-mails, Facebook “likes”, tweets, and all other collectable data. Seen in this way, a reasonable fear emerges: as entities gather data, they act on that data as though it is a more complete picture than it actually is. Another way to state this issue is “data does not explain itself.”

There are a few important takeaways about the limits of the power of data:

1) You don’t get to know people from their Facebook profiles.

2) Stores know what people buy, but not always why they buy them.

3) Privacy can protect both parties from an incomplete picture.

4) Data is a raw material. It must be processed with understanding, refined through meaning and context, and crafted with wisdom into usable information and then into intelligence.

5) Computer systems can record observations of fact and interact according to algorithm, but cannot “understand” any “significance” or “meaning” of any data.

NOTE: There is so much to this subject! I expect to return (probably repeatedly) to this subject in more specific settings to explore deeper nuances and applications of issues.

Cheat Codes, Privacy, and Disobedience: Generation Y’s Perspective.

I saw a YouTube video of some teens who had devised a most wise and useful way to spend their time. They call it “Gallon Smashing.” The idea is simple: walk through a not-too-crowded supermarket with one or two plastic gallon jugs of some liquid (milk will do). When no one is looking, smash the gallons on the floor and immediately pretend to have tripped and caused the messy spill accidentally. It’s a way to destroy property and get away with it- in fact, you come out of it looking like the hapless, innocent victim. It has all of the hallmarks of Generation Y: dastardly and unimportant destructiveness while deceiving others into letting you take on the role of the innocent victim.

My assessment of at least part of this phenomenon is that GenY has the power-fighting spirit of recent generations without the willingness to come face to face with that power. Generation X would have walked into the store, smashed the gallon of milk without any pretense, given the finger to onlookers, and walked back out or gone peacefully with the security officer (because GenX was indifferent enough to not care about getting in trouble). The protesters that preceded GenX would have entered the store with a megaphone, announced their destruction of the gallon of milk and the political motivations behind it, burned a draft card and/or bra, and would continue to make a scene until several police used tear gas and a fire hose to subdue and detain the individual.

But Generation Y? We’re used to anonymity. We feel entitled, not to fight against things or to get to have things, but to get away with things. We don’t fight the system head on, we don’t glare indifferently at the system with our middle fingers raised, but we certainly don’t support the system more than any previous generation. When our adolescents fight authority, they do it with a smile and overtly expressed support, while sneaking decisively and quietly behind authority’s back. I think this goes hand-in-hand with a generation that grew up with the anonymity of the internet. Our generation grew up with screen names and cheat codes. Previous generations saw a need to be in only one place at a time and being only one person at a time (though you could be different hour-to-hour). This generation attempts multiple existences, multiple states of mind, simultaneously. Perhaps formative years spent embracing a Cyberspace which bends previously accepted rules of time and space leads to a sense of duplicity as commonplace and identity as detachable and replaceable. Perhaps, as children, we used too many “God Mode” cheats: we became too used to invulnerability and doing whatever we wanted. (see also: Brene Brown’s TED talk on “The Power of Vulnerability”)

Of course, for all their deceptive ploys and crafty planning, GenY still posts their exploits to YouTube and Facebook, so it’s hard to be too worried about them as a threat. There is a contradiction in the rising generation: a need to publicize the secretive. The dilemma for them is the paradoxical need to be underground megastars, widely known among only a select few. They might be more deceitful and dishonest than previous generations, but they also feel a need to overshare exploits.

“Everyone wants to pull off the crime of the century…. /And get away with it. Get away with it. We Americans are freedom loving people, and nothing says freedom like getting away with it. /We went from Billy the Kid to Richard Nixon, Enron, Exxon, O.J. Simpson… /We used to dream about heroes, but now it’s just how to beat the system.” – Guy Forsyth, “It’s Been A Long, Long Time”