In Both Overcooked And The GDPR, Execution Matters More Than Ingredients

I deliberately avoided playing Overcooked for a long time because so many review joked about the fights it causes with friends. Now that I’ve played it, I barely understand why it’s such a divisive experience for so many people. The game is charming and delightfully fun. Players work together in kitchens filled with obstacles (food and tables often move during the round, forcing players to adapt) to prepare ingredients and assemble meals for a hungry restaurant– though the diners are sometimes floating on lava floes and sometimes… the diners are penguins. The game is about coordinating and communicating as you adapt to changes within the kitchen. Maybe the reason so many people throw rage fits during this game is that they are not good at coordinating an effort and communicating effectively. In any case, the game isn’t about food so much as it’s about kitchens (especially in restaurants). So the game doesn’t focus so much on the ingredients as it teaches the importance of working together in chaotic situations.

People are focusing  a lot on the ingredients of the new EU data privacy law– particularly the consumer protection rights enumerated in it. However, there is very little talk about the bulk of the law, which is aimed at the effort to coordinate the enforcement and monitoring mechanisms that will try to secure those consumer rights. The rights listed in the GDPR are great ingredients– but as Overcooked teaches, it takes both execution and ingredients to make a good meal.

Supervisory Authority: How We Get From Ingredients to Meal

I’ve read a lot of articles about the General Data Protection Regulation, and I notice two common points in almost all of them: 1) the GDPR lists data privacy rights for consumers, 2) this is a positive thing for consumers. However, after reading the entire law, I think this is a gross oversimplification. The most obvious point that should be added is overwhelming portion of the statute that is devoted to discussing “Supervisory Authorities.” The GDPR may list a lot of consumer rights, but it also specifically details how these rights are to be enforced and maintained. This law prescribes a coordinated effort between controllers, processors, supervisory authorities, and the EU Board.

As described in Article 51, 1, a supervisory authority is a public authority “responsible for monitoring the application of this Regulation, in order to protect the fundamental rights and freedoms” that the GDPR lists. Each member of the EU is required to “provide for” such an authority. I can only speculate that this would look like a small, specialized government agency or board. This supervisory authority is required to work with the various companies that hold and process data (“controllers” and “processors” in the GDPR) to ensure compliance and security. The supervisory authority is responsible for certifications, codes of conduct, answering and investigating consumer complaints, monitoring data breaches, and other components of a comprehensive data privacy program. The supervisory authority must be constantly and actively ensuring that the rights in the GDPR are made real.

If the supervisory authority can’t coordinate the effort with the controllers and processors, the rights in the GDPR are just delicious ingredients that were forgotten about and burned up on the stove.

Advertisements

Tooth And Tail: Lessons in Planning With Realistic Expectations

Tooth and Tail is simple. It has to be simple because the game designers had a very challenging goal: Make a Real Time Strategy game that is reasonably playable on a console. Real-Time Strategy games are notorious for needing high-speed and complex inputs (professional Starcraft players’ fingers perform over 400 actions per minute) that are simply not possible with the constraints of a console controller (even with all of the buttons they’ve added after Nintendo produced the perfect game controller in 1990). But the designers were smart, and they looked realistically at the constraints of the system, and they crafted the game to fit those constraints. The result is a playable, enjoyable game about a Soviet-revolution inspired rodent uprising on a farm. The designers of in-house corporate programs and databases need to learn to be realistic about the actual uses of their programs.

I. Lesson in Project Design: Accept the Probabilities of Disaster so you can Plan for Prevention; Don’t Plan for Immortality and Invulnerability. (#dontbeateen)

In the digital age, there is an increased focus on preventing and eliminating problems/errors. The promised outcomes of flawless perfection are enticing, but the realities of inevitable problems require more effort be put into managing problems and recovering from disasters.

Computers amplify the speed and scale of what people can do. This makes it easier for people to do more, and to do more, faster. This includes making mistakes bigger. Years after a British woman got 15 minutes of fame for accidentally ordering 500kg of chicken wings, Samsung accidentally made a $105 billion ghost.

Samsung Securities Co (a financial services company owned by conglomerate Samsung Group) tried to pay a dividend to their employees, but accidentally gave the employees shares instead. The 1,000 WON dividend became a 1,000 SHARE distribution- creating over $100 billion in new sharesThen some employees immediately sold those shares. There were a lot of safety measures that failed in this story. The program should have been able to calculate that this order totaled over 1 trillion WON, more than 30 times the entire company value. A second human should have checked over the work for simple, obvious errors when there is a potential for this level of damage (anything at a company-wide level for a publicly-traded international corporation would certainly qualify). Several departments should have reviewed the work (compliance, risk, accounting, finance, legal—almost anyone!). Samsung’s own internal compliance should have also prevented the sale of the ghost shares.

II. A Lesson in Categorical (Or Macro) Errors: Some Mistakes are Annoying, Others Are Fatal. Design to Catch and Prevent, Not Headline and Damage Control. (#dontbeaceleb)

Mistakes happen a lot when computers are involved. Sometimes it’s the user, sometimes it’s a problem in the code. But when a user catches a problem, they can assess the problem in a broader context, and determine just how bad a mistake is. A bigger mistake is just more obvious to a human than a computer.

Many years ago, a friend of mine got on a flight and found someone else sitting in his designated seat. Not wanting to cause trouble, he simply took the empty seat next to his designated one and prepared for the flight.  As the crew prepared for taxi and takeoff, a flight attendant welcomed passengers to their non-stop service to their destination city. Upon hearing this announcement, the woman next to my friend hurriedly gathered her belongings and fled the plane.

She wasn’t in the wrong seat. She was on the wrong plane.

Computer programs don’t intuitively differentiate between the severity of errors:  the wrong plane and the wrong seat are just two errors if you’ve never flown and don’t know have a broad concept of travel or the context of moving around a world. To a computer, being in the right seat is still pretty good, just like executing a financial order with the correct number is pretty good – even if the number is in the wrong field or tied to the wrong variable. What humans easily grasp, computers are often unlikely to infer. The right detail at a micro-level cannot remedy a catastrophic error at a macro-level.

User errors are inevitable. Programming errors are likely. The more we rely on computers, programs, and apps for the things that allow our lives to function, the more likely it is that our lives will be disrupted by programmer or user errors.

III. The Solution: Make The Programs Flexible, and Make Problems Fixable.

Tooth and Tail’s success is rooted in the realism of its game designers, who sacrificed dreams of a more complex game (that would have been unplayable) for the right game that fit the actual constraints and experience of the player. Designing with the actual user’s experience in mind—with special consideration for what can go wrong—is more important for project designers and programmers every day.

There is an increasing drive to try to use computers to prevent any errors, mistakes, or problems. However, these solutions only make problems worse because they decrease flexibility in and around the program. The solutions is to move in the opposite direction: programs need to play less of a role in trying to self-regulate and self-repair, while users and programmers take a larger role in guiding and overseeing the programs.

But wouldn’t this much red-tape bureaucracy be time-consuming? Wouldn’t it be inefficient to invest so much effort in a simple dividend payment? It would take time and resources, yes—but efficiency measurement is relative to scope (among other factors): it certainly appears inefficient if 6 people spend 10 minutes each to look at the same work and find no error. Here, we would conclude that a full hour of productivity was wasted. However, if 6 people took 10 minutes each and found a problem that would have cost 1,000 hours of productivity had it not been discovered, we conclude that we have a net gain of 999 hours of productivity.

Although problems like these cannot be entirely prevented or eliminated, they can be contained and managed. If a person is on the wrong plane, they can quickly determine the outcome of their choice and work on a solution. People will still get in the wrong city from time to time, but they don’t have to end up in the wrong city as a result. Similarly, employees will make occasional typos or errors in their accounting and payroll from time to time, but that doesn’t mean that financial markets have to be rocked as a result.

Bonus Content: Privacy’s Meaningful Purpose

A few years ago, I dreamed up a concept of “meaningful privacy” to better define the discussion around the broad topic of privacy. I noticed that not every piece of data is equal. Some things are kept private because there is a concern of actual harm if the information is publicized. Some other things are kept private because of societal or cultural norms and traditions. Privacy is not and end in itself- we have it for the purpose of protecting information. However, different data has different value. Therefore, the value of privacy is relative, varying according to the data in question. One effect of this concept is to treat different breaches according to the type (or value) of data in question.

There is a huge and illuminating problem with this idea of “meaningful privacy”: Just because someone didn’t steal anything from your house doesn’t mean you feel comfortable about a break-in. Although privacy is not an end in itself, it is intrinsically upsetting when our privacy is violated. The biggest fear is the potential for future violations of privacy: just because no harm occurred as a result of one violation, there is no guarantee about future violations. Furthermore, a past violation of privacy indicates a vulnerability and thus the potential for future violations. With a diminished expectation of privacy, there is diminished privacy. Privacy is of little use if it cannot be relied upon.

Horizon: The Dawn of Zero Privacy?

Horizon: Zero Dawn is a problem because I don’t know which game I have to slide out of my top 5 in order to fit it into that list. (It might be have to replace “Child of Light,” which pains me, but replacing any would pain me… maybe “Outlaws” will move to #6 …) It’s an incredible game in its own right, with beautiful artwork, well-written characters, and genuinely fun gameplay. I find its story especially fascinating—and particularly relevant as we grapple with a framework for governing and living in an age of digital information and interconnected devices. Though its central technological focus is on Artificial Intelligence and the future of humanity, it touches a multitude of topics- including data privacy.

Although Judge Richard Posner famously decried privacy as a way for bad people get away with bad things, privacy is important for personal development and free association. Privacy is essential to our culture, and it is only valuable inasmuch as it is protected and reliable. Our expectations of privacy follow us into our digital extensions. However, one of the best methods of securing privacy is impractical in the face of consumer demands for interconnection and convenience.

I. Can We Have Privacy by Design When We Demand Designs that Compromise our Privacy?

The Federal Trade Commission’s favored method for protecting Privacy is “Privacy By Design.” In simple terms, this often means designing a product to rely as little on privacy as possible. After all, if no data is collected, there is no data to steal. However, there are serious questions about the feasibility of this approach in the face of consumer expectations for interconnected devices.

Privacy by Design is a much better idea than the sophomoric idea of increasing security measures. Designing a house not to be broken into is better than trying to just put a good lock on the front door. To put it another way: Think of it as building a dam without holes rather than trying to plug all of the holes after you finish building.

I’ve heard tech entrepreneurs talk about “The Internet of Things” at conferences for many years, now. They talk about it like it’s a product currently in development and there’s an upcoming product launch date that we should be excited about- like we can line up for outside of a retail store hours before the doors open so we can be the first to get some new tech device. This is not how our beloved internet was created. Massive networks are created piece by piece- one node at a time, one connection at a time. The Internet of Things isn’t a tech product that will abruptly launch in Q3 of 2019. It’s a web of FitBits, geolocated social media posts, hashtags, metadata, smart houses, Alexas and Siris, searches, click-throughs, check-ins, etc. The “Internet of Things” is really just the result of increasingly tech-savvy consumers living their lives while making use of connected devices.

That’s not to diminish its significance or the challenges it poses. Rather, this highlights that this “Coming Soon” feature is really already here, growing organically. Given that our society is already growing this vast network of data, Privacy by Design seems like an impossible and futile task. The products and functions that consumers demand all require some collection, storage, or use of data: location, history, log-in information- all for a quick, convenient, personalized experience. One solution is for consumers to choose between optimizing convenience and optimizing privacy.

II. A Focus on Connected Devices

Horizon: Zero Dawn is a story deliberately situated at the boundary of the natural world (plants, water, rocks, trees, flesh and blood) and the artificial world (processed metals, digital information, robotics, cybernetics). As a child, Aloy falls into a cavern and finds a piece of ancient (21st century) technology. A small triangle that clips over the ear, this “Focus” is essentially a smart phone with Augmented Reality projection (sort of… JawBone meets GoogleGlass and Microsoft Hololens). This device helps to advance the plot, often by connecting with ancient records that establish the history of Aloy’s world (it even helps with combat and stealth!).

It’s also a privacy nightmare. The primary antagonist first sees Aloy -without her knowledge- through another character’s Focus. Aloy’s own Focus is hacked several times during the game. A key ally even reveals that he hacked Aloy’s Focus when she was a child and watched her life unfold as she grew up. (This ultimately serves the story as a way for the Sage archetype to have a sort of omniscience about the protagonist.) For a girl who grew up as an outcast from her tribe, living a near-solitary life in a cabin on a mountain, with the only electronic device in a hundred miles, she manages to run into a lot of privacy breaches. I can’t imagine if she tried to take an Uber from one village to the next.

Our interconnected devices accumulate deeply astonishing volumes of data- sometimes, very personalized data gets captured. In a case heard by the Supreme Court this month, a man in Ohio has his location determined by his cell phone provider. The police obtained this information and used it as part of his arrest and subsequent prosecution. The Supreme Court recently heard a case about the use of warrants for law enforcement to access cell phone data. (This is different from the famous stalemate between the FBI and Apple after the San Bernadino shooting, when Apple refused an order to unlock the iPhone of a deceased criminal.)  As connected devices become omnipresent, questions about data privacy and information security permeate very nearly every side of every facet of our daily lives. We don’t face questions about data the way that one “faces” a wall; we face these questions the way that a fish “faces” water.

From cell phone manufacturers to social media platforms, the government confronts technology and business in a debate about the security mechanisms that should be required (or prohibited) to protect consumers from criminals in myriad contexts and scenarios. In this debate, the right answer to one scenario is often the wrong answer for the next scenario.

Conclusion: Maybe We Don’t Understand Privacy In a New Way, Yet

The current cycle of consumer demand for risky designs followed by data breaches is not sustainable. Something will have to shift for Privacy in the 21st century. Maybe we will rethink some part of the concept privacy. Maybe we will sacrifice some of the convenience of the digital era to retain privacy. Maybe we will try to rely more heavily on security measures after a breakthrough in computing and/or cryptography. Maybe we will find ways to integrate the ancient privacy methods of the 20th century into our future.

 

What the Internet of Things can Learn from “The Order 1886”

Great (Sounding, Looking) Potential

The Order 1886 has great quality graphics, but is a poor quality game. Just because the technology involved is cutting edge doesn’t mean the final product is good. The internet of things relies on some cutting edge technology and novel ideas, but that doesn’t mean the final product is always favorable.

I’ve been hearing about the “Internet of Things” for several years now. Middle-aged entrepreneurs are just sure that this “the next big thing,” except it’s going to be bigger than the car or the light bulb. From what I’ve seen, IoT is a glossy, shiny, pretty gimmick that hasn’t shown it’s poised to really solve problems that consumers feel they have. So far, we don’t think a fridge that buys eggs for us is really what’s missing in our lives.

Having sophisticated technology isn’t the same as having a great (or even marketable) tech product. In the same way, having glossy graphics isn’t the same as having a good (or even marketable) game. Both IoT and Order 1886 are impressive at a glance, but fail to live up to expectations as one spends more time with them.

Burger King Sets Itself Up For Trolling

The broad IoT idea continues to reveal vulnerabilities and half-thought-out applications. A few months ago, Burger King aired an ad in which the actor in the commercial asked the viewer’s smart phones to read the first paragraph of the Wikipedia page about Burger King’s flagship product, The Whopper. The completely predictable result was that people started vandalizing the Wikipedia page in question, leading the ad to tell people that The Whopper contained humans and cyanide.

There’s a lot I could go into about this example, especially about troll behavior and the weaknesses of IoT’s reliance on unsecure nodes. I want to highlight that the problem wasn’t about hacking into Burger King or Android systems. There are some concerns with IoT and that sort of hacking, but there’s another problem: Entrepreneurs rely on the web without knowing what 4Chan is or having have never been verbally abused by a stranger for the length of an entire League of Legends match. That is a mistake.

This example also illustrates why IoT hasn’t gotten traction: It’s still a gimmick that breaks often. Even when it works at its best, IoT is a fun and surprising answer to a question no one asked. The best case for Burger King’s commercial is that they surprise a few consumers, but also stir fears about privacy and security in doing so. The success of IoT still hangs on the uncomfortable reality of diminishing personal privacy, and many consumers haven’t completely reconciled leaving the past with entering the future.

The Order 1886 Fails as a Game, IoT Still Fails as a Tech Product

One of the reasons people were so angry about The Order 1886 is that the trailers looked so good. People bought into the promise and the hype, and then it failed to deliver in meaningful ways. Similarly, the more glossy the presentations about IoT get, the more consumers will feel the gap when they don’t experience a meaningful impact as a result of using it.

It’s the applications that go on top of the tech that really matter. Platforms and apps that balance consumer’s emotions about privacy and security will be the only thing that can really bring about the kind of pervasive, omnipresent IoT about which I keep hearing (excited and vague) presentations.

Things that look really good but don’t do anything are called art. Things that do something useful are called products. Usefulness is not the sole factor in a product’s quality or its marketability, but it is important- especially if it wants to be more than a fad or gimmick that ends up with a discount sticker in the bargain bin.

Are Trademarks a Data Security Alternative to Sad, Weak, Outdated Copyrights?

If you’ve been on the web for a while, you’ve seen an advertisement that looks like the user interface of the website you’re viewing- or maybe an ad that has a false close button, and clicking it just navigates you to the advertised page. These are blatant ways to trick consumers into taking actions they don’t want to take. Sometimes, these inadvertent actions can create security vulnerabilities such as malware.

Despite all of the focus on applying copyright law to the internet, I wonder if there are hints of trademark and trade dress protections that could become relevant to data privacy issues. I will cautiously, even timidly, explore a few of those possibilities (which several others have explored over the last few years).

I. Trademarks: When it Comes to Data Privacy, Accept No Imitations.

Trademarks have a simple purpose: to let consumers know the origin of a good or service. Trademarks are often a word, phrase, or image (logo), but can also be a sound or smell (on rare occasion, it can get a bit more abstract ).

A major category of trademark infringement is counterfeiting. That $20 “ROLEX” watch from the guy in the alley? That’s a counterfeit (sorry), and one of the legal issues involved in the sale of that watch is the use of a trademark without the legal right to use it. There haven’t been a lot of counterfeit websites on the internet, especially since SSL and other authentication processes got better. However, there are plenty of imitation apps and games. One of the reasons such apps and games fail and are quickly removed from distribution is that they infringe trademarks.

However, some countries do not have the same standards regarding trademark (or copyright) enforcement. Consider an imitation League of Legends game, lampooned here. At the end of the video, the player says “Oh, and it’s also a virus,” as his security software reports malware after playing the game. This humorously underscores the point that many infringing* products pose a security and privacy threat. Using trademark law to limit the proliferation of readily accessible, easily confused programs is a valuable practice in maintaining computer security for consumers.

II. Trade Dress: No One Really “Owns” That Icon… But You Know Who Owns That Icon.

Trade dress is a sort of sub-category of trademarks. It’s rarely talked about or used, but it can be thought of as the totality of design and aesthetics that go into a product, place, or service that make consumers identify the source. Color palette, patterns, shapes, and other factors go into the evaluation of trade dress. Crucially (and perhaps fatally), elements of a trade dress must be considered “non-functional.”  For example, the major case in trade dress concerned a Tex-Mex restaurant that used the same colors and layout of another Tex-Mex restaurant.

Here’s the controversial idea I think deserves consideration: Could misleading, camouflaged web content be considered an infringement of trade dress? (Think of the kinds of ads that make you believe you’re not clicking on an ad, but rather some piece of actual content on the site- especially regarding navigation buttons that match the navigation icons of the site.)

The reason I look to trade dress for a solution is that icons and interfaces, even stylized ones, are not subject to trademark, copyright, or patent protections. Furthermore, websites are increasingly treated as the digital equivalent of stores and offices of businesses- so much so that designs and layouts can come to be the trade dress of that business. Thus, there is a gap in the legal protection of user interfaces, and a need to cover that gap.

(Treating websites as subject to trade dress might have the added benefit of discouraging UX and UI designers from fiddling with the location and arrangement of navigation tools every other month just to justify their paycheck. And that’s the kind of change this world really needs.)

Conclusion: Trademark Protection is Already Working, Trade Dress is Still Vague and Untested

Trademark law is already quietly making the digital ecosystem a little bit safer by eschewing threatening knock-off games and apps. I think there’s a case to be made for applying trade dress to websites and UIs, but it would be a novel application and courts may be hesitant to apply the law so creatively.

 

* “300 Heroes” Infringes both copyrights and trademarks, but it’s the funniest example.

 

Her Data Is Part of Her Story, But Her Story is not Just Her Data.

Her Story” is a great example how piecing together bits of information can create a picture of a person or an event. It is also an example of some of the limits of that picture.

Hack Her Data, Hack Her Story

Her Story” is difficult to describe or classify as a game. It’s a little like trying to find and organize the pieces of a detective novel. The game doesn’t give the player a lot of direction; part of the game is the discovery of the game itself. The game allows the player to search a police database to find short movie clips from several police interviews with a woman. No context is given for why the woman was interviewed or why the player is searching the database. However, by finding and watching the clips, the player gains clues that allow new searches. This cycle of searching and information is the core mechanic of the game.

Hacking to Learn

Hacking can mean a lot of things, but it is broadly about investigation (sometimes, it is an investigation that is against some laws). It can be done for a wide range of reasons and can take many different forms, many of them legal– or even a legitimate business. Regardless of the specific details, hacking always involves exploring the possibilities and limits of a system in order to learn or discover something. In “Her Story,” the hacking is learning what the in-game database can find that will help the player piece together a coherent string of events and characters.

The Limits of Hacking

Even after hacking together all of “Her Story,” something about the picture is incomplete. Why is the player watching these interviews? The game gives the player this answer after piecing together enough of “Her Story,” but hacking a person’s data doesn’t necessarily answer all of the questions about that person. For most criminal hackers, the pieces of data have enough of the story: credit card numbers, bank accounts, social security numbers, addresses, birth dates, etc. Sometimes we need more than a collection of data about a person, and those are often cases where believing data too blindly can cause problems, from legal decisions in courts or policies to judgments in our interpersonal relationships. As mountains of data pile up for each of us, the temptation to describe and explain people using that data also grows. This data has a lot of appeal because it can measure and evaluate some things very effectively. This effort to make life more efficient comes brings at least two potential drawbacks: First, the data can be misleading in myriad ways, and second, the data seems so powerfully scientific and sound that questioning it (or its interpretation) can become almost taboo.

Her Story

There will always be hackers trying to steal financial information and identities. But that threat is known and recognized, so experts fight against it and consumers take protective measures. The data we give to companies and employers and government is riddled with pitfalls, and blind faith in big data will amplify those problems. In “Her Story,” twists emerge as the player pieces the plot together. After enough of the story is pieced together, the game asks the player if “you understand why [the woman] did what she did.” I’m not sure any collection of data can ever really answer that.