Richards applied the “Page 99 Test” to his new book, Why Privacy Matters, and reported the following:
From page 99:Learn more about Why Privacy Matters at the Oxford University Press website.Sometimes there is little or nothing we can do to prevent others from disclosing information about us. This can happen when companies set up pricing systems that rely on information disclosure, like “safe driver” discounts for car insurance contingent on your agreeing to have a black-box data controller in your car, especially if such boxes were to become standard in passenger cars. Or when your child’s school decides to use a “learning management system” or other software that has privacy practices only the school can agree to. Or when a company voluntarily discloses data it collected about you to the government. Or when someone discloses their genetic data to a company, which, since blood relatives have very high genetic similarities, means they have also shared sensitive information about their close family members.If someone were to open my book Why Privacy Matters randomly to page 99, they would get an excellent idea of what my book is about – and of why privacy matters! My book is about privacy, about human information, and about how our information confers power over us. To put it simply, privacy matters because in an information society, privacy is power. Privacy advocates, lawmakers, and even companies have argued for many years that the best way to protect privacy is to put us in control of our privacy by giving us choices about how our information is used. But in Why Privacy Matters, I show how control is an illusion in our modern, networked world. Thinking about privacy as control might seem great in the abstract, but in practice it is overwhelming, with innumerable choices about privacy for every one of the dozens of digital services and hundreds of websites the typical consumer uses or visits in a year. And on page 99, I argue that even if we could somehow solve the problem of overwhelming control, control is insufficient to protect our privacy because other people’s choices can affect what people know or do with our information. The example of the use of leaked DNA data to catch the Golden State Killer is one example of how other people’s choices about information can affect us, though there are many others. Instead of treating privacy as merely an individual value that can be bartered or frittered away by individuals, we need to recognize that the set of rules we put in place for how our human information is collected and used has social consequences for the society that we – and our children and grandchildren – will inhabit. We need better privacy rules, ones that advance the human values of letting us develop our political and personal identities as humans, securing our political and democratic freedoms as citizens, and protecting us as consumers and full members of the digital economy and society.
This last example is how the notorious murderer and rapist known as the Golden State Killer was caught in 2018, using data from GEDMatch, a basic Florida website that allowed people to upload their genetic profiles to help with genealogical searches and filling in blank spots in family trees. Police took old genetic material from crime scenes and uploaded the sequenced genome to GEDMatch. That produced a pool of potential relatives of the killer, which the police used to identify Joseph James DeAngelo, a seventy-two-year-old man living in Citrus Heights, California. (They also confirmed his identity using DNA of his on a tissue they found in his trash, but that’s a different privacy issue.) And while it might be hard to muster too much sympathy for the privacy travails of a serial killer, the Golden State Killer example nicely illustrates how such privacy unraveling could also be used to reveal paternity, disposition to genetically linked diseases like breast cancer, and many other facts about us that (unlike being a serial killer) are no fault of our own. Perhaps even more important, the phenomenon of unraveling happens entirely outside the realm of consent, a gap in consent that is just further evidence that Privacy as Control cannot bear the tremendous weight that has been placed on it.
The limitations of Privacy as Control are thus numerous. Simple to state and noble in theory, it nevertheless operates in practice as a smokescreen under which companies control humans rather than humans controlling their data.
--Marshal Zeringue