Privacy In The Age of Genomics, http://issuu.com/genewatchmagazine/docs/genewatch_27-1

Private Accommodations
by Patricia J. Williams

published in Genewatch Magazine, vol. 27, no. 1, Spring, 2014

How much of our privacy will be challenged by decoding the human genome is an open, evolving question. Translating the coded “text” of the double helix is only half the task. Translating that information into the taxonomies of public policy requires balancing the contrasting values of scientific conventions and legal discourse.
Most obviously to most people, the indiscriminate compilation of DNA databanks here and in other countries potentially compromises not just privacy but presumptions of innocence, as well as the right to be free from unreasonable searches and seizures. This is particularly so where databases sort people using categories overlaid with biases about social history, race, genetic determinism or inborn aggression. In 2013, for example, it was revealed that police in southern Sweden had compiled a registry of over 4000 “travellers” (or Roma people), with personal information about whole families, going back to the 1800’s. The database included no other Swedes, whatever their criminal history. Nor did this pretend to be just a list of Roma criminals—it included artists, athletes, civic leaders, and over 1000 children, some as young as two years of age. Their only commonality was that they were Roma.[1] (Sweden being Sweden, the police department involved ultimately turned itself in, reporting its own violation of a number of laws including the European Convention on Human Rights.)
In the United States, suspect profiling in largely-black and Latino neighborhoods is so widely and disproportionately practised that DNA registries are inherently racially loaded and coded, if in nominally more subtle ways than Sweden’s. And in courts, there is a good deal of disingenuousness among judges who have dealt with the issue thus far. In Maryland v. King, for example, the Supreme Court characterized DNA simplistically and quite flatly, as a means of identification similar to fingerprints or photographs.[2] But of course DNA reveals much more than fingerprints or photographs—not just about ourselves but about our families. And in the future, it is likely to reveal much more.
The state of Maryland at least destroys data if it collected from an arrestee who is subsequently not convicted. That is not the case in many other states: California, for example, keeps DNA samples, even those from people merely arrested and subsequently released, unless and until the retention is challenged by an individual request to destroy. Yet in March of 2014, the 9th Circuit upheld California’s practice in the case of Haskell v. Harris.[3] The court dismissed concerns about privacy, breezily dubbing buccal swabs “routine” and ubiquitous “throughout the nation.” The swab itself was viewed as a “minor intrusion,” for the judge weighed “intrusiveness” only by the physical ease of cheek swiping; there was absolutely no consideration of the deep and lifelong intrusiveness at stake in the medical and familial information DNA may reveal, so exponentially beyond the sort of “identification” that fingerprints or photos can.
The profiling and curatorial instinct of government functionaries—from courts to police to armies—is not something to be lightly written off. For those who doubt the potential dystopian uses of genetic material, the recent film “DNA Dreams,” available on YouTube at http://www.youtube.com/watch?v=jf3MrVHkKxk , provides sober ground for thought. Although focused on the enormous Chinese genomic research conglomerate BGI, it highlights the porousness—indeed the evaporation–of boundary between what we are accustomed to thinking of as public and private structures and institutions in a transnational, globalized economy.
Nor is it DNA collection alone that threatens privacy; it is the ability to pair that information with the planetary tracking of every other aspect of our lives. Books like Julia Angwin’s “Dragnet Nation” and Robert McChesney’s “Digital Disconnect” document the degree to which every last intimacy of our lives is accessible to strangers: through online hospital records or the trails we leave by our shedding of hair and skin, through street cameras or what our address implies about us, through our use of cellphones, webcams, Facebook, fitness wristbands, and credit cards. In addition to the information that is gathered in legal or unregulated ways–say by neighbors, employers, corporations, and governments–we also contend with illegal or ethically vexed invasions from a fairy-tale-sounding litany of anonymous grifters, hackers, trolls, cookies, and infinitely-proliferating forms of malware.
Despite this, we sigh a bit, shrug with the cliché of it all–there are no secrets anymore!–and press “I agree” without thinking when purchasing everything from iTunes to banking services to airline tickets. Agree to what? is an inquiry routinely evaded, a closed door unshadowed by curiosity.
So it should not be a complete surprise when those unread terms come back to haunt us in seriously constraining ways. When the Microsoft corporation recently suspected an employee of stealing software code protected as trade secrets, it simply combed through users’ private emails and instant messages—not only those of the employee, but of a journalist who had blogged about communications received from the employee. Microsoft claims it was authorized to do this because of the terms of service to which all Hotmail accounts are subject. The New York Times reported that Microsoft’s actions were technically “within the boundaries of the Electronic Communications Privacy Act, which allows service providers to read and disclose customers’ communications if it is necessary to protect the rights or property of the service provider.” This is indeed a relatively unregulated realm, stretching the interpretive bounds of traditional private contract law, creating fairly dubious presumptions that consumers have choice in the matter, and have willingly given up their privacy rights in exchange for the service of the internet. Microsoft’s power invades not only individual privacy but also the ability of journalists to protect their sources. As civil rights attorney Nate Cardozo observed, “To see Microsoft using this right to essentially look through a bloggers email account for evidence of wrongdoing and then turn it over on a silver platter for law enforcement, it is extremely undesirable…” [4]
Whether we bother to read the invisible contracts that govern so much of our lives or not, the truth is that almost all service providers leave consumers with little in the way of privacy rights—it’s just rare that a company like Microsoft admits it so openly. As Edward Wasserman, Dean of UC Berkeley’s Journalism School, stated, “Microsoft essentially decided that whatever privacy expectation that its own customers supposedly had was basically a dead letter. It simply decided that in its own corporate interest, it can intrude on a person’s email.”
This very broad ability should be considered against the backdrop of how such a power might be used in the context of DNA dragnets conducted not just by governments but by global corporations with accountability to no interest but their own private profit. If most of us are at least vaguely aware of the potential for misuse when genetic data is taken by law enforcement agencies, we seem entirely willing to just give it away through the easy carelessness of such unread agreements with ancestry tracking services, direct-to-consumer health companies and so called “spit parties.”
In this over-exposed new universe, there are many who insist, “I have nothing to hide.” But the law’s protection of privacy is not dependent upon the felt necessity to hide. Privacy is a space as well as an idea. It is the distance we give each other to be happy hogs wallowing in our own mud. It is the shelter we need to be creative and think or write or compose on our own terms to say nothing of outside the box. It is the freedom to make mistakes and to improve on first efforts. It is the ability to decide when to publish an observation or whether to broadcast a considered narrative of our own experience. On a personal level, it is the right to hold at bay the prurient humiliation or judgmental gaze of others who might desire to catch us literally or figuratively with our pants down. And as a matter of citizenship, it puts distance between us and the potentially totalizing power of government functionaries who may be motivated–however beneficently or banally—to regulate political thought by acting as arbitrary censors, just-curious home invaders, authoritarian gatekeepers, or whimsical jailers.
The concept of autonomy is central to American—and most Western– juridical and political constructs of democracy. As largely-invisible data aggregators amass evidence of our every purchase, movement and heart beat, our identity as unique individuals will become subsumed to the much greater emphasis placed on our relation to some spectrum of actuarial expectation. Increasingly we will be advertised to, deflected from, assessed for criminality, disease probability or financial risk, assigned emotional valence, assorted, tagged, boxed, confined.
On top of this, social media networking has not lived up to its promise of replacing traditional intra-personas communities, emerging instead as a force that fragments human engagement as much as it coheres. Along with global media monopolies, it too often herds erstwhile polities into imaginary “teams” and embattled formations of hype, tabloidization, disinformation and fear.
All these forces conspire to create a world and a citizenry of fewer and fewer upwardly mobile “speaking subjects.” Instead we become locked into a shell-like status fixed by carelessly-composed data sets, as well as un-interrogated correlations made by invisible beaurocrats. Without oversight or due process, it will be harder and harder to challenge, never mind find out why, we came to be labelled “a this” rather than “a that.” A flight risk or a cancer risk? A quick learner or a big spender?…Like it or not, willingly or not, these are the identity groupings by which we will be judged and from which we will struggle vainly to escape. In writing about Sethe, the main character in Toni Morrison’s Beloved, Avery Gordon notes that “[w]hen she hears schoolteacher’s directions to his pupil that he should put her animal characteristics on the right and her human characteristics on the left, she does not know what the world characteristic means and has to ask, but she nonetheless understands the conjuncture of power and epistemology that is the very stakes of her representability.”[5]
However reductive, the tiny particulate markers of our identity are valuable as intellectual property; they become monetized nuggets in the “knowledge economy”—little lego assemblages of data used to construct the avatars and facsimiles that stand in for us in a world repositioned as efficiently heuristic rather than participatorily democratic. Genes, cells, fingerprints, blood or isolated phenotypes become “immortal” ciphers, or fixed character properties. Governments, pharmaceutical companies, and, yes, Microsoft, attribute to incremental pieces of ourselves a separate life that engulfs or becomes more important than our complex embodied selves. It is a peculiar de-forming of our lives; just for the sake of argument I will analogize to Gordon’s description of Sethe’s fear: “Within the violence of an economy in which Sethe made the ink used to write her into a book that would literally measure her alterity, the equation literacy equals power unmasks its sinister shadow.”[6]
Law professor Jessie Allen, who authors the blog “Blackstone Weekly,”[7] a contemporary take on 18th century English jurist William Blackstone, observes that what is happening now might be very similar, as a conceptual matter, to the beginnings of the corporation. In his “Commentaries on The Laws of England,” Blackstone described early entrepreneurs’ concern with the basic limits of their own humanity: People die. That’s bad for business. The invention of the corporation effectively created an immortal legal subject, untethered to human frailty. That immortality is, in effect, a way of extruding from particular bodies a use value that can be assigned to the ether of a legal fiction—a fiction that “speaks” through articles of incorporation, and whose profit may be divvied up among distant, abstract shareholders.
There is a similar process of dispossession in the mining of our habits, our bodies, our preferences and dispositions. It’s framed as “not about us,” at least as individuals—even though it may be used to powerfully confine us as individuals, can be used to mark us even as it can rarely be claimed by us.. It’s rather about one’s group, one’s place, one’s “anonymized” metrics.
Along with this expansion of Big Data, there is a shift, as described by legal academics David Singh Grewal and Jedediah Purdy, away from liberalism’s vaunting of the autonomous legal subject and toward a neoliberal “moral vision of the person and of social life that emphasizes consumer-style choice, contract-modelled collaboration…”[8] This reconfiguration of the righted subject into what is effectively instantiation of the person as corporation has two implications for how privacy is perceived. First, the value of the individual is rewritten as alienable rather than inalienable, as cost-benefit, profit-motivated and value-added. (This explains, I suppose, the conversation I overheard among a group of high school students on the subway, busily working on a homework assignment in which they had been asked to “brand” themselves, to give that brand a catchy name, and to sell that brand in no more than five sentences, because with more than five sentences “you lose your audience.”)
Second, corporatized people don’t need healing; and indeed the rules of corporate law bend away from the idea of justice as individually remedial or personally restorative. A corporate being looks to the law not for civil rights but for the predictive, the risk-minimizing, the future-controlling immortality of guaranteed return. Through that lens, any legal system based on consent, or on individual cases and controversies begins to look cumbersome in comparison to the speedy efficiency of stochastic models. Consider again the Microsoft case: how we in the industrialized world who conduct most of our work and play—indeed our entire lives—with the assistance of computers, are always pressing little buttons that say “I agree” to terms of service, conditions of usage, and privacy limitations that we never bother to read. Consider how ritualized that behaviour has become, the act of consent rendered thoughtlessly, invisibly performative in a way that “disappears” any need for negotiated participation. The surface language of contract effectively marks only a site of erasure.
It is not as though the terms of those agreements do not exist, however. If one bothers to print off the actual contracts to which those little buttons refer, the monolithic imbalance of bargaining power rises before one, like the dark cliffs of Sauron’s castle walls, in The Lord of the Rings. Often running to thirty or forty pages of language that leave corporations with no responsibilities and individual consumers with no rights, the utter lack of public engagement with such terms means that there is virtually no consumer movement or pushback to the accumulated wealth being mined from the data that most such contracts assign to huge entities like Google or Apple or Amazon. Here is just one paragraph from the agreement that gene-tracking company 23andMe proffers:
“YOU EXPRESSLY ACKNOWLEDGE AND AGREE THAT: (1) YOUR USE OF THE SERVICES ARE AT YOUR SOLE RISK. THE SERVICES ARE PROVIDED ON AN “AS IS” AND “AS AVAILABLE” BASIS. 23ANDME EXPRESSLY DISCLAIMS ALL WARRANTIES OF ANY KIND, WHETHER EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT. (2) 23ANDME MAKES NO WARRANTY THAT (a) THE SERVICES WILL MEET YOUR REQUIREMENTS; (b) THE SERVICES WILL BE UNINTERRUPTED, TIMELY, UNFAILINGLY SECURE, OR ERROR-FREE; (c) THE RESULTS THAT MAY BE OBTAINED FROM THE USE OF THE SERVICES WILL BE ACCURATE OR RELIABLE; (d) THE QUALITY OF ANY PRODUCTS, SERVICES, INFORMATION, OR OTHER MATERIAL PURCHASED OR OBTAINED BY YOU THROUGH THE SERVICES WILL MEET YOUR EXPECTATIONS AND (e) ANY ERRORS IN THE SOFTWARE WILL BE CORRECTED. (3) ANY MATERIAL DOWNLOADED OR OTHERWISE OBTAINED THROUGH THE USE OF THE SERVICES IS DONE AT YOUR OWN DISCRETION AND RISK AND THAT YOU WILL BE SOLELY RESPONSIBLE FOR ANY DAMAGE TO YOUR COMPUTER SYSTEM OR LOSS OF DATA THAT RESULTS FROM THE DOWNLOAD OF ANY SUCH MATERIAL. (4) NO ADVICE OR INFORMATION, WHETHER ORAL OR WRITTEN, OBTAINED BY YOU FROM 23ANDME OR THROUGH OR FROM THE SERVICES SHALL CREATE ANY WARRANTY NOT EXPRESSLY STATED IN THE TOS. (5) YOU SHOULD ALWAYS USE CAUTION WHEN GIVING OUT ANY PERSONALLY IDENTIFYING INFORMATION ABOUT YOURSELF OR THOSE FOR WHOM YOU HAVE LEGAL AUTHORITY. 23ANDME DOES NOT CONTROL OR ENDORSE ANY ACTIONS RESULTING FROM YOUR PARTICIPATION IN THE SERVICES AND, THEREFORE, 23ANDME SPECIFICALLY DISCLAIMS ANY LIABILITY WITH REGARD TO ANY ACTIONS RESULTING FROM YOUR PARTICIPATION IN THE SERVICES.
…WITHIN THE LIMITS ALLOWED BY APPLICABLE LAWS, YOU EXPRESSLY ACKNOWLEDGE AND AGREE THAT 23ANDME SHALL NOT BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL, OR EXEMPLARY DAMAGES, INCLUDING BUT NOT LIMITED TO, DAMAGES FOR LOSS OF PROFITS, GOODWILL, USE, DATA OR OTHER INTANGIBLE LOSSES (EVEN IF 23ANDME HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES), RESULTING FROM: (a) THE USE OR THE INABILITY TO USE THE SERVICES; (b) ANY ACTION YOU TAKE BASED ON THE INFORMATION YOU RECEIVE IN THROUGH OR FROM THE SERVICES, (v) YOUR FAILURE TO KEEP YOUR PASSWORD OR ACCOUNT DETAILS SECURE AND CONFIDENTIAL, (d) THE COST OF PROCUREMENT OF SUBSTITUTE GOODS AND SERVICES RESULTING FROM ANY GOODS, DATA, INFORMATION, OR SERVICES PURCHASED OR OBTAINED OR MESSAGES RECEIVED OR TRANSACTIONS ENTERED INTO THROUGH OR FROM THE SERVICES; (e) UNAUTHORIZED ACCESS TO OR ALTERATION OF YOUR TRANSMISSIONS OR DATA; (f) THE IMPROPER AUTHORIZATION FOR THE SERVICES BY SOMEONE CLAIMING SUCH AUTHORITY; or (g) STATEMENTS OR CONDUCT OF ANY THIRD PARTY ON THE SERVICES.

By the same token, the popularity of massive online education forums, or so-called MOOCs, can contain very troubling terms vis a vis privacy interests as well as the transferred wealth of data: Platforms like EdX, while billing themselves as “free,” require participants to consent to being research subjects for neuroscientific studies about how students learn, or as the privacy terms as of March 27, 2014 (since all terms are “subject to change at any time”) put it: “…for purposes of scientific research, particularly, for example, in the areas of cognitive science and education.” To that end, EdX and other MOOCs have begun to publish research studies based on quietly mining the learning patterns of what is, in effect, its online global laboratory of students: “…[W]e sometimes present different users with different versions of course materials and software. We do this to personalize the experience to the individual learner (assess the learner’s level of ability and learning style, and present materials best suited to the learner), to evaluate the effectiveness of our course materials, to improve our understanding of the learning process and to otherwise improve the effectiveness of our offerings. We may publish or otherwise publicize results from this process, but, unless otherwise permitted under this Privacy Policy, those publications or public disclosures will not include Personal Information.” Hmm. “Unless otherwise permitted…”? As a student of contract law, I like to hope that all such terms would be interpreted through a filter of implied reasonableness and conscionability, yet that is not necessarily the jurisprudential trend…Here is just one other paragraph from EdX’s terms of service:
License Grant to edX. By submitting or distributing User Postings to the Site, you hereby grant to edX a worldwide, non-exclusive, transferable, assignable, sublicensable, fully paid-up, royalty-free, perpetual, irrevocable right and license to host, transfer, display, perform, reproduce, modify, distribute, re-distribute, relicense and otherwise use, make available and exploit your User Postings, in whole or in part, in any form and in any media formats and through any media channels (now known or hereafter developed).

I began this essay by framing the issue as one of privacy, in particular genetic privacy; let me place that concern against the cultural backdrop of our general, if radically rosy, technophiliac faith in the inevitable good of what genetic information will divulge. This is a remarkable moment, surely, with technology transforming human relations as profoundly as did the printing press. Technology is progressing so rapidly and sweepingly that it is almost impossible not to allow the imagination free reign, to push past what the science actually reveals. It is hard to resist romanticizing its possibility, as enhancement beyond all known history. We are headed towards an era of superhumans, mechanical Ubemensches.! We cannot fail! Throw out the old! Bring on the bionics!
But I remain intrigued by that notion of neoliberalism as pushing humans into corporatized boxes and those boxes as ciphers for the ancient hubris of sought immortality–the immortality of a figurative body; the crafting of a fictional, controllable or ideal mechanism that can be cobbled together from pieces and parts. Alas, I do not believe in immortality. There is only the intimately creative integrity of an embodied self. If we fail to nurture that generative space, of which privacy is the guardian, we put distance between heart and head, and our flourishing becomes unmoored from any investment in the self that is not situated in a global market place of invisible, soul-crushing number crunchers.

Advertisements

Leave a comment

Filed under 23andMe, big data, maryland v. king

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s