They, the people

Social Darwinism in the United States

By Patricia J. Williams Times Literary Supplement, August 6, 2021,

A Eugenics Society exhibit, c.1930A Eugenics Society exhibit, c.1930|© Wellcome LibraryAugust 6, 2021Read this issue


FACING REALITYTwo truths about race in America
168pp. Encounter Books. $25.99.Charles MurrayLong reads

In 1853, Joseph-Arthur, Comte de Gobineau published a tract entitled “Essay on the Inequality of the Human Races”, in which he argued that “history springs only from contact with the white races” and that miscegenation with “commoners” as well as with “yellow” and “black” races leads to the “demise of civilization”. Widely credited with popularizing the concept of the Aryan master race, Gobineau’s thoughts found purchase among pro-slavery Americans and, eventually, became an ideological cornerstone for the American Breeder’s Association, the American Eugenics Society and the Nazi party. His royalist assertions of elite white “bloodlines” have been enduringly influential in the United States.

Few have had a greater hand in carrying Gobineau’s notions into the present than Charles Murray. Along with J. Philippe Rushton, Arthur Jensen, Richard Herrnstein and Nicholas Wade, Murray has championed social Darwinism, an ideology that divides humans into biologically distinct “races” with distinct genetic propensities that neatly predict everything from penis size to brain function and criminal disposition. Their bottom line is that dark-skinned people are evolutionarily limited and thus unsuited to modern life.

Hard scientific evidence contradicts the narrative. Yet the reason I find myself writing this review in 2021 is that lots of people still believe it, want to believe it, and remain committed to the disparagements such colonial conceptions invite. Little for them has changed since Gobineau’s day, and since the Victorian Francis Galton, who invented the word “eugenics” in the late 1880s, postulated that “there exists a sentiment, for the most part quite unreasonable, against the gradual extinction of an inferior race”.

The Bell Curve: Intelligence and class structure in American life, co-authored by Murray and Herrnstein, ignited an infamously bitter round of the so-called culture wars when it first appeared in 1994. While Herrnstein died shortly after publication, Murray continued to carry the torch, despite his theories being repeatedly debunked. (The most well-known refutation is probably Stephen Jay Gould’s The Mismeasure of Man, 1996.) Still Murray rises, every few years a new book, a new version of the same old narrative. He is a potent pundit whose convictions are spread gleefully by the Wall Street Journal and Fox News.

But Murray’s latest rehash, Facing Reality: Two truths about race in America, drops into the conversation at a particularly flammable moment. In brief, the American “reality” Murray presents is a construct of “race” as a category of unyielding genetic difference, a sealed box of capability, disposition and destiny. The first “truth” he abstracts from the box is that “American Whites, Blacks, Latinos, and Asians, as groups, have different means and distributions of cognitive ability”. The second “truth” is that “American Whites, Blacks, Latinos, and Asians, as groups, have different rates of violent crime”. Murray claims that “the numbers show” that those of African descent are biologically less intelligent than whites and more prone to violence. “The numbers show”, he insists, that such predispositions are unaffected to any significant degree by environment or nurture. Thus, he argues, it is a waste of time and money to be concerned by racial disparities in our societies, because unequal social outcomes are attributable to immutable difference.

Murray’s work is not really about biology and does not withstand scrutiny as such. His adventure began as an attack on the post-civil-rights era remedy of what he calls “aggressive affirmative action” in the US and his persistent aim has been to devalue the inclusion of African Americans in mainstream American society, whether in higher education, the professions or government service. His ideas have also been underwritten and promoted by right-wing and ultra-libertarian foundations such as the Manhattan Institute, the American Enterprise Institute and the Hoover Institution, which generally advocate shrinking all public spending, including in education. But Murray is not largely concerned with belt-tightening. He is specifically intent on proving differences between those received as white in American society and those perceived to be Black. In some of his books, he ranks Jews, women and “model minorities” as well, but there is always a singular drumbeat: that African Americans suffer from mental deficiencies that simply can’t be fixed. Be nice on an individual basis, he says, but collectively “they” are “different”. Moreover, Murray’s analyses do not merely presuppose white superiority and Black inferiority as standalones: the two are positioned in zero-sum relation to each other.

When I was first asked to write this review, I declined. I was reminded of Bertrand Russell’s eloquent response to an invitation to debate Oswald Mosley, the founder of the British Union of Fascists:

It is not that I take exception to the general points made by you but that every ounce of my energy has been devoted to an active opposition to cruel bigotry, compulsive violence, and the sadistic persecution which has characterized the philosophy and practice of fascism. I feel obliged to say that the emotional universes we inhabit are so distinct, and in deepest ways opposed, that nothing fruitful or sincere could ever emerge from association between us … It is not out of any attempt to be rude that I say this but because of all that I value in human experience and human achievement.

I thought, too, of the Princeton genomics professor David Botstein’s denunciation of The Bell Curve as “so stupid that it is not rebuttable”. To decline seemed the better path to sanity.

My mind was changed by the florescence, especially after January’s attack on the Capitol, of references to “fear of white replacement”. The idea, a favourite of the ultra-right, has re-entered the mainstream, as expressed by Republican members of Congress, such as Representatives Marjorie Taylor Greene and Scott Perry, and by popular television pundits including Tucker Carlson; it has been used to justify border walls and cages for migrants, as well as hate crimes and mass shootings. While the demographics of the US are indeed changing – forty years from now, those who are thought of as white today may be outnumbered by those who are thought not to be – we have been here before. In the first half of the twentieth century, “White Anglo-Saxons” felt at risk of being outnumbered by Italian, Greek, Armenian, Jewish, Russian, Latvian, Estonian and other Eastern European immigrants, who were not considered “white”. Then too people spoke of a “crisis”. But the siege mentality has taken a decidedly dark turn of late, focused on minority voters, imagined hordes of unsavoury “critical race theorists”, migrants and refugees. It has been accorded increased volume by some combination of the ungoverned corners of the internet, pandemic derangement, profound economic anxiety, and a renewed fascination with notions of genetic “purity”, the latter fuelled in no small part by the marketing of home DNA testing kits which translate the tens of thousands of haplotype groups and infinitely varied markers of human diaspora into reductive “percentages” of purported racialized heritage. (Dorothy Roberts’s Fatal Invention: How science, politics, and big business re-create race in the twenty-first century, 2012, is particularly good on this.)

I decided to write the review because today the US is close to a kind of free enterprise civil war in which the very definition of criminality has been raced, as in the casually reiterated defamation that “blacks commit all the crimes”. This assertion often contrasts with wild rationalizations about a broad range of white criminality. According to a recent Reuters poll, for example, nearly half of Republicans believe that the attack on the Capitol was “largely a non-violent protest or the handiwork of left-wing activists ‘trying to make Trump look bad’”. Although she was later censured, Virginia State Senator Amanda Chase described the people who stormed the Capitol – and who beat and injured at least 140 police officers, defecated in the halls, broke into offices and stole files and computers – as “patriots”. While conceding “some acts of vandalism”, Representative Andrew Clyde minimized the siege which cost five people their lives: “You know, if you didn’t know the TV footage was from January the 6th, you would actually think it was a normal tourist visit”.

My goal here is not to attempt yet another refutation of Murray’s theories. (For that, there’s Troy Duster’s Backdoor to Eugenics, 2003; as well as Harriet Washington’s A Terrible Thing to Waste: Environmental racism and its assault on the American mind, 2019.) Instead, I will point out the most dangerous bits of Murray’s political agenda, while providing more grounded bibliographical sources – before they disappear, given the sudden proliferation of state laws suppressing the teaching of anti-racism or other “controversial” topics. Since January this year, twenty-nine states have proposed bills to restrict or ban such teaching; nine have enacted legislation to that effect.

Murray starts with what he styles as “the struggle for America’s soul”. “You have to be quite old to remember how uncomplicated it seemed to many of us, White and Black, in 1963”, he says, wistfully. With the passage of the Civil Rights Act of 1964, “We had done it. We had set things right … The act had to be a good and necessary thing. As a college junior at the time, I certainly thought so.”

I was not yet a teenager in 1963, a year that spanned the Alabama Governor George Wallace’s “segregation now, segregation tomorrow, and segregation forever” speech, Martin Luther King’s “Letter from a Birmingham Jail”, Bull Connor’s deployment of firehoses and police dogs against civil rights protesters, and the assassination of JFK. No sentient being could have thought things “uncomplicated”. In a rhetorical tic, Murray resorts to such nostalgic tropes to sprinkle fairy dust on his arguments. His use of the past tense, the insistently well-intentioned “I used to think”, signals an older-wiser tone of innocence betrayed, difficult “truths” revealed, and the regrettable “reality” that “we have kidded ourselves that the differences are temporary and can be made to go away”. This pure-hearted stance underwrites crocodile tears about how much it pains Murray to become a warrior of unpleasant revelation; it takes “courage” and “bravery” to stand firm, even if you get called “a racist and hateful person”. Don’t be afraid, he instructs “us”, to say that vast racial disparities in American society exist because they should. They are what they are because it is what it is.

This “is-ness” allows Murray to assert “group difference” without ever interrogating how such group identities came to be formulated and normalized, specifically in the American context. If population data sorted as “White”, “Black”, “Asian”, or “Latino” is “naturalized” as genetic, we obscure the imprint of policy decisions that created such demographic lumps to begin with: including anti-miscegenation statutes, anti-literacy laws, the history of homesteading in creating raced geographies of wealth, the legacies of financial red-lining and legally enforced housing segregation, economic migration, moral panics about ethnicity and resistance to linguistic diversity. These are complications worthy of studied consideration. Since you won’t find mention of them in Facing Reality, I recommend a dive into deep history with Charles Mann’s 1491: New revelations of the Americas before Columbus (2005); Forget the Alamo: The rise and fall of an American myth, published earlier this year by Bryan Burrough et al; Howard Zinn’s A People’s History of the United States (2015); Richard Rothstein’s The Color of Law: A forgotten history of how our government segregated America (2018); Bill Ong Hing’s Making and Remaking Asian America through Immigration Policy, 1850–1990(1994); Richard R. W. Brooks and Carol Rose’s Saving the Neighborhood: Racially restrictive covenants, law, and social norms(2013); and Mitchell Duneier’s Ghetto: The invention of a place, the history of an idea (2016).

Murray does not seriously consider the disruptive stresses of racially segregated communities living on top of toxic waste dumps, or evicted multitudes trying to subsist on the streets; he doesn’t address stop-and-frisk policies directed at some but not all. (See Michelle Alexander’s The New Jim Crow, 2013; and Eugene Jarecki’s documentary film of 2012, The House I Live In.) He does not consider vast racial disparities in government responses to the devastation of Hurricane Katrina; or to man-made catastrophes such as that befalling Flint, Michigan, where the entire population (more than half of which is Black) was poisoned by a lead-contaminated water supply imposed on it by a Republican governor interested in “cost-cutting”. (For documentation of how lead poisoning irreparably damages children’s brains, read Gerald Markowitz and David Rosner’s Lead Wars: The politics of science and the fate of America’s children, 2013; and Rob Nixon’s Slow Violence and the Environmentalism of the Poor, 2013.)

Murray certainly does not consider the educational limitations of racially segregated life in the US: who has access to advanced placement classes (where some high school students are taught college-level curricula), who has science labs or foreign- language classes, school nurses or guidance counsellors; who as a kindergartener is likely to be sent to the police rather than to sit in the corner and to be expelled before the age of ten; whose school has more police patrolling the halls than books in the library. This parallel “reality” is ignored.

Under slavery it was illegal to teach slaves to read. Under Jim Crow, Black students were legally segregated with little access to public resources. Under today’s “colour-blind” regimes, de facto segregation is frequently enforced by ostensibly “race-neutral” laws against “boundary-hopping” between school districts. One of the saddest uses of the legal system in our post-civil rights era is in cases where poor, almost always Black, parents are sued or jailed for lying about their address in order to send their child to a well-resourced school in a better (almost always whiter) neighbourhood. The charge is called “theft of education”. Consider the case of Kelley Williams-Bolar, who used her father’s address in a nearby suburb to move her children from a dilapidated school in Akron that met only four of Ohio’s educational standards to one that exceeded all twenty-six of the guidelines. The school hired investigators and found that the children were living with their grandfather only five days a week, returning to their mother on weekends. In 2011, Williams-Bolar and her father were charged with felonies: falsification of records and theft of public education. She was given two five-year jail terms, suspended; she ended up serving nine days in jail, plus three years of probation and eighty hours of community service.

Without a nod to social history of this or any other sort, Murray skips to human assortment. Claiming to seek more “clarifying”, “dispassionate” nomenclature that will do away with unintended “semantic baggage”, to make it “easier to look at some inflammatory issues with at least a little more detachment”, he declares: “I substitute European for WhiteAfrican for BlackLatin for Latino, and Amerindian for Native AmericanAsian remains Asian”. This trick of labels allows Murray to ignore slavery’s inheritance, leap over any question of epigenetics, nurture or environment, and generalize his convictions into the imagined form of biologically segmented humanity. (To understand how inheritance laws of hypo-and hyperdescent affected perceptions of who is related to whom in the breeding system of racial property, see Jennifer Morgan’s Reckoning with Slavery: Gender, kinship and capitalism in the early black Atlantic, 2021; and Ariela Gross, What Blood Won’t Tell: A history of race on trial in America, 2008.) And, at first, Murray is coy: “There is something intuitively wrong about calling American Whites Europeans when American Whites are so clearly not like Europeans in Europe. The same is true of American Blacks compared to Africans living in Africa, American Latins living in the United States compared to Latins living in Latin America, and, for that matter, American Asians compared to Asians living in Asia”. But then he drives his larger point home: “Better that you be jarred, I decided, if it might improve my chances that you understand the sentence as I intend it to be understood”.

For the like-minded, Murray has laid a nice clear path. “I don’t ask for much … I will be gratified if researchers are buffered from accusations of racism because they entered IQ scores as an independent variable of regression analysis.” He simply wants “policy analysts to incorporate race differences into their analyses”. It follows: “don’t assume that successful reforms must raise test scores” of Black and Latino children. (He believes it can’t be done.) It follows: don’t trust the seemingly outstanding resumés of Black and Latino candidates. (Liberal schools belch out minorities of consistently inferior calibre, Murray says.) “If you are researching racial discrimination in the job market, recognize that controlling for educational attainment isn’t good enough … Control for IQ as well.” In assigning genetic causation to differences between the IQ scores of “Africans” (Black Americans) and “Europeans” (white Americans), Murray writes off the fact that racial gaps have narrowed considerably over the past few decades, in line with greater literacy, better education, better diet, better housing, better health. He takes issue with well-established findings, such as the “Flynn effect”, according to which IQ has been rising for all since the beginning of the twentieth century, in part because of fewer stressors including war, famine and infection. He ignores how culturally specific IQ tests are, yet also how ungoverned and methodologically incommensurable. (This incoherence is the topic of Aaron Panofsky’s detailed study, Misbehaving Science: Controversy and the development of behavior genetics, 2014.) He ignores the way that international comparisons unsettle some of his geneticized stereotypes. In 2010, for example, Israel’s national IQ, not including the West Bank, was lower than that of the state of Mississippi, and as of 2019, Mongolia’s was higher than Sweden’s. He ignores the fact that “European” Americans score lower than most Europeans in Europe. He does not consider that high-scoring China tests only a well-chosen 500,000 of its billion plus citizens, while the much smaller US tests 1.5 million. Finally, he dismisses the documented effects of early childhood enrichment programmes and scoffs at the disempowering effects of stigma.

Murray invokes data: “Africans, at 13 percent of the population accounted for only 3.6 percent of CEOs, 3.7 percent of physical scientists, 4.4 percent of civil engineers, 5.1 percent of physicians and 5.2 percent of lawyers”. But, contemplating these figures, “your inferences could be completely wrong” unless you take into account how much dumber Blacks are: in Murray’s opinion, the numbers showoverrepresentation because minorities only “get through” – he always uses the language of contaminants – the educational pipeline because of “preferential treatment”. Hence, there ought, really, to be fewer. (For more about how wrong, and politically subsidized, Murray’s manipulation of such data is, there is Angela Saini’s excellent Superior: The return of race science, 2020.)

This weighted presumption of white superiority reinscribes precisely the disparagement that affirmative action law was designed to overcome and re-legitimizes the knocking-off of points and opportunities for the historically disenfranchised. Murray declares that “race differences in cognitive ability and crime” should be a consequential part of policy decisions affecting “income inequality, efficacy of preschool and jobs programs, the causes of residential segregation, the voting behavior of the working class”. He casts any dissent as mere “identity politics”, distracting “us” from “warding off the jungle. It is the jungle, the primitive sense of ‘us against them’ pressing in on the garden”.

Lest there be any doubt about what Murray is vaunting, consider the recent lawsuit against the National Football League for damages suffered by players because of the NFL’s active denial and suppression of data linking concussion and long-term brain damage, including dementia. Because the lawsuit joined the claims of thousands of former players, the litigation resulted in a billion-dollar settlement. But, as recently reported by the Associated Press, distribution of the award has run into problems: the NFL “has insisted on using a scoring algorithm on the dementia testing that assumes Black men start with lower cognitive skills. They must therefore score much lower than whites to show enough mental decline to win an award”. The practice, overlooked until 2018, has rendered Black former players less likely to qualify for compensation. In May this year, a petition signed by more than 50,000 former players and supporters asked the court to make public the metrics by which payouts were being allocated. Most importantly, the petition demanded an end to the practice of “race norming”, a broad medical practice in America (and elsewhere), where some treatments continue to rely on uninterrogated, centuries-old assumptions. (For a dissection of one such myth, see Lundy Braun’s recent Breathing Race into the Machine: The surprising career of the spirometer, from plantation to genetics, which shows how beliefs about the superior lung capacity of white people have led to the automated “race correction” of pulmonary readings.)

Murray does not address the NFL’s scoring of attributed cognitive value and yet it conforms precisely to his idea of a good policy – one that makes race an “added, independent variable”. (Still, he denies that anything structural is at work.) If Blacks are simply genetically less intelligent, then the justice system’s slashing of the settlement awards is “fair” and “common sense” (words Murray adores).

Separate but unequal is Murray’s glib conclusion. Treat minorities “as though” they were equal, but “save the soul” of America by imposing policies that “recognize” their inequality as “immutably real”. The most painful circularity of Murray’s thought is that when “you” – the you of his address is always “European” – encounter a minority who is “equal”, or even more accomplished than you, you can rest assured of your superiority by reminding yourself that they are “exceptions” to the rule. A particularly capable minority specimen is posited as a separate kind of “equal”, marked by the abnormal superpower of having measured up to a “European” norm of intelligence and behaviour. While advocating that the reader “resist generalizing”, in the next breath he asks that you suppose yourself to be a “White” (by the last chapter he has reverted to “White” and “Black”) who is,

living in a multiracial working-class or middle-class neighborhood in a megalopolis. The great majority of crimes are committed by minorities. Most of the children in the bottom of the class in your child’s school are minorities. These observations are not the products of a racist imagination. They are the facts of your lived experience. There are exceptions to be sure – your daughter’s super-smart minority classmate, the minority couple down the street who provide loving care for foster children, the minority cop you watched deftly defuse an escalating confrontation. But your lived experience tells you that these are not typical. Is it OK for you to generalize that minorities are criminal and dumb? Obviously not. The obviously correct answer is that a difference in means exists, but that we must insist on treating people as individuals.

There are so many things wrong with this passage. For one thing, the US is still a majority white nation in which most crimes are committed by whites, even if there are disproportions in rates of who is arrested and convicted. But let’s begin with the frame: an invitation into a normative “White” brain imaginatively constructed as floating through a “typical” “lived” “multiracial” experience. The scene is, in fact, atypical for most white Americans. According to the Brookings Institute, the average white resident of metropolitan America lives in a neighbourhood that is 71 per cent white, 8 per cent Black, 12 per cent Latino or Hispanic, with a statistically unclear percentage of Asians and “others”. Even this is misleading when one takes into consideration the further lack of contact imposed by racial separations that are trackable block by block, job by job, school by school and building by building.

But Murray resents any suggestion that he has misapprehended things: “Blacks, constituting 13 percent of the population, are telling Whites, 60 percent of the population, that they are racist, bad people, the cause of Blacks’ problems, and they had better change their ways or else”. He casts as “victims” those journalists and academics who (like himself) have been criticized for racist or sexist commentary. To those mirrors of himself, he counsels “bravery” in resisting “woke culture” while warning that “new ideologies of the far left are akin to the Red Guards of Mao’s Great Proletarian Cultural Revolution of the 1960s, and they are coming for all of us”. The culture wars into which we have all been co-opted are unwinnable when apostasy is at stake rather than good faith. As David Botstein warned, beliefs that are profoundly wrong yet widely held may add up to an “unrebuttable” stupidity.

So here we have it: a book, published in 2021 that could have been written in 1921, or 1821. A book that forces the reader to confront the basics of white supremacy and take a stand. Murray’s constructed “reality” poses a simple set of direct questions: do you believe not in the magnificent randomness and infinite mutability of human potential, but rather that our most significant variability is transmitted through melanin? Can you “naturalize” the histories of segregation, poverty and social trauma by reassuring yourself that such disparities are the rational fallout of inherent genetic defect? A moral crossroads is laid before the reader: a choice. But, writing this, I can change no minds. Walking into the conceptually gated community of Charles Murray’s “us”, “I” become “you people”.

Patricia J. Williams is the author of Giving a Damn: Racism, romance and Gone with the Wind, 2021

Leave a comment

Filed under Uncategorized

Untethered, or The Year of Living Virtually

by Patricia J. Williams

Scenes from a Pandemic, #61, published as part of a series with The Nation Magazine and the Kopkind Colony, at

New York City

When baseball legend Ted Williams died in 2002, it came to light that he had directed that his body be cryogenically frozen so he and his children would “be able to be together in the future, even if it is only a chance.” At the time, it seemed strange to me, a desire for immortality so intense that one would slow the body’s decomposition to molecular silence, the breath held in wait for the perfect cure.P

Global pandemic has helped me better understand that determined longing for biostasis. In mid-March of 2020, friends began to die, and I began to lose my mind. Today, post-vaccination, and nearly 4 million global deaths later, I am slowly waking up, like Rip van Winkle, much more than merely a year older, and not at all the same. I feel as though I have been preserved by a shock of flash-freezing, and I am thawing now—slushy and watery and uncertain in my body.

It was the sensory deprivation I found hardest to bear. Early on in this plague, as my contacts with the outside world had retreated into the numbed realm of the “remote,” I vowed to try to find grace in isolation. I would meditate and listen to what I imagined might be some lost store of poetic inner quiet. Like so many, I was determined to “make the best of it”; I would gussy it up as a writing retreat, a prolonged snow day, a space to hibernate for a bit.

But the sequence of death derailed the project. More people sickened, more friends passed, more relatives of friends, more acquaintances I no longer thought of as “casual” but essential. How are you? became an existential question. I Zoomed, I Skyped, I learned to use Teams. Images of other human beings were delivered in digitized boxes, algorithmic animations with sharp rectangular edges recalling The Hollywood Squares, the flesh tones odd, and no smells of the living. I watched the incense at a Zoomed funeral; I watched the bitter herbs at a Zoomed seder; I watched a bouquet of white roses tossed at the livestream of a Zoomed wedding.

I experienced all of this as fictional, surreal, perhaps because my sense of reality depends on the echo of how a real voice in a real room hits the ear. Or how a happy person smells. Or how a handshake or a hug stimulates the nervous system. Or how looking directly into someone’s eyes reveals small inflections that enhance the meaning of words as they are spoken.

The architecture of Zoom requires that in every encounter I had to watch my own face. It was the material enactment of double consciousness: watching myself as I watched others watching me.

I make my living as a teacher. In a bricks-and-mortar classroom, I rely on the presence of students to read the room, on subtle expressions—a head tilted in questioning, a slouch of boredom, an excited buzzing among ones who’ve made an important connection… On Zoom, their tiny heads were lined up like figures on an Advent calendar. When they wished to speak, their little yellow hands, like cartoon Mickey Mouse mittens, went up and down. Their voices were muted and unmuted, on and off, like a sound faucet. When I divided students into problem-solving subgroups, there was no collegial hum. Using the chat feature, everyone just dropped out of sight, out of sound and existence, a timer at the bottom of the screen blipping down the seconds till they would reappear, bursting to the surface like divers from the deep. (I have a friend who, while his students disappeared into their 15-minute chat-worlds, would hop on his treadmill for a refreshing workout.)

I felt diminished by the disconnection. In order to perform myself, I had to stand within an exoskeleton of myself, a prosthetic, a platform, to translate myself, to project the three-dimensionality one takes for granted intra personas. I felt as though I were manipulating a marionette of myself, trying to get my limbs to work just right, to avoid getting tangled or lost in the strings and buttons, the lighting, the filters. Worst of all, the architecture of Zoom requires that in every encounter I had to watch my own face, sallow and flattened, in a constancy of self-regard. It was the material enactment of double consciousness: watching myself as I watched others watching me.

Yes, it was better than nothing, and we all made do. But a year of such mediation was disembodying in all those literal ways.

The word “parasocial” occurs to me as I survey this year of lost-minded time. Parasociality is a one-sided relationship with another who exists at a distance—most often a celebrity. The relation is not only one-sided but illusory, an attributed sense of intimacy or proximity, such as a crush on a pop star, or the daydream of an imaginary friend. Parasociality is the projection one places on someone who does not reciprocate, or who may not even know you exist. I am co-opting the word, I suppose—it’s a technical term in media studies—but there is something powerful about the idea of life imagined as living among others, while without them in reality. In that way, a year on Zoom was sometimes like talking to the dead. Some days navigating the geography of our miniature screen-world was like floating through gardens of computer-generated ghosts. Sealed in my home office, I would toss a bottle of my ideas into that imaginary sea, trusting that it would find shore, and be released like a religious revelation upon the screens of extant others. A clutching neediness sustained my reaching out to partial people through this ritual Zoom communion. I call them “partial people”; I mean people who exist somewhere in the present tense but whom I could apprehend only as bumblebees captured in a jar; wings beating against the glass, they buzzed with the threat and the promise to break through as real.

As the days grew darker, as the economy spiraled downward, as the political scene grew more disordered, I too grew scattered, anxious, sad. I bought a stationary bike. I wore masks and plastic gloves to collect the mail. I studied the instructive dictates of astronauts, and hermits, and Oprah Winfrey. I forced myself out of bed in the morning, I updated my will, made wish lists and to-do lists—things that are supposed to inspire a sense of purpose. I counted my many blessings. I wrote down what I had eaten, and what I should be eating. Too much Twitter was in my head to think, to feel, so I switched off all electronics for five hours a day.

Of course, it’s impossible to turn off the world entirely; the sounds of catastrophe leaked through the walls. Ambulances streaked through the streets; I wore earplugs to dull the overhead thwumping of medical helicopters. As the months rolled by, medical helicopters were joined by police helicopters, and chanting filled the streets. The National Guard materialized, and personnel carriers mustered round the city.

To the extent that there is the promise of vaccination, at least for now, I am aware of how much my watery, pulsing interior rejoices at having survived to see this moment.

Last June, I hung a picture of Nelson Mandela’s stone room of a prison, where he passed some 25 years in solitary confinement. If he could do it, maybe I would make it to whatever future lay beyond. In September, I added a portrait of the late Ruth Bader Ginsburg. And after January 6, I completed the gallery with a drawing I made of a bright happy balloon that was well-tethered to a stake in the ground. This was in response to a dream that I was a balloon that had lost its mooring. A child had let go of my string, and I was being carried away by a strong, angry wind—blowing away from everything I knew, disappearing higher and higher into a dense fog, the sky around me a grey and endless opacity. I woke up with the need to draw myself down to solid ground.

The whole world will need a lot of mooring post-pandemic. I fear that one of the costs of sustained parasociality is inability to come back down to earth, to stop and listen to what real others are really saying. Perhaps the perpetual state of emergency has unhinged us all. Awakening into a changed world, I am wobbly and in need of repair. I fear the wobbliness of others—particularly the great and growing numbers of lives given over to slushy accumulated moral panic. The pandemic has been a horrendous rupture of time, a trauma requiring reinvention of purpose. We will need some link between the fear-pod of deadliness and the redemptive reassurance of regeneration.

The threat of contagion is far from over; the virus mutates and disperses itself inequitably through the lacunae of bad public policies and cultivated fears. But to the extent that there is the promise of vaccination, at least for now, I am aware of how much my watery, pulsing interior rejoices at having survived to see this moment. I open my door early each morning. I look up at the dawn sky and remember how big and how beautiful and how unimaginable the world truly is. I taste the air. I set the table and reheat the uneaten dinner that has been waiting for you, my friends; I have missed you all. I settle anew into an embodiment of vulnerable exposures, pain points, and joy, a body absolutely certain that she, this lover of life, this I, this infinite ontography, will carry on and on and on without end.

Patricia J. Williams, a regular contributor to The Nation, is the author of Giving a Damn: Racism, Romance and Gone With The Wind, Harper Collins, 2021. She was a guest speaker at Kopkind in 2000 and 2009. This piece originally appeared on The Nation‘s website on July 14, 2021.

Leave a comment

Filed under Uncategorized

Book Launch: La Marr Jurell Bruce’s How to Go Mad Without Losing Your Mind

Leave a comment

Filed under Uncategorized

The Guardian: Opinion, “The ingrained fear of blackness,” May 27, 2021

Patricia Williams

Patricia Williams

The deployment of wildly unreasonable subjective fear is often sufficient to justify a wide range of reactions, even murder

People lay flowers at a memorial in George Floyd Square in Minneapolis.

People lay flowers at a memorial in George Floyd Square in Minneapolis.Photograph: Anadolu Agency/Getty ImagesThu 27 May 2021 06.00 EDT

Last modified on Thu 27 May 2021 11.12 EDT

It has been a year since George Floydlast drew breath. It has been a year since the multiple videos of his death spread worldwide; since passionate demonstrations swept cities and towns; since personnel carriers filled with soldiers crawled through American streets; since “saying” his or her name became a ubiquitous incantation, an infinitely unspooling litany of death. In the year since, Derek Chauvin, the police officer whose coldly dispassionate gaze riveted our own, was convicted on all counts. It was hard to unsee. And we saw.

Moreover, the witnesses against him included the chief of police; the instructor in techniques of restraint at the academy where Chauvin had trained; the police dispatcher who was watching remotely and thought her screen was frozen because he stayed on top of Floyd for so long; the emergency medical technician who had to reach around Chauvin’s knee to take a pulse (there was none) because Chauvin refused to move even after the ambulance had arrived; Floyd’s weeping (white) girlfriend who testified to his gentle, generous and prayerful nature; the sheer number of bystanders who “called the police on the police”; the crying children; the shopkeepers; the passing martial arts professional who shouted at Chauvin repeatedly, telling him that that he was killing Floyd. I began my own career as a prosecutor and I have never seen a stronger case.

There simply was no question.

And yet … there was. Indeed, there was such great collective apprehension about whether Chauvin actually would be convicted that thousands of troops were called to the streets of Minneapolis before the verdict was read. That apprehension was a testament to how rare it is that police are convicted of even egregious misbehavior. Indeed, if Chauvin hadn’t been convicted, the biggest issue would not have been the much-discussed potential for riots, the larger emergency would have been whether there exists any legally enforceable limit at all to police’s exercise of deadly force.

Derek Chauvin listens as the verdict is read in his trial for the death of George Floyd on 20 April.

Derek Chauvin listens as the verdict is read in his trial for the death of George Floyd on 20 April.Photograph: AP

A year on, any optimism I harbor is built on our aversion of that existential crisis. And yet I continue to worry because there are other cases. I worry because there is such a strong acculturated sense about who is presumed innocent or not in racial encounters, about who may be categorized as inherently “angry” or “threatening”. (Part of Chauvin’s unsuccessful defense rested on trying to depict the protesting onlookers as distractingly “angry”, “threatening” and unruly.)Advertisement

At this vexed moment, it is a truism that Americans of different races, ethnicities and religions are tense, wary of one another; but it is white fear of blackness that has the longest history, that is most intractable, and that still underwrites majoritarian tendencies to forgive even lethal police misconduct, and to rationalize punitive forms of segregation in housing, education and employment.

In the domain of criminal procedure, that generalized fear is an evidentiary problem. Not just police officers, but self-appointed citizen vigilantes are often not prosecuted or charged at all when they allege mere free-floating decontextualized fear. If such cases actually proceed to trial – again, the Chauvin trial was a rarity – the deployment of wildly unreasonable subjective fear is often sufficient to justify killing innocent, unarmed people. I feared for my life. Who are you to judge?

These are the two forces that we must bring into contention as a widespread pattern of response. “I feared …” as a subjective standard of self-exoneration. And then the follow-up banishment of any juridical review of that fear: “Who are you to judge?”

This pair of immunizing assertions is built into the very structure of recent so-called “stand your ground laws” that expand self-defense as licensing shooting to kill, unqualified by any duty to retreat, in public places even where there are other non-violent options. Although such laws are race-neutral in language, dominant American assumptions about who can claim a sidewalk or public street as ground that is “yours” is a highly raced proposition.

Consider Mark McCloskey, a personal injury lawyer from Missouri who, days ago, announced his intention to run for US Senate in 2022. McCloskey achieved memed infamy last July when journalists captured pictures of him and his wife, Patricia, brandishing guns and aiming them at a group of mostly black protesters who were passing their house en route to the mayor’s residence located farther down the street. McCloskey’s campaign website describes the couple as having “held off a violent mob through the exercise of their 2nd Amendment rights”.

But there was no violence: the McCloskeys were upset because the crowd had transgressed a wrought-iron gate at the top of the street which the couple felt marked a boundary line between the entire neighborhood complex as a “private” venue and the public thoroughfare beyond. (There are at least 285 gated streets in St Louis, part of what urban planner Oscar Newman called the “defensible space theory”, designed to create as sense of privacy even when residents benefit from publicly subsidized police, sewer, fire and water services.)

Patricia McCloskey and her husband, Mark, draw their firearms on protestors passing their home in St Louis on 28 June.

Patricia McCloskey and her husband, Mark, draw their firearms on protestors passing their home in St Louis on 28 June.Photograph: Lawrence Bryant/Reuters

Once the protesters passed through that gate, McCloskey claimed that they “may as well have been in my living room … I was frightened. I was assaulted.” He felt that it was “like the storming of the Bastille”. “They’re angry, they’re screaming. They’ve got spittle coming out of their mouths. They’re coming towards our house.” The couple’s decision to display and point guns – an AR-15 assault rifle no less – was dressed in the legal terminology of immediacy, McCloskey insisting he was in “imminent fear” that “they would run me over, kill me”, and that “we’d be murdered within seconds. Our house would be burned down, our pets would be killed.”

Again, the protesters were notin Mark McCloskey’s living room. They never so much as set foot on his lawn. They walked past his house. They did not “storm” his home. They passed, chanting loudly but entirely peacefully.

In some neighborhoods any black person may be looked upon with suspicion. As a friend describes it, “it’s an almost magical power: we can inspire fear just by appearing.” I wish I could shrug off the McCloskeys’ exaggerated fear as idiosyncratic or delusional, but it’s not isolated. Fox News pundits such as Tucker Carlson or Sean Hannity give round-the-clock voice to the many white people who share the McCloskeys’ effusive and globalized fear of black people on “their” streets. That fear has inspired proliferation of dozens of “anti-riot” bills initiated in state legislatures, apparently aimed at Black Lives Matter protests (but not at the largely white crowds who broke into public buildings across the country culminating in the assault on the US Capitol on 6 January.)

Consider just one bill, introduced in Alabama’s legislature in February 2021: had it been enacted, it would have “provide[d] that if an active riot is occurring within 500 feet of the premises, a person in lawful possession or control of the premises may use deadly physical force to defend the premises from criminal mischief or burglary”. Happily enough, the bill was defeated. But it illustrates a too-ubiquitous sentiment percolating through our polity. Authorizing not just the police but private citizens like the McCloskeys to use lethal force – based on subjective perceptions of danger to things, not just to persons – is a rationalized mitigation of culpability almost never extended in practice to black people.

Mark McCloskey, meanwhile, has sworn fealty to “Donald Trump’s agenda” in announcing his new “call to public duty”. As his website summarizes it: “God came knocking on my door last summer disguised as an angry mob.”

In the year since George Floyd died, I am relieved that the jury found Derek Chauvin guilty of murder. But anti-protest laws and anti-Black Lives Matter backlash have been significant responses to the attempted reform movement that his death inspired. I cannot yet dismiss the thrumming of those who insist that it is “natural”, obvious, rational and “reasonable” to apportion our collective fears by “colorblind” but coded hierarchies of racial “threat”. Fear should not govern everything. Fear cannot excuse everything. Fear heals nothing. And healing is the hard road still ahead.

Leave a comment

Filed under Uncategorized

Theoretically Speaking — A Panel Discussion on the Film Coded Bias

May 7, 2021 11:00 am – 1:00 pm 

Ashia Wilson (MIT, moderator)

Panelists: Seny Kamara (Brown University Department of Computer Science and Aroki Systems), Shalini Kantayya (7th Empire Media and UC Berkeley Graduate School of Journalism), Sendhil Mullainathan (University of Chicago Booth School of Business), Omer Reingold (Stanford University Computer Science Department), and Patricia Williams (Northeastern University School of Law)

Leave a comment

Filed under Uncategorized

The Free Speech Project: “Will Free Speech and Human Rights Survive the Worldwide Pandemic?”

A forum co-sponsored by Georgetown University (USA) and Oxford University (UK). 

When (if) the worldwide COVID-19 pandemic subsides, what toll will it have taken on free speech and human rights around the globe? Have governments, both democratic and authoritarian, taken advantage of it to consolidate their power? Have civil liberties been sacrificed in order to try to restore public health safeguards? What explains the extreme impact of the disease on disadvantaged minorities? This international dialogue brought together academic experts, human rights activists, and a field worker for Catholic Relief Services to explore these and related issues.

This event was co-sponsored by the Free Speech Project and the Future of the Humanities Project at Georgetown University.


Pia Jolliffe

Pia Jolliffe

Pia Jolliffe grew up in Vienna, has worked in the human rights field, and now teaches Japanese history in the Oxford University Department of Continuing Education. 

Michael West Oborne

Michael West Oborne

Michael West Oborne, who lives in Paris, was a university professor in the United States and France before serving as a senior official with the Organization for Economic Cooperation and Development. 

Patricia Williams

Patricia Williams

Patricia Williams is University Distinguished Professor of Law and Humanities at Northeastern University in Boston. In 2000, she was named a MacArthur Fellow (known as “genius grants”).

Alex Woelkers

Alex Woelkers

Alex Woelkers grew up in Montana and has volunteered or worked with humanitarian and development organizations around the world. He has served as an international relief worker in Kenya, Somalia, Uganda, and now Bangladesh.

Michael Scott

Michael Scott

Professor Michael Scott (moderator) is Senior Dean, Fellow​of Blackfriars Hall, the University of Oxford, college adviser for postgraduate students, and a Member of the Las Casas Institute. 

Sanford J. Ungar

Sanford J. Ungar

Sanford J. Ungar (moderator), president emeritus of Goucher College, directs the Free Speech Project at Georgetown University, which documents challenges to free expression in education, government, and civil society in the United States.

Leave a comment

Filed under Uncategorized

Gone, but not forgotten…

The cover of the April 23, 2021, no. 6160, issue of The Times Literary Supplement was devoted to my new book, Giving a Damn(Harper Collins), under the title “Gone, but not forgotten: Patricia Williams on the malign racial legacy of a literary classic,” as described by TLS editor Martin Ivens. 

An extract of the book was also published in the same issue under the title: “To the north: Race, migration and violence in the United States of America.” 

Leave a comment

Filed under Uncategorized

Giving A Damn: Racism, Romance and Gone With The Wind

by Patricia J. Williams

published by Harper Collins, 2021

Leave a comment

Filed under Uncategorized

Public Health Law Watch: volume II of the COVID-19 Policy Playbook: Legal Recommendations for a Safe and Equitable Future, Chapter 38: “Closing Reflection: The Way Forward,” published March 19, 2021

Patricia J. Williams, JD, Northeastern University School of Lawpage1image2312

An Entanglement of Policies

One of the most difficult challenges facing the Biden administration will be undoing a profound confusion of terms.  To understand the true dimension of that problem, it helps to look at the document that most succinctly captures the thinking behind Trump’s federal policy during most of the year 2020:  The Great Barrington Declaration.  Although it was not published until October of that year, it summarized the thinking of the administration’s most hyper-libertarian advisors, including Dr. Scott Atlas, and then Secretary of Health and Human Services, Alex Azar. The authors, a loose collective of epidemiologists and doctors, were proponents of a strategy they call Focused Protection. They asserted that “current lockdown policies” were causing “irreparable damage, with the underprivileged disproportionately harmed.” It is worth noticing that in this version of reality, the more active agent of such harm is not the actual virus, but “lockdown.” The expressed goal of the authors was “reaching herd immunity” by opening up everything—everything, period–and soldiering through. According to them, literally encouraging community spread would supposedly “allow those who are at minimal risk of death to live their lives normally to build up immunity to the virus through natural infection, while better protecting those who are at highest risk.” Sunetra Gupta, one of the three principle authors, told The Daily Telegraph: “we’re saying, let’s just do this for the three months it takes for the pathogen to sweep through the population.”[1]And Martin Kulldorff, another principle author, told Canada’s National Post what he envisioned: “…anybody above 60, whether teacher or bus driver or janitor I think should not be working—if those in their 60’s can’t work from home they should be able to take a sabbatical (supported by social security) for three, four, or whatever months it takes before there is immunity in the community that will protect everybody.”[2]

There are innumerable ethical questions raised by such a proposition, not least its unproved assumption that the human population is anywhere near the happy status of “building up” immunity.  There’s the thoughtlessly impractical description of what “better protection” for those at higher risk would look like:  “Nursing home staff should use staff with acquired immunity”—as though there’s a work force of the certifiably immune just waiting to be hired. Even though vaccines provide hope, the slow, even chaotic roll-out is the result of assuming that “acquired immunity” would be a cheaper option that actually preparation for mass production and distribution of vaccines, to say nothing of PPE.  The document also made the casual assertion that “Retired people living at home should have groceries and other essentials delivered to their home. When possible, they should meet family members outside rather than inside”—as though there’s a world in which “retired” people come neatly segregated in separate homes, apart from non-retired family. Indeed, even the use of the term “retired” as a cipher for age, seemed to focus on those no longer contributing to the economy; it skirted around the degree to which many people over the age of 65 have to keep working because social security does not cover the costs of living even before pandemic became a factor.  

Most astonishing was this throw-away: “A comprehensive and detailed list of measures, including approaches to multi-generational households, can be implemented, and is well within the scope and capability of public health professionals.” But to a hungrily contagious virus, any in-person mingling—school, bar, gym, office–is the absolute equivalent of a “multigenerational household.” This reality of unbounded human sociality is of course, the crux of the problem, and precisely what’s missing from the Declaration’s analysis as well as the Trump administration’s response:  If there is such a “list of measures,” we should have had it posted on every public billboard long ago. If the development of guidelines is “well within” the scope and capability of public health officials, there ought to have been urgent endorsement of the same from the highest national office.  If there had been clearly-enunciated and vehemently endorsed protocols all along, perhaps there wouldn’t have been so many lost souls drinking disinfectants and plotting to kidnap the governor of Michigan.

            Instead, the Declaration called for nothing more specific than “simple hygiene measures such as handwashing…” Mask-wearing was not even mentioned in the Declaration.  Maintaining physical distance was not mentioned. True to its libertarian origins, the plan treated a pandemic as something that could be effectively contained by individual decision-making–and that is a mind-set that will take a lot of public education to reform. Within this ideological filter, the elderly and the sick were left to exercise their right to self-isolate “if they wish,” configured as autonomous actors for whom rational choice is uncomplicated, a mere mental commitment to self-removal from public space. The good choice for everyone else was merely to get back out in the world, back to school, back to work, back to “normal”.  Not mentioned in the Declaration is the CDC’s data showing that Blacks and Latinos, disproportionately employed as low-level ”essential workers,” constitute 43 percent of all deaths from COVID-19, although they represent only 12.5 and 17 percent respectively of the population of the United States.[3]  In other words, the employment and living conditions of people of color are as important mortality risks as age; Dr. Uche Blackstock, CEO of Advancing Health Equity, observes:  “It’s almost as if living in a country with racism ages people…to the point where even people who are not elderly…are still susceptible to dying from this virus is in a way that’s very similar to people who are elderly.”[4](Haglage, 2020). These long-standing health disparities among racial minorities have been incalculably exacerbated by Trump’s neglectful policies. Nor is this catastrophe merely one of unequal health outcomes: the fall-out includes disproportionate burdens of debt, job loss, homelessness, educational de cits, child welfare, trauma, and grief. The cascading consequences of such social disruption will be one of the greatest challenges facing the new administration.

One of the most appalling aspects of the Declaration was its substitution of the term “herd immunity” for the “community spread” it was actually proposing. Herd immunity more accurately refers to a contagion against which there has been made available widespread programs of vaccination—typically between 60 and 80 percent of a population.[5]  That in turn depends upon the existence of a scientifically efficacious vaccine that ensures immunity for a stable and significant period of time.  In contrast, the term “community spread” captures the promiscuous, relentless virality of infectious disease.[6]  We may have herd (or “herd mentality,” as Trump misstated it[7]), and we certainly have spread.  But we have nothing close to immunity.

Moreover, it far from clear whether infection guarantees immunity, or for how long.[8]  As has been obvious from endless spikes among partying college students and professional athletes, the young and the buff are more susceptible that the Great Barrington Declaration allows; and even if they seem to represent a lesser proportion of immediate fatalities, they may suffer disproportionately from long-term cardio-pulmonary syndromes and disabling vascular disorders.[9]  Most perniciously, the Declaration ignores altogether the reality that COVID-19 may be spread by those with no outward or visible symptoms; its authors make no mention of the need for widespread, repeated, reliable testing of the asymptomatic. 

Herd immunity requires that 60-80 percent of a given population not only have been “exposed,” but have recovered sufficiently to have developed antibodies—around 200 million Americans. (As of March, 2021, there have been about 30 million American cases since March of 2020, or less than 10 percent of the US population.) Only at those levels will unvaccinated vulnerable people have a hope of being protected. Again, the Great Barrington Declaration did not propose that herd immunity happen through vaccination. Its suggestion that those levels be acquired “naturally” refers to those left standing after untold greater calamity: first, those for whom exposure does not result in death; second, those who suffciently recover to have developed enduring antibodies; and third, those not left with long-term or permanent disability. To get to that point without a vaccine means tolerating millions more deaths–well above the current toll of nearly 565,000–not to mention violently destabilizing rates of grave and protracted illness. As intentional policy, this ends up not looking like “survival,” even of the fittest; it looks like intentionally induced avalanche of slaughter. For the Trump administration to have ever pursued such a path as a “goal” constitutes, in my opinion, a crime against humanity. 

Confusions of Value

A second major challenge for the Biden administration will be the degree to which the propagation of such deadly confusion was assisted by deeply contested hierarchies of legitimacy and a jabbering bewilderment of competing sources, all laying claim to “truth.” Although the Great Barrington Declaration claimed to be endorsed by tens of thousands of medical professionals, the vetting of signatories lacked rigor (hence, endorsements from such eminent authorities as “Dr. Johnny Bananas” and “Dr. Person Fakename”).[10]In short, it is a crowd-sourced ideological tractsponsored by the American Institute of Economic Research, a libertarian umbrella group located in Great Barrington, Massachusetts, which adheres to Austrian school economic notions of methodological individualism. Major donors include Charles Koch, and the Bradley J. Madden Foundation, which has worked to evade and erode the FDA’s regulatory mechanisms and processes designed to ensure health and safety protections in the approval of new drugs and vaccines. The institute’s other sponsored tracts include titles like “Brazilians Should Keep Slashing Their Rainforest.” Consider a recent post on the institute’s website written by one of its research fellows, John Tamny (also editor of, entitled “Imagine If the Virus Had Never Been Detected.” He asserts that: ” [T]he coronavirus is a rich man’s virus…People live longer today, and they do            because major healthcare advances born of wealth creation made living    longer possible. We wouldn’t have noticed this virus 100 years ago. We          weren’t rich enough. …What is most lethal to older people isn’t much noticed    by those who aren’t old. A rapidly spreading virus was seemingly not much
            of a factor until politicians needlessly made it one. … The virus didn’t   suddenly start spreading in March of 2020 just because politicians decided it       had. The likelier beginning is 2019. Early 2020 too. Life was pretty normal as           a virus made its way around the world then. Politicians made it abnormal.          Let’s never forget the sickening carnage they can create when they find         reasaons to ‘do something.'”

Let me underscore that this is a post dated February 4, 2021

            Unsurprisingly, the glib laissez-faire recommendations of the Great Barrington Declaration were opposed by overwhelming consensus of public health experts, including organizations like the NIH, the CDC, the World Health Organization, Britain’s National Institutes of Health, the Mayo Clinic, Johns Hopkins Medical School, as well as globally regarded scientists like Drs. Anthony Fauci and Frances Collins.12 

            All that said, the Great Barrington Declaration became dark reality because its strategy of free market practices was been embraced at the highest levels of American governance. This stance was aligned not only with Ayn Randian ultra-libertarianism, but also became entangled with the sovereign-citizen movement—militant anti-mask, anti-vaxxers willing to take up arms to resist stay-at-home guidelines; belligerent anti-government souls whose extremism inspires them to descend upon legislatures in bids to ensure we may all live to die for a free-market economy. 

            This convergence of anti-regulatory sentiment likely means not only that the pandemic will continue to rip through certain sectors of our polity unabated for the foreseeable future, but also that the tragedy of such massive loss will imprint itself upon us as enduring collective trauma. And at a moment when fact sometimes seems to have been locked behind an inscrutable cosmic paywall, the bipartisan angst emerging from a national sense of siege should not be underestimated as its own governing force. This is an altogether dreadful moment. And dread eludes logic or law or rational discourse; it is a powerfully destabilizing force as well as powerfully directive. 

Proliferation of Punitive Eugenic Beliefs 

            Among the more troubling undercurrents of the official embrace of community spread is a certain cynical resignation on the one hand (“Gotta die one way or the other”) and something like a gambler’s resolve on the other (“Survival is all about your genetic lottery…”). There is something quite grim in those formulations, a transformation of the libertarian’s credo of “live and let live” into the eugenicist’s commitment to “live and let die.” We may well worry that there is something like a death wish in this limp capitulation to nihilism. 

            Philosopher Judith Butler writes of the “national melancholia” that proceeds from “disavowed mourning” for unremarked, “ungrievable deaths.”13 The Great Barrington Declaration reads precisely like a disavowal of mourning. We are trapped in a season of funeral after funeral after funeral—and yet even as we stand with heads bowed at multiple gravesides, there’s a call from the boss telling you to just get over it and haul your butt back to work NOW. Or else You’re Fired! Or you’ll be evicted. Or you’ll lose the car. Or you won’t be able to stay in university. Or you can forget about health insurance. What else is it but disavowal of loss, ungrievability of death, when Dan Patrick, Lt. Governor of Texas, opined on Fox News: “Let’s get back to living…And those of us that are 70-plus, we’ll take care of ourselves.”14 

            These statements are transactional in a blatantly macabre way…It puzzles me deeply, this eager swarm toward euthanasia. This profession of willingness to die for the sake of “living” is structured as sacrifice, as obedience to a higher order. This is an attitude that sees disability—including economic disability–as a social burden and an unaffordable drain. In the economically devastated period following WWI, and leading up to the full-scale grip of Nazi rule in Germany, hospitals became overwhelmed, children with birth defects became an economic burden, and poverty slowly became merged with eugenic and germophobic legal stances on behalf of the body politic. “Mercy killing” of “useless eaters” gradually became labeled as “therapy,” and elimination as “treatment.” Hospitals and mental institutions quietly initiated more systematized bureaucracies of killing: children deemed “unsustainable” were marked for execution by a plus-sign on their paperwork, their ultimate destiny identified as “disinfection,” “cleaning,” “therapy” and “treatment.” This, of course, metastasized into the mechanics of mass murder we know as The Final Solution. But I mention it here only to underscore the slow, hypnotically encroaching cultural violence when the nation’sbody is prioritized in competition with or in opposition to the stricken human body. 

            I wonder if the immorality of the Great Barrington Declaration would be taken as more urgently alarming if we challenged its entire framing: it gussies up a “cost-benefit” analysis of threats to economic freedom in the sheep’s clothing human protectionism–and tries to pass that off as public health. Without that cost-benefit frame, we would redesignate any policy of laissez-faire do-nothing-ism as reckless and depraved endangerment of human life. To be clear, I am not, in general, an advocate of laws that criminalize those who spread communicable disease: as we saw during the AIDS crisis, there are unintended public health costs to such an approach, including hesitancy to seek medical attention. It’s not easy to assign intentional fault in the middle of a pandemic: after all, we’re all taking risks by going to the grocery store, we’re all imperfect in our need to reach out to others, and we’re all ignorant to some degree about the protocols of prevention. 

            But as a matter of political decision-making, our leaders make choices of an entirely different dimension. They distribute public benefits that affect the life chances of all persons, and there are standards of professional conduct that must be expected, that ought to be enforced. So, for example, in Massachusetts, a two hospital administrators were recently charged with criminal neglect, infliction of bodily harm, and reckless endangerment of human life: they were in charge of nursing homes run by the Veterans Administration. Charged with that care, they knowingly put corona patients in the same units as uninfected patients and then later actively misrepresented the numbers of stricken residents. This outbreak started one of the first major spread in Massachusetts. 

            Yet this is malign behavior not so very different from President Trump’s more infamous actions. Even after hosting super-spreading outbreaks that threatened national security by sickening dozens of White House staff, Secret Service personnel, members of Congress and of the Joint Chiefs of Staff—he intentionally and defiantly held subsequent rallies and town halls where thousands of mask-less attendees were packed together, like sardines in a nursing home, like lemmings at Jonestown, all supposedly begging “to kiss me.” 

            For at least ten months of 2020, the degree of federal non-action combined with personal self-dealing was simply mind-boggling. Indeed, breaking with a 208 year tradition of non-partisanship, the editors of the New England Journal of Medicine published a blistering condemnation of the Trump administration’s handling of the crisis, “Anyone else who recklessly squandered lives and money in this way would be suffering legal consequences.”15 

            But if what has happened thus far is indeed a crime against humanity, more worrisome still is the long-term fallout: the accelerated lethality of a sickness that has already begun to mutate into various strains of greater contagiousness is greatly exacerbated by having encouraged people to go about business “as normal.” This habit of conduct will be its own additional catastrophe, one that will be very hard to turn around quickly. Vaccines surely must be mass-produced as quickly as possible: but hospitals are already strained to the breaking point, people continue to lose jobs and homes, the numbers of homeless continue to skyrocket, children have lost their teachers, parents, grandparents, inmates and staff in prisons and detention centers fall ill at epic rates because not deemed “essential”….This purposefully unchecked disease has left us to navigate a treacherous and still-brewing social storm. 

            The American history of state-mandated involuntary confinement isn’t foremost in public discussion or anticipation right now–but we forget at our peril its invocation, during the first half of the twentieth century. Growing from the American Eugenics Movement’s appeals to survival of the ttest, movements to sanitize the collective national body were institutionalized in Supreme Court decisions like Buck v. Bell, which counseled sterilization of “those who already sap the strength of the State.” (In the ultimate irony, of course, Justice Holmes wrote that the benefits of compulsory vaccination were rooted in a principle “broad enough to cover cutting the Fallopian tubes.”) In other words, recent American political and juridical discourse valuing the strong over the weak is not merely grounded in economics, but contains intimations about racial, ethnic and class preference. Therefore, it would serve us well to be attentive to situations where neglectful inaction in the name of free market ideals accomplishes the same disabling end that compulsory action might have done in another era. In his 1927 Buck v. Bell opinion, Holmes enabled structures of thought that distinguished the “the best citizens” from the “socially inadequate” and “manifestly un t” who may be sacrificed “to prevent our being swamped with their incompetence.” The consequence was widespread state action to detain and constrain everyone from epileptics to “imbeciles,” from “incorrigible” youth to wanton women to syphilitics. Today, as we watch more and more people sickening, dying, falling out of the workforce, wandering the streets, being detained in shelters, incarcerated in prisons, orphaned in institutions, camped out in tent cities and buried inpotters’ fields, I worry that “laissez-faire” policies have brought us to very much the same divided social end. 

We should worry, too, about what might happen if the tide of public emotion turns on people who move through public space with the illness–as happened to “Typhoid Mary” who spend the last 23 years of her life involuntarily detained in an asylum on North Brother Island, in the Bronx, coalescing backlash against Irish immigrants after she persistently violated quarantine orders. I don’t know if such animus might come from the right or the left, but I can imagine the appearance of a single demonized or intentional super-spreader becoming the justification for confinements that would draw even deeper and more irrational lines than we’re seeing now. Too much of our public health infrastructure has been transferred or is being monitored by police rather than actual public health agencies or policies informed by good medical practice. Just take for one example, the investment some police departments are making in drones that can take temperatures aerially of people walking down public streets.16 That data will be part of an overall architecture of surveillance that is already worrisome, but may be particularly susceptible to backlash based on blame, whether based on “bad behavior,” or other configurations of biological or political danger. 

            If we were to remain inflected by the Great Barrington Declaration’s emphasis on “personal choice” and survival-of-the-fittest as a viable response to deadly pandemic, one could foresee militarized health police serving as our new-age public health monitors. Since it will be a very long time before we can hope to see 80% of Americans “naturally immune,” we can predict some competition for the preservation of sub-communities of such perfected bodies through enforced segregation instead.  In a culture where many are yearning for, even cultivating civil war, we might anticipate geolocation-enforced quarantine, physical segregation by algorithmically determined susceptibility based on education level, preexisting medical condition, zip code, gender, race, ethnicity, as well as old age. Our recent presidential election was a distressingly close one: in other words, we came very close to having the wealth of public health entities distributed according to the ideological preferences of a Dr. Scott Atlas rather than the professional ethics of a Dr. Anthony Fauci, As discussed above, some of those preferences have already been embedded in chilling forms of algorithmically-triaged resource allocation. What we have grown to tolerate in the casual demarcation of some people as economic “parasites”—as Trump called immigrants—means that quite a few of us may be left to die as “useless” devourers of costly resources. 

            The Great Barrington Declaration claimed public space only for those who supposedly are brave enough, strong enough, young enough, and most of all economically productive enough to endure, and who could face down the invading, polluting contaminating, economically corrupting enemy. This aesthetic fusion of viral “enemies” and economically unproductive bodies is dangerous. This cleansing of public space and assignment of inherent value to those who remain standing (particularly without considering how lethally contagious the asymptomatic may be) is foolhardy and a recipe for chaos. 

Imaginary Bodies 

            One of the forces I found most mysterious in discussions of this pandemic has been the almost cult-like reverence for imaginary bodies, false icons and composited fictional entities whose ideations were mythologized, even immortalized, as greater in importance that human biological systems. Of course we humans are metaphor-machines—to one degree or another we all believe in imaginary bodies: As a lawyer, I understand the dignity accorded to “the corpus of law.” As a citizen, I respect the symbolic power of embodied national values for which soldiers in wartime would lay down their lives, a precept for which Gold Star families stand in courageous sorrow. As a consumer advocate, I reject the fiction of “corporate personhood” even as I comprehend the legal creativity of its construction. 

            But here’s what feels so impenetrably other-worldly to me: for the duration of the annus horribilisthat was the year 2020, the United States was engaged in a mask-less danse macabre. It was nothing less than a drawn-out, hubristic flirtation with death: a pushing of scientific limits, logical limits, ethical limits. What I mean is neatly summarized by the ever-succinct if nonsensical Glenn Beck: speaking of older Americans who may be statistically and immunologically more vulnerable to contracting COVID-19, he said “Even if we all get sick, I would rather die than kill the country.”17 

            This doesn’t make much sense if one believes “the country” is synonymous with “we, the people” who “all get sick.” As human beings we are united in our vulnerability to COVID-19. This disaggregation of the country from its people signalled an important conceptual shift in American identity. There was enough evidence to suppose that Beck and Trump, like the authors of the Great Barrington Declaration, were immortalizing the economy, or perhaps capitalism, as the eternal lifeblood of our nationhood. This is a perilously fragile dream in which to stick one’s head: if we all die, much more than the economy will be ruined. 

            But my point here is to make visible the ideational bodies we have invented through such relatively common verbal gestures. Beck essentially created a golem of the Economy. He invented a mythic entity with the power to do apocalyptic battle with our fear. It’s certainly understandable. COVID-19 is itself invisible, uncontrollably amorphous—the temptation is irresistible to “see” it as an “enemy” that can be rebuffed in some material form. Our yearning for control tempts us into conjuring various imaginary counter-forces, benevolent specters that will stand up to the virus’s murderous voraciousness. At one point and for some, The Wall became the imagistic cure, as though steely barricades could block the dewy clouds of breath and death. Some of us prayed instead to the Winged Victory of Vaccination. Other bowed down to the Valkyries of Inherited Vitality. (In Norse mythology, Valkyrie translates as “chooser of the slain,” a fierce goddess who rides bareback on her giant wolf, choosing which warriors will rise to Valhalla. The image of the Valkyrie as ruthless libertarian who grants survival to the fittest, seems to have gradually displaced the generous gentility of our other civic goddesses, Blind Justice with her scales, or Lady Liberty as figured in the Statue of Liberty.) 

            Perhaps most powerfully of all, immunity itself has been reconfigured in some quarters as Free Radical Individualism–a brave and muscled man, frequently armed with bullet proof-vest, military grade weaponry, but, alas, no facemask. Last July, Vice President Pence, impersonating this kind of warrior, faced down doctors at the Mayo Clinic, radiating strength as well as his wet breath. It was, unfortunately, a colonial stance as well, whether intentional or not: if one takes a moment to acknowledge that masks are not only about protecting oneself, but also and perhaps primarily others, it ceased to look like fortitude and more like recklessness toward others. 

            Pence later said he didn’t wear masks because he wanted to look at people “eye-to-eye.”18 Given the fact that masks don’t cover the eyes, it is clear that “eye-to-eye meant something more than just the ocular. It referred to an aesthetic, a gaze of controlled statesmanship, to be read in conjunction with firmly pressed lips and a sculpturally jutting jaw, all signifying stout resolution.19 With a mask obstructing that profile of nose, lips, jaw, the eyes alone become helpless, disengaged from the expressive personality of the rest of the face, so beseeching and vulnerable above the anonymity of an obliterating blue medical patch. “Eye-to-eye” is a fiction of masculinity, in other words, a feeling to be telegraphed, a fantasy of the strong leader who stands bare in the face of battle. Of course it is also magical thinking, this idea of walking into the fray and dodging bullets, and emerging unscathed. It’s myth-making, a way of performing miracles. Begone coronoavirus!! 

            If we can control nothing else, we can reign in our wandering imaginations by more carefully curating our profusion of fears, and projected golems. We can choose to tell ourselves better stories. What could we come up with if we were imagining broad “social security” not for a few elderly isolates, but rather for all. If, as virologists predict, we will have some reasonably effective and scientifically tested vaccine widely administered within a year or two, why not dream into being even-just-temporary subsidies and housing policies for all until that comes to pass? Classics scholar Paul Kosmin has written that in very ancient times of catastrophe and great death, time was stopped and, most importantly, debts were forgiven.20 I wonder how different would be our sense of imagined survival if we could reset the clock, and implement programs forgiveness of catastrophic debt, impose amnesties of crushing obligation. We need a time of pause to manage the unprecedented traumas of this time. Why not dream of a plan that would keep more of us fed and housed and truly able to choose to stay sheltered as a way of not overburdening every bit of our infrastructure with grief, with the sick, with the dying, with the dead? 

            In the summer of 2020, essayist Sabrina Orah Mark wrote a piece in The Paris Review entitled “I’m So Tired”: “I tell my mother about North Brother Island. ‘Maybe we should buy it,’ she says. ‘I need somewhere to go.’ What I don’t tell my mother is that we have already gone somewhere. We are already in this place where the world we once knew is rushing out of us.”21 These words have stayed with me. If there is any consistency to what I feel, it is captured by that paragraph: there’s such affecting particularity in that vision of the external world not just changing around us, but of interior worlds “rushing out of us.” 


Patricia Williams, JD, a pioneer of both the
law and literature and critical race theory movements in American legal theory, holds a joint appointment between Northeastern University’s School of Law and the Department of Philosophy and Religion in the College of Social Sciences
and Humanities. She is also director of Law, Technology and Ethics Initiatives in the School
of Law and the College of Social Sciences and Humanities, and an a liate of the Center for Law, Innovation and Creativity.


Butler, C. (2004). Precarious Life: The Powers of Mourning and Violence. London: Verso.

Centers for Disease Control and Prevention (CDC). (2020). What is community spread? Retrieved February 25, 2021, from

Concha, J. (2020). Glenn Beck: ‘I’d rather die’ from coronavirus ‘than kill the country’ from economic shutdown. The Hill.

DeVega, C. (2020). Trump’s death cult nally says it: Time to kill the “useless eaters” forcapitalism. Salon.

Gold, J. A. W., Rossen, L. M., Ahmad, F. B., Sutton, P., Li, Z., Salvatore, P. P., … & Jackson, B. R. (2020). Race, Ethnicity, and Age Trends in Persons who Died from COVID-19 – United States, May-August 2020. MMWR, 69(42), 1517-1521.

The Great Barrington Declaration. (2020). Retrieved February 25, 2021, from https://

Haglage, A. (2020). CDC: Black, Latino people make up nearly 43 percent of COVID-19 deaths. Yahoo Life.

Higgins-Dunn, N. (2020). ‘The death toll would be enormous,’ Fauci says of herd immunity to coronavirus in the U.S. CNBC.

Kelland, K. (2020). COVID-19 infection gives some immunity, but virus can still be spread,study nds. Reuters.

Kirkey, S. (2020). New declaration calls for ‘focused protection’ to achieve COVID-19 herd immunity. Critics say it would be deadly. National Post.

Manthorpe, R. (2020). Coronavirus: ‘Dr Johnny Bananas’ and ‘Dr Person Fakename’ among medical signatories on herd immunity open letter. Sky News.

Mark, S.O. (2020). I’m So Tired. The Paris Review.
Medical Daily Staff. (2020). Dr. Fauci, Other Top Scientists, Condemn Use of Herd Immunity.

Medial Daily.
Mostert, M. P. (2002). Useless Eaters: Disability as Genocidal Marker in Nazi Germany. The

Journal of Special Education, 36(3), 157-170.page7image21040


Leave a comment

Filed under Uncategorized

Conversation with Orlando Patterson: “The Confounding Island: Jamaica and the Post-Colonial Predicament”

Black Memoirs In the Global Age of the Boston University Center for the Humanities

Leave a comment

Filed under Uncategorized