Royal Flush

By Patricia J. Williams

published in The Nation Magazine, JUNE 27, 2016

Donald Trump’s 70th birthday was June 14. One of its stranger celebrations took place in New Delhi, India, where members of the far-right Hindu Sena party made offerings of cake to a life-sized poster of the billionaire, praising him as the anti-Islamic messiah. “Trump has said Muslims should be banned from entering America. Everyone should support that,” said their organization’s president, Vishnu Gupta. “Trump is about to become the king of the world.”

Most media response rightly focused on Hindu Sena’s application of Trump’s xenophobic rhetoric to India’s boundary disputes with largely-Muslim Pakistan. But I was intrigued by how much easier it is to imagine Trump as king rather than as “presidential.” Trump certainly surrounds himself with more filigreed trappings than Louis the XIV. And with followers lauding his purity solely because he’s rich, he has magically succeeded in branding himself as America’s official golden calf. Perhaps it’s not a surprising leap that he would somehow be able to pass as Hindu Sena’s sacred cow as well. But I think that Donald Trump’s peculiar combination of crude yet imperial appeal is rooted in his personal appropriation of powers that legally belong to the state. He speaks not of government of or by the people, but only of “my” people.

Take just one well-worn example: “The people, my people, are so smart…they say I have the most loyal people, did you ever see that? Where I could stand in the middle of Fifth Avenue and shoot somebody and I wouldn’t lose any voters, OK? It’s like incredible…” Delivered as a brief, apparently careless throwaway, this soliloquy was nevertheless a masterpiece of well-practiced theatrical force, a “shot” taken gesturally as well as verbally. It’s worth having a close look at the video. If you pause it at just the moment when he says the word “shoot,” you’ll see that for a second or two, his entire body language changes—drawing himself up to full height and looking directly into the camera with sudden, bracing dead-seriousness. (Indeed, he adopts the exact posture of Uncle Sam in James Montgomery Flagg’s 1917 US Army recruitment poster. It is almost impossible to see as accidental—right down to the color scheme—impossible to look at without hallucinating the familiar caption “I want you”….)

Trump then cocks his thumb, points his forefinger, and air-fires.

The next instant he’s back to the dance, the boasting jabs and jutting jaw, shoulders slanted, eyes off to the side, head lolling to all four winds. A roar of amusement rolls through the audience like a thunderstorm.

But in that one freeze-framed flash of his taking aim, his cocked body—that moment unleavened by jocular performativity—there is the tyrant’s invitation to join the army of “my people,” or else. His direct eye contact with the camera makes clear that he talking to you. In that quiver-pulsed moment of dark equivocation, you are invited down one of two paths: either that of the dumb sucker who goes limp and paralytic, the easy mark, a deer in the headlights of an unforgiving fate; or else leap aside, to his side, the good side, riding Trump’s triumphalist surf. The candidate’s rhetoric invites us into a world of magical realism, where a self-anointed good guy like Donald Trump can ride back in time to perform miracles: He would’ve cut short the carnage in Orlando, prevented 9/11, San Bernardino, and countless trade deals. He abides in a warrior world of redemptive vigilantism, where frightened psyches thrum to the drama of the OK Corral, now positioned squarely in the middle of Fifth Avenue. And Trump will be the savior of his people.

L’état, c’est moi.

In L’Eloge Historique du Roi Louis XIV, the great poet and tragedian Jean Racine wrote of the king’s power: “There is a continuous series of miraculous deeds that he himself initiates, that he completes, deeds as clear, as intelligible when they are carried out as they are impenetrable before they are carried out.”

In an ordered society, the state consigns to itself a monopoly of power over bodies and, in a well-ordered society, is held accountable for judicious exercise of that power through regulatory constraints like due process, habeas corpus, and civil-rights laws. In traditional models of European monarchy, by contrast, the king was anointed, in a way that combined both absolute physical as well as spiritual—or messianic—power. This is the most important distinction between power accorded to systems of governance such as ours and the rule of kings. The restrained ideal of civil society is the direct, unpanicked polar opposite of “you’re fired” efficiency or “off with your head” fiat.

Nicholas Mirzoeff, in his very lucid media manual, How to See the World, writes that the corruptible humanity of royal individuals was disguised, shielded, enrobed in “the concept known as the body of the king, which we can call majesty. Majesty does not sleep, get ill, or become old. It is visualized, not seen.” This implied immortality of the embodied state is of course why the accession of monarchs was—still is—hailed with, “The king is dead, long live the king!”

“Long live Donald Trump,” shouted members of Hindu Sena, whose brand of fundamentalist nationalism has been linked for years to lethal riots against Muslims.

Donald Trump invokes “people” with every breath. But that invocation never encompasses the complex humanity of the people he marks as “his” (as in: “Oh, look at my African American over there…”). There is little acknowledgment that it is citizens themselves who hold power to govern. In his reductions of democracy to lone-wolf appropriation, his campaign has been one long pantomime of a corporate takeover: l’état, c’est Trump. Vive le moi. That “efficiency-of-me” bodes poorly for polity, for fairness, for justice, for us.

Leave a comment

Filed under Uncategorized

Donald Trump’s Virtual Reality

published in The Nation Magazine, May 9-16, 

“I don’t know. What do I know? All I know is what’s on the internet,” said Donald Trump on Meet the Press last March. He was attempting to excuse his false assertion that a protestor at one of his rallies “had ties to ISIS.” It was certainly a startling assertion, at least to me, bookish woman of writerly profession that I am. Of course everything that man says startles me; this time it made me think about the general status of knowing, knowledge and its online production.

The internet is hardly the first technology of information transmission to be suspect. In Phaedrus, Plato described the Egyptian king Thamus’ suspicion of the written word. Thamus feared that writing was untrustworthy, because it “will create forgetfulness in the learners’ souls, because they will not use their memories…you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.”

I was thinking about all this because I was sitting on a train not long ago amid a group of exuberant young millenials. They were discussing religion, and the election and the influence of evangelicism in American politics. There was general consensus that none of them could understand what motivated people to attend a mega-church whose minister owned private jets. Then one young man piped up: …”but it’s the head of the Roman Catholic Church who makes more money than any of them.” Really? “Oh yes—the pope makes $200 million annually as his personal salary.” Really? Not the church? “Nope,” he said confidently: “It goes into his personal bank account and he can do with it exactly as he wants. He’s got all these homes and palaces, and he’s invested in all kinds of real estate….”

Ordinarily I’m very reticent to intrude upon conversations among strangers, but for once I couldn’t restrain myself. “Ahem?” I offered by way of introduction. “The pope takes a vow of poverty. He arrived at the job with two pairs of shoes. He does not receive a personal salary of $200 million a year.”

The young man’s response was: “Google it. I’m telling you the truth.”
I did not doubt my memory. I do not doubt myself. Yet…I did Google it, and he was right. Still wrong, but also right, in that it was the first thing that came up on Google when I entered a search for “pope’s salary”: “Pope’s personal income: $200 million annually.”

I had to ask myself how it came to pass that the first result was from 2011 on One has to assume that it has received more hits than any other site when it comes to the personal profit and salarial concerns of the papacy. (Google-truth is highly situational and epistemically fluid, however; for it came up as the second entry when I looked two weeks later.) Perhaps it’s a reflection of crowd-sourced belief. Or it could be as simple as a bot, or some troll conniving to push it to the top of the list. But whatever the motive or cause, it is an algorithm that ultimately decides placement—and it has been able to erase in some people’s minds the entire history of the Roman Catholic Church.

There has always been that possibility, of course. Tabloids and Fox News do something of the same thing every day. As Neil Postman points out in his wonderful book, Technopoly, King Thamus feared that writing will “change what is meant by the words ‘memory’ and ‘wisdom.’ He fears that memory will be confused with…’recollection,’ and he worries that wisdom will become indistinguishable from mere knowledge. This judgment we must take to heart, for it is a certainty that radical techonologies create new definitions of old terms and that this process takes place without our being fully conscious of it.”

All this makes me think of Microsoft’s recent attempt to launch a chatbot on Twitter, Kik and GroupMe that would sound like a teenager. Named “Tay,” it was created to “experiment with and conduct research on conversational understanding,” but quickly “turned from a nerdy attempt at reaching teens, into the racist, Holocaust-denying, Hitler-loving AI of all our nightmares.” As Peter Bright wrote in Ars Technica, Tay doesn’t understand what the Holocaust was: “She just knows that the Holocaust is a proper noun or perhaps even that it refers to a specific event. Knowing what that event was and why people might lie to her about it remain completely outside the capabilities of her programming.” Peter Lee, the corporate vice president of Microsoft Research apologized, saying “To do AI right, one needs to iterate with many people and often in public forums.” In other words, if we are going to craft an AI chatbot free of biases, we have to include everyone in the conversation. Microsoft, oddly, didn’t seem to anticipate that Twitter isn’t about “everyone,” in some happy, kumbaya way. It’s about a technology that’s proved capable of holding up a mirror to our darker realities—a space in which women regularly receive anonymous rape threats and people of color receive racist diatribes from strangers. It’s a problem of having built our prejudices into the machine, so that they take on new life, reproducing, generating, mirroring, magnifying, and ultimately ruling us in the great singularity of our robotically simulated kingdom come.

This brings us back to Donald Trump, who lays claim to knowledge but still doesn’t know. As Neil Postman observes: “technology imperiously commandeers our most important terminology. It redefines ‘freedom,’ ‘truth,’ ‘intelligence,’ ‘fact,’ ‘wisdom,’ ‘memory,’ ‘history’—all the words we live by.”

All the internet knows is what’s a proper noun, after all. “We’re going to build a wall, and Mexico is going to pay for it,” opined Tay, before Microsoft finally killed her.

Leave a comment

Filed under AI, donald trump, elections, epistemology, neil postman, phaedrus, Uncategorized

From Zubik to Zika : When Religious Exemptions Are a Menace to Public Health

             On May 16, the Supreme Court issued a terse per curiam opinion deferring resolution in the case of Zubik v. Burwell. The petitioners were nonprofit employers who argued that federal rules requiring employee health benefits for birth control were contrary to their religious beliefs. They maintained that even merely requiring employers to opt out by formally asserting a conscientious objection substantially burdened the free exercise of their beliefs. For now, the justices chose not to decide that issue, instead remanding it back to the Court of Appeals level, requesting further briefing as to “whether contraceptive coverage could be provided to petitioners’ employees, through petitioners’ insurance companies, without any such notice from petitioners.” This deflection has postponed a contentious showdown pitting reproductive rights against religious rights—a contest that has become all too familiarly fraught in recent years.

What makes Zubik particularly important is the precariousness of the constitutional principles at stake. In an earlier challenge to the Affordable Care Act, Burwell v. Hobby Lobby, the Court held that a family-owned corporation could refuse to pay for insurance under the ACA, based on the owner’s belief that contraception is a sin. Hobby Lobby, in other words, invited the current impasse about the degree to which businesses and other organizations may pick and choose—that is, discriminate—among the laws they will respect.

The ultimate outcome in Zubik will also shape public accommodation as a component of basic public health for years to come. Lack of access to gynecological and obstetrical services hobbles not only women’s personal “choice” but also public policies having to do with familial well-being and overall social welfare. Public health is hardly best served by allowing its administration to rest with the whimsical gods of one’s employer. (Dionysius doesn’t believe in blood-pressure medication. Baal has a thing about vaccines.)

In any event, Zubik’s petitioners represent but one strand of a peculiarly American brand of 
antigovernmentalism, some strands of which can be traced back to Reconstruction-era theologies of states’ rights. Other sources include antitax revolts, the racialization of welfare benefits, and the rise of Ayn Rand–style libertarianism. Over time, these forces have led to reductions in Medicare, the disappearance of all manner of maternal and child healthcare, attacks on Head Start, the virtual nonexistence of mental-health infrastructure, the reluctance to fund studies of gun violence as a health issue, and the erosion of air and water-quality controls. In an era when climate change, industrial toxins, and global migration combine to increase exponentially the odds of mass contagion, pollution, and bioterrorism, the purposeful impoverishment of collective response has put us all at grave risk.

Consider one example: With the rise of the Zika virus, some public-health agencies have advised women to delay pregnancy, given its link to microcephaly in fetal development. This has overlapped in predictable if incoherent ways with already vexed debates about sexuality, gendered poverty, and access to contraception, abortion, and healthcare. It remains to be seen whether the “best interests” of encephalic children born during an epidemic will be left to the private decision-making of families, or if we might expect the intervention of some broader programs of support. Says Tarah Demant, senior director of Amnesty International’s Identity and Discrimination Unit, 
“It’s putting women in an impossible place, by asking them to put the sole responsibility for public health on their shoulders by not getting pregnant, when over half [in Latin America] don’t have that choice.”

While Zika’s greatest risk is to fetuses, the virus is correlated with other potentially devastating conditions such as Guillain-Barré syndrome. Moreover, what may seem to Americans like its sudden prevalence is related to the slow encroachments of global warming—and we who are not to the tropics born should know that the same mosquitoes which carry Zika also carry dengue hemorrhagic fever. This alone poses the threat of skyrocketing emergency-room costs, well before we get to the question of long-term care for the permanently disabled. Public health’s main mission is prevention; that mission is unrealizable if we understand disease only as the personal responsibility of those individually afflicted.

This is not a narrow question of risk-benefit analysis. If, as a legal matter, we assign the burden of control to the nonscience of individual or religious choice, we create a vacuum in which the politics of fear may overtake us in the event of an actual (or just a threatened) pandemic. When the Ebola crisis was at its peak, there were hyperbolic calls for walled borders and the quarantine of those who posed no medical risk. At the height of the HIV epidemic, moral panic far outstripped the actual risks of transmission, stigmatizing gay men as untouchables. The lead-poisoned water supply in Flint, Michigan, wasn’t merely the result of bureaucratic inattention, but also of a widespread and long-term affective aversion to the ethnically and racially marked humanity of its residents.

Everywhere and always, disaster narratives of invasion by contagious bodies inflect immigration policy, educational access, employment opportunity, rights to movement, and public accommodation. Such narratives draw lines; they reinforce circles of identity; they monsterize and idealize; they underwrite superstition as well as forge truths. We know that resource allocation crossed with sacralized belief can structure better or worse social responses not just to birth control, but also to outbreaks of syphilis, tuberculosis, cholera, malaria, poisonings from toxic spills, putative invasions of “Africanized bees,” and leprosy. The imagination of disaster only goes so far in the preparations for an actual disaster. But the meta-knowledge of how ideologies of calamity, contamination, and cleanliness intersect with law, politics, and public health just might bridge the gap.


Leave a comment

Filed under Uncategorized

Disappearing Act

How Trump Keeps Us Talking Even When He Disappears

published in The Nation Magazine, March 24, 2016

The Republican debate scheduled for the week of March 21 was canceled at the last minute because Donald Trump decided not to participate, invoking a prior engagement: “I’m doing a major speech in front of a very important group of people…that night” (the general electorate being not so important, one is left to suppose). Trump’s condescending engagement with the process has apparently run its course: “I think we’ve had enough debates.

When Trump canceled, John Kasich did too. That left only Ted Cruz, so Fox News pulled out, citing the impossibility of debate in the form of soliloquy; the network wasn’t going to risk airtime for the sad sound of one hand clapping. But frankly, the Republican debates had become a one-man show long ago. Kasich’s decision was probably right: Who’d bother to watch a show without the main attraction? This lent a certain forlorn desperation to Cruz’s willingness to show up: Instead of winning him a few points for stepping up to the plate, he was framed to look like the unpopular kid who turns out for a party that the mean kids have moved across town without telling him.

Trump is a monster of Fox’s own making, and when he gets bored, he simply walks off the set and shuts it down.

It is remarkable how successfully Trump has manipulated the inverted realities of a television industry in which the firewall between news and entertainment, between journalism and profit-making, has collapsed entirely. Fox has certainly never made a pretense of being anything other than beholden to its corporate sponsors; but even so, it’s sad to see the Fourth Estate straying so far from its purported function of sustaining an informed citizenry. This much I lay squarely at the feet of the FCC: The public interest was sacrificed long ago with that agency’s slow corruption, gradually allowing the greater and greater aggregation of corporatized media outlets in fewer and fewer hands—such as Rupert Murdoch or Clear Channel—while simultaneously chipping away at regulatory checks like the Fairness Doctrine and the equal-time rule.

If the FCC allowed Fox to rule the journalistic henhouse, Fox enabled Trump to become its first real political reality star, sweeping the ratings to the point that his presence has become essential. The “Trump bump” is the ultimate validation of the old tabloid dictum that “If it bleeds, it leads”; he’s the network’s lifeline to attracting eyeballs and clicks. If it weren’t so frightening, one might be able to savor the irony: Trump is a monster of Fox’s own making, and when he grows tired of toying with Megyn Kelly, he simply walks off the set and shuts the whole thing down. Indeed, if any media outlet thought they were going to edge Trump out by strategies like canceling Miss Universe… well, he’s certainly shown them.

Philosopher Slavoj Zizek writes that thumbing one’s nose is traditionally a phallic gesture, whose message “would appear to be a simple showing-off in front of an adversary: look how big mine is; mine is bigger than yours.” But Zizek explores the idea that the gesture is actually an imitation of the other’s member, so that the ulterior message becomes: “Yours is so big and powerful but in spite of that, you are impotent. You cannot hurt me with it.” Thus, the sneering flip-off-the-nose by even the stubbiest of thumbs represents a threatened castration. And any attempt by one’s adversary to attest to his own power “is doomed to function as a denial…. The more he reacts…the more his impotence is confirmed.”

This is perhaps part of the reason that Marco Rubio repeatedly lost so much ground when pitted against Trump—Rubio looked impotent, particularly in that cringe-worthy moment when he tried to out-Trump Trump, making his own lame joke about what the size of Trump’s hands might signify. This gestural vocabulary is also something that is not gender-specific in its power to wound. Indeed, Hillary Clinton may be at more risk than Bernie Sanders in this symbolic universe; it doesn’t take much to imagine Trump resurrecting much of the most poisonous imagery of her last run for president, when she was figured as a “nutcracker” and a “ball-buster.” We should gird ourselves for that: Donald Trump never hesitates to indulge in brutal phallic pleasures.

When it comes to presidential campaigns, many of us think of the debate format as having a longer history than it really does. The Lincoln-Douglas debates of 1858 are frequently invoked, but they were part of a senatorial rather than a presidential campaign. In 1948, there was a radio debate during the Republican primary, and in 1956 among the Democratic candidates. But the first real presidential debate wasn’t until 1960, when John F. Kennedy and Richard Nixon made television history. There would not be another until 1976; it has been a consistent, if diminishing, tradition ever since.

In 1858, Abraham Lincoln and Stephen Douglas, the incumbent senator, met for seven debates. The rules allowed each candidate one hour to present an opening argument and an hour and a half for rebuttal; then the first candidate to speak had a half-hour for a closing response. Today, each candidate has about two minutes to answer a question presented by a moderator; opposing candidates have one minute for rebuttal. Then it’s up to the moderator to decide whether to extend discussion by 30 seconds for each candidate. But even as those few moments shrivel into no time at all, we remain glued to the drama, the players exiting the stage accompanied by sound and fury and strobe lights—a magical sleight of hand, an awe-inspiring political disappearing act.

Yet Trump’s power at this point is such that even his absence hogs the spotlight, and his silence speaks volumes. There’s a powerful paradox at the center of this, for it is precisely Trump’s refusal to commit, to cohere, or even to materialize that seems to fill a gap in the field of Republican desire. It’s straight outta Kafka: Trump has power precisely because his vacuity is an empty screen upon which others may project a host of their own brutal pleasures. The void he leaves is generative, his absence signifies.

Leave a comment

Filed under Uncategorized


The New York Times Book Review, March 20,2016
From Civil War to Civil Rights With One African American Family
By Gail Lumet Buckley
Illustrated. 353 pp. Atlantic Monthly Press. $26.

In The Black Calhouns, Gail Lumet Buckley displays a particularly panoramic view of American society. Daughter of the legendary entertainer Lena Horne, she was raised among show-business royalty. But as the descendant of a privileged and lucky line of well-educated African-American professionals, she also grew up related to or knowing nearly every major figure in the movements for racial, gender and economic equality, from Reconstruction onward.

The name Calhoun is mostly remembered today in association with our ardently secessionist seventh vice president, John C. Calhoun, a fiery orator who fashioned his conviction that slavery was a “positive good” into the ideology of states’ rights. His nephew was Andrew Bonaparte Calhoun, a wealthy doctor who owned the slaves whose descendants include Buckley’s and Horne’s maternal line. This link between history’s white founding fathers and the slave families who carried their names into freedom is a story with which most African-Americans are all too familiar, but one that has remained remarkably suppressed as a matter of general public knowledge. Only in recent years have some stories come to light, such as Annette Gordon-Reed’s excavation of Sally Hemings’s genealogy and Essie Mae Washington-Williams’s revelation that Strom Thurmond fathered her by a black family maid.

To some extent, The Black Calhouns is a revisiting of Buckley’s 1986 biography of her mother’s lineage, The Hornes: An American Family. That earlier work focused on the personal lives of specific family members. This book is more occupied with the historical events and political movements that shaped those lives.

Written in the style of a sweeping historical novel, The Black Calhouns deals with broad themes of property and politics, duty and determination; it follows the family’s profound engagements with the founding of “missionary” schools that educated a few but not nearly enough of the new black citizens recently freed from slavery; the establishment of the Freedmen’s Bureau; the rise of lynching and Jim Crow; battles to vote, work, buy homes and serve in the military; the daily confinements of “blood,” color and phenotype. In the last chapter, Buckley turns a worried eye to the cyclical nature of such struggles and includes a caution about “21st-century Republicans,” whom she casts as “secretly 19th-century Democrats, citing recent efforts to constrain voting rights, citizenship and the 14th Amendment.

That might sound polemical to some ears, but Buckley meticulously documents how many ­present-day racial and economic struggles are still framed by habits of thought that have changed little since the Civil War. This is not to say that there hasn’t been progress, but that the battle is so very slow precisely because the terms of debate have deep and often forgotten roots.

Remembering lessons that ought to have been learned long ago is hard and deceiving terrain. One of the enduring costs of racial segregation–either de jure or de facto–is how knowledge itself has been segmented, pieces of the puzzle sealed away within subpopulations, so that privilege and pain might never meet. If there are those who don’t understand the complexities of current student debates about the significance of buildings named for Woodrow Wilson at Princeton and John Calhoun at Yale, the intimate history in this book is unequivocal: President Wilson actively despised black people and counted Thomas Dixon, author of The Clansman, upon which the film Birth of a Nation was based, among his close friends. The federal government had been integrated since Reconstruction, but Wilson, determined to put blacks in their place, resegregated all jobs, freezing thousands out of the public job market. This massive and traumatic expulsion into unemployment not only dashed the aspirations of the author’s ancestors but also signaled a virulent uptick in the spread of Jim Crow laws throughout the land. It is one of those historical turning points that are remembered to this day among many African-Americans but remain nearly invisible to most white people.

Indeed, The Black Calhouns makes for particularly interesting reading against the backdrop of today’s culture wars, from Donald Trump’s disingenuous claim not to know anything about white supremacy to efforts in Texas to cut all mention of Jim Crow and the Klan from social studies textbooks. In the 1930s, as Buckley reminds us, Senator Theodore Bilbo of Mississippi wanted to send all blacks–not full citizens in the eyes of most white Southerners–back to Africa. The rhetoric was remarkably similar to some ­present-day calls to expel all Mexican or Muslim migrants. And in the 1940s, Lena Horne’s scenes were routinely cut from the movies in which she appeared when shown in the South. (There were only two roles for blacks that Southern states would accept for distribution: servant or jungle “primitive.” Horne refused to be cast as ­either.)

This is history from the inside. Her family was cosmopolitan, well educated and well placed. They interacted with a cross-section of American trendsetters, policy makers and cultural icons: W.E.B. Du Bois, Paul Robeson, Hattie McDaniel, the Tuskegee Airmen, Frank Sinatra, James Baldwin, Robert Kennedy, Gene Kelly, Humphrey Bogart  to name but a few. Headlines often unfolded in their living room.

Buckley charts the generational ­branches of black Calhouns painstakingly, as though making up for the lost stories of so many other African-Americans left on the cutting room floor. There is an insistence in her meticulously detailed recollections: We were here! We were there! Do not forget!

But we have forgotten, over and over. The Black Calhouns is a comprehensive reminder of how, even when not immediately visible, the burden of racial trauma is carried deep within the body politic. With so much of our collective national experience consigned to oblivion, we tread unknowingly on the graves of those whose lack of accorded dignity echoes with us yet.

Leave a comment

Filed under Uncategorized

Half-Time In America

by Patricia J. Williams
The Nation Magazine, February 24, 2016
Mid-way through February, poised between an unusually controversial Superbowl and an unusually controversial Academy Awards ceremony, I sit listening to the presidential primary debates in South Carolina. Twitter is a jangled mess: Donald and Beyonce, Peyton and Hilary, JLaw and Cam.  It’s as though you could take any given sentence uttered by any one of them at any time in their lives, jumble it in a Vitamixer, place a spoonful into the mouth of the next, and it would make as much sense: “Anyone who thinks my story is anywhere near over is sadly mistaken.”[1]“I get nervous when I don’t get nervous.”[2] .” “Pressure is something you feel when you don’t know what the hell you’re doing.”[3]  “I have a million ideas.  The country can’t afford them all.”[4] “Why would I ever get cocky? I’m not saving anybody’s life.”[5]“You have to live like a winner.  You have to think like a winner.  You have to eat like a winner.”[6]

Somewhere behind the theatricalities of half-time in America, lies the broad playing field of real-life (if not “reality”) America, and its landscape of worsening resource inequalities, unattended infrastructure, and simmering racial resentments.  Is there nothing that can connect our national cinematic appetites with the urgent need for serious political address?

Enter a small jewel of a film entitled “Last Day of Freedom.” Available on Netflix, it has been nominated for an Oscar in the poorly-publicized category of Best Documentary Short Subject. Lest it slip under your radar, therefore, let me press its merits. (Let me also reveal that although I served as a consultant for the multi-platform series of which this film is part, I had nothing to do with its making. That larger project, entitled Living Condition, examines aspects of the criminal justice system through the eyes of families with a relative on death row.)

“Last Day of Freedom, an exquisitely rendered 32-minute animation, presents the life of Manny Babbit, a homeless, mentally challenged, traumatized Vietnam vet who, in a PTSD panic induced by oncoming headlights, ran off the street and into the home of Leah Schendel, an elderly woman, whom he struck repeatedly.  She died of a heart attack at the scene.  The film is narrated by Manny’s brother Bill, who turned him in to police—“I had to be responsible”– hoping against hope that he might be hospitalized and get treatment.  Instead, he was defended by a drunken lawyer who called no witnesses and excluded “niggers” from the jury.  Manny was executed in 1999, on the day before his 50th birthday.

The story is a sad but familiar one from a statistical perspective:  Manny was swallowed in a vortex of legal, political, and social failures.  Injured in a car accident  when he was 12, he had suffered traumatic brain injury that severely hobbled his intellectual abilities.  He joined the Marines despite failing the eligibility test—and became one of the hundreds of thousands of men who were drafted during the 1960’s despite not meeting basic mental or physical requirements. Nicknamed “McNamara’s Moron Corps,” they were conscripted as part of President Johnson’s back-door attempt to have sufficient numbers of soldiers on the ground in Vietnam.  Manny served several tours of duty in Khe Sanh, enduring some of the bloodiest battles of the war. By the time he returned to the States, with shrapnel embedded in his skull, “he was chasing shadows”—paranoid, homeless and unable to hold a job.  His brother Bill took him in, but in those days the symptoms of PTSD were not widely discussed or recognized.  Manny went untreated.

The poet Shaul Tchernikhovsky once wrote that “A person is only the outline of his native landscape.” Manny Babbitt’s outline was etched upon a moral landscape oblivious to the consequences of concussion, ignorant about mental illness, unresponsive to the needs of its war veterans, and gung-ho about the death penalty. Still, what makes the telling of this particular life truly exceptional is its literal as well as figurative perspective.  Because the story is recounted by the condemned man’s brother, it foregrounds the generally invisible grief suffered by families of those on death row.  “They say he was a monster,” mourns Bill. “I see a little brother.” He describes the death chamber with quiet, heartbroken exhaustion: “…the press was at Manny’s head.  I was standing at his feet…I see my brother lying there in his white socks. My family back home was in agony…”  At the same time, the film is quite careful not to understate or disrespect the bereft family of Leah Schendler, who were also present at Manny’s execution.  Bill, who used to believe in the death penalty “before it came knocking on my door,” observes them with choked sadness: “…They were victims.  They suffered a terrible loss. But we’re all partners in this experiment.  We’ve all got blood on our hands now.”
In addition to its compelling narrative baseline, the visual artistry of “Last Day of Freedom” is stunning.  Co-produced and co-directed by artists and filmmakers Nomi Talisman, and Dee Hibbert-Jones, it is all the more remarkable for its being the first film either Talisman or Hibbert-Jones have made on their own.

The more than thirty thousand stills were drawn by hand. The lines are fluid, spare, unflinchingly intimate.   This rendering is somehow infinitely more affecting than if it had been done as a straightforward, on-camera-style interview. (Indeed, I found this documentary much more powerfully convincing on the subject of traumatic brain injury than the movie “Concussion”—though I’m not weighing in on whether “Concussion”’s cast should have been snubbed. Worse acting than its is in contention….)

Photography sometimes invites us to forget that we are looking at a re-presentation, a reframing, a re-interpretation.  As Roland Barthes put it, any image is, by definition, “that from which I am excluded…The images from which I am excluded are cruel, yet….I convert my exclusion into an image. This image, in which my absence is reflected as in a mirror, is a sad image.”

Animated film is a genre we too often associate with children’s fare:  upbeat, reductive, unremittingly colorful.  But “Last Day of Freedom” uses simplicity of form to very different dramatic effect. Drawn almost entirely in black ink on white background, it deploys subtle visual cues: a few foggy smudges, a bit of play with font and thickness of lines, the rare dollop of red. This works to filter out distraction somehow.  The economy of line focuses attention, like haiku. It is as though we are inside Bill’s head, quietly re-living a nightmare, the rise and fall of his voice summoning only so much of the dream as he can bear.  This restraint in presenting the mere outline of his memories makes Bill’s story not merely personal, but politically haunting; not just tragic, but evocative of universal human complexities.

Watching this film will remind you of all that remains unsaid—on all sides–in our current political food fight; hopefully it will push us to insist upon debate that addresses the actual consequences of governance bewildered and thus held unaccountable for its flaws.


Leave a comment

Filed under Uncategorized

Restrain Guns, Not Tongues….

The Nation Magazine, February 22, 2016
Prior restraint” used to be a fairly well-defined concept, particularly in the area of First Amendment jurisprudence. It was generally accepted that we do not punish ideas—what someone reads or says or thinks—unless those ideas threaten to depart the realm of mere ideas, becoming a “clear and present danger.” There are two significant forces that are converging to compromise that settled law, both in the U.S. and abroad.  The first is the rise of global fear about terrorism. The second is the enormously complex communicative power of the internet.

Many of us in the legal academy have spent the last few decades of the so-called culture wars debating the definition of dangerous speech in traditional media and in new forms of social media.  Those debates have been largely focused on books like Mein Kampf, or voices like those of Cliven and Ammon Bundy, or suggestive images like Sarah Palin’s rifle cross-hairs over the faces of her political opponents. We have generally arrived at a sort of free speech absolutism, and sunny cliches that hate speech must be met with more speech. Threats of insurrection have always required more than easy bromides, but nevertheless, it is startling to see how much the contours of the debate have changed recently. Harvard Law Professor and former Obama administration official Cass Sunstein has asked whether it’s time to reject the “clear and present danger” test in favor of one that suppresses “explicit or direct incitement to violence, even if no harm is imminent.” University of Chicago law professor Eric Posner has gone much further, proposing a law that would make it illegal even to read websites that “glorify” the Islamic State or to share links to such sites.[1]

Many of the restrictions now being proposed are directed specifically against the Islamic State—a threat less serious for being a “state” than a state of mind.  Posner worries about the persuasive appeal of such sites to the “naïve.”  But how do we distinguish the naïve from one who wishes to be informed about a major global phenomenon? Is danger less clear and present when extremist ideas are home-grown and packaged as Christian? Most importantly, who are the gatekeepers not just of what’s dangerous to publish, but who gets punished for reading it?

Many parts of Europe already deploy more stringent regulations on incendiary speech. We may recall that, in the wake of the terrible Charlie Hebdo massacre, the French came together in one of the largest demonstrations in history, dedicated to freedom of expression and resistance to censorship; but France has also long had some of most restrictive speech regulations in the industrialized world.  Moreover, a new French surveillance law allows internet monitoring, phone bugging, and secret home invasion, for vaguely described reasons ranging from “organized delinquency” to “major foreign policy interests.”  Administration of the law is overseen by a 9-person advisory committee, but it is the prime minister who has ultimate decision-making power.

That prime minister is Manuel Valls, whose stances—against Muslims, migrants, trash-talking comedians, and Roma children–have been controversial and divisive. Valls used to be the mayor of the town of Evry; while in that role, he was captured by a television news crew striding across the town plaza, through a pleasant-looking throng among whom were a number of black people. Valls, annoyed, complained that their presence detracted from the footage and called, in three languages, for white faces to be more prominent: “some blancs, some whites, some blancos.”[2]

Recently, Valls was scheduled to attend a meeting at the University of Avignon.  In response, Bernard Mezzadri, a classics professor there, wrote his colleagues a mocking message in an internal email: “I hope that upon this great occasion…there will be present sufficient numbers of ‘blancos’ (without too many of the tanned persuasion), so as not to project too bad a picture of our institution.” The president of the university reported the message to the local constabulary.  The prosecutor then pressed charges against Mezzadri for public incitement of racial “discrimination, hatred or violence.” The case has sparked widespread protest in France.[3]

If this prosecution seems silly to some of us, it is because Mezzadri’s message is so clearly sardonic. The gatekeepers seem to be exhibiting some fundamentalist tendencies of their own:  indeed, journalist Eric Fassin has written that it almost seems like a resurrection of pre-revolutionary law, when charges of “blasphemy” could be brought against those who dared to mock the king.[4]

Mezzadri’s case is an object lesson in why “emergency” restraints in a time of “perpetual” emergency and “endless” war—whether France’s laws or the dark, unexplained operation of our own USA Patriot Act—are rife with translational dangers, whether attributable to carelessness, ignorance, or abuse.

But the question of speech as imminently threatening or incendiary is even more complicated in the American context, where the right to bear arms has been deemed expressive. Consider the situation of Professor Steven Weinberg, a Nobel-prize-winning physicist at the University of Texas at Austin.  He recently said he would close his seminars to anyone carrying a firearm, fearing that guns in the classroom chill discussion. For this, he is vulnerable to lawsuit under Texas’s new “campus carry” law, which goes into effect on August 1, 2016.[5]

Meanwhile, there have been demonstrations on the Austin campus pitting “campus carry” against another Texas law that forbids individuals from displaying or distributing obscene materials.[6] Thousands of students are coming together to protest guns on campus by attaching “gigantic swinging dildos” to their backpacks.  The logic has been summed up thus:  “You’re carrying a gun to class?  Yeah, well I’m carrying a HUGE DILDO.” Dildos are, as organizer Jessica Jin points out, “just about as effective at protecting us from sociopathic shooters, but much safer for recreational play.” A veritable jouissance of expressive freedom may be found at #CocksNotGlocks. Have a look while it lasts, before it’s a-prior-ly restrained. In the effort to keep ourselves safe, it seems somehow easier to think of tying tongues than taking guns.


Leave a comment

Filed under Uncategorized