Skip to content

Detourette’s Syndrome

July 18, 2014

When I am not writing, I often think about my reasons (read: excuses) for not writing. The seriousness of work and the pettiness of life–and vice versa–have displaced my creative habits. The fact that I chose to start this piece with an admitted cliché about writing underlines the point.

Even when I am ostensibly writing, it’s easy to question the vague status and worthiness of creative ambition. How have we spent our time? How does time continue to slip through our fingers even as we put that rapidly disappearing commodity under a microscope of self-reflection and admonish ourselves to do better with that only finite resource? Is it better to have spent a life gathering satisfaction and enjoyment or creating something distinct from oneself? To me, these feel like ethical, existential questions, not just idle abstractions.

Distractions both fleeting and life-encompassing appear in the rear-view mirror and as detours beckoning in the distance. One hears the conventional wisdom that you’d be a sucker to leave the highway and take the turn into a tourist trap meant to separate fools from their money. Yet somehow you wonder whether you’re on the right road or if the detour will lead you where you’ve been meant to go all along. There is no map to consult.

The biggest question of all now appears: whether the detours are keeping you away from the road of life or are they life itself?

To put what I’m pondering more directly, are all of the games, videos, memes, TV shows, movies, and even books just distractions from a more creative or productive life? Is a domestically satisfying life a distraction from a creatively satisfying one? Is there a point where one needs to say, “Ok, I’ve learned and consumed enough; now, it’s time to create something”?

None of these questions imply any value judgment on what constitutes worthy creative activity. Maybe you get all of your creative rocks off in the process of earning a living. There are certainly those mythic and noble beasts who do, and there’s a reason one of the most quintessential pieces of job-related advice is to “do what you love.” The problem usually lies in either an information gap regarding the kind of work a given career actually entails (e.g., being a lawyer isn’t all dramatic cross-examination, frequent televisual dramatizations notwithstanding). However, most people view their problems as stemming from a wage gap (i.e., doing what you love doesn’t pay for all the other stuff you think you need/want). But the latter view assumes you can satisfy all of your basic, human, and existential desires through wages in general. That returns us to the question of whether the “distractions” are really just life itself.

If the human experience is any indication, the most common answer to this conundrum has been that existential needs may be satisfied by a rich family and/or social life, sprinkled with a dash of hobby and infrequent travel. To say that one is satisfying their genetically-ingrained creative impulses by literally creating life is unassailable. But what of the life of the mind, separate from the life of the body? Doesn’t it have needs to pass itself on to future generations?

So we are caught between two poles: the life of bodily pleasure, comfort, security, and genetic influence competing and the life of the creative, productive, and intellectual influence on more than just what we can immediately reach.

By all means, marry. If you get a good wife, you’ll become happy; if you get a bad one, you’ll become a philosopher.
–Socrates

Never have I understood that quotation better.

I’m not certain that existentialism offers an answer to this question other than to say that there isn’t a dichotomy at all if you properly understand yourself and your values. But then, Nietzsche would have said that the free spirit is the one who creates his own values. Maybe the question is further reducible: “know thyself,” “the unexamined life is not worth living,” etc.

But I also think that our modern context bears significantly and differently on the issue of distraction. Maybe it’s just my personal experience, but I wonder whether we are approaching something of a “distraction singularity.” Specifically, we seem to have so much stimulation available to us at all times and so much good content being produced all the time that our in-the-moment decisions of how to maximize our time may be sliding towards consumption rather than creation. If there were a million content producers at the excellence level of William Shakespeare constantly producing YouTube videos that were functionally free to consume and accessible enough to appreciate for what they were, would we have the ability to hold back? Thus, the Distraction Singularity connotes that distraction and fulfillment become one and the same. In economic terms, the supply curve has shifted so far out that the demand is perhaps only limited by our need to subsist. But then again, is there a plausible future where simply consuming free or near-free entertainment combined with an acceptance of the lowest social safety net for subsistence purposes is preferable to some?

Ok, maybe the Infinite Jest hypothesis is a slight stretch (for now), but doesn’t that description of the direction of history ring true, even if the diminishing returns of the distraction-demand curve keep us from running off the cliff? Am I just an old man (30) complaining about these damn kids with their cell phones and their YouTubes?

It is probably unfair to accuse this generation of behaving any differently from any other one; if anything, millennials manage the flood of stimuli admirably considering the shift in the supply. Any alternative to embracing the profusion of distractions would be to demand artificial ignorance of the levels of supply and demand in favor of some non-market based determination of how to make oneself happy. Such prescriptivism is directly opposed to self-determination. And all this just reinforces the Founders’ wisdom that freedom and democracy take work. But we have so much TV to catch up on.

It is nonetheless fair to say that these distraction dynamics are providing people with more reason to avoid ever having to engage in civic duties or maintenance of the body politic. That’s the rational ignorance hypothesis, and it’s existed since long before the Internet. I am not arguing that there is an evil conspiracy setting these dynamics in motion, but maybe the Distraction Singularity has the unintended consequence of facilitating oligarchic influence by creating competing outlets for the attention of the people. Those people, in turn, neglect of their full rights of participation as citizens of a democracy. Of course, maybe the great majority of people have always wanted ignorance and bliss, and now the really good blinders are within reach. Who is to say whether the people are the winners or losers in that fight?

Grand Babushka Hotel

April 16, 2014

In case you haven’t seen it, Wes Anderson’s newest movie is a Wes Anderson movie. Which is to say that it has so many hallmark, camera-winking elements of a Wes Anderson movie that Grand Budapest Hotel is almost a parody of a Wes Anderson movie. You can certainly play a brisk game of Wes Anderson Bingo while watching.

Specifically, Grand Budapest Hotel exhibits many of what are now trademark-Andersonian actors, quirks and nods. Indeed, Anderson’s oeuvre grows like a snowball, picking up bits and pieces (i.e., actors and themes) as the movies themselves accumulate. Of course, Anderson’s filmography is most obviously and notoriously replete with his signature cinematographic style: everything obsessively centered, designed, and stylized.

The camera pans are steady. The set pieces are just that, often more evocative of a staged play than a film. The style is too-often described with the word “twee” by folks seeking an appropriate context with which to use the buzzword, like people who call assign the label “hipster” to anything created outside of the mainstream. A “dollhouse” aesthetic rings more true to me. And that term also seems to strike closer to the heart of the matter.

Then there’s the cast of familiar faces. Andersonia started with Bottle Rocket at its core, with Owen and/or Luke Wilson recurring in just about every Anderson project. Then, Rushmore added the recurring appearances of Jason Schwartzman and Bill Murray, along with a slew of bit parts (e.g., Brian Cox, Seymour Cassel, Kumar Pallana, Waris Ahluwalia, et al.) that pepper themselves throughout each Anderson outing, practically a part of the crew. The Royal Tenenbaums brought Anjelica Houston into the fold. The Life Aquatic with Steve Zissou added Michael Gambon, Jeff Goldblum, and the consistently creepy Willem Dafoe. The Darjeeling Limited saw Adrien Brody work his way into the Andersoniverse, and Moonrise Kingdom brought Edward Norton and Tilda Swinton (along with Bob “I-know-him-but-I-don’t-know-where-from” Balaban) onto the call sheet. Add all the aforementioned together with a new star lead (as per the custom), and you get The Grand Budapest Hotel.

Anderson’s themes also recur more consistently than cocaine in a Scorcese movie. Every single Anderson film, even including Anderson’s adaptation of The Fantastic Mr. Fox, can be construed as an exploration of children behaving like adults and adults behaving like children. From Max Fischer’s rivalry with Herman Blume to Ari and Uzi’s corrective upbringing by Royal Tenenbaum to Steve Zissou’s conflict with the real world to the Darjeeling brothers seeking forced enlightenment (contra their mother absconding from their lives entirely) to the obviously too-adult affair of Sam and Suzy in Moonrise Kingdom. The Grand Budapest Hotel is no exception. In fact, it is perhaps the capstone of Wes Anderson’s previous layers of construction and foundation.

It would be frivolous to claim that The Grand Budapest Hotel is a culmination for Anderson simply because it incorporates these elements of a Wes Anderson movie, even though it does in spades. Even the nonchalance and even unsubtlety with which these elements are almost shoe-horned into the movie can be fairly described as Andersonian. However, the nested storytelling/framing devices that bracket the core narrative of The Grand Budapest Hotel suggest that there is something significant to the recursive cacophony.

To break it down into its matryoshka-esque storytelling layers, The Grand Budapest Hotel is a (1) film that opens on (2) a present-day girl reading a book called The Grand Budapest Hotel written by “The Author,” who is (3) by portrayed by Tom Wilkinson who explains to the camera how (4) his younger self, played by Jude Law, traveled to the aging Grand Budapest in 1968, only to meet and dine with Zero Moustafa, played by F. Murray Abraham, who in turn relates (5) the story of Gustave H. and the Grand Budapest in the years between World Wars (not to mention (6) the expositional asides by Gustave H. to Zero in explaining his relationships with the various guests of the Grand Budapest), which is the heart of this narrative turducken.

But then, the use of a storyteller as a framing devices is another Anderson trope. Rushmore opens with curtains drawn to reveal the set of this play masquerading as a movie; The Royal Tenenbaums is portrayed as a children’s book narrated by Alec Baldwin; The Life Acquatic with Steve Zissou is titled, framed and shot like one of the titular characters adventure movies, down to the daring rescue and discovery of extraordinary marine life; and Moonrise Kingdom is practically reported by a historian/narrator, as though for a History Channel special on the most interesting story of the small island’s history.

With all of these storytelling layers, the potential for unreliable narrators compounds. The audience considers whether a narrator is lying or embellishing their slice of the story. And the stories are almost always stories created by the person with the most childlike point of view in Anderson’s films. Arguably, Anderson employs this perspective to reinforce the point that it is capacity for wonder and enthusiasm that makes a good story. Those qualities that make the child a sympathetic protagonist are their immunity to irony or ironic detachment. Their enthusiasm makes them fun and engaging.

So even if the unreliable narrators caused us to conclude that the plot of The Grand Budapest Hotel was made up (even within the universe of the film), the audience must also consider in what other way would anyone have access to the glorious and delightful history of a landmark like the Grand Budapest? The audience’s present-day perspective views the hotel as a monument in ruins, without the animated liveliness that serves as a complement and counterpart to Gustave’s brilliant efficiency. Anderson seems to say that only the authentic power of storytelling (in its myriad traditions: film, book, radio, dinner conversation, etc.) can preserve such larger-than-life memories with a genuineness, even if the audience must accept the risk that some of the stories are simply tall tales.

The cute, childlike, and even Wes Anderson-y set pieces and artifice to remind us that “this is a story, don’t take it too seriously.” The fact that the stories come from precocious children seem a clever way to disarm the hip detachment that is the hallmark of adulthood that would normally dismiss an exaggerated fable out of hand before enjoying what it has to offer. Maybe that is for the best. Maybe the defense mechanism has served its purpose. After all, how else do you get a hipster to admit something is just grand?

Rival Video

March 14, 2014

By now, it is probably no news to you that President Obama was on an episode of Between Two Ferns with Zach Galifianakis. Not only did “President Barack Hussein Obamacare” (as Scott Aukerman [Between Two Ferns’ director]’s character “Scott Aukerman” calls him on his podcast, Comedy Bang! Bang!) plug the Affordable Care Act and Healthcare.gov, but he also played along and adopted the tone of the show with some impressively casual adeptness. More than just a sense of humor, Obama displayed a sense of savvy and understanding of the way certain segments of society now communicate.

Specifically, this spot is simply the latest example of Obama embracing democratic means of communication. The Internet has empowered otherwise disenfranchised and atomized voices by providing a focal point through which they can find one another and unite. Sometimes these groups are people without enough at stake in the establishment to wade through the “muck” of daily news reporting. Newly minted 18-year-olds and working-class voters have enough daily travails that they rationally disengage from politics because the potential benefit of learning the nuances of deliberately complicated policy questions is minimal. 

However, those same people don’t have to feel like their time is wasted if they are engaged in their own preferred manner, e.g., through a viral video, the new dominant medium of cultural exchange. Or maybe on a late night television show. Or maybe with memes, e.g.,

Another president who effectively used humor

What makes Obama’s appearance even more sensible is that this particular policy issue, enrollment in an insurance exchange under the Affordable Care Act, requires participation from the most politically disenfranchised people in America. These younger segments of society needed a reason to get informed, let alone to get enrolled, and if Obama’s semi-humorous appearance in a five-minute video did that, then that is an effective use of the bully pulpit. He certainly wouldn’t have reached the same audience on Meet the Press. Indeed, Healthcare.gov got a 40% bump in daily traffic after the stunt (with millions of views over the next several days), leaving Obama with the last laugh.

By the same token, this isn’t Funny Or Die’s first foray into the realm of the overtly political. They’ve actually had some pretty good sketch/ad hybrids for a while now. And people seem to forget that President Obama likes being funny in a relatable way (e.g.Correspondents’ Dinners, his appearances on Late Night with Jimmy FallonLeno, etc.). So, while Obama’s appearance on Between Two Ferns was unexpected, it wasn’t exactly a surprise. It was the next logical step for reaching the portion of the electorate that matters for the Affordable Care Act.

The fact that some are decrying this particular advertisement (which is what it was, let’s face it) says less about the video itself and more about the reactionary dynamics that are now well-established in the old media. Fox News would say the sky is green if it scored points with folks predisposed to disliking Obama. And all the moreso given that the Affordable Care Act (or if saying Obama’s name is enough to make your blood curdle, read: Obamacare) is the subject matter. But these shrieks are incoherent in an evolving democracy. The electorate needs to be informed and educated, regardless of the medium. To hold politics itself above the hoi polloi is to limit the access of the political process to those with vested interests or to those with the luxury of time to spend learning about politics.

And Obama is hardly the first president to appear on a new medium to communicate his position.

Just as Kennedy beat Nixon through his legendarily telegenic debate appearance, Obama has simply been making effective use of new media that aren’t going anywhere.

It’s too soon for us to judge whether Obama’s fondness for non-traditional media is an anomaly; complaining that Obama spends more time on internet video than his predecessors is a little like complaining that Calvin Coolidge spent more time on radio addresses than Warren Harding.

But the one thing that is missing from the national discussion of Obama’s “new” media approach is how safe this move was for Obama. Obama’s true media revolution came during his presidential campaigns, the recent NSA scandal notwithstanding. His campaign used hyper-local and hyper-specific data about voters collected from the online surveillance apparatus to make decisions like which celebrities to invite to which events, and when and how often to hit up voters up for fundraising dollars without sounding desperate.

So Obama went on a well-established web series with a household celebrity name (that might be difficult for Fox News commentators to pronounce) who happens to be just slightly edgy and can function like a punching bag for a president on the offensive. Not much of a risk. On the contrary, Between Two Ferns was the right platform for Obama to reach out to include the 50% of eligible voters who stay at home every election. The alternative, digging heels into political point-scoring, would have resulted in the same outcome: Democrats cheer, Republicans boo, and America stuck between the two.

A Sheep in Wolf’s Clothing

January 12, 2014

Critics have described The Wolf of Wall Street as Casino meets GoodFellas but without the violence (if you decline to classify visceral drug and sex abuse as self-inflicted violence). In the long tradition of Scorsese’s villain-as-protagonist movies, The Wolf of Wall Street depicts hunger for power, money, drugs, and sex as zeitgeist more than as a character flaw. The movie seems to presume that you desire the same things as Jordan Belfort. Belfort, as narrator, asks who wouldn’t want to be rich and powerful, vacuuming up mounds of cocaine and other drugs with impunity? Never mind that such success requires outright contempt for the countless dopey would-be life-savings investors from middle America that you victimize in order to make your nut. This is America! No, this is Wall Street! Not only is such behavior rewarded, it is required just to stay afloat.

While the comparisons to Scorsese’s other crimelord movies are apt, the movie reminded me more of Inglorious Basterds (and if you read my review of that film, you know where I’m going with this). Both movies are told from the point of view of monsters that the audience is urged (not “intended”) to sympathize with, and not just because they are the focal point of the film. Audiences like larger-than-life rascals having fun. Especially when those rascals win. In both Inglorious Basterds and The Wolf of Wall Street, the audience attaches itself to the gleeful (there is no other word) abandon of anything resembling morals. In Inglorious Basterds, the protagonist Americans’ amorality (scalping, torture, refusal to abide by the rules of war) is purportedly justified because they seek to inflict fear and revenge on a greater immorality. In Wolf of Wall Street, beating Wall Street at its own game, only with more ridiculously outsized and outlandish excesses. We are reminded that Belfort’s conquests are justified because this is Wall Street, the heart of darkness itself. The degree of competition–in and of itself, we are led to believe–justifies Belfort’s explicitly criminal behavior. The spoils of victory allures our appetite for power even as “what it takes” should make our guts churn.

Curiously, critics have divided on whether or not the movie succeeds based on the question of whether Scorsese is too in love with Belfort and the metaphor of excess. They compare Scorsese’s own cocaine addiction to Belfort’s money addiction (yes, money is the chief drug in Wolf), and find a through-line of “excess” in Scorsese’s antihero movies. But with its nonstop depravity, some thought that Wolf was too fully celebrating the excess, overwhelming and smothering the audience with lewdness and lude-ness, rather than counterbalancing it with some measure of moral distance and critique. They are annoyed that the only clear sign of rebuke comes when Belfort finally puts himself and his family in physical danger, rather than when he is fleecing thousands of amateur investors out of their retirement savings.

These critics may have been frustrated that they didn’t get their money’s worth of catharsis by painting Belfort as enough of a villain. Or they may be anticipating the inevitable misinterpretation of the film’s thesis by audiences who idolize antiheroes in the vein of Scarface or the Godfather because they only read the surface level of the narrative. E.g., Wall Street bankers themselves, emitting hoots and cheers as Belfort treats his clients with open contempt or covertly warns his partner that he was wearing a wire.

DiCaprio has compared Belfort’s story to Caligula’s, and that undoubtedly rings true. The difference to a modern audience is that, whereas Caligula’s dalliances and excesses appear patently repugnant and amoral, Belfort’s addiction to money hits a little closer to home. Belfort resembles a few too many people who society has put on a pedestal. These “masters of the universe” seem to act with a little too much impunity these days, and Belfort’s story isn’t exactly reassuring. But to conclude that Scorsese himself is too in love with Belfort and his excesses to maintain the necessary critical distance seems to miss the point.

If we give Scorsese even a shred of credit (though maybe not for his editing skills), Wolf of Wall Street starts to look more like Conrad’s Heart of Darkness than GoodFellas or anything else; the interesting question lies in whether our morals are coherent in the first place. Like Conrad’s Kurtz, Belfort is a “prodigy” and a “genius” at producing previously untold profits, though something seems rotten to anyone who takes a second to consider his operation. Of course, getting so far ahead with such speed in a purely competitive system should raise a red flag for anyone who’s taken Economics 101. Profits are supposed to approach zero in a market where competitors are able to act on equal footing (i.e., within the same legal confines). Both Wolf of Wall Street and Heart of Darkness (are the titles symmetric?) postulate the thesis that money itself is amoral, and the first person to apply that amorality to the means of production will be rewarded. The Dutch colonialist investors in Heart of Darkness praise Kurtz for his ingenuity, turning a blind eye to the fact that Kurtz has deified himself in order to obtain unquestioning obedience and the willingness to perpetuate atrocities for his productivity gains. And the money-hungry audience of Wolf of Wall Street is put in a position to turn a blind eye to the poor saps who lost their shirts to the likes of Belfort because look at how rich he got. Returns to criminality are much greater than they are to legal behavior, and the audience fetishizes the end result far more than any consideration of the means. Those who live in late-stage capitalism have seen too many depravities to care about means anymore. We’re down to “get rich or die tryin’.”

In this light, the lack of moral anchors in Wolf of Wall Street is the disease, not a symptom of weak film-making. Affluenza has taken root in America, after all. The fact that Belfort’s victims are out of sight, out of mind is consistent with the capitalist zeitgeist on the subject. The wealth of Wall Street is equated with the health of the economy, never mind whether that health is sometimes derived from cannibalizing and subjugating everyone else. The only important consideration is that America remains on top, and that anyone can vault themselves into the upper echelons (even if they have to do some jail time as penance). The moral anchors are left behind to say that aspirational working-class citizens have duped themselves into preferring a society that serves the most powerful.

Jonathan Haidt, one of the preeminent analysts of socio-political psychology, attributes this split political personality to different appeals to divergent moral considerations.

Despite being in the wake of a financial crisis that – if the duping theorists were correct – should have buried the cultural issues and pulled most voters to the left, we are finding in America and many European nations a stronger shift to the right. When people fear the collapse of their society, they want order and national greatness, not a more nurturing government.

In short, people don’t want to cry for the losers, they want to cheer on the winners. And when the only real “hero” of Wolf of Wall Street is Special Agent Denham, the unimpeachable FBI agent who is rewarded with a subway ride home as Belfort survives the pathetic legal fallout that ensues, it’s no surprise that the audience–and the zeitgeist in general–views Belfort with greater admiration. As one reviewer put it:

“Wolf” starts with a Fellini-like party on the floor of Belfort’s firm, then freeze-frames on Belfort tossing a dwarf at a huge velcro target, literally and figuratively abusing the Little Guy. The traders get away with their abuse because most people don’t see themselves as little guys, but as little guys who might some day become the big guy doing the tossing. “Socialism never took root in America,” John Steinbeck wrote, “because the poor see themselves not as an exploited proletariat but as temporarily embarrassed millionaires.”

In Haidt’s framework, we care more about a convenient victory narrative than the sanctity of the legal system or the attenuated harm to unsophisticated investors because we Americans want to be in the position of victory some day. The same rationale explains why Americans approved of the Bush tax cuts. Indeed, aspirational cognitive dissonance is not only a product of our time. It seems to be inherent in the (Western) psyche. Shakespeare described exactly the same situation in Julius Caesar (and the larger quotation is illuminating):

Why, man, he doth bestride the narrow world
Like a Colossus; and we petty men
Walk under his huge legs, and peep about
To find ourselves dishonourable graves.
Men at some time are masters of their fates:
The fault, dear Brutus, is not in our stars,
But in ourselves, that we are underlings.

Indeed, Scorsese seems to say, the fault is not in the stars of Wall Street. How could they act any other way? The fault lies in ourselves for failing to apprehend that Caesars will run amok if we let ourselves continue to act as subservient underlings. We have only just created the Consumer Financial Protection Bureau, a teething puppy charged with reining in just one facet of Wall Street’s abuses. And if Wall Street gets its way, it will remain a toothless pup for a long time. Even the FBI has rebranded itself to concentrate on national security, rather than law enforcement generally. Special Agent Denham is Scorsese’s Brutus with a butter knife, and we are left to wonder aloud why Wall Street executives have been let off the hook for fraud prosecutions in the wake of the financial crisis.

To my eye, Scorsese seems to have smuggled a morality tale about turning a blind eye into the theater, forcing us to bear witness. If we see Romulus raised by wolves instead of Julius Caesar, the fault, dear Reader, is not our films, but in ourselves. “The horror! The horror!”

Blockbuster Revelations

November 18, 2013

Re-watching Children of Men, Alfonso Cuarón’s true masterpiece, during the government shutdown may not have been a good idea. Or maybe it was the best idea.

Relative to Children of Men, Gravity is a child’s tale. Gravity is a story about everything that could go wrong in a place that is principally known for its tight structure, protocols, and goals. Gravity’s dramatic tension arises from the juxtaposition of unplanned accidents and random contingencies against a need for careful planning. The stakes of Gravity are small, but potent. We relate to Gravity because we too feel like we would be similarly ill-equipped to deal with the rules of a callously indifferent universe if we were literally severed from a lifeline.

Children of Men, by contrast, explores what happens when one fundamental thing goes wrong and how the effects of that one disaster emanate outward into the rest of society. The stakes are about as large as can be, and the film lands no fewer emotional gut-punches for its sprawling focus. But unlike Gravity, where the audience implicitly understands that Bullock’s goals are to get somewhere safe, Children of Men’s audience (supposedly) does not come into the film with a grasp on any preconceived end goal for what society should aspire to a few generations into the future. Society and government do not have pre-written teleological ends; the only stated mission of a liberal polity is to allow people to pursue their own ends of happiness, and maybe provide the environment or the means to achieve said happiness.

Through the narrative arc of Children of Men, the audience is put in the position of watching what it would feel like to tug at the loose thread in the corner and see what lies behind the woven veneer of so-called society. That we accept the Britain portrayed in the film’s opening scenes as a plausible hypothetical only to find that it is just the top layer of a much deeper dysfunction is unsettling and frightening, especially in this moment of history. It shows us just how capable we are of self-deception.

For me, Children of Men makes a good case for being the best the apocalypse genre has to offer specifically because of the authentic feel of the unfurling societal and governmental decay. The fact that the world feels intuitively realistic, but then reveals itself to be even more horrifyingly realistic than previously considered serves to root out and confirm some of our deepest fears and anxieties. The references to riots and revolt in other countries and spreading chaos touch deep-seated nerves. That the chaos inevitably proves to be much closer to home than originally thought makes the film feel apocryphal. That we recognize our own susceptibility to the kinds of thinking that leads the film down its path is what makes Children of Men terrifying.

Of course, it is these kinds of nightmares that Hobbes was considering when he postulated that a governmental leviathan would be necessary to allay all of the fears that humanity is capable of imagining. We know that humans are fallible. And we know that we live in an age where near-infinite harm could result from some human–all too human–mistakes. So when we are presented with a scenario where everything is at stake, we recognize in ourselves a tendency to reach for a security blanket, whether that blanket manifests itself as an information-addicted National Security Agency or an full-on totalitarian/military state. And as with many nightmares, there is a tendency to equivocate the end of American values with end-times or actual, biblical apocalypse. However, more than anything else, it is probably more of a reflection of American exceptionalism and self-centeredness that Americans cannot imagine the world going on without America.

It’s tempting to date this kind of eschatological thinking to 9/11. After all, when did the public utterly lose any ability to calculate the probability or impact of worst-case scenarios? When the actors on the world’s stage were the rational, predictable actors that characterized the Cold War, we at least could know where we stand relative to our enemies. Even if the doomsday machines themselves were terrifying, we could calculate and counteract what would set them off. There was a logic to mutually assured destruction because it was tied to concrete, worldly values. Now that the fingers on the innumerable and untraceable buttons have morphed into hands attached to people willing to give their lives to achieve other-worldly ends, we feel as though we can no longer trust in rationality to win the day.

So when presented with a powerfully direct and immediate example of just how wrong things can and will go (i.e., 9/11), the American leviathan rears its head. And consequently, in what might be termed the “Wikileaks Era” of American history, we find ourselves having to deal with a seemingly unending barrage of revelations (they’re always “revelations,” aren’t they?) about the extent of government militarization, surveillance, data mining, overreach, and impunity. Values as fundamental to the American founders as freedom from intrusive searches and seizures have been exchanged for marginal increases in security, and even those improvements are of questionable (if any) utility.

That these news stories are always characterized as dramatic “revelations” rather than “predictable outcomes”–indeed, the exact kind of outcomes that Hobbes anticipated–makes me feel like we’re living in a society at the precipice. America’s self-deception runs so deep that these developments felt like the worst conspiracy theories  borne out as true when they were first revealed. But in retrospect, didn’t we know that these consequence we now live with were predestined by the new normal set in the wake of the PATRIOT ACT and the rise of the surveillance state? Didn’t we know that we would react this way? Of course, but we didn’t think we’d have to actually get up the political willpower to actually do something.

Instead, we allow the inertia to continue to propel us along the same track. Instead of repealing the laws authorizing the surveillance state, we double down on them. Instead of welcoming whistleblowers with outstretched arms as a means of marshaling public opinion against our institutions in need of correction, we despair because we think that it would be a futile exercise.

That policy inertia is unlikely to change anytime soon in an environment where the political capital is spent dealing with non-real issues that increasingly resemble the hyperbolic and ridiculous puppet shows that are always described in 1984 or Farenheit 451 or Brave New World. When the government can be shut down as part of a sideshow tactic by a cult that desires the apocalypse, i.e., the Tea Party, any hopes of any legitimate discussion about uncomfortable choices are already lost. This millennium’s ostrich approach to politics created this cult of apocalypse out of a false premise that the world might stay the same.

What the understandably beleaguered citizens of this new modern order want is a pristine variety of America that feels like the one they grew up in. They want truths that ring without any timbre of doubt. They want root-and-branch reform – to the days of the American Revolution. And they want all of this as a pre-packaged ideology, preferably aligned with re-written American history, and reiterated as a theater of comfort and nostalgia. They want their presidents white and their budget balanced now. That balancing it now would tip the whole world into a second depression sounds like elite cant to them; that America is, as a matter of fact, a coffee-colored country – and stronger for it – does not remove their desire for it not to be so; indeed it intensifies their futile effort to stop immigration reform. And given the apocalyptic nature of their view of what is going on, it is only natural that they would seek a totalist, radical, revolutionary halt to all of it, even if it creates economic chaos, even if it destroys millions of jobs, even though it keeps millions in immigration limbo, even if it means an unprecedented default on the debt.

Apparently suicidal tactics and theocracy are not exclusive to our enemies.

Like the twist in any good movie, we find in ourselves something we despise in others. A movie’s protagonist usually grows from this realization; now the onus is on us to do the same. There may even be some reaction and growth from a world-wearier and savvier public. But corrective measures we take will mean very little if we don’t grow a little more accustomed to checking for loose threads to see where they follow. Or is that starting to sound too much like a movie?

Nose Cut, Face Spited

November 7, 2013

Politicians in democracies like to make glossy statements like “the will of the people” or “the people have spoken” or even that they “represent the people.” If any such statements have been true in the last few months, then the shutdown may be read as “the people” declaring themselves unfit to govern.

Claiming to know “the will of the people” is a political tradition that dates back to well before Rousseau’s argument that the will of the people is the only legitimate basis of government. It’s a matter of rhetoric. It’s also a total red herring for anything beyond technical philosophical questions about basic governance, and even then it might depend on who you ask.

While one might think we could all agree that any tenable definition of “the will of the people” includes support for certain things like the rule of law and individual rights of political participation and self-determination, it’s nearly impossible to speak about what “the people” want when it comes to subjects about which there exists even an atom’s worth of room for disagreement. “The people” can and do disagree on fundamental political questions like the size, scope, duties, and worthiness of government in general. Some of “the people” don’t even want there to be a government at all, let alone one that has sufficient resources and wherewithal to do the things that other parts of “the people” want government to do.

Of course, the question of what “the people” wants government to do manifests itself at every level of law: social contract, the Constitution, and federal and local statutes. Each step involves a level of filtration, division, and distance. There will always be disagreement, but what disagreement does that result in? A government or no government? Can there be a law that applies to everyone, even if that law is simply that the rule of law is supreme?

One idealistic baseline is that natural rights should not be abridged at all by giving a government the power to enforce laws for which one does not individually consent. This position seems to require that the governed provide explicit consent at least to the baseline constitutional law that enables the rest of the laws. Otherwise, the eventual threat of force that is the backstop of government authority is essentially tyrannical. That’s why this school of thought often sounds like it’s arguing a debate that was settled with Hobbes and Locke. (Note that I did not use the words “libertarian,” “objectivist,” or “anarchist.” This will be relevant later.)

Another school of thought is necessarily more pragmatic. It acknowledges that we can never actually garner 100% participation in any social contract. After all, there are new generations that cannot meaningfully consent until they reach the age of majority, and there are always those who will be holdouts for personal (possibly system-gaming) reasons. Instead, consent of the governed has historically been presumed on behalf of a just system, more or less as a philosophical shortcut.

That consent is presumed via constitutional law. We have a constitution that we histrionically enshrine in order to inculcate the appearance and actual willingness of consent. Children are taught from a young age that the Constitution is basically infallible, that the founders were enlightened demigods amongst men, and that anyone who denigrates the constitution is probably an anti-patriotic saboteur or worse. From this starting point, Americans start with respect for the system created by these founders, and differences of opinion subsequently arise in the interpretation of how to carry out that particular vision. Those differences are the subject matter of statutory law and modern politics, whether manifested in questions about what laws are “necessary and proper” to the functioning of the republic or questions about how and whether to enact new laws in light of the Constitution, its premises, and the country’s history.

Of course, as heretical as it may seem, the founders and the Constitution were not, and are not, infallible. The founders turned a blind eye to slavery in the name of political expedience, and the Three-Fifths Compromise wasn’t exactly the Constitution’s finest exposition of democracy, let alone basic human rights. And as any reader who has enjoyed alcoholic beverage will already know, even 2/3 of both houses of Congress and 3/4 of the state legislatures can get things wrong.

So it is no surprise that America has rather limited institutional mechanisms available to change the laws that underlie all of the other ones. After all, what kind of political system would be sustainable if fundamental revisionism could happen with every election, let alone every occasion for a legislative vote? Apparently, it’s the kind that has to fight a war to end slavery, to suffer a depression to allow the federal government to intervene in the economy, and to witness the rise of Jim Crow and race riots to begin to guarantee some of the most basic civil rights.

But that doesn’t mean that the attempt to form a more perfect union is worthless or foolhardy. It simply means that the project of self-governance is difficult. And universal consensus even more so.

Indeed, sometimes we find ourselves in severe disagreement with the results of the political process. Less than five percent of the United States approves of the job that Congress is doing right now. Obama’s approval rating is at historic lows. And yet, I think most Americans–with the possible exceptions of the Tea Party and anarchists–still want a form of government where the basic rule of (constitutionally enacted) law controls. That’s the social contract we’ve signed onto, impliedly or otherwise.

So when legislation becomes gridlocked, and partisans are unwilling to change their minds or compromise, and holding the government hostage becomes the apparent tactic, that hurts Americans at their core. It does and should strike most Americans as a perverse tactic when the goal is simply to modify existing law. Indeed, shutting down government is impounding the Executive Branch’s ability to “take care that the laws are faithfully executed.” The shutdown gambit seeks to change the rules of the game: “even though we agree by social contract to follow the laws that are duly enacted, we decided that we don’t like this one, and aren’t going to let you follow it either.” It would be like defunding a military during wartime or refusing to allow a peaceful transition of power when your party doesn’t win an election. When the motivations of those who would shut down the government seem cynical or bought off, maybe the fault is not in the fight but in the weapons being brought to the table.

Refusing to fund the enforcement of the laws whenever legislative disagreements happen is not consistent with America’s social contract. It is, however, a plausible route to bring to the public’s consciousness the contradictions festering in the American polity, and make them question the validity of the legislature’s role in the social contract in the first place. Maybe it is “We the people” who have some constitutional questions to answer.

Single Player

November 3, 2013

Quick shout out to Vermont, one of the most politically interesting places in the nation right now. They are a great reminder that federalism was baked into our political system so that the states could act where the federal government wasn’t explicitly empowered to intervene.

Since the federal government isn’t getting the job done when it comes to providing legitimate public health care, now Vermont is planning to provide its own public option to its citizens.

In such a setting, Vermont’s plan looks more and more like an anomaly. It combines universal coverage with new cost controls in an effort to move away from a system in which the more procedures doctors and hospitals perform, the more they get paid, to one in which providers have a set budget to care for a set number of patients.

The result will be healthcare that’s “a right and not a privilege,” Gov. Peter Shumlin said.

And when people can’t agree on what a basic right looks like, it makes sense to let them vote with their feet.