By Isaac Butler
By Isaac Butler
By Isaac Butler
The Flea Theatre, which relies on the unpaid labor of its resident company "The Bats," who also do menial jobs around the space and used to have to pay to audition, is moving into a new space. The space's pricetag? Over 18 million dollars. The Flea will be expanding its staff by 2 and its operating budget by less than 20%.
By Isaac Butler
Here's the thing...
When you come out with a book that says that facts don't really matter in nonfiction, that if you bend or fabricate them without clearly signalling to the reader that you're doing so in order to increase the effectiveness of your writing it's generally okay, it really does have the long term side effect of decreasing the effectiveness of the future writing you do under the auspices of nonfiction.
So, for example, when you write a lovely little essay about Ansel Adams that purports to in some ways get inside the moment when Adams discovered what his photography was all about, the piece is immediately knee-capped, because it relies on quotes as a form of evidence, as a form of signpost on the reader's journey. And we already know from your book that the quotes you use in your writing have been in the past unreliable and unverifiable. So who knows on what basis the thing rests? And who has the time to independently verify every single quote in a blog post that is asking the reader to trust it in order to experience its loveliness? And if you can't trust the quotes, then what are we left with besides some supposition and lovely sentences?
By Isaac Butler
I had long had an idea for an essay-- one that I likely won't write now-- called The Three Houses of August, one that pivoted off a visit to his childhood home (which I've seen, and is a boarded up wreck contained within The Hill Distrcit, where his plays take place) to then profile The August Wilson Center for African American Culture in Downtown Pittsburgh just a couple of miles away before visitng The August Wilson Theater on Broadway, which currently houses Jersey Boys.
At least one leg of that trip is likely to be impossible now, as the August Wilson Center for African American Culture is in very, very serious financial trouble:
The bank has sued to foreclose. The city’s philanthropic groups, with names like Mellon and Heinz, have withdrawn support. The $42 million August Wilson Center for African American Culture, a bow-front building inspired by a Swahili sailing ship, is high and dry.
Yowch! At least this article, unlike most coverage of financially struggling arts institutions, is willing to put the blame squarley where it belongs, namely with mismanagement. It seems that the August Wilson Center made three very big mistakes in its founding that all but guaranteed its eventual failure. First, they built the center in Downtown Pittsburgh, where "working-class blacks, many of whom feel unwelcome downtown with its skyscrapers and largely white-owned businesses," were unlikely to go. Second, they stuffed the Board with people who didn't give a shit about the Center, its programming or the playwright it was named after. Third, they took on an $11 million mortgage, which required mortgage payments of over fifty thousand dollars a month with no plan for how to pay for it.
So to review: upon deciding to build the August Wilson Center for African American Culture, they build a building they couldn't afford in an area of town where African Americans feel unwelcome and put people who don't like August Wilson on the board. Bravo. Really. Bra-fucking-vo.
This is all very reminiscent of the financial crises that we saw facing various theaters in the wake of the 2008 stock market crash. A theater would be in trouble, they'd beg the theatrical community to save them, and then we'd find out that they'd, for example, moved their space to a 20-30 minute drive from their aging subscriber base when it used to be five minutes away, or that their artistic director had cannibalized the theater for spare parts in order to launch his Broadway career, or that the managing director was a pathological liar.
And almost always, almost always, at the heart of this was a misguided brick and mortar decision that made a crisis all but inevitable as new spaces bring with them huge new monthly overhead costs, or the expected fundraising and subscriber boost from opening a new space dries up after the first year. Rumor is that there are two major theaters in DC that are facing financial crises as the result of these very factors.
But it's time to just be honest-- as this very fine article is-- about these crises. These are crises of management, and they're crises of the funding community. These nonprofits didn't all get the Edifice Complex out of nowhere. This is what they can get money to do. This is where gravity is pulling them. I know I went after The Public (a theater I adore, just to be clear) for spending $40 million on a lobby while their art starves, but the truth of the matter is, that failure is shared-- and incentivized and built-- by donors and foundations and public funders who will only fund buildings and don't really seem to care about art.
Until we get reform, we'll have more cases like this. The August Wilson Center, named after the man who is perhpas the greatest of American playwrights, a man whose plays told the story of the African American denizens of the Hill District with the musuicality of Shakespeare and the quotidian detail of Chekhov, is now being rented to a White Megachurch.
By Isaac Butler
Corey Robin has a good roundup of a recent twitter kerfuffle involving the Atlantic's resident Israel-censor Jeffrey Goldberg. Goldberg, for those who don't know, is a columnist for the Atlantic, a former prison guard for the Israeli Defense Force during the First Intifadah, and a Zionist concern troll who is faster than just about everyone to label rhetoric that questions Israeli action as anti-semitic. He even used his perch at the Atlantic to beat up Theater J for daring to stage Caryl Churchill's "Seven Jewish Children."
(He also claimed the invasion of Iraq was "an act of profound morality," but that's less relevant here.)
One thing that really stands out about Goldberg's writing about Israel is how personal it is, in the worst possible sense. Goldberg has appointed himself some kind of arbiter of what is and is not acceptable for people-- especially his fellow American Jews-- to say and do with regard to Israel. And, in true tyrant fashion, the only evidence he needs that something is unacceptable is that it feels unacceptable to him. For example, his piece on "Seven Jewish Children" begins this way:
Against my advice -- and the advice of others -- my friend Ari Roth has decided to stage two readings of Caryl Churchill's "Seven Jewish Children" at hisTheater J, in Washington.
Roth's crime, in other words, is not staging the play, but instead not kissing Goldberg's ring with enough enthusaism.
Robin has a much more detailed account of Goldberg's attempts to dictate the way that Jews talk about Israel, an act of such monumental hubris that Goldberg appears to believe he gets to say who is and isn't a Jew based on their Zionist bonafides after Goldberg threatened to stop "defending" J Street's place in the Jewish Tent after one of their employees argued with him on twitter:
Zionists like Goldberg like to style themselves as open, hip, and pluralist. They think what distinguishes them from the Black Hats is their embrace of secular modernity. But as you can see from this incident and the one I discuss below, Zionism has not only made these types intolerant and anti-pluralist; it has turned them into Popes and Inquisitors, enthralled with their imagined power to exile and excommunicate.
Under their watch, one of the most important questions that lies at the heart of the Jewish tradition—What does it mean to be a Jew?—gets taken off the table. Because we already know the answer: support for the State of Israel. If you do, you’re a Jew in good standing; if you don’t, you’re not.
That’s what nationalism—especially nationalism hitched to a state—does to people. It makes the Goldbergs of this world think they can give Jews a passport or take it away. Well, guess what, Rabbi Goldberg: you can’t. I don’t need you defending my right to be in the Jewish tent because that’s not within your, or any other Jew’s, power to decide.
By Karl Miller
(Editor's Note: Things have been a wee bit slow around this ole blog of late due to work commitments and so I bring to you Karl Miller's review of the book "24/7 Late Capitalism and the Ends of Sleep" by Jonathan Carey. Karl is no stranger to these parts. A theatre blogger, writer and actor-- you can see his excellent turn in Marie Antoinette currently running at Soho Rep-- I've long looked forward to the day when Karl would write for Parabasis. I think with this review, you'll see why. -- IB)
SLEEP SO MUCH
24/7: Late Capitalism and the Ends of Sleep
by Jonathan Crary
Perhaps you’ve heard the Internet is ruining our minds. Symptoms include: snark bites, chronic blogorrhea and -- worst of all -- a starchy, purple discharge from the mouths of our media critics. By that last affliction I mean a kind of breathless, scattershot doomsaying essay whose mangled prose is the only cause for doom the author bothers to supply. It is as though someone wanted to warn us of a great dyslexia epidemic, but rather than research, report and argue the point, they just penned their warning entirely in spoonerisms and called it a day. For a ripe specimen of this sad condition, see the title above. To his credit, Crary wishes to alarm his reader and I confess I have been successfully alarmed.
Here is his argument as best I can parse it. Crary believes <Internet> is the sublime fulfillment of a reactionary political movement that began in the 1970s to suppress the flourishing of socialism. (I place Internet in brackets because you may have heard this same charge applied to many another culprit.) You see, <Internet> enables capital to course across the planet “24/7,” unimpeded by the rhythms and cycles of “everyday life.” With these rhythms and cycles bleached away, we have no common space or “temporality” with our fellow citizens and can therefore only transact with them as subjects in the neoliberal global capitalist system. Meaningful socialization and political organization is not just “impossible” now but “unthinkable” because <Internet>. Consequently, sleep is the only sector of human life that has not been appropriated by global capitalism and our only hope is that it cannot be. So if you want to find common cause with your fellow man, you should literally knock yourself out.
If you disagree with this dizzying and contradictory thesis, fear not, for you will find little in Crary’s book to persuade you otherwise. If your reading of history doesn’t happen to pivot on the year 1968, you will find nothing here to prove or advance the point; it is taken as axiomatic. If you recall that socialism, to its credit, had global “24/7” ambitions, too, you will find no mention of that as Crary strenuously binds all things “24/7” to capitalism. And if you believe global warming, green energy and the multi-polar nuclear world order are important issues requiring a global consciousness, you will find no engagement with those notions either.
But those are all sins of glaring omission. The real problem with the book is that it is a series of statements that rarely build atop each other to form what we in the industry call an argument. Which is to say the pages do not unfold in sequence as sections, chapters, paragraphs, sentences, phrases and words that synthesize and analyze examples, data and theories. Instead we get a few nice film and painting recommendations to investigate on our own time and a lot of verbose reiteration of the same general gripe. Here is a representative sample:
“If something as private and seemingly interior as dreaming is now the object of advanced brain scanners and can be imagined in popular culture as downloadable media content, then there are few restraints on the objectification of those parts of individual life that can be more easily relocated to digital formats.”
Why is dreaming “private” but only “seemingly interior”? How does a study of dreaming in the lab or a depiction of dreaming in art remove “restraints on the objectification of [other] parts of individual life”? What were the reigning restraints and who held the reins? There may be good answers to these questions, but in Crary’s treatment we never even get the questions. Try this:
“Sensory impoverishment and the reduction of perception to habit and engineered response is the inevitable result of aligning oneself with the multifarious products, services and “friends” that one consumes, manages, and accumulates during waking life.”
Adding scare-quotes to the word “friends” is about as close as Crary gets to analysis or demonstration. But that’s a sneer, not an argument. The quote above sounds true enough if you read it at the right gallop, but there is no explanation of which products, services, friends, together with what kind of consumption, management and accumulation results in “sensory impoverishment and the reduction of perception to habit and engineered response”. Consider a more specific example:
“We buy products that have been recommended to us through the monitoring of our electronic lives, and then we voluntarily leave feedback for others about what we have purchased.”
Sure, but what makes this different from the target marketing of yore, when advertisements were linked to certain sections of the paper, zip codes of the city, or times in the television broadcast clock? And remind me why voluntary feedback among consumers is bad? Are we doing too much of it at the expense of something else? Are there trends or numbers to prove so? Short of that simple metric, is there a qualitative difference to articulate? Should opinion-having be a credentialed profession again? We never learn. Each sentence is an implication that only triggers a fresh sheaf of implications. And like the manic web-surfer he seems to be criticizing, Crary himself cannot rest to establish any particular thesis statement before gliding face-first into the next one. Here is the following sentence:
“We are the compliant subject who submits to all manner of biometric and surveillance intrusion, and who ingests toxic food and water and lives near nuclear reactors without complaint.”
Here he’s at least talking about familiar threats, but he doesn’t empower the reader to make a complaint. Is there a nuclear reactor in Washington Heights I don’t know about? What toxins? How might I complain? How am I being prevented from doing so? And how is the first half of the sentence connected to the second except by vague exasperation with <Internet>?
This is followed by a real kicker:
“The absolute abdication of responsibility for living is indicated by the title of the many bestselling guides that tell us, with a grim fatality, the 1,000 movies to see before we die, the 100 tourist destinations to visit before we die, the 500 books to read before we die.” [emphasis mine]
Is this Columbia art historian so literal-minded that he doesn’t know that any recommendation on any subject from any source comes with an implicit “before you die” attached? Crary’s academic art history roots might explain this unremitting, declamatory posture to his writing. After all, in a painting, things are always already happening and all linear narrative is imposed by the viewer. So when Crary writes that an index of good books “indicates” an “absolute abdication of the responsibility for living,” he may just be stretching the vernacular of his day job around a more complicated subject, like a plumber describing heart surgery.
But for those in his audience who don’t happen to be Marxist art historians, a simple entry-level problem remains. We need fact, stipulation, sequence and context for any economic, political, historical or psychological argument to persuade us and Crary provides none. If he is trying to say that even these rudimentary footholds for the rational exchange of ideas have been washed away in the slippery blizzard of cyberspace, he has still failed to offer any evidence beyond his own white noise to prove so. He “indicates” much but explains little.
In Amusing Ourselves to Death (1985), Neil Postman, deftly articulated how a televisual discourse had supplanted the typographic discourse that preceded it. In written language, sequence and continuity determine what is true and false. By contrast, in televisual language, the film editor determines what is true and false. As Goddard said, “Film is truth 24 times a second and every cut is a lie.” If you follow this, you can see how the easy discord between audio and video tracks, the jump-cut and the montage all dismantle prior definitions of “sequence” and “continuity”. Taken together, these techniques amount to a new governing syntax for all that passes through it. The gravest consequence of this new discourse is that it makes contradiction and hypocrisy almost impossible to hold accountable. So if you watch more news than you read, you will lose the ability to detect contradiction and hypocrisy. Postman then demonstrated how this new discourse changed education, religion and politics, often for the worse, and he closed his book by offering some solutions for mitigating the change and recovering what we had lost along the way.
By contrast, on every page of 24/7, we get the general impression that a similar transition has taken place, but there is no sequence or continuity to Crary’s case as it is written. Consequently, Crary can only reify himself with escalating feats of verbosity. The spectacle is comic, then tedious, then pitiable. By the end, my frustrated marginalia condensed into one solemn wish: Academician, deconstruct thyself. Consider this statement:
“Modernization could not proceed in a world populated with large numbers of individuals who believed in the value or potency of their own internal visions or voices.”
Now try to square that with something the author writes two pages later:
“It is impossible now to conjure up an individual wish or desire so unavowable that it cannot be consciously acknowledged and vicariously gratified.”
Both statements are attempts to articulate a problem. In the first, Modernization affronts our individuality, but in the second, individuality is the problem. Which is it to be? Can I believe in the potency of my internal visions and voices or can’t I? If so, why is my “conscious acknowledgement” and “vicarious gratification” of them a problem two pages later? Why are “voices and visions” good and “wishes and desires” bad? There is no coherent argument to bridge these two unwieldy declarations. Here Crary embodies the heedless discontinuity of thought that Postman patiently dissected.
This is unfortunate. We need lucid analysis of globalization in all its forms. You don’t have to be a Luddite or techno-utopian to appreciate that. Nor do you have to be a Marxist or libertarian (though these seem to be the only categories our media critics care to engage). If, like me, you are already open to the notion that Internet and capitalism have profoundly altered how we enter and exit consciousness, you will instead find yourself profoundly aggravated by Crary’s one-note hysteria on the matter:
“The more one identifies with the insubstantial electronic surrogates for the physical self, the more one seems to conjure an exemption from the biocide underway everywhere on the planet. At the same time, one becomes chillingly oblivious to the fragility and transience of actual living things.”
This sounds serious, until you recall that our globalized 24/7 culture is what made global warming detectable in the first place. Without a planetary consciousness, which would be 24/7 by definition, we cannot mobilize our fellow primates to fight the biocide that Crary thinks we cannot even see. Moreover, if you read that passage again, you will find nothing there that could not be grafted onto a discussion of television decades ago. Or a discussion of the printing press centuries ago. Or, indeed, a discussion of the written word millennia ago. What makes an “electronic surrogate” different from a literary or theatrical one?
This is the operative distinction at play, but with Crary all we get is more gobsmacking about kids these days. Here he is on blogging:
“The phenomenon of blogging is one example – among many – of the triumph of a one-way model of auto-chattering in which the possibility of ever having to wait and listen to someone else has been eliminated. Blogging, no matter what its intentions, is thus one of the many announcements of the end of politics.”
So sorry we at Parabasis cannot match the depth and calm decorum of cable news, talk radio or the newspaper editorial page (where one has no choice but to passively consume) but, respectfully, has Crary ever interacted with a blog? The comment section of any post may be a bloody trench – a core sample of ideological hatred – but it is hardly “one-way auto-chattering.” Even at its most shrill, blogging is a net positive for politics, in this sense. And what was the alternative venue that blogging “eliminated”? Waiting in lines:
“As Sartre showed, the queue is one of the many banal instances in which the conflict between the individual and the organization of society is felt … The suspended, unproductive time of waiting, of taking turns, is inseparable from any form of cooperation or mutuality. All the preceding decades of authoritarian rule had not nullified certain enduring features of community, in part because brutal but crude forms of Stalinist discipline allowed many of the underlying rhythms of social time to persist unchanged.”
True, Stalin did bless his citizens with new and longer lines, but if I’m using my iPhone to sign a petition for marriage equality whilst standing in line at Duane Reade, am I really missing a more ripe political exchange in the fluorescent muzak of the moment? How would a more tedious DMV make my neighborhood a better place? When you sink to calling Stalin “brutal but crude” – as though his crudeness somehow tempered his brutality – it’s safe to say your argument is strained. These are the limits of applying Marxist art history to <Internet>. The brutal but crude ideological frame simply cannot contain this big sloppy canvas.
So 24/7 will not persuade those who disagree with its countless stipulations, and those who start the book with some kindred interest will be frustrated by the absence of developed argument. But you can be wrong in argument and right on the merits (Christopher Hitchens). And you can be wrong on the merits but a delight to read (Christopher Hitchens). So is 24/7 worth reading as a Marxist rallying cry? Sadly, no. Crary’s sentences are built from an evasive phraseology that only gets mushier the closer your inspect it. And this is where 24/7 goes from being merely weak to actively harmful. When you’re wrong at the word-to-word level, you’re going to cause some real damage to the reading public, regardless of its ideological affiliations.
How so? Well, for such a short book, it’s alarmingly redundant. If something goes without saying, Crary will be sure to say it thrice over. In the space of five pages, he warns of “collective normalization” and “generalized sameness” and the “fabrication of pseudo-necessities.” He never bothers to itemize, trace or even name our “necessities,” never mind our “pseudo-necessities”. But if we have reached the sorry state where the “pseudo” has been “fabricated” (don’t ask how it could be otherwise) then what authentic spring of nature or man may provide our “pseudo-necessities” in the raw, as it were, furthermoreover? Is he saying that even our fake delights are fake? I have no idea. Because he never pauses to talk about what, precisely, counts as “normal” or what he means by “collective,” he just fuses the two synonyms into an ungainly mash-up that doubles the word count but halves the meaning. In this fashion, each sentence manages to be elaborate without ever proceeding to elaborate anything. Behold the ravages of Internet! Redundant echoes infect yet ensicken the syntactic verbularities of his mentalized cognitisms.
I mock with great sympathy because I sincerely believe that, absent Internet, Crary would have written a passionate purple tract against some other broad sociological phenomenon, but would have done so with a little more clarity, cohesion and perhaps an ounce of consideration for the motherfucking reader. Something about 24/7 reality really has scrambled an otherwise fine mind and hampered its ability to communicate with other minds. As my Kindle highlighter bled out in the grocery line, I got the dreadful feeling that Crary wasn’t trying to speak to me, the reader, at all. (And really, how could he speak to me if he’s already concluded that both individuality and mutuality have been annihilated in the “collective normalization” of 24/7 Internet?) Better to speak directly to Internet herself, dodging her snark and algorithmic snares with language that is purposely, perhaps strategically, disjointed and obscure – like the characters in that last digital shibboleth for humanity:
I started by saying Crary’s superobjective was “to alarm” and that he had succeeded. Here is the “grim fatality” of his own treatise. If sleep really is our last and best stand against global capitalism, then we are left to believe that we are most connected to our fellow citizens when we are least conscious, that is, when we are out cold, paralyzed or dreaming. But as anyone who’s ever dreamt or listened to another person’s dream knows, dreaming is the most exacting and peculiar individual experience we can have (pace Jung). In dreams, we accept whole premises without complaint or reflection – such and such a house is now our house. Such and such a lover is now our lover. We are undermined by our dreams, but safely so precisely because the body is paralyzed. In this private theatre, rational control, ideology and language are all delivered over to the imperatives of the subconscious, which, I’m sorry to report, never sits still.
This is a vulnerable condition. Crary is right when he points out that it is predicated on a trust for each other. We must trust each other to broker common rules for public and private life in order to have a safe space for sleeping and dreaming. So maybe our new insomnia comes from a new anxiety about new global threats? This sounds plausible enough until you recall that this has been a founding anxiety for most animal life on the planet. So to ask for the umpteenth time: what makes this age so different?
I’d like to close with a theory for further investigation – one that might answer that last question. And it’s a theory inspired by one of the better passages of 24/7. Crary’s art history vernacular may be inadequate to his larger subject, but it works well to describe a few things:
“In my account, modern sleep includes the interval before sleep – the lying awake in quasi-darkness, waiting indefinitely for the desired loss of consciousness. During this suspended time, there is a recovery of perceptual capacities that are nullified or disregarded during the day … One follows an uneven succession of groundless points of temporary focus and shifting alertness, as well as the wavering onset of hypnagogic events.”
As we drift to sleep, we see our thoughts run before us of their own inertia, no longer driven by the rational intentions and demands of the animate working day. We see our thoughts unbraid into a web of associations. A memory of Duane Reade, the drug store, morphs into a memory of Duane Miller, my father. This free association tends to happen in sleep because the associative is the least taxing form of cognition. Unlike the intentional, the linear, the synthetic or the analytic, the associative relation is built on the simple economy of “generalized sameness.” At some ungraspable point in this gentle regression to the associative level of thinking, we lose consciousness itself. Not just consciousness of ourselves and our fellow citizens, but consciousness of any object. We stop taking in new experiences so we can cement the memories of the old ones. We sleep.
If we are trying to identify an insomnia peculiar to our times, we must investigate how this hypnagogic gateway to sleep has been mirrored or inverted by a new conscious activity like web surfing. Online we likewise careen across a network of associations, of links, often marveling at just how far afield we can run from our original intention. An article about Miley Cyrus may take us to the Wikipedia page for the Island of Nauru, for all we know. Is it possible this new daily labor depletes your ability to lay in bed and let your brain passively range over its own internal, individual network of associations? Is it possible that a person conditioned to surf, organize and share his or her experiences will find it difficult to ease into this hypnagogic state where the surfing cannot be controlled, tagged and validated instantly? Would such a person be more likely to crave the clean, binary on-off switch of narcotic sleep aids if the liminal space between consciousness and unconsciousness only creates fresh anxiety? And is all of this the lone rampage of capitalism or is something else going on here, too?
I think this is what Crary means by “perceptual categories that are nullified or disregarded during the day.” On that crucial point, his argument finally connects with the promising themes kicked up by his title. But as the preponderance of his book clearly demonstrates, we desperately need something more than Marxist art history to confront it. Globalization may be an outgrowth of late capitalism, but global warming is hardly a boon to the same. The Internet is a marketplace, but it is also a forum. It frees speech, information and political exchange but it redefines privacy and intellectual property along the way. Most importantly: socialism, like the Internet, presumes a planetary, species-wide identity and concern. And we happen to have far more in common with each other than our daily shift as passive bodies.
Put another way, if a ready capitalist like George F. Will sees all civic and public life as one big annoying line at the airport, we may say that George F. Will has a sad and stunted view of civic life. But if Crary’s ideology leads him to crave annoying lines at the airport and to see the Internet as one big annoying commercial … what does that “indicate” about his ideology? Both men have their favorite bedtime stories, I’m sure. But the rest of us could use a better lullaby.
By Isaac Butler
I went on a bit of a twitter rampage this morning about the Donmar Warehouse production of Julius Caesar that's in an extended run at St. Ann's Warehouse right now in Brooklyn. I don't really have time to expand it into a review per se because of other writing projects, but you should read Mildly Bitter's thoughts on it, which I'm sympatico with.
In the meantime, because the critical consensus on the produciton is so rapturous, because I want to add my voice in supporting those who didn't care for it, because twitter is disposable and a weird place to post anything of length, and because I have no freaking clue how to use Storify, here is a lightly adapted transcript of those 17 tweets, turned into fourteen thoughts:
By Isaac Butler
The Times has a piece-- 33 paragraphs long-- about that evergreen topic that goes by the name "the crisis in the humanities," in this case, discussing declining interest and enrollment:
The concern that the humanities are being eclipsed by science goes far beyond Stanford.
At some public universities, where funding is eroding, humanities are being pared. In September, for example, Edinboro University of Pennsylvania announced that it was closing its sparsely populated degree programs in German, philosophy, and world languages and culture.
At elite universities, such departments are safe but wary. Harvard had a 20 percent decline in humanities majors over the last decade, a recent report found, and most students who say they intend to major in humanities end up in other fields. So the university is looking to reshape its first-year humanities courses to sustain student interest.
There's just one problem: There likely isn't declining interest and enrollment and the article itself even admits it:
The future of the humanities has been a hot topic this year, both in academia and the high-culture media. Some commentators sounded the alarm based on federal data showing that nationally, the percentage of humanities majors hovers around 7 percent — half the 14 percent share in 1970. As others quickly pointed out, that decline occurred between 1970, the high point, and 1985, not in recent years.
In other words, there hasn't actually been a decline in nearly thirty years.
But that's not all! The stat in the above paragraph? The one about "the percetange of humanities majors" hovering around 7 percent? It turns out to be wrong too, and easily checkable. Go look at this chart (h/t for pointing me the way to Freddie DeBoer) from the national center for education statitstics. The % of total Bachelors that went to humanities majors in 1970 is seventeen point one percent. The % in 2010 is also seventeen percent.
Like with Social Security, we are talking here about a "crisis" that does not appear to be real. In Social Security's case, the "crisis" myth is perpetuated by people who wish to cut and/or destroy social security under the guise of saving it. I wonder if that's what is going on here.
In my experience-- limited tho it may be-- the actaul "crisis" affecting the humanities is largely administrative rather than student interest driven. Simply put. a lot of Universities give lip service to how much they value the humanities but do not actually value them when compared with more lucrative STEM progrems, and so they tend to undermine and sabotage them. Within actual humanities departments there also tends to be the kind of eccentric battles of will and dysfunctions that academic satires are built on.
UPDATE: The original draft of this post made it seem like the Times used the term "crisis." They never actually used the term, but this article is of a piece with many articles published over the last couple of years that assert there is a crisis of confidence in the humanities at the undergraduate level, which is why i use the term. Thanks to reader Karl Miller for bringing it to my attention. The language has been clarified.
By Isaac Butler
Hey! I have a piece in HuffPo about why the Redskins should change their name and branding:
When I was growing up in the heyday of Washington, D.C.'s crime wave, when our murder rate made national headlines and a serial killer nicknamed the shotgun stalker prowled the now-pricey streets of Columbia Heights, the Washington Bullets, our basketball team, abruptly announced they were changing their name. Owner Abe Pollin explained that the name made him uncomfortable, because so many bullets were being used to kill D.C.'s citizenry and had just been used against Israeli Prime Minister Yitzhak Rabin.
Now, a loose campaign is afoot to pressure the Washington Redskins into changing their name. At least on an ethical front, the decision is a no-brainer. The name is an ethnic slur and deeply offensive. Tradition is not a defense if the tradition is wrong. If the Bullets can change their name because of negative associations with violence, the Skins should be able to follow suit.
Later on, it somehow works in Woody Allen. Don't ask me, I just write this stuff. You can read it here.
By Isaac Butler
In the years that I had been away from New York, The Public Theatre finished the renovations to its lobby. There’s a part of me that’s nostalgic for the old space. True, it was dilapidated and drafty and drab. But during the afternoon, when you were between appointments, you could park yourself in a surprisingly comfortable chair and read for a bit at one of its round, formica tables and at the next table over see the shambolic visage of Neil LaBute or the sharp manic angles of Tony Kushner clutching a cup of coffee, talking to someone you didn’t recognize. The old lobby was in bad shape, and figuring out how to pick up your tickets and get to your stage was nearly impossible, but it was comfortable in its way.
The new lobby tells a different story, has different ideas embedded in its freshly painted pillars and elegant balustrades. It sports a snack bar downstairs and am expensive lounge upstairs called The Library. Dangling from the roof is an art installation called the Shakespeare Machine, a luminous chandelier featuring thirty-seven LED screens on which algorithm-chosen words of the Bard form and dissolve. The typography that is an essential part of The Public’s brand has slipped the surly bounds of their posters to run in high impact style up and down the walls of the space. In the lounge, you’ll find hand-scraped walnut tables. You’ll find a menu designed by chef Michael Oliver.
Other changes have been made as well. Staircases have been moved. Ancient bricks replaced and repaired. Even the sidewalk outside has been refashioned. The steps, formerly inside the lobby, are now outside. According to the Times, Artistic Director Oskar Eustis feels this makes the space more “permeable,” while the lead architect calls the steps the theater’s seventh stage. They glow, the steps. Above them, a transparent awning features the name of the Public in reversed custom-fit glass. It looks up to the minute, but is also carefully designed to conform to the building’s landmark status.
The gamble of the new lobby lies in the shifting of its values and it assumptions. The old lobby was approachable in a particular way, homey and homely all at once. The new lobby is not approachable. It is, instead, cool. It is a place for sophisticated people to hang out drinking sophisticated beverages before going to see sophisticated live entertainment. It is not for the public but a public. It feels very similar to the lobbies of various chic, high-end museums and, like much of the rest of the island in which it sits, is going after a youngish, disposable-income laden demographic.
On some level, there’s nothing wrong with this. It’s a break with many of The Public’s peers, who continue chasing the same dwindling crowd of elderly nouveau riche Jews and old-money WASPs who’ve been subscribers since time immemorial. If anything, it feels like The Public is coming to some level of self-awareness of what it actually is after a few years of Artistic Director Oskar Eustis trying (and failing) to reduce their ticket price point to as near zero as possible. And that level of self-knowledge is reflected in a truly excellent season of offerings that includes Lear deBassonet’s take on Good Person of Szechuan, a month of Mike Daisey, and a reimagining of Antony and Cleopatra courtesy newly-minted-MacArthur-genius Tarrell Alvin McCraney. There’s even a musical of one of my favorite books, Alison Bechdel’s Fun Home, running and receiving raves there right now. The season is a stylistically diverse group of shows that still feel all of one piece, one that balances reimagining the canon with premiering important new work.
In other words, sometimes a lobby is more than a lobby. Sometimes your renovation is really a rebranding, or, perhaps, an acceptance of what you are. And if you can get the taxpayers of the city of New York to pick up most of the $40 million dollar price tag for that renovation, why wouldn’t you?
These were some of the thoughts crossing my mind when I went to see that musical of Fun Home. I had plenty of time to think them, standing in that redone space, surrounded by walls so bright and so white and so hard they could be teeth. For on Friday night, Fun Home stopped after its first number, and never resumed. One of the actors had fallen sick, you see. And as a member of house management told us, the show had no understudies. The Public couldn’t afford them.
by Isaac Butler
Note: A year ago, this piece appeared in the Fall print edition of Rain Taxi a wonderful review of books based in Minneapolis that some of my favorite writers-- including David Foster Wallace-- have written for. Since the copyright recently reverted back to me, I thought I'd finally get it up online.
Conversations With Anne
Theater Communications Group ($22)
Anne Bogart, now sitting comfortably amongst the Old Lions of the American Theater, has had at least three distinct stages to her influential and befuddling career. In the 1980’s, she was a postmodern visionary, stalking her favorite directors in New York, absorbing their work and reconfiguring it into grand conceptual gambits, most famously a production of South Pacific at NYU which she reimagined—much to The Rogers and Hammerstein Organization’s chagrin— as taking place within a hospital for wounded veterans. After serving for one disastrous (her word) year as the Artistic Director of Trinity Rep in Rhode Island and two years as the President of the Theater Communications Group, she then founded the SITI Company with Japanese director Tadashi Suzuki. Over the years, SITI has trained dozens of artists and toured all over the world with a variety of original works, earning a storage locker's worth of awards and critical recognition.
The SITI Company became most famous within the theater industry for a series of ensemble building and training exercises known as The Viewpoints. Adapted from exercises developed by the choreographer Mary Overlie, The Viewpoints are a system for using movement and gesture to explore various aspects of live performance, such as architecture, spatial relationships, tempo, duration, emotion, and story. The Viewpoints are remarkable training, but also amongst the most troubling aspects of Bogart’s legacy, as many a young director, misunderstanding their point, has attempted to use them as a way to develop staging concepts, with near-universally poor results. As Bogart herself says in her new volume, Conversations With Anne, “The mistake I think people make about our work is that they think that we actually improvise to make a play, which we do not . . . It’s essentially a training technique. It informs how people work together.” (376)
It is the way of the world that the avant-garde becomes the old guard, and SITI has not escaped this fate. Now they tour and teach, but the critical responses to them are often guided by an admixture of respect and disinterest; their most enthusaistic fans tend to be other theater artists, and the vast majority of their influence on the field can be traced to their training, not the actual pieces they create. Still, it’s tough not to admire their resolve. They continue to survive as a company, creating the work that they want to create and building a business foundation that allows it to happen. Bogart is also adamant about paying her labor well and dedicated to serving as a mentor for countless theater artists, surprisingly rare traits in an established director.
While SITI has settled into the august Valhalla of the established quasi-experimental troupe, Bogart has gone on to reinvent herself again as a prose writer, essaying the intertwined arts of theater and life in The Viewpoints Book (with Tina Landau, Theatre Communications Group, 2005), A Director Prepares (Routledge, 2001) and And Then You Act (Routledge, 2007). The third of these, written and published in the middle of George W. Bush’s second term, is a particularly wonderful book. A meditation on creating art in seemingly hopeless times, And Then You Act is beautifully written, rigorously researched, and that rarest of things: a book about theatre that has appeal beyond it.
Over the last decade, Bogart has also been conducting a series of live interviews with various groundbreaking directors, writers, and choreographers. Conversations with Anne collects twenty-four of these, including talks with Richard Foreman, Bill T. Jones, Oskar Eustis, “My Dinner With” André Gregory, and Julie Taymor. Sadly, while the book contains many great truffles for the enterprising hound, it is marred by several pervasive problems, many of which can be traced to a lack of editing that renders the book infuriatingly dense and repetitive.
The interviews in the book are presented in a transcript format. Except for a couple of speaker identifications when an important audience member chirps up, the text is left largely alone, save for a prose introduction written after the fact about whomever the subject is. Each subject is also introduced within the transcript of the talk. The two introductions often repeat the same stories and information, sometimes almost word for word. Here are a few sentences from the first two paragraphs of the prose introduction for Richard Foreman:
As an undergraduate at Bard College I was part of a posse of theater students who organized the “I Hate Richard Foreman Club.” . . . When Bard’s film department invited Richard to speak with film students, we rounded up the “I Hate Richard Foreman Club” to face him down. Much to our surprise, no film students showed up, which left us to face and accuse him directly: “You use actors like props!” At the time, we were all highly influenced by Jerzy Grotowski’s brant of actor-centric “poor theater” and Richard’s auteur theater seemed heretical to us. Much to our frustration, Richard . . . agreed that he treated actors as props. (3)
And here is most of the first paragraph of the interview transcript:
Let me start by telling my personal story with Richard. When I was an undergraduate at Bard College in upstate New York, I was part of a company . . . Which was very influenced by Grotowski . . . I hated Richard Foreman. I just thought he was awful. We formed the “I Hate Richard Foreman Club.” This was 1973. We saw a sign at school saying that Richard Foreman was coming to visit… so the “I Hate Richard Foreman Club” al went down to torture him and none of the film students showed up. So, Richard came it, and we said “You treat actors like props!” And Richard said, “Yeah.” (5)
Having had a similar conversation with the Master once (Me: “You took all the humanity out of your work!” Him: “Yes! Exactly!”) I can attest that this wonderful anecdote sums up a lot about him, but the nearly verbatim repetition distracts from and ultimately undoes what delight it contains. The next interview—with Peter Sellars—has the same issue, as do at least three others in the book. If any of her subjects directed in college, Bogart uses the formation “X studied at Y, where their productions are still legendary,” to describe their early work.
All of this would be picking the tiniest of nits if the conversations delivered on their promise, but a similar repetition and lack of thorough weeding dominates there as well. As Janet Malcolm puts it in her landmark exegesis of the subject-writer relationship The Journalist and the Murderer, “When we talk with somebody, we are not aware of the strangeness of the language we are speaking. Our ear takes it in as English, and only if we see it transcribed verbatim do we realize that it is a kind of foreign language.” (154) Malcolm’s own prescription for this problem is to treat the interview transcript
not [as] a finished version, but a kind of rough draft of expression. As everyone who has studied transcripts of tape-recorded speech knows, we all seem to be extremely reluctant to come right out and say what we mean—thus the bizarre syntax, the hesitations, the circumlocutions, the repetitions, the contradictions, the lacunae in almost every non-sentence we speak. (155)
Capturing other people’s words is one of the great challenges of nonfiction today. Striking the balance between massaging and clarifying speech into something coherent, clear and beautiful and capturing a voice in all of its glorious, authentic idiosyncrasy is a task whose shape changes from project to project. An anthology of live interviews will never be able to engage in the kind of reworking that Malcolm was able to do at The New Yorker, but it still has a duty to find its own solutions to the elocutionary problems that she articulates above.
On that front, Conversations with Anne is a failure. To pick a few examples at random: The first answer Foreman gives Bogart takes three and a half pages to get through what should be a two paragraph answer. Peter Sellars delivers frequent verbal mélanges like, “That to me is the project of democracy that is always ahead of you. And one of the reasons we do theater is to hold in front of people where we are still going, that we haven’t been there yet, but we’re still trying to get there.” (31) Tina Landau’s interview comes laden with qualifiers like “I think” or “for me” that torque her sentences into odd shapes. Molly Smith introduces the subject of Wendy Wasserstein and her cancer by saying, “I’m very excited about—well, let me tell you about a person who I am thinking about right now. You were talking about women writers. I don’t know if you all know what’s on with Wendy Wasserstein right now.” (421)
When shaped by the human voice with all its cues of tone and rhythm, inflected via hand gesture and the acrobatics of our facial muscles, projected by microphone and speaker through the trembling air into the attentive ears of the audience, I have no doubt that these interviews made for bracing listening. The people collected for these Conversations are legends, geniuses, and in some cases, bullshitters par excellence. In the book, you’ll find history lessons from Zelda Fischandler on the origins of the Regional Theater Movement, inspiration from André Gregory on how you can do theater in your living room, Mary Overlie and Anne Bogart discussing the intellectual property issues surrounding The Viewpoints in real time, Bill T. Jones talking about his public image after he revealed he was HIV Positive, Joseph V. Melillo explaining how BAM works, and Martha Clarke and Bogart discussing their differing approaches to Act Four of A Midsummer Night’s Dream. You’ll even get a round-table discussion amongst members of the SITI Company in the book’s final chapter, as they explain with as much candor as possible the strengths and weaknesses of the organization. Finding these gems takes far more work than it should, however, as they come buried under so much verbal sludge that they are times near invisible.
Throughout, Bogart is a generous, effusive ringleader. Her most commonly used adjective to describe something her subject has said is “beautiful,” and she is generous on the subject of her own weaknesses. It’s hard, in fact, not to fall a little in love with her as you read her questions, or as she charmingly self-deprecates about her tenure at Trinity, or retells the fables that guide her own thinking about the world.
All this effortless charm hides how little actual conflict or drama there is within any given interview. Theatre folk are notoriously thin-skinned and conflict-averse in public; the social nature of both the industry and the work it creates has resulted in an environment in which people are afraid to alienate or offend peers whom they may need to count on for work. So while the various artists playing nice on the stage makes a certain amount of sense, it’s a missed opportunity. After all, in A Director Prepares, Bogart bizarrely asserts that America’s love affair with theatrical realism is the result of artistic cowardice after the McCarthy hearings, but when she interviews Zelda Fischandler—one of the primary forces behind America’s love of theatrical realism—this difference goes completely unmentioned. Bogart says in one interview that Robert Woodruff’s work was stuck in a rut until he “came and did Columbia’s directing program… [and] his work with young directors took him out of the eighties,” (105) but avoids discussing this fallow period with Woodruff himself.
Admittedly, the book is called Conversations and not Debates with Anne; its goal is to give its subjects as much room to express themselves rather than to push them on any given thing that they say. When this approach works, it works beautifully. The best interview in the whole book is with the Public Theater’s Oskar Eustis, as he holds forth about his background with Communism, his fascinating life story, and his struggle to balance his ethics with running and sustaining a large theatrical institution.
In place of conflict within the book, however, a kind of odd dramatic irony creeps in. The final interview proper in the text comes from 2006 (a SITI Company roundtable is from 2009)—before the financial crisis, before theaters started going out of business, before the Edifice Complex that afflicts the American theater had become a deadly disease. I cringed reading Ben Cameron’s recounting of Joe Dowling’s promise that the Guthrie Theater’s new building would result in more adventurous work knowing that instead their seasons today feature Broadway try-outs and Neil Simon. As Molly Smith mentions offhandedly that she’s fundraising “over a hundred million dollars” (418) to build Arena’s new building, it’s hard not to wonder how no one could discuss why such a budget would ever be necessary. The book, then, inadvertently serves as a sharp reminder of how deluded we all were in those days. How we thought—as André Gregory says in 2003— that if Bush was reelected, “we have seen the end of Democracy in this country.” (69) How we thought—as Bogart says of Peter Sellars—that someone “is a courageous guy” (29) for stomaching audience walkouts of their work. How we thought as it played and played that the music would never stop, how we didn’t realize there weren’t enough seats left at the table for all of the children running round and round.
Spoiler alert: This is a post about Carrie, both the old and the new, so if there are spoilers here...I feel sorry for you. Get out of your house more.
Towards the end of the new version of Carrie, I experienced a thought I didn't expect to have. As Carrie White hurled the film's villain, the bitchy, slutty, spoiled brat Chris Hargensen into a gas station to her fiery death, I thought, "Wow. This seems kinda like an overreaction. I bet that, if Carrie survived all of this, she might kinda regret this." Which is pretty much the opposite of the point of Carrie. The last thing you're supposed to feel is any sort of pity for her victims. And yet...I did.
When I posted on Facebook that I had seen the new Carrie, a friend of mine asked if the remake accomplished anything that the book or the 1976 classic didn't. It doesn't, but that doesn't exactly matter, does it? For my money, Carrie is one of those timeless stories, about such large, social issues that will never really go out of style, even if the style of filmmaking or the costuming does. Do we ask if a new production of Romeo & Juliet brings anything new to the table? Before you get your "Shakespeare is sacred" knickers in a twist, I know it's not Romeo & Juliet. But there is something...big about Carrie. It's not for nothing that this material has been revisited several times since the book came out. It's also not for nothing that, since the 1976 movie, every single one of those times has met failure. The newest version comes closest to success, though, for my money, it doesn't quite get there.
There's not a lot of story here, not a lot to be mined in terms of character or new discoveries. There is, in comparing the new version to the old, something to talk about, not just about how teenagers have changed, how portrayals of teenager have changed, but also about how storytelling and moviemaking has changed. Unlike a lot of remakes and reboots, this one is quite faithful to the original, so it's a rare, nearly 1-to-1 comparison, 1976 vs. 2013.
After we went to see Carrie in the theatres, my girlfriend and I watched the Brian De Palma version on Netflix. My girlfriend had never seen it and it's been at least 10 years since I'd seen it. And it's all there in the new version. I can see why Lawrence Cohen maintains his script credit. Roberto Aguirre-Sacasa's work is more in line with an adaptation of an adaptation. Some new 2013-y touches are added: hints of cyber-bullying, a different relationship to teenage sex, slight tweaks on some minor characters. One major character shift was made, but I'll get to that in a second. For comparison's sake, though, it's the same script, the same story: Carrie White is a weird girl, she gets her period in gym class, the mean girls taunts her, she discovers that she has telekinetic powers, she gets invited to prom, pig's blood gets dropped on her, she destroys the town. That's what we've got.
It is effective, in both versions. Kimberly Peirce's direction is less lascivious, less show-off-y than Brian De Palma (because no one can be *more* lascivious or show-off-y than Brian De Palma; that's why we love him), if a little overly slick. It's an interesting irony; this recent spate of horror movie remakes (Nightmare on Elm Street, Friday the 13th, Texas Chainsaw Massacre) all share these deep, dark color palettes, full of details, in the way that all of the originals share the cheap, over-exposed look of the 70s and 80s exploitation flicks. The blood that drenches Carrie looks like chocolate syrup in the new version, while in the original, it's more like Heinz ketchup. Times have changed.
The mean girls in the original were a decidedly ragtag bunch: gawky, weird, distinct. Edie McClurg was a mean girl! Edie McClurg! They didn't look like teenagers, but they looked like real people. The mean girls in 2013 are all model-gorgeous, all barely legal, all total vamps and hotties. Then again, so is Carrie, really. Someone like Sissy Spacek could make a name for herself in indie movies these days. Not in a major motion picture. (I know that's not the more original or stunning insight, but seeing the two movies essentially back-to-back really drives the point home.)
The biggest, intentional distinction between the two comes in the role of Margaret White, Carrie's abusive, religious mother, played by Julianne Moore now (Piper Laurie played her in '76). In both versions, of course, she's a crazy Christian. In '76, though, the emphasis was on "Christian." Now it's firmly on "crazy." This Margaret White is as naive as her daughter; the film opens with a new prologue showing Carrie's birth, which apparently took Margaret by surprise. She thought she was dying, in the same way that Carrie does when her first period hits. Julianne Moore's Margaret is clearly suffering from some form of social anxiety, unable to make eye contact with others or speak above a whisper in public, jabbing her leg with a seam-ripper to get through a brief conversation. Her abuse of Carrie becomes more of an extension of her crazy, rather than outright abuse. Piper Laurie's Margaret seems crueler and her abuse more extensive because, well, she seems smarter, cannier, more prideful. It feels more intentional, more manipulative. And it makes you more sympathetic for Carrie. That's an odd point. Piper Laurie's Margaret is clearly trying to control her daughter out of pride and anger, while Julianne Moore's Margaret is clearly genuinely afraid for her daughter's soul (even if that fear is based in lunacy). Her intentions are better...and yet left me feeling like Carrie could have just endured it for a few more months and gotten out. Piper Laurie was never letting Sissy Spacek out of her control.
It shouldn't be hard to build sympathy for Carrie White. We've all known or been that kid, that out of step kid, the out of place kid, with her weird hair and hunched shoulders, forever waiting for the next blow to fall. As she discovers her power, we see her grow, become stronger. When she finally makes it to the prom on the arm of the studly Tommy Ross, we feel her coming into her own even as we know the hammer is about to fall (or rather, the bucket full of pig blood). We're on her side. We want her to win. We want her to punish those who have wronged her. So why did I feel like she went too far?
Part of it has to do with Chloe Grace Moretz; she's a terrific young actor, but she radiates confidence. She just does. She looks well-fed, well-supported, well-loved, a mature young woman in training. You can't just throw a blonde fright wig on a person, have them look at their toes a bit and make her an outcast. Sissy Spacek looks like a creature from some other world. She wanders through the 1976 movie like a peeled egg, all raw, exposed nerve endings. Her isolation is so great, her need is so great, it's overpowering. There's a classroom scene in each, the scene where Carrie actually connects with poor, doomed Tommy Ross. In the original, the scene featured Tommy reading a poem and Carrie saying she liked it, almost against her will. In the new one, the roles are reversed. Which is great for making Carrie a person, but less great for building a picture of an outsider.
The other part is about the visual storytelling of the movie. In subtle ways, the 1976 movie achieves a couple of things that the new movie undermines repeatedly. In the original, it's clear that Carrie White has ALWAYS been an outcast, a source of ridicule, the target for bullying. When you add that to a more intentional abusive mother, you see a downtrodden, broken girl.
Another shared moment in each revolves around a memorable piece of graffiti: Carrie White Eats Shit. In the original, it's in the background, as though it had been there for some time and no one cared enough to clean it. In the new one, it's massive, and fresh, and immediately being scrubbed away. Maybe more realistic...but it doesn't have the same effect.
In the new one, it seems more like she was simply invisible until until the period incident (now broadcast to the world via YouTube). It builds less a pattern of cruelty as one vicious event. Maybe a subtle difference, but a difference that accumulates. By the time the '76 Carrie reaches its climax, you can feel the explosion coming. It builds and builds. In the new one, even with the generous foreshadowing, the explosion feels more like a surprise. The humiliation at the prom is tied explicitly to the shower scene, which makes sense for plot reasons, but also makes it a reaction to that one act, not a lifetime of unanswered abuses. It reduces the scope at the point that it should widen.
The other part where the storytelling undermines the story (and , I think, is reflective of the way we want to connect with our heroes now) comes in the apocalyptic prom. In the original, Carrie's humiliation creates a major psychological break. I would even say that De Palma goes over the top with it, with his split screen, spinning images, directoral geegaws and scare score. But it's clear that we're in Carrie's mind and in Carrie's mind, they're all laughing at her. Even the gym teacher who took her side. In reality, they may not be, but for Carrie, it's the fulfillment of her mother's terrible wish. That's what matters. And it matters that we see that moment, not as an outsider, but from Carrie's point of view. We're right there with her. It's a primal howl of rage and pain. Something uncontrollable has come out of her.
In the new one, Carrie is in control the whole time. The movie takes great pains to show us that not everyone in the gym was laughing and even greater pains to show us that those that didn't laugh are spared. The consciousness of her revenge makes it colder, crueler. When she is finally hurling Chris Hargensen into that gas station, it's presented in slow motion, in great detail as Chris' face smashes through the windshield, glass embedded in her cheeks, before the car bursts into flame. We are clearly meant to see that she deserves this, she has earned this. But...has she? That's the moment I thought, "Well, I mean, yeah, she embarassed Carrie, and did accidentally kill poor Tommy Ross, but...does she deserve to die? So painfully? For that?"
We are in the midst of a cultural conversation about bullying and teenage cruelty. Carrie exists as part of that conversation. On those terms, though, what does this story tell us? That a violent, extreme response to emotional stimulus is justified. Is that the message we believe? The original, focused less on the cruelty of her peers, doesn't carry that weight. Carrie snaps and snaps totally. But, like I said, it's the snap of an entire life, not just one bad moment. I felt more sympathy, more empathy for her. She was a victim of so many forces. Sure, the original is far more focused on a sort of women's-sex panic, typical of the Golden Age of Horror Movies. But that's another post.
The new script wants us to have it both ways: we're meant to see Carrie as a victim, but also as a powerful female figure. She's standing up for herself, the way we all wish we could. But do we all wish we could brutally kill our bullies? Is that really what lurks in all of our hearts?
There are stories that get re-discovered, re-interpreted for each generation. Carrie is a fine choice for that. But when a story is re-interpreted, it tells you more about the time it's remade in. I'm not sure that this version tells us anything comforting about our times.
By Sally Franson
(Editor's Note: A couple of weeks ago, my dear friend and UMN classmate Sally Franson and I were chatting about our mutual dissatisfaction with the final couple of episodes of Girls. Out of nowhere, she said something quite brilliant about the show, character, and therapy culture. I wasn't surprised. Every conversation with Sally contains at least a couple moments of utter genius. I asked her to turn what she told me into an essay. Below she delivers in spades. Enjoy.)
Once, some time ago, I was drinking green tea with an old friend and talking about my problems. I use old in the sense that he was aged: in his seventies, an artist like me. Unlike me, he had grown up in rural China. I was younger then, and knew less than I do now, yet in the paradox of youth I thought I knew more. I was worrying over the bad things that had happened to me; I was trying to figure out why. I worried and worried until my friend held up his hand, but kindly, the way you halt a toddler before he careens into the wall. “You Americans,” he said. “You always want to manage your suffering.”
Suffering, it seems to me, emerges from two sources. The first is the terrible fear that we are unlovable. The second is the terrible fear that meaning does not exist. The former, for some of us, might never be relieved. The latter we try to relieve through various addictions and obsessions, but it can only really be lifted through art. Yet the purpose of art isn’t to make suffering manageable, or comprehensible; it’s simply to make us feel less alone in it. Suffering in its purest form – as in, its original manifestation in the body; as in, that deep, nameless ache beneath our ribs – is universal, everlasting. In art this deep namelessness can be held in the silence between words, chord progressions, chunks of paint on a canvas. In life, however, we have difficulty staying with it. Our index finger careens in a circle, desperate to assign causality. We want someone, or something, to blame, be it our parents, our genetics, a condition parsable by the DSM-V. So we go to therapy. Or we take our meds. Or (and here I claim my positionality) we do both. Healing neurosis through this kind of work is valuable and necessary, especially for creative types, but what happens when the work we do in life in order to create art starts leaching into the art itself?
It’s happening more and more in literature, this turning to pathology as a narrative gesture. But the problem with the therapeutic narrative is that it’s predestined and stale. Marilynne Robinson calls it a “mean little myth,” a bungled effort to make suffering tidier. Which may work as a survival tactic as well as any other mythology, but it’s a real buzz kill in stories. Chekov, who implored his fellow writers to “shun descriptions of a character’s spiritual state,” would be alarmed by the trend, for the therapeutic narrative stands in opposition to the Creative Writing 101 admonition to show, not tell. And, more importantly, in opposition to the lateral, non-predestined maneuvers required for the creative process to bear authentic fruit.
In an essay entitled “Narrative Dysfunction,” a brilliant postmodern continuation of Orwell’s “Politics and the English Language,” Charles Baxter argues that the dysfunction in contemporary fiction and drama can be boiled down to this quest for blame, which is the engine of so much contemporary storytelling (including, oh, approximately ninety percent of memoirs). “The perplexed and unhappy,” Baxter writes, “don’t know what their lives are telling them, and they don’t feel they are in charge of their own existence.” Ergo, we create stories that search for the source of our discontent; the unhappiness and confusion must have a locus, and if we can only find that locus, then we’ll be okay.
And of course we Americans want to be okay. We are a chubby, comfort-driven people. We also live in the end times of a consumer culture; buying stuff is supposed to be the thing that makes us okay. And it does, temporarily: a charlatan’s panacea. But then it does not. What are we to do? During a bad time a few years back I spent weeks on the couch watching daytime television, and despite the fact that I felt dead inside I convinced myself that life could be sunny, and orderly, and good, if I could only purchase a Swiffer vacuum. That I could not buy a Swiffer vacuum, or that when I did buy a Swiffer vacuum I felt more guilt than happiness, made me more depressed. Which is different than sorrow. There is nobility to sorrow. It shares an Indo-European base with the Sanskrit sūrkṣ: to care about. Depression lacks ferocity. It is the affliction of the impotent, the put-upon. If the Talking Heads were befuddled in the eighties by the beautiful house and the beautiful wife, these days there’s a lurid, entitled demand for both. “Where is my beautiful house?!” we cry as we watch reality television. “Where is my beautiful wife?!” Which might as well be the battle cry for HBO’s hit show Girls, and the reason the show and its creator, Lena Dunham, bug the shit out of so many people.
One can argue ad nauseum over how self-aware the show is about its characters’ privilege and these characters’ inability to take responsibility for anything, including their privilege (I myself think it’s pretty self-aware), but what I am more interested in than the show’s cultural context is the abrupt left turn it takes toward a therapeutic narrative model in the last third of Season Two, now available in toto on DVD. Before this, Girls careened rather pleasantly between slapsticky Gen-Y humor and wistful, Louis C.K.-inspired profundity. Or, to think about it another way, between entertainment and art. It was content to be taken seriously, but not too seriously. Which is why some critics railed against its narrow lens and others defended its Right to Niche. But the light touch of the series falls away in episodes seven through ten. What’s left is nothing more nor less than textbook, DSM-able angst.
By this point in the season, our fearful protagonist Hannah Horvath (played by Lena Dunham) has crumpled under the stress of an e-book deal and a bad breakup, lapsing into full-blown OCD in a way she hasn’t since high school. Lots of counting, lots of obsessing. She busts an eardrum with a Q-Tip, busts the other to make it even. Prior to this revelation there has been not even a whispered reference to this severe affliction in the show, though in episode ten, the season finale Adam, Hannah’s ex-boyfriend, asks if the thing from high school is back, which implies, clunkily, she had confided in him at some point. Until now Hannah has been a riot to watch: an impulsive, pleasure-seeking missile. The kind of missile dramatists dream of, for her actions are often mystifying, even to herself. With the relapse into OCD, and the subsequently tedious therapy visits, medication discussions, and symptom management, Hannah’s buoyant energy deflates, and with it the hum of her storyline’s engine. The focus settles not on her actions but on her inaction, her self-sabotage, and in trundles the grim introspection of the overly pathologized. “We don’t know why you have it,” Hannah’s mother tells her over dinner. “We’re still married, we never raised a hand to you…it’s not our fault.” To which Hannah shoots back, “Well, it’s genetic, which is sort of the ultimate ‘your fault.’”
A funny line, and revealing, but where can a therapeutic narrative go after the locus of the character’s not-okayness has been located? The answer is: nowhere, really. Hannah falls apart, and because we know why, because the answer has been given to us as neatly as an Ativan prescription, it is terrifically boring to watch. “I feel like I’m unraveling,” Hannah tells her Adam. “I’m really, really scared.” This is a moment that has potential to crack the viewer wide open in the way only art can, a moment in which our mutual, essential brokenness can be shared, but we are cut off from Hannah’s suffering; there is no room for anyone else under the lumbering weight of her diagnosis. I can’t help but wonder, and sadly, how much more authentic this moment might have been if we had no idea why Hannah was unraveling, if her emotional state and subsequent behavior remained reasonless, unwitting. If the deep namelessness might have been maintained.
A similar narrative problem occurs with Jessa’s character at the end of the season, when the recently divorced, troubled free spirit (played by Jemima Kirke) pays a visit to her n’er-do-well father. In the emotional climax of episode seven, Jessa slumps, infantilized, on a swing set with her father, angry that he has spent so little time with her over the weekend. The conversation is terse until Jessa bursts into an accusation that will sound unhappily familiar to anyone who has tried to translate a revelation that occurred in a therapist’s office to real life. “You have no idea, do you,” she tells her father, her face puffy with the cyclonic swirl of tantrum. “How much time I’ve spent waiting for you. How much shit I’ve taken because you’ve never taught me how to do anything else!” Father and daughter fight over abandonment and mutual unreliability until finally Jessa cries, the camera lingering on her tearstained face, “I’m the child! I’m the child!”
And here the conversation grinds to a halt, and along with it the arc of Jessa’s character for the season. She abandons Hannah at the end of the episode and disappears entirely for the final three episodes. Though ostensibly this is due to Kirke’s burgeoning pregnancy, in the world of Girls the narrative pressure of Jessa’s bad behavior has been released. Jessa abandons people because her father abandoned her. Jessa picks emotionally unavailable men because her father is emotionally unavailable. Check and mate. Let the healing begin.
Except drama doesn’t exist to heal, at least not on such an obvious level. Drama heals in the way truth heals, the way shadow is no match for light. But because the therapeutic narrative is a broken myth, the revelations about Hannah and Jessa ring false. In contrast, consider season three of “Louie.” Specifically for these purposes, season three, episode eight, an episode called “Dad.” Here Louie finds out from his uncle that his father, with whom he hasn’t spoken in two years, is ill. Shortly thereafter, Louie starts vomiting episodically. He also develops a rash, which prompts a visit to the doctor, who after giving Louie a clean bill of health, gently suggests somatization. Is anything else going on in Louie’s life that might be causing stress? “It’s hard sometimes,” Louie tells the doctor. “Boilerplate misery, alone in the world… But nothing new.” Then he brings up his father, and pukes again.
On the way to his father’s house, in Boston, the GPS mocks Louie. “It’s not like he touched your penis,” the Siri voice says. Louie bristles. “I just feel weird around him,” he says. This tentative confession is the closest thing we get to exposition in the entire episode. Indeed, no explanation is offered when Louie approaches the door to his father’s house and runs away as soon as a silhouette emerges in the glass. Nor when he starts running down the street and rips off his t-shirt and steals an orange three-wheeled motorcycle and drives it to Boston Harbor and jumps down onto the docks and steals a speedboat and drives it into the middle of the quiet ocean and starts to laugh. “Whoo!” Louie says. Until he stops saying it. Until he is quiet, and his face droops, pink with sweat and sun, and his body crumples against the boat’s white faux leather seat. He looks around him, and no one is there, and we jump cut to a shot further back, so that all that’s on the horizon is Louie and the boat and the vast empty ocean, and he tips his head back toward the sky.
In the end, to admit we are haunted by life to admit there is no discernable cause or remedy for this haunting, is to allow ourselves to hold light and dark in the palm of our hand. There is a tragic joy in claiming incomprehensibility. Because it feels truthful. And the truth sets us free. In contrast, by diagnosing characters with particular afflictions, be it OCD, or PTSD, for the purpose of narrative, by stripping a character’s behavior into such linear parts, we fabricate coherence and the art becomes complicit in the culture’s deceit. The best art comes from that inexplicable truth force that moves through the artist, into the work, and then into the receiver of the work, who will hopefully pass it on ad infinitum. “Not I, not I, but the wind that blows through me,” wrote D.H. Lawrence. Such wind redeems the creator and the receivers of his creation; it breathes expansion into our cold contracted hearts. To reveal in art that which we do not and perhaps will never understand in life is not a failure of the art; in fact, it is the only art worth making.
 So says the Buddha, anyway, who knows more about these things than I do.
 My heart breaks to note Allison Bechdel’s latest memoir, Are You My Mother?, as one of the worst perpetrators to date, since Fun Home was so damn good and so (until the last few pages, anyway) so resolutely non-therapeutic.
 According to Marilynne Robinson, it goes something like this: “One is born and in passage through childhood suffers some grave harm. Subsequent good fortune is meaningless because of this injury, while subsequent misfortune is highly significant as the consequence of this injury. The work of one’s life is to discover and name the harm one has suffered.”
 And who, speaking of therapeutic narratives, attends an AA meeting in Episode 8 and honks out an expository, feelings-y monologue about their break-up.
 It seems worth noting that Dunham wrote this bit of dialogue for comedic purposes, and to that extent possesses some awareness of Hannah’s self-obsession, but the majority of the OCD-related content in Girls feels earnest to the point of turgidity.
 Or all of “Louie.” Forever.
 He even vomits while playing poker with Sarah Silverman.
by Rob Grace
1) Turn on a lamp.
2) Be born.
Chances are you won’t make it to 35 without being born first. Literally everyone who has ever lived has been born.
3) Swim in a pool while the government of the country in which you live kills an innocent person.
Swimming is a totally great workout, and can also be super fun! Take a cue from the play Aunt Dan and Lemon and just accept that your government does terrible things that you don’t need to worry about in order to enjoy your First World lifestyle. As Colonel Jessup said: “We live in a world that has walls, and those walls have to be guarded by men with guns. Who’s gonna do it? You?” Heck to the no! I’m busy working on my backstroke!
4) Think that you’ve deeply contemplated your own mortality, but then have an experience that makes you realize that, even though you thought you had deeply processed this, you never really let it sink in as deeply as you could have.
You can’t escape death. This notion is likely to occur to you in different, and probably more profound, ways as you get older.
5) Either eat peanuts, never eat peanuts because you are allergic, or never eat peanuts for some other reason.
Everyone needs to have some relationship to peanuts. It’s unavoidable! If you have no relationship to peanuts, then that’s your relationship — having no relationship! It is logically impossible for you to be alive and not fit into one of the three categories mentioned above.
6) Be ashamed of a fundamental part of yourself.
Some have argued that motion of money in the economy is moral, since then everyone’s standard of living increases. Watch a commercial that makes you feel terrible about yourself, then buy the thing that the commercial tells you will solve this problem.
7) Have trouble with your keys.
By Isaac Butler
Up on Howlround today, you'll find this lengthy piece about innovation in arts business practices in anticipation of the "National Innovation Summit for Arts & Culture."
I want to talk a little bit less about what it says, and a little bit more about what it's missing, and thus what its assumptions are.
The fundamental assumption of the piece is that innovation qua innovation is both necessary and good. And while it's true that there are a great deal of challenges facing the arts sector, I question a conversation about innovation that is divorced from a conversation about values, because inevitably without those two conversations intertwined, the only innvoation that will be seriously considered is how to make more money.
I don't want to be a jerk. It's a well-meaning piece and people who work in arts admin are good people who are working hard at a considerable pay cut to try to facilitate other people's creations. That said, not all innovation is good and business practices are fundmentally a reflection of an orgnaization's values, not just their capacity to come up with new ideas.
Within the piece itself there's some mention of creating more "public value," which sounds nice. But what is public value? And how is it measured?
Let's take two examples of innovation in response to declining audience base and ticket revenue for theatre. One possible solution is dynamic pricing, another is Mixed Blood's "radical hospitality" These are, in terms of their values, diametrically opposed solutions to a problem. And these different values lead to different designs. Radical hospitality (in which all seats are free at the door or can be bought in advance for $20) is about broadening the audience base and serving the widest possible constituency with the possible consequence of losing ticket revenue. Dynamic pricing is about maximizing ticket revenue and incentivizing memberships/subscriptions with the possible consequence of narrowing the audience base.
I would argue that one of these innovations (radical hospitality) is a positive development and the other is most decidedly not. I would argue that dynamic pricing lowers the public value of the institutions that practice it. You could argue that it instead improves it. I don't want to refight that argument right now. What I'm saying is that, by leaving terms vague and assumptions unexamined, we set ourselves up for real trouble.
Obviously, on a business level, theater (and many other kinds of arts organizaitons) are in serious trouble. And I applaud efforts to create new and better business practices. But there's a part of me that recoils, that remembers that building giant buildings that bankrupt your organization was considered an innovative business solution at some point. Or that double edged swords like enhancement-- which makes musical theater in america possible by using nonprofits as quasi-legal tax shelters-- are innovations as well. This part of me worries that, when we leave the hard discussions about values and mission statements and what an arts organization is actually for, that we will only go further down the path that's gotten us here in the first place, a world with a stable adminsitrative class, arts as a bauble for the rich, and artists getting paid peanuts.
The article is the first in a four-part series. My hope is that future installments will talk about values, about why arts organizations exist in the first place, and what in particular nonprofits-- which are meant to in some way serve the public good in exchange for their tax status-- stand for beyond money.
“Most of our friends we find interesting find us boring: the most interesting find us most boring. The few who are somewhere in the middle with whom there is reciprocal interest, we distrust: at any moment, we feel, they may become too interesting for us, or we too interesting for them”
-- Lydia Davis, Boring Friends
At the heart of Enough Said—the latest from writer/director Nicole Holofcener, the last from James Gandolfini, the best from Julia Louis-Dreyfus—sits a sit-com contrivance. You may already know this from the preview: massage therapist Eva (Dreyfus) meets television archivist Albert (Gandolfini) and superstar poet Marianne (Catherine Keener) at a party and begins dating him and massaging her. She swiftly figures out that Marianne is Albert’s ex-husband and, a bit gunshy from her own divorce, befriends Marianne and pumps her for information. Soon, Eva can’t stop seeing Albert—whom she is falling in love with—through the poisoned eyes of his ex-wife.
Were the genders reversed, this plot device could be the A story of a garden variety Frasier episode, and some critics find the presence of its contrived plotting odious enough that it wrecks the film. What is fascinating and compelling about Enough Said are the ways it seriously asks itself what kind of person would actually engage in a sit-com plot? And the answer, of course, is somewhat a bit desperate, a bit selfish, and so charming that few people around them can see either.
The wonder of Enough Said is in its gradual revelation of character, and how this revelation slowly upends the viewer’s understanding of Eva, and how much about Eva’s life and world and psychology is subtly crammed into its scant ninety minute running time. For while it’s never overtly discussed, Enough Said is very much a film about class and—to use a term sure to strike nausea into acting students everywhere—status.
Eva, like many of Holofcener’s protagonists, is in a world that is above her own station. Everyone around her is better off than she is, from her best friends (whom we first meet arguing about whether or not to fire their full-time cleaning lady) to her clients, whom we see just enough of to notice the well appointed environments they inhabit, and who offhandedly ignore Eva’s humanity. Marianne might be a poet, but she’s popular and beloved enough to be accosted on the street and has enough money to have a home that Eva believes is perfect. She’s friends with Joni Mitchell, and possessed with a kind of glamor.
Eva’s aspirational desires and all-consuming need blind her to what the audience can clearly see. From the moment we meet Toni Collette’s Sarah and Ben Falcone’s Will, for example, we know their marriage is in trouble, possibly even doomed. Marianne is a horror show. She’s constantly cruel, monstrously selfish, and barely able to fake interest in the people who have been so affected by her work. Eva’s neediness even blinds her to the ways that she is destroying her relationship with her own self-sufficient daughter by chasing after the affections of her daughter’s troubled best friend.
But then there’s Albert. Menschy, slovenly Albert, a bear of a man with “paddles for hands,” who is charming but not particularly clever, decent but not glamorous, stable but not rich. A guy who invites Eva over for a second-date brunch and answers the door wearing sweatpants and sandals. A guy whose job involves watching and cataloguing yesteryear’s television all day long. Albert is resolutely uncool, unrich, unenviable.
Eva’s never gives voice to her panic that she might be fucking down instead of up, but it adds ballast to the feather-brained plotting and suffuses every pained look on Dreyfus’s face as she debates Albert’s worth. It also underscores the film’s masterful (and excruciating) climactic double date between Albert, Eva, Sarah and Ben.
Both Eva and the film she inhabits are so goddamn charming that the quite dark and desperate dynamics underlying both take their sweet time emerging. It’s telling that Eva spends the first half of the movie serving up one-liners only to confess to Albert after the first time they have sex that she’s tired of being funny. Charm, after all, means not only the ability to give delight, but also an enchanted objected that can be worn as a ward against the world. There are multiple terrors lurking in Enough Said. The terror of being uncool, yes, but also the terror of being unloved, and of being alone. What we truly fear in our own lives is more often than not something that has already happened. The two everyday traumas of Eva’s life—her past divorce, her daughter’s impending departure for college—are the twin poles of Enough Said, and while they fuel Eva’s initial generosity and charm, they eventually pull her apart.
Holofcener handles all of this with a light touch, and it’s the tension between the film’s surface lightness and subterranean depths that makes the film so moving, and such a delight to watch. Enough Said is, in many ways—its structure, its not-quite-resolved ending, the low-key redemptions it holds out, its use of dialogue and action to reveal and transform character—like a great short story, the kind you might read on a melancholy Saturday when you’re in the mood.
The film is only strengthened by its performances. Eva is in every single scene; we only hear what she hears and sees what she sees. Dreyfus more than carries the film, she is marvelous in it, bringing just the right kind of ingratiating energy while showing us glimpses of the misery underneath. One of the film’s great achievements is that I’m still unsure how I feel about Eva as a human being a week later, even as I find her compelling as a character. While you’re watching the film, it’s also hard not to find yourself ultimately drawn to Gandolfini. Not only is he playing the rare Holofcener male character who isn’t a yutz, but his performance is easy, graceful, casually worn yet thoroughly convincing. A kind of transference occurs within the film. Eva’s fear that she might lose Albert becomes our recognition that we have lost the actor playing him, a recognition underscored by the film’s dedication “For Jim” during the credits.
For all I’ve talked about some of its darker currents, at the end of the day, Enough Said is still a romantic comedy, one of the few artistically successful ones in recent memory. This also means that it follows the structures of a romantic comedy, and is quite predictable. What ultimately makes Enough Said wonderful—and it is, truly, wonderful, one of my favorite movies of the year—is how, through taking the familiar structures and plots of a rom-com seriously, and by investing deeply in character, it manages to reinvigorate the form, if not quite reinvent it. Enough Said demonstrates convincingly why the rom-com’s ossified plot gestures became so hardened in the first place.
by Isaac Butler
For the third time since this blog's creation, Washington, D.C.'s Theater J has in some way backed down from or reconsidered a controversial play:
Plans for the English-language world premiere of a controversial Israeli play at Theater J in March have been scaled back. The action follows a flurry of activity, including a protest group’s campaign against the play that raised concerns that the production would hinder donations to the institution that houses the theater company.
Officials at the D.C. Jewish Community Center (DCJCC), where Theater J performs and gets other cost-defraying support, in tandem with Theater J’s artistic director, Ari Roth, have decided to pull back “The Admission” from a 34-performance, full-production run in March. It will now be presented in what they are describing as a “workshop” run, lasting 16 performances, in proposed repertory with “Golda’s Balcony,” a biographical play about the late Israeli prime minister Golda Meir.
Really, read the whole article. It's astonishing. I'll still be here when you're done.
The first time this happened, Theater J "contextualized" a public reading of Caryl Churchill's "Seven Jewish Children" by including three separate response plays and a discussion with a man who was saved from the Holocaust by fleeing to British Mandate Palestine. Then, amidst protests by Elie Wiesel, a play about Bernie Madoff by Deb Margolin was rewritten and postponed. I've written about both of these incidents in the past (here, here, here and here amongst other places).
If you read those links, you'll see a lot of emotions on my part ranging from disbelief to fury over Theater J's actions in these matters.
But my perspective on all three of these incidents has changed. I feel very differently now than I used to. I no longer feel angry at Theater J, or at Artistic Director Ari Roth (whom I have come to know a bit since those posts linked above). I feel, instead, saddened and disappointed, but not at the people or for the reasons that you might think.
First, I feel disappointed and saddened because I am someone who has argued for years in favor of both culturally specific theaters and theaters being more deeply rooted in their communities as potential solutions for some of the ongoing problems plaguing the American theater. Any solution, of course, will create its own new problems. I know that. That's the way of the world.
But it's still disheartening to realize that one of the side-effects of doing this might be that our theater gets more conservative, and less challenging, because the communities it is rooted in have certain lines they will not permit their theaters to cross. Looking at Theater J, I no longer see a theater that lacks the courage of its convictions. I see instead a theater attempting again and again to push its community on a core and controversial knot of issues (Isreal, Palestine and Zionism) and finding itself threatened for it every time.
I actually have come to kind of admire Theater J's stubbornness in trying to find ways to push their community, even as it keeps blowing up in their faces. As much as I want them to never back down, the pragmatist in me recognizes that they are deeply tied in to a very well organized and well led fundraising community and housed by an organization that's explicitly Zionist in its views. I don't think there's enough said about what a difficult situation that is, and how it's only gotten more difficult as Americans have hardened and shifted rightward in their beliefs on Israel.
Thus, it also saddens and disappoints me greatly to see my fellows Jews banding together to stop other Jews from airing a point of view that they disagree with. But this is of a piece with our recent history. Liberal Jews have long pointed out that within Israel a far more robust debate about Zionism, its history, and the actions of the Israeli state is possible than within America. The kind of op-eds that regularly appear in Ha'Aretz would never get published by a mainstream newspaper. In fact, one third of American rabbis won't truthfully and openly discuss their views of Israel, for fear of "significant professional repercussions," according to a recent poll (full report here).
What's happening to Theater J is another example of this phenomenon. Both "The Admission" and "Return to Haifa," a controversial play Theater J put on two years ago, were written by Israeli Jews. The fact that we in the States are to be "protected from" the art of Israeli Jews because it does not cohere with our Zionist consensus is as hilarious as it is despicable. If hardline Zionists are right (obviously I don't think they are) then there shouldn't be any problem or threat in hearing from a different viewpoint. We have a rich history of argument in our culture. One of our core literary traditions-- the Midrash-- is based on it. The fact that a play with a particular POV on 1948 cannot be seen unless it is not a real production and is paired with a glowing bioplay of the woman who said that there is no such thing as a Palestinian people is a shanda fir de goyim.
UPDATE: My friend Andy Horowitz from Culturebot noted that he found the term "culturally specific" to be offesnive, as all theatre is to some extent culturally specific and it only gets applied to work that isn't White/Western European. I agree with him that the term is a bit clumsy, but I didn't invent it. While I am unsure of its origins, it's in fact a term i heard for the first time at Arena Stage's Diversity Convening, and heard again at Arena's Black Playwrights Convening, as a term meant to enlarge our understanding of (there's really no good word to use here is there?) uh, "identity based"(?) theater beyond, say, the Negro Ensemble Company. (Here, to take one example, is Ma-Yi Theater's "about" page, which uses the term.)
In other words, it was a term of art I learned from practitioners of what gets called Culturally Specific Theater, which is why I felt okay using it. Language--and particularly Jargon-- is something that constantly evolves. So the term Culturally Specific Theater, which is a term that seeks to recognize the commonalities of and intersectionality between groups doing whatever you want to call this kind of work, may in fact be past its sell-by date and need to be replaced with a term that also recognizes that most Durang plays are culturally specific work in their own right.
by Nicole Beth Wallenbrock
As a longtime Bill Callahan fan (I first saw Smog-Bill Callahan’s previous moniker—in 1998), I was excited to see a documentary of the tour of his last album, Apocalypse. For those who are less fanatical, Bill Callahan writes unassuming melodies and lyrics that sound simple while explaining the infinitely complex relationship of man to nature, musically edging on a darker side of Americana. Bill Callahan's voice is unique, haunting and relaxing all at once. While his just released album, “Dream River” showcases flutes, and he has previously even used strings, the Apocalypse album and tour were especially captivating as a trio dominated the songs; while the band was small the sound was layered and full, Matt Kinsey’s guitar weaves dissonant notes into harmonies and Neal Morgan’s drums increased our suspense. Both augmented the fragility and fierceness in Bill Callahan’s voice and strumming.
The film by Hanley Banks (a photographer and girlfriend of B.C.) captured the live performance, and I duly appreciated revisiting the concert of 2011. The director also visually interpreted the Americana theme of the music; at times this was magical, fireworks in “Say Valley maker”. At other times, the montage of working class Americans felt a bit forced, circa John Cougar Mellencamp. The general theme, a tour, matched Callahan’s songs which are often about travel (many times by horse) and the aural montage of interviews with Callahan played over fuzzy sequences in a car. These interviews are of interest for fans that hang on the songwriter’s every word, but did not shed much light into his artistic process, or even his world view. But for this we have those amazing songs…
Still for those who attended the screening at BAM on Monday, we received the gift of his presence in a A and Q. We raised our hands and Callahan asked us questions- most of which he had scribbled carelessly on a piece of paper. His questions were flirtatious, “How often do you think about me?,” narcissistic “do you have any famous friends? What do they say about me?” and sardonic, “do you think the world is getting worse and is your life getting better?” He spoke slowly, calmly, enjoying the audience’s rapt attention. Eventually he allowed us to ask questions about the film. Although his stage persona is consistently guarded and controlled, he shed tears when explaining how happy he was that the tour, which was extremely important to him, was recorded. It was a personal moment that he did not expand on, but which gave us a hint at the sensitivity at the root of such keen observations.
by Isaac Butler
(UPDATE: Hello Dish readers and others who have been sent here from various corners of the internet. Welcome! This is Parabasis, a blog about culture and politics. I'm Isaac Butler, an erstwhile theater director and writer. I write most (but not all) of this site. You all might be particularly interested in The Fandom Issue, a special week-long series we did devoted to issues of fandom in popular culture.)
Every work of fictional narrative art takes place within its own world. That world may resemble our world. But it is never our world. It is always the world summoned into being in the gap between its creators and its audience.
Yet at the same time, the art we experience shapes our view of the world. As Oscar Wilde puts it in the Decay of Lying:
Life imitates Art far more than Art imitates Life. This results not merely from Life's imitative instinct, but from the fact that the self-conscious aim of Life is to find expression, and that Art offers it certain beautiful forms through which it may realise that energy. It is a theory that has never been put forward before, but it is extremely fruitful, and throws an entirely new light upon the history of Art.
Wilde discusses this in terms of appreciating sunsets through the lens of Turner; perhaps our modern day equivalent is juries being incapable of understanding that real world evidence gathering isn't like CSI.
This odd tension-- that narrative art creates its own world yet helps shape our view of ours-- has given birth to (or at least popularity to) a new brand of criticism that measures a story against real life to point out all the ways that it is lacking. You've seen it before, right? "Five Things Parks & Rec gets right about small town budgeting bylaws." Now with Gravity busting box office records, we're getting astronauts and scientists telling us that there are many points where the film departs from real life. Entire critical careers are now founded on churning out "What X Gets Right/Wrong About Y" blog posts, posts that often completely ignore issues of aesthetics, construction, theme or effect to simply focus on whether in "real life" a given circumstance of a story would be possible.
In real life, people don't talk the way they do in movies or television or (especially) books. Real locations aren't styled, lit, or shot the way they are on screen. The basic conceits of point of view in literature actually make no sense and are in no way "realistic." Realism isn't verisimilitude. It's a set of stylistic conventions that evolve over time, are socially agreed upon, and are hotly contested. The presence of these conventions is not a sign of quality. Departure from them is not a sign of quality's absence.
The Realism Canard is the most depressing trend in criticism I have ever encountered. I would rather read thousands of posts of dismissive snark about my favorite books than read one more blog post about something that happened in a work of fiction wasn't realistic or factually accurate to our world as we know it. Dismissive snark, after all, just reflects badly on whomever wrote it (at best) and (at worst) cheapens the work it is written about. The Realism Canard gradually cheapens art itself over time. It's worse that the reduction of art to plot, or to "content." Those can still form the basis of interesting conversations. Instead, we're talking here not only about the complete misreading of what something is (fiction vs. nonfiction), but the holding of something to a standard it isn't trying to attain and often isn't interested in (absolute verisimilitude). We're talking about the reduction of truth to accuracy. We're talking about reducing the entire project of fiction so that we can, as Grover Norquist said of the Federal Government, get it to the size where it can be drowned in the bathtub.
And I suspect on some level this is part of the point of the The Realism Canard. That art in its size and complexity is too much to handle sometimes, and too troubling. That even though we say fiction's job is to take us out of ourselves, we don't really want to be pushed. So we must take it down a peg, to a point where it is beneath us and thus can be put in its place. And the easiest way to do this is to cross check it against "real life" and find it lacking.
Take this piece about Breaking Bad in The New Inquiry. It has some interesting points to make about the show's racial politics, but before it can get there it, it must shrink the show to manageable size by trying to come up with ways that its depiction of the drug trade isn't "realistic," landing on the show's overemphasis on the purity of Walter's meth. Set aside that the author's critique of the show's purity emphasis on realism grounds is wrong (purity matters because Walt is a wholesaler and the purer his product is the more that it can be stepped on by the people he sells it to), and set aside that the purity matters for character reasons (no one has ever been able to do what Walt figures out). The accuracy question with regard to Breaking Bad is a complete sideshow. Breaking Bad is not a work of realism. Its aesthetic and language is highly stylized, and its plotting is all clockwork determinism, as anyone who has watched the second season can attest. It's not trying to exist in our world. It's trying to exist in its world. You might as well criticize it for having a sky that's yellower than ours.
I don't mean to pick on that TNI piece, it just happened to be the latest one I'd read. At least it has something beyond factchecky questions to ask. Once you get through that bit, it's well written and eye opening to some racial dynamics I'm ashamed to admit I hadn't fully considered. But still. The Realism Canard is a problem, and it's everywhere (here's another one from Neil deGrasse Tyson about Gravity) and I feel it spreading more than ever over the internet's criticosphere.
Are there exceptions to this? Obviously. There are works where the idea that what you are watching is a fictional representation of things fairly close to our own world is part of the works' value, whether it be "based on a true story" films like Zero Dark Thirty and The Fifth Estate or social issue (and agit prop) works like Won't Back Down. And there are ways of discussing the differences between art and life that illuminate rather than reduce. That ask the question "what does it mean that they changed this thing about our world?" rather than assuming some kind of cheating or bad faith. Or ways that treat these differences not as a form of criticism, but rather a form of interesting trivia. Or, in the case of Mythbusters, edutainment.
There is also the issue of representational politics, particularly in light of what we know of narrative's deep intertwining with the processes of stereotype formation in the brain. But I do not think it's inconsistent to argue for diverse representations of the underrepresented-- and more characters that are fully rounded-- and the imaginative power of art.
What matters ultimately in a work of narrative is if the world and characters created feels true and complete enough for the work's purposes. It does not matter, for example, that the social and economic structure of The Hunger Games makes absolutely no sense. What matters is whether or not the world works towards the purposes of the novel rather than undermining them. People praise August Wilson's portrayal of poor and working class African American life in Pittsburgh, but many of his plays feature an off stage character who is over three hundred years old and has magic powers. One of them ends with a cat coming back from the dead.
The Wire's "realism" and "accuracy" are both shouted from the rooftops, but, for all of its deeply known and felt and researched world-building, it abandons both when it needs to. There is no way that Hamsterdam would exist in present day Baltimore. It's a thought experiment, an attempt to game out what drug legalization might be like. No one really cares, because it works within the confines of the show. Season 5's fake serial killer plotline is not actually any more preposterous than Hamsterdam. But it doesn't work largely because the shortened episode order left Simon et al without enough time to adequately set it up and the tonal shift in Season 5 to a more satirical, broadly-painted mode feels abrupt and off-putting. The problem, in other words, has nothing to do with whether it would really happen, or how journalism or policing really work. It's about the world the show has created and its integrity.
That, the integrity of the piece and of the world it creates, of its internal logics and rules, is what matters. My hope was always that as genre gestures got more integrated into mainstream literature and television and film, the overreliance on realism-based critiques would fade. Instead, it's intensified and is becoming a major mode of critical discourse. It's sad, really. There're so many more riches to be discovered in fiction if we could just let ourselves see them and not be so afraid that it might take us somewhere new.
By Isaac Butler
Gravity is a film about the struggle to survive that wants to fill you with the wonder of being alive on this Earth in this Solar System in the Galaxy in this Universe. It's a film that starts with a title card that tells you life is impossible in space before immediately showing you the Earth and a small handful of humans definantly, inspiringly proving the title card wrong. It's a film where, even in the midst of terror, there is awe. That awe comes courtesy of writer/director Alfonso Cuarón, a wizard with a camera firmly in command of his seemingly bottomless powers.
To judge from the reactions of most critics and nearly all of my friends, it's succeeding on all fronts, filling people with wonder at all the things cinema and life are capable of. For me, while it left me incredibly impressed, I found its hand ultimately too heavy, and thus it's survival-movie tension of terror and inspiration-- for aren't we always scared by death and thrilled by life's stubbornness in survivial stories?--never quite landed for me. All narrative art is manipulative, but each audience member also has their breaking point where they're no longer willing to be pushed. And somewhere after its remarkable opening shot, I passed that threshold.
There's a point somewhere in Gravity's second act where Sandra Bullock's Ryan Stone is, as we say, having a moment in a zero gravity environment. As she does, two little small tears float off and away from her face in 3D in the bottom left hand corner of the screen briefly before vanishing. My breath stopped and my jaw literally dropped, awestruck at the simple and subtle compositional beauty of the moment. Like nothing i'd ever seen on screen. Two seconds later, in case you missed those two little drops, a huge globe of tear floats from her face near the center of the screen and holds there, almost distracting in its insistence on being seen.
There's a point much later, as Stone goes into a Chinese space station, that we see floating by a ping pong paddle. It feels like a clever little brief visual joke. Until it shows up in a shot a couple of seconds later, insistent, hovering.
There's a point much earlier, when various VO lines make sure that you know for absolute sure that George Clooney's Matt Kowalski is taking part in that honored trope of the cop who is "one day away from retirement."
There's a point where a dead body is found, and, in case you don't think it's upsetting enough, the camera focuses on the picture of his wife and child taped onto his arm.
There's a point near the end where Ryan Stone tells you the message of the movie as the music gets the loudest and most melodic it has been thus far.
There are also too many points to enumerate where something so wondrous is happening on screen, and you're happy to be alive to see it. My personal favorite is a camera shot that hovers outside of Stone's helmet as she struggles to catch her breath and then slowly pushes into her helmet from the outside, only to then turn around and become a POV shot. Or another where Stone futzes with some piece of equipment while slowly in the bottom of the screen the Earth drifts into view.
Gravity, for me, was an incredibly impressive piece of film-making that didn't quite land as a film. I don't really want to harp on this. Many of you reading this adored it, and I can see why, and I don't think you're wrong to have loved it, even if it didn't quite get there for me. What I ultimately wanted from Gravity was something sparser and more rigorous. Something with much, much less dialogue and no backstory, rather than the hackneyed one they came up with. Gravity's real achievement after a summer featuring collateral damage fests like Man of Steel and Star Trek: Into Darkness is to insert the value of human life back into mainstream blockbuster filmmaking. I wish they had trusted us enough to find human life valuable without inserting a dead child into Stone's past, just as I wish they had trusted us to see the tears at the bottom of the screen, or the ping pong paddle the first time it flew by.