Unverifiable

28 Feb

A lot has happened since last I posted here.  I try to keep this focused on culture, and not on politics outside of cultural issues. As readers know, we’ve been kind of hit over the head the past few months. How this has affected culture is beginning to show more as cracks in the foundation.

Today I was listening to the normally great WNYC radio host, Brian Lehrer, whose local show in New York covers all sorts of issues and local politics.  He did an interview segment with writer and psychologist Andrew Solomon about how parents should or can introduce their children to political issues in these troubled times. Lehrer is also one of the hosts of National Public Radio’s new call-in program, Indivisible, which promises to get people talking outside of their bubble and in the words of WNYC President Laura Walker, “find common ground.” The way that the show is run, and indeed the ultimate direction of today’s interview to me underscored the failure of American liberalism and, in particular, that failure within the press, to address what has happened and is happening in this country and the world.

Lehrer asked about striking the balance between educating children on the significance of the current moment without telling children what to think, or letting them make up their own minds as independent thinkers.  Although I was taken aback by the explicit assumption that parents are not allowed to educate their children about ways of thinking critically, or that taking them to a demonstration might infringe on their rights to think for themselves (full disclosure here: one of the best things my grandmother ever did for me was take me around her apartment building to get signatures for her anti-Vietnam War petition to her senators), Solomon made the good point that parents can reframe the discussion away from telling kids whom to vote for, and towards the implicit moral distinctions made between love and hate, respect and bigotry, inclusion and exclusion, racism, and so on, assuming we choose to live more moral lives.  Like a good liberal, you don’t tell people, even your children what to think, you let them make up their own minds, but you try to instill in them a sense of your own morality or, as Lehrer interpreted it, the difference between good and evil (even as we’ve seen a wedge driven between different takes on “good and evil” in the new national discourse).

Some parents called in and made some good points and asked good questions, including one woman whose 11-year-old son turned to her and said, “They’re all crooks, Mom, even Bernie Sanders.”  (The whole 17-minute interview segment is worth listening to.)  And this to me is why liberalism is losing.  First, on this level, when people get disgusted with the entire political process, and turn away from all candidates as equally bad, we know that benefits autocrats and harms participatory democracy, by definition, really. So low voter turnouts are not politically neutral; they benefit right-wing candidates.  Turning people against politicians – a big part of the current rhetoric of the last political campaign – and encouraging them to sit home does not have symmetrical results for both ends of the political spectrum.  It disproportionately harms those who run on more transparent, egalitarian, inclusive platforms that emphasize sharing of resources, citizen and informed participation, and global sustainability.

But at no point during this interview segment did anyone mention that parents, like the press, have a responsibility to tell listeners, whether children or adults, how to assess fact from fiction, truth from lies. There is much in political culture that is based on “opinion” (whatever that means, which is a topic for another day). But reasonable actions, regardless of one’s morality, cannot be made on the basis of misinformation, whether deliberate or not, or lack of information. And parents, as well as the press and teachers, have the solemn responsibility to teach the young how (and why!) to be better informed or when something is an absolute falsehood.  Truth.  Truth matters.  The American press did not do this until too little, too late, for the most part.  The New York Times and Washington Post seem to have woken up to this now, and some of the Times’s recent editorials, such as this one on immigration, are model summaries of critical thinking and the application of facts to analyze and undermine lies, deliberate lies by our leaders. You can still be neutral while denouncing lies and misinformation.  There may not be one absolute truth, and facts (and their ramifications) may be debatable, but we can’t allow them to be tossed aside as if they don’t matter and just believe the fantasies that tell us what confirms our prejudices. We may not be able to determine with certainty how facts relate to causes and consequences. But the search for that connection is vital to our survival.

Unfortunately, whenever I try to listen to Indivisible (which I feel like calling Unlistenable or Insufferable), it feels as though there were a directive on high from the NPR management never, ever to correct callers’ statements, no matter how blatantly false or misinformed.  Invariably within the first couple of calls, a listener repeats some idea that is demonstrably, empirically false. And the hosts – seasoned NPR journalists – let these falsehoods not only sit there unchallenged, but even gain credibility as they are further distributed over the airwaves.  I’m not saying the callers are unintelligent, or uneducated, or that my opinion is more valid than theirs. We can disagree when we are all speaking from a position of being informed.  But there are times when they express beliefs about social conditions and historical events that are flat-out wrong.  It’s not politically correct to say that, and the shorthand way of dismissing this is to say it is “elitist.”  Yet somehow it would be hard to imagine a patient opining about how to conduct surgery and the doctor having to follow the patient’s instructions because all opinions are equally valid. For example, when people base their opinions about immigration on the belief that immigrants are “streaming” across the Mexico-U.S. border, that crime waves are higher because of immigrants from Mexico or the seven banned countries or simply that crime is at a 50-year high, and not a low, or that Obama increased the debt more than any other President – all measurably false, to remove any doubt – they are drawing conclusions and promoting remedies based on information and ideas that are completely erroneous. Aside from the moral dimension, it’s aimless to discuss whether building a wall is the best response if the so-called need for one can’t even be demonstrated in reality.  If the press isn’t there to report the truth, and to call out misinformation in an adversarial way, who is?

But there remains this need for liberals – and dare I say it, white liberals – to “find common ground” and be reassured.  One problem is, it’s really hard to find common ground when you understand the policies of those people who disagree with you are actually going to cause you harm, if not kill you. This is a loud and clear message coming from Black America right now, whether in the form of two essential and devastating documentaries this season, I Am Not Your Negro and 13th, in the need for discussions of reparations as voiced by Ta-Nehisi Coates, or in the critique of broadcasters like Tavis Smiley, or on a less famous or public level, the lived experience of my students. It’s an uncomfortable truth that the ideal of “common ground” can’t fully be realized while “systemic racism” is a dominant cultural order, let alone one on the rise.

So when Andrew Solomon ends his interview by telling parents that it’s important to avoid heightening their kids’ anxiety, by telling them that things may get worse for the world under the current presidential administration, but “we’re going to be ok,” that to me seems less a prescription for lessening anxiety than a recommendation to teach your children how to practice denial.  That may be not only how we got into this mess, but what will keep us from getting beyond it. It may make for nice parenting, but it is neither good journalism nor sound advice for the future of the planet. If you really believe you’re going to be ok in four years, you’re in pretty good shape, comfortable, sociologically speaking. We have to start by admitting that there’s a good chance most of us are not going to be ok – if you know anything about climate change and its consequences, which is to say, science – and that catastrophes like nuclear holocaust, genocide, widespread gun violence, and ruptured oil pipelines that can contaminate the water supply for millions and wipe out entire indigenous communities, are preventable. But that’s only the case if we come together to start naming the truth or short of that, seeking it out, and cease ignoring facts while our press looks the other way rather than confront dominant falsehoods as is their job.

Our Flawed Political Leaders

6 Nov

I wrote this yesterday – just some idle thoughts on this political campaign season, a throwaway – and posted it on Facebook. Already at least three of my friends have copied and shared it.  So I’m putting it here just as a record it’s originally mine (easier to archive than Facebook).

(Meanwhile I’ve been carrying two, maybe three more in-depth posts around in my mind since August, looking for the time to develop them further here.)

As I prepare to vote on Tuesday, I find myself thinking and talking more and more about Lyndon Johnson (whom my students have never heard of, incidentally). I was not old enough to remember him and the obscenity of our war in Vietnam (my first actual memories began under Nixon), policies for which history has decided he was rightly despised. There is no apologizing for this, and this is what brought him down and overshadowed his presidency and his legacy, at first attacked by the progressive forces of Eugene McCarthy and Robert F. Kennedy, but undermined and succeeded by the likes of Nixon with the assistance of Cold War Democratic hawks and Dixiecrats.

And yet, here’s a list of what this often duplicitous and untrustworthy wheeling-and-dealing politician oversaw in just five years of his Presidency: The Civil Rights Act (actually two of them, the second including the Fair Housing Act), the Voting Rights Act, the Immigration Law of 1965 that ended quotas and preference for white Europeans, the first Endangered Species Act, and the acts that created Medicare, Medicaid, the Equal Employment Opportunity Commission, Head Start, the Corporation for Public Broadcasting (NPR and PBS), the National Endowment for the Arts, and the National Endowment for the Humanities, and the act that made food stamps a permanent program. (Not to mention appointing Thurgood Marshall to the Supreme Court.) None of these, which have altered and improved the lives of all Americans, would have existed or passed without him – flawed, dishonest, and hawkish as he was. Fifty years on it’s hard to imagine American life without these, even though arguably whatever social justice they helped to bring about at home stopped at the border of our empire. It’s also hard for us to conceive of the kind of political imagination that could envision these dramatic improvements that reshaped American life in just five years. (Even as there were more progressive voices inside and outside of government, some of whom formed useful alliances and some of whom remained in opposition.)

Little of this happened without considerable people pressure by progressives, progressives who were mobilized for years by violent and structural injustices in Southeast Asia and the U.S. South. As I’ve said before, democracy doesn’t end on Election Day, it begins then. No one we elect on Tuesday can or will be a savior. Not only is there no perfection, but it may take generations before we significantly change the course of our foreign policy away from its history of massive spending on war, violence, and weaponry, or address the damage we unleashed during “shock and awe” in 2003. Are we going to address economic inequality, racial injustice, or the steps needed to stop catastrophic climate change in the next four years? That’s up to us, but it will help not having someone in the office who would like to forcibly turn the clock back to a time before any of these social advances were part of the fabric of modern America. Nor will it help having someone in office too flagrantly disrespectful of science, sociological evidence, public policy, history, tolerance, and gender equity to recognize the relationship between the folly of willful ignorance, nationalism, hatred, and catastrophe.

Held together by a “Skeleton Crew”

15 Jun

When I had menial summer jobs during college, going to a good film the night before could upend the tedium for the entire shift the next day, and thoughts, impressions, and analysis of the film and its elements swirled in my mind. One of the strongest memories of this was working in a library, where I was doomed to change the labels on the front of card catalogue drawers all day – unscrew, remove old label and mylar covering, insert new label and mylar covering, provide paper backing for thickness, rescrew the assembly on to the front of the drawer – for minimum wage (then $3.35/hour), when I chanced to see Paul Schrader’s brilliantly written film, already in revival, Blue Collar with Richard Pryor in perhaps his greatest dramatic performance. That film, about auto workers in Detroit, brought me a lot to think about for all the hours the very next day and I still remember how in my mind’s replay it made the time pass but also let me bear down on the themes raised by the film and think about the unavoidable conflicts, class-based as well as racial, inherent in industrial capitalism.

Last night I had the chance to see another scripted Detroit-auto worker story, this time the new play, Skeleton Crew, by the young playwright Dominique Morisseau, with whom I had been unfamiliar until now. For one thing, I’ll be keeping an eye out for her other work from now on. This is a play that, in capturing the precarious existence even of skilled, union workers in the contemporary American economy, gives me hope that our theatre still can take on significant economic and social issues with sophistication and empathy, that theatre can do so much more than entertain by showing us the fragile humanity caught up in our crumbling economy. Our safety net has been ripped to tatters, even among the most strongly protected union jobs. Far from the labor optimism of Clifford Odets we now feel as if we are watching the sun set on union protection, as individual self-preservation is pitted every day against collective solidarity, because advancement comes at a moral cost.  In this sense, Morisseau’s play evokes Arthur Miller’s tragedies of psycho-economic conflict (Death of a Salesman most famously, but even more strongly both The Price and All My Sons). The dialogue is both natural and naturalistic, and yet at times with a tone as precise and ringing as that of The Crucible.

I’ll leave it to Ben Brantley in his rave review to provide more plot background to the play. But in brief, in this four-character play each of the thoroughly drawn characters occupies a tenuous position in the work hierarchy of a Detroit auto plant in danger of shutting down, including the union rep and two others who work with her on the assembly line  and the supervisor, now management, who has risen up from the union ranks to a position, though teetering, in the middle-class. We learn early on – though not all the characters know – that the plant will close, and it is up to the supervisor to make recommendations about who will be fired in advance or laid off, who will be transferred to other plants, who gets a good severance package, while the union has to scrape and scramble to protect its dwindling and vulnerable workers. One of the workers just bought a house, one is a year away from  full retirement benefits, one is saving up to start a small business, and one is about to go out on maternity as a single mom.

The play is not just an indictment of our economic system – our economic collapse as a country when it comes to providing a decent standard of living to increasing numbers of people (collapsing faster than Europe) – but also an inquiry into what happens to people morally when they get close to the line that separates management from workers, and those who think they can become secure from those who see themselves sliding into peril. We all have enough personal flaws and financial soft spots (e.g. cancer) to bring us down.  But the question remains whether the moral response in enough to offset the effects of an amoral economic system. Still, nothing in the play is contrived, there are no devices to move the plot forward, no sudden second-act revelation of secrets that forever changes the characters and the way we understand the play.  Life plays itself out without, as Brantley observes, melodrama. All of us who have worked in an office setting know the complicated ways that office mates get to know each another with a special kind of intimacy , as friends and sometimes not as friends even though we can spend as much awake time with them as we do with family.  The nature of the work relationship is different from worker and class solidarity – it is more complex, even in union shops (which I now know, working for the first time in my career in a unionized position). Friendship, comradeship, power plays, conflicts are all there, and we come to care about one another because of our frequent and purposeful contact.

This is highly engaged and perceptive theatre. What it offers over film is the intimacy of getting to know four complex and multidimensional characters by being physically close enough to touch them. And in so doing, and in seeing them in the flesh, as opposed to a two-dimensional screen, we can identify with their pain and anxiety, as (if) we come to know them. The actors have to become the people such that not one sentence can sound written.  The repartee, the comebacks, the conflicts must remain spontaneous.

Yet at the same time, there is the paradox – external to the play itself – that people who share the background and social status of the characters could not afford to see this production, even at off-Broadway prices. For that reason (among the demands of real life), I personally cannot have seen as much contemporary theatre as I would like so I cannot say categorically that this kind of new social realism is rare, but I suspect it is.I hope it’s part of a new wave.

The reason we remember Miller, Odets, Lorraine Hansberry, is that they expose something real yet complex about the relationship of individuals and families within the economic matrix. Perhaps this is what it means to be American in the post-manufacturing age. And furthermore, even though race hovers over this play and the deep vulnerability of its characters, the racial positioning of the characters themselves is far more ambiguous and complicated than what Hansberry’s Younger family had to deal with: both moving into the middle-class and remaining in the working-class are fraught with dangers of different kinds.

All of these tensions become that much more heightened as – in every industry, whether manufacturing or healthcare or higher education – fewer and fewer full-time workers relative to the growing need are being asked to do more, work more, give more. We are all becoming the very skeleton crews keeping this nation’s professional engines generating, whether products, service, care, or knowledge, while our brother and sister workers, and dads and moms, are severed, cast off, demoted to precarious, contingent positions or, as the play points out, moving from skilled labor in auto plants to jobs with no human impact in copy centers. Unemployment may technically be low by quantitative measures, but there remains just a skeleton crew doing meaningful work, in both the middle-class and the working-class, leaving bare our open wounds of aspiration.

The Privilege of ‘failure’ in the precarious economy

1 May

I sat tensely with the search committee in a conference room for the final interview for an academic tenure-track position.  I really needed the job, having been on the job market for three years and having only been able to find part-time, contract, or adjunct positions for the past two.  This was the first interview I had gotten as a finalist in all that time. One of the committee members turned to me and asked, “We noticed that ten years ago, you left your position [as executive at a non-profit] after less than one year.  Can you explain to us why you left that job after such a short time?”

I was prepared for the question so gave it my best spin.  I had had a conflict with the board president, I had made all these innovations and had measurable successes in that position, successes that were recognized within the larger community, but the board redefined the position, demoting it from executive to office manager, because they decided to retake control over the day-to-day operations of the organizations.  Any way I spun it, and without independent corroboration, left open the possibility I was difficult to work with, uncompromising, a poor communicator, or even incompetent.  And for all this to emerge in less than a year on the job indicates either a disastrously bad tenure, or an unforgiving board with no patience for disagreement.

Though one can never know if there is one definitive reason, needless to say I didn’t get the position for which I was interviewing.

I have never used this blog for trolling, settling scores, proving my political correctness, or sour grapes.  But I think it incumbent to point out that I’m willing to bet Johannes Haushofer has never faced such a question in any of his job interviews.

The reason I’m posting this entry is that no fewer than four people I greatly respect have re-posted Princeton University Assistant Professor Haushofer’s so-called “CV of Failures” or articles about it, on Facebook, which garners the predictable number of ‘likes’ in response by students and other academics. Even NPR, as it is wont to do, made a lighthearted report on the topic on Morning Edition.  One reporter wrote that the takeaway lesson from this is that “The real tragedy isn’t these failures — it’s when these failures convince people to stop trying.”

Even the professor who wrote the original article on which the idea was based, Melanie Stefan, drew two conclusions: this same one (“we construct a narrative of success that renders our setbacks invisible both to ourselves and to others. Often, other scientists’ careers seem to be a constant, streamlined series of triumphs. Therefore, whenever we experience an individual failure, we feel alone and dejected”) and what I think is a more valuable conclusion that even the most successful scientists, and academics, face a ratio of about six failures to every major success. That latter point is a valuable, and encouraging, insight.

But from their lofty positions at Princeton, Harvard, Oxford, Cal Tech, and Edinburgh, both Haushofer and Stefan miss the economic context.  And that’s what makes, to me, this approach so infuriating.  Both are writing from positions that reflect an inevitability of ultimate success and security, the uppermost echelons of academic success, especially for such young, promising scholars who got top positions right out of grad school.  But the adjunctification of higher ed (not to mention global poverty and the precarious economy) guarantees no such narrative of success for most of the people taking part, even from top tier graduate programs.  If you fail six out of seven times, but still end up with a position at a university in the ranks of Harvard or Princeton, then yes, by all means, teach younger people not to give up or get discouraged.  But be careful to avoid an error in logic.  There’s a big difference between the lesson “you’ll never get a position (or anything you want) if you give up” – which is logically true – and the lesson that “if you never give up, eventually you will get a position (or whatever you seek).”  The latter is a logical fallacy.  There is no demonstrable guarantee that refusal to give up will lead to success. Or put another way:

Giving up —-> No success

but the inverse is not true:

Not giving up –/–> [does not lead to] Success.

The other moral and practical part of the lesson that they leave out is that it’s not enough not to give up, but that learning from one’s failures is a central ingredient in overcoming them.  Picture the analogy of the fly trying to get through the glass window pane, or better yet, the sperm trying to fertilize the egg, because in this day and age, there aren’t enough eggs to go around. You not only have to figure out a way through or around the glass you can’t fully comprehend, but you have to do so in competition with dozens if not hundreds of others.

The narrative of inevitable success is also based on a fallacy of logic. The 6:1 failure-to-success ratio of the most successful scientists may be true, but in an age of declining positions for full-time academics (not to mention other industries) and economic precariousness, that ratio is much, much higher even for those like myself who nonetheless have ended up with great tenure-track academic positions.  And for many, the ratio is infinity, since they may never get the full-time position they seek.  Two successful tenured faculty told me during my job search, “Oh, you’ll be just like [so-and-so], who did great academic work but never had a permanent position.”  I could take that to the bank as I was fighting to make a mortgage payment (and went three years in middle age without health insurance).  For those who are fighting to find such a position, competing against literally hundreds of other applicants, being able to release a “CV of Failures” is unrealistic – they’re too busy trying to conceal them from the search committees.  It may not be a rejected fellowship or grant proposal, but not having any successful ones, or having been fired, or having poor evaluations, or gaps in employment.

Having been on both sides of this, I can say it’s a lot easier to stomach rejected grant proposals when you have a reliable batting average such that you “know” you will get some.  Many people are not in a position to get any grants.  Or, for example, the last NEA proposal I ever wrote – and, in my opinion, the best – was never considered by the panel because I was fired from my job before I could send in the required artistic samples to complete the application.

In a market economy with a high level of precariousness and underemployment, in which at this point a small minority of qualified people will be getting the academic positions for which they have trained (unlike, say, the medical or legal professions), any real failure, any real negative mark on your CV, is going to be enough to disqualify you permanently – or you may at least have the very real fear that it is so, even if you never give up.  That’s why only those who have reached a certain level of the highest academic success will dare to display their so-called “failures,” because they are in reality failures without consequence. And being able to have failures without consequence is a great privilege.

Irony of ironies, one of Professor Haushofer’s research areas is the psychology of poverty. And it is here that he shows himself to be tone deaf.  In the abstract to one of his articles from 2013, Haushofer notes, “low incomes predict lower intrinsic motivation and trust, less prosocial attitudes, and more feelings of meaninglessness…Income inequality is an additional predictor of psychological outcomes across countries, and is associated with loneliness, short-sightedness, risk-taking, and low trust.” Presumably, adjunct professors would fall into this category of low-income workers. So wouldn’t people with that psychological profile be more vulnerable to failure and frustration? In another study co-authored with his thesis advisor, they demonstrate that poverty leads to stress, other negative psychological feelings and feedback loops that reinforce behaviors that maintain poverty. The causality is in fact well-established.  Interestingly, in one study they cite, farmers worried about their economic status have been shown to perform worse in cognitive tests when reminded of their financial woes, and if that finding holds true across professions, it could easily lead to the feedback loop that those in low-paying academic positions face greater stress about success and are more likely to be affected (i.e. underperform) throughout all stages of the job search process. So while not giving up would be good advice even for them, he should know from his own research that such a pep talk (let alone listing one’s failures publicly) is not going to be enough to overcome the psychological effects of years of low income in underpaid, overworked adjunct teaching positions in which one also fights against feelings of “meaninglessness” especially in terms of unsupported or self-supported research.

As for me, I can encourage my students and friends not to give up and to tell them privately that I faced many setbacks in my job searches, but that I always tried to learn from each setback.  As for listing my failures publicly, I’ll wait until I have earned the privilege of tenure, a level of protection and job security that also happens to be under threat from an ever more precarious economy.

Addendum (24 June 2016): Here is another example of the parade of failure as a status symbol by the highly successful and privileged.  The idea that “a stressful and potentially embarrassing experience [can be] spun it into an opportunity,” or that as “Any TED talker could tell you, failure is so hot right now,” refers only to the most securely successful and well-paid in our workforce. For most people who lose their jobs, in this precarious economy (jobs that pay considerably less than $300K a year, a sum that if he saved any money, he never really needs to work again), losing one’s job is a disaster– financial as well as economic and psychological (if you’re not so cocky to know someone else in that same social class will snap you up).  For most people, even most professionals, who lose their jobs, this is no laughing matter.

The Feast Day of Óscar Romero

24 Mar

I have to start by saying I am not a Catholic, so everything that follows here comes from someone who is not coming from within the tradition.  And yet, every year when March 24th comes around, I find myself meditating on the life of Archbishop Óscar Romero who was assassinated on this date in 1980, just before – full disclosure again – I became involved in movements to keep the U.S. military and government from intervening in the wars in Central America.

Then I come across this Vatican Radio article (on Facebook no less) about the first occurrence of Romero’s Feast Day since his beatification by Pope Francis.  I don’t know how such things work, but I find myself moved that his first Feast Day can’t be celebrated as such because this year it falls on Holy Thursday.  I’m also deeply moved that there is also an effort to beatify Fr. Rutilio Grande, who is also known to Pope Francis himself.

Why does this matter, even to nonbelievers?  I mean, even if the idea of  Feast Day or a beatification has no meaning for you, what is it that makes this stand out as so important.  From my perspective, there are two reasons. Two very powerful, even emotional reasons.

First, especially in light of the illegal and inhuman EU-Turkey refugee deal last week, when all the governments of Europe and Turkey conspired to deny the most vulnerable populations of refugees their human and legal rights, it is clear that we live in a moment of such widespread mediocrity in our leadership worldwide (with the exception of the current Pope, but any others?) that not a one of these European leaders will go down in history leaving behind a single memorable legacy or achievement.  Every one of them is ultimately forgettable, because they stand for nothing when it comes to culture, morality, building a better and sustainable future, equity, vision, community, or humanity.  And at the worst possible moment: the decisions and actions we take over the next ten to twenty years in response to climate change will determine the fate of human civilization on this planet.  We can’t wait a couple of generations for better and more moral leadership to come along.  Time is running out, and the leaders we have selected are bureaucrats more interested in national security than human security, and profit and privatization instead of mutually supportive economic practices and goals.

Second, especially in the United States, the idea of morality has been almost completely hijacked by the right in the most sanctimonious of ways in principle and hypocritical ways in practice.  Moral leadership has become aligned with conservatism (even as the conservatives running for President sink to new lows in rhetoric and the morality they express from the campaign podium).  To read about the moral courage of someone like Romero amidst the background noise of the gutter and trivia of American politics is to throw our true bankruptcy into full relief.  And again, at a moment in geological time when we have to make life-or-death decisions determining the fate of life on the planet, our front rank of leaders (many of them elected) exhibit such moral weakness in the face of multinational corporations and the seductions of quick profits that we have little hope of ever finding our better selves, let alone putting them effectively into action.

It’s not a question of waiting for leaders as if they are going to be anointed and appointed from party apparatuses above to look after us.  It is instead a question of how exceptional and moral leadership bubbles up from the bottom but inserts itself by shifting the channels of power like small rivers that set their own course as they stream.  Think of not only Rutilio Grande, a priest rather than a member of the hierarchy, but of young leaders like John Lewis and Cesar Chavez and Berta Cáceres.  I’m sure I’m not the first person to pursue that mystery of our culture: Why is moral courage despised, to such an extent that we reward and let our world be run by those who instead value expedience, profit, and self-aggrandizement?  How did we develop a social system that from an evolutionary perspective works against our own species’ long-term interests?

Deceit is in the hearts of those who plot evil,
    but those who promote peace have joy.” – Proverbs 12:20

 

The Decimation of democracy’s critical class

13 Mar

There are two social institutions whose independence and viability are essential for the functioning of a democracy, and which are vulnerable to structural dismantling in a way that take at least a generation to repair.  They are vital for a democracy precisely because they muster the ability to criticize, question, and push back against Power, against the walls behind which government, business, military, or religious institutions exercise control. These sectors are the press and higher education. They operate, outside the walls, not as isolated voices but as collaborations of research and revelation, networks of thousands of individual voices operating as a chorus with shared commitment to uncovering and approaching the truth and then disseminating their findings to readers, students, and other colleagues.  (And I’m under no illusions any kind of uniformly enlightened academia or journalism – there can always be reactionaries and hacks in any large tent.  But then again the complexity of those sectors make such a spectrum possible.)

It is kind of accepted worldwide that a free press is absolutely essential for that reason, though it is not as widely accepted about universities in that kind of constitutional sense because there are those who believe universities are just for the teaching and mastery of job skills, not for the independent voice of social critique.  After all, freedom of the press is enshrined in our Constitution, though academic freedom is not.

The press does not exist merely to record and transmit the official story.  Universities do not exist merely to provide job training for future workers who will serve government or business without being called on to make decisions.  The basis of living in a democracy is the right to participate in decisions about the community’s future, and the basis of being a moral and effective worker is the ability to have a say in decisions that affect the corporation as well as the surrounding environment (natural as well as human).  The essence of good decision-making is not just critical thinking but also having a well-developed body of knowledge about the issues before us, knowledge that can be complex but one that includes valid evidence and perspectives, rather than ignoring them.

In this context, it is frightening to read, in a very moving investigative article by Dale Maharidge, that the number of full-time reporters for daily print newspapers in the U.S. has dropped 40% in the past nine years, and that rate may accelerate. (This article is really worth reading, devastating, and was the inspiration for this post in the first place.)  As Maharidge makes clear, it’s not just a question that daily print newspapers are being replaced with Internet journalism, but that older reporters with long and local historical knowledge are being let go while inexperienced younger reporters are stepping in. Second, the web-based news is more likely to be national or global rather than local, and even worse, as Maharidge contends, more likely to be to be centered on celebrity and what is entertaining, rather than on what has implications in people’s lives.  But perhaps most devastating is that this new generation of freelance journalists is being asked to work or write for little or no pay, or at best are paid only for the stories they can sell.  Certainly only in the rarest cases are Internet reporters well-compensated and receiving benefits, although recent unionization at Gawker and other news websites is an encouraging start.  At the same time, the type of stories being covered are changing from local and hard news that require interviewing and digging, to the kind of pieces that are either unquestioning repeats of political declarations by our leaders, or that are entertaining (including fear-mongering as a form of horror-show entertainment).

A 40% cut in practicing, full-time personnel would be devastating to any industry.  Not just for the lives and families affected, but for the loss of output, historical knowledge and knowledge of the craft by the elders in the field, and for the inevitable rush to the center among the survivors.  Picture a fishing vessel facing waves crashing over the sides and sweeping the crew overboard.  Those who want to survive will run towards the safer center and cower, rather than ever risk standing near the edge or exposing themselves to risk of any kind.

That kind of sizable cut would also imply that even assuming the nature of news stories were to stay the same, there would be that many fewer stories exposed by the press because there were more topics than the remaining writers could accommodate.  Imagine a 40% cut in the number of stories about climate change, for example, or remaining reporters now having to cover, say, the environment as well as another beat.  They won’t be able to produce as much, investigate as deeply or broadly, and will also have to master multiple fields with professional sophistication in order to interpret what they are being told. (I gnash my teeth sometimes when I hear even NPR reporters who can’t get the details right in immigration law reporting. And we are all still waiting for just one reporter with evidence to confront Ted Cruz on his oft-repeated claim that Obamacare has cost thousands of jobs.)  Put another way, instead of 50 reporters on the ground covering a war, now there would be 30, or there might be 50 but they have to cover more countries and more conflicts, and obviously can’t be two places at once. Stringers are constrained by having to write what will sell, rather than having the financial support of a newspaper to pay for their livelihood while they dig.  In every case, depth as well as the inductive and experienced knowledge from being on the ground are all sacrificed, and can’t be easily recovered. 

Once the business plan of daily newspapers and the field of journalism in general shifts to such an extent that such a high percentage of practitioners are lost, it’s hard to imagine the equal and opposite reaction on the other end of this.  In other words, the proverbial pendulum may not in fact exist and there won’t be a time when suddenly there’s  40% growth in jobs in declining industries like print media.  Newspapers are shutting down much faster than they are starting up. After all, even if there is a massive rehiring, it will take at least a decade for all the new hires to begin to acquire the kind of experience that presumably makes specialists wiser and more able to develop a network of sources.  (Personal pet peeve: there is nothing I hate more than random “person-on-the-street” sound bites, to get the impressions of either totally uninformed or prejudiced people, and usually just one at that, on the air, especially in lieu of interviews with informed parties on multiple sides of an issue.  But I will return to this in another post.)

The same goes for universities, especially researchers and writers.  Much more has been written on the shift over the past twenty years from full-time faculty, engaged in research and writing as well as teaching, to adjuncts hired to teach only, and at such low wages that they are forced to take on extraordinary teaching loads to make ends meet.

Universities are famous worldwide as crucibles of dissent and of research and science (no contradiction there).  And while teaching the young – not just teaching material but teaching the right to question – can be an exercise in freedom, the time and resources to conduct research is at least equally important.  It’s the R&D division of democracies, if you will, and what company can innovate and respond without investment in R&D?  Wipe this sector out and you wipe out an entire intellectual class (like it or not, for millennia every complex society has had its scholar class).  If governments and church denominations can control universities, especially the time and liberty to conduct research, as well as what is taught and what is disseminated to the public, then the critical potential of universities can be circumscribed.  In its most extreme form, this state or military control can lead to the assassination of university leaders, faculty, and students (for example, the murder of the Salvadoran Jesuits at the Universidad Centroamericana in 1989).  But there are more subtle and systemic ways as well, for example by tying research funding to military and business ends, cutting government funding, and most recently, filling boards with figures from business, not academia. As many have pointed out, this leads to restructuring the faculty so that the majority of classes are taught by underpaid, contingent workers with neither job security nor research portfolios, rather than comfortably-paid professors with lifetime appointments, institutional memory, and the ability to work with students on social and political issues without fear of losing their jobs.  I’m not saying anything new here that hasn’t been said and documented in more detail by others, both the “adjunctification” of universities, as well as the retreat from enlightenment, if you will, described by Jane Jacobs as well as, most recently, Marilynne Robinson, among many others.

In about 25 years, the percent of college courses in America taught by full-timers has dropped from about two-thirds to 30%.   The number of full-time faculty has not expanded with the increase in the population attending college, meaning that student-faculty ratios have increased as have faculty teaching loads.  The emphasis is less on the productive work of professional intellectuals as scholars, and more on providing credits for students to obtain their degrees, and in fields in which they are more likely to be able to pay off their debts, because tuition has outpaced inflation and so college is actually harder to afford now.

As I said, others have written about this more than I, and even I have written here about some of this.  But here’s the significant point: in one generation, American universities have changed to a business model that favors training, employment and paying off debt (for alumni) and part-time, contingent work over lifetime investment in faculty to do work including research, writing, and occupying a critical role in our society. Adjuncts can be outstanding teachers but their job function does not permit them the time or resources to be researchers or voices of conscience. And then, will it even be possible for current graduate students and undergraduates to find full-time careers as scholars and professors?  Some will, but how many – and who – will be sacrificed in the name of competition?  (A little bit like the journalists who are getting laid off.)  My heart broke for the young poli. sci. major from Florida who told Hillary Clinton in the Miami debate that she wanted to go on to get a Ph.D.  Sure, we need people like that, but will there be enough chairs in the market for her?  Or will she invest 5-10 years of her life only to get jobs that pay, total, $25,000 a year with no health insurance?

As for the research itself, why wouldn’t you want to be creating positions for more medical researchers, more sociological researchers, more science researchers, to address the most pressing problems of our time?  After all, if you want to find a cure for, say, colon cancers or dementia, why wouldn’t you want to have more researchers working on this and involving more young people in the research and showing them the ropes?  It’s simple common sense that 200 scientists working on a problem or treatment are more likely to come up with useful results than just 120 could.

It’s going to take a lot more national imagination to figure out a way to restore that intellectual class, including a restructuring of education funding  so that tuition doesn’t become the main economic lifeblood of every college and university.  That not only makes students feel they are “consumers,” it also means there is less money to invest in projects that may or may not produce significant short-term results. Such a renaissance of what universities can achieve for democracy and humanity is years away.  Same thing with rebuilding the journalism industry.  It’s not just local print dailies, but the kinds of stories and reporting, and as a by-product, civic involvement they were able to support.  That means getting readers to be interested in learning what is going on around them, and not just parroting and reinforcing their prejudices or following their favorite celebrities (including news personalities) as news.  Yes, the next generation could take this on, with the help of current (tenured) academics and experienced reporters – if they can find the money to support such work.

Alarmingly, we’re at a historical period when we really don’t have time.  The press and universities cannot be absent at what all evidence suggests is a crossroads in our decisions about how to handle climate change and whether or not to continue extracting fossil fuels.  Unlike past generations, this generation has the unique timing to come along when the decisions we make will affect habitability for the next few centuries, if not the fate of humanity itself.  We don’t have twenty years for universities and the press to come up with a critical agenda of questions and answers to allow us to find solutions and grill our elected leaders to do the same.  The disappearance of universities and reporters as significant critical voices is coming at the worst possible time, and we haven’t even found a way – or the political will – to begin to reverse the trend.

 

Chris Rock: Risky Insights or Flat Notes?

29 Feb

Even before last night’s Academy Award ceremony was over, online columnists were congratulating Chris Rock for his “thorny, meaty, and hilarious” and “brilliant and brave” opening monologue. While he certainly made points that took the Oscars in a better direction – and spared us the nearly revanchist embarrassment of Neil Patrick Harris as host – perhaps I am alone in finding it flat and oddly reassuring when it could have been risky and provocative.

I applaud the way Rock explained and exposed “sorority racism,” but it was also an opportunity to introduce America, using biting satire, to structural racism more broadly. Let’s recognize that Rock is now of a stature where he has little to lose.  So to say that the issues people are facing today are less serious or significant than those of fifty years ago may be, on some level, empirically true, but structural racism, economic and educational equality, mass incarceration, and police violence are still pretty significant manifestations of oppression that continues today.  It’s not just a question of “opportunity.”  It is a question of inequity when it comes to who produces pictures, who is hired in positions of power and decision-making, and who is actually purchasing most of the movie tickets in this country and this world.  Structural racism is perpetuated by those kinds of disproportionate imbalances between the producers, the artists getting work, and the people buying the tickets and buying into the dream.

So no, I don’t think his speech would qualify as “meaty” or as “brave.”  The past couple of weeks, I have been showing my students films about the Black Panthers, Nina Simone, and Tommie Smith and John Carlos – people who risked and sometimes lost everything and yet who are unknown by college students (even adult ones) today.  Compare them to Beyoncé, for example.  I found Rock last night to be not his usual edgy self, but safe, even at times reassuring that we will be able to get past today’s issues.  To me his message included a kind of subtext that said, Hollywood, we know once you provide more opportunity things will get better, without really digging in to why it’s more than just a Hollywood problem and more than just an opportunity deficit.  He had the stage and could have had a moment where the satire was as sharp as he has been in the past, but to me his message was blunted.

But one line really bothered me.  The joke, quoted as “When your grandmother’s swinging from a tree it’s really hard to care about best documentary foreign short” hit a sour note for me for several reasons.  First, lynching is one of those rare topics that to me doesn’t belong in any joke or any line that’s going to end with a laugh.  It’s beyond the pale, even if the satire is in the service of a larger, just point.  I know he wasn’t making light of it in any way, but even as a throwaway to an intro, the image is too horrible to even turn around and laugh, no matter the gallows of this particular humor.

But second, and more subtle, is that “best documentary foreign short” actually is very important for all of the same reasons why we struggle for diversity.  Those other categories at the Oscars, especially the documentaries but also foreign films from time to time, are exactly where the issues of racism, violence, injustice and so forth have been openly discussed when Hollywood and mainstream cinema has been way too timid to take risks.  People need to see those documentaries, precisely because they are not trivial, because they do cover lynching. In fact, this very year’s winner for Short Documentary is about honor killings in Pakistan, which are indeed lynchings – some irony there, no?  (Shout out to director Sharmeen Obaid, who generously agreed to meet with me and my students at last year’s Tribeca Film Festival showing of her radiant documentary Song of Lahore.)  Without documentaries like this one, how would we know or learn about killings like this around the world.  While I haven’t seen it yet, I am also eagerly awaiting 3 1/2 Minutes – 10 Bullets which was short-listed this year though not nominated.

To be brave you have to risk something.  To be meaty, brilliant, and thorny you have to provide insights that don’t just voice what most of the people in the room would like to say, but that takes them to a different level of understanding or provokes them to investigate further.  With great respect for Chris Rock’s career, I don’t think last night he achieved either.  Then again, I grew up in an era of Oscar telecasts with acceptance speeches that included congratulations from the Viet Cong, condemnations of fascism, McCarthyism, and anti-Semitism, parallels between U.S. intervention in Central America and Vietnam, the role of American corporations in the nuclear weapons industry and pollution, and more recently speeches by Michael Moore and Errol Morris two years in a row.  None of these Oscar speeches were so celebrated, and in fact most were derided as inappropriate.  Sad to say, but the Salon and Mother Jones commentators may be too young to remember when political protest wasn’t so safe and watered down as it is today.

*   *    *

On a brighter note, lost in the discussion of the absence of people of color in the acting categories, was the fact that of the award-winning filmmakers themselves, the winners’ circle was actually quite diverse.  The Best Director award went to Mexican director Alejandro Iñárritu for the second year in a row – and in fact the third consecutive year for a Mexican director.  As mentioned above, the winner for short documentary was Pakistani Sharmeen Obaid, winning her second Academy Award as well.  (She may be only the second woman to win two directing awards, after Barbara Kopple – I’ll have to check.)  The animated short film was directed and produced in Chile, and the director of the feature documentary, Amy, is British of South Asian descent.  Also, the producer of the full-length animated film is a U.S. American of Latino background.

While much can still be written about American cultural and cinematic hegemony – after all, there are thriving and major popular film industries coming out of India, Hong Kong, Mexico and many other countries, yet only the American Oscars are seen worldwide and American films exported with more force behind them than other countries’ films – the Oscars are changing with more foreign films and directors getting some recognition, or even work.  Who would have thought that the directors of Amores Perros or Y Tu Mamá También would come to Hollywood and win Oscars?  There is much here in the hidden diversity of Hollywood to be written about later, such as why, for example, no one ever seems to acknowledge that since 1980, there have been 33 women directors who have won Oscars for documentary films, including Barbara Kopple and Laura Poitras. That doesn’t excuse structural sexism – why women get to direct documentaries, and usually shorts, but not feature films – but it does complicate it.  But more on this another day.

*   *   *

Finally, very happy for Mark Rylance.  (Again, time was when the Academy would have given it to Stallone for sentimental and commercial reasons.)  Though I haven’t yet seen Bridge of Spies, he is one of the great actors of our time.  I’ve been privileged to see him onstage three times: in Cymbeline in New York playing multiple roles, where I first took notice (especially when he played multiple characters in the same scene) and when, as he told Leonard Lopate this week, he was a complete unknown, then in one of his Tony-winning performances in Jerusalem, also a great play, and in London, as Richard III at the Globe.  I missed two or three of his great performances in New York, but the last few years have been tough for me to see a lot of theater.  It’s great that more of his work is being recorded on the big and small screen and he’ll be recognized by a larger public.  Hope he gets some leading roles in film now.

*   *   *

Addendum (six days later): On the night of the Oscars, when I wrote this, I missed the middle third of the program while driving home.  For that reason, I didn’t get to see the appearance of the three young “accountants” – an Asian joke by Chris Rock that also included a Jewish stereotype.   Had I seen that I would have had lots to say about that too, another example of how Oscar broadcast writers never get it right, even when they have an opportunity for redemption.  I don’t know if Rock wrote that himself, or if they were writers he hired, or if he had veto power not to deliver such a joke, only that apparently the kids’ parents were not in on how their children would be used as the butts of the joke.  I also don’t know why the Oscar telecasts have lately been so badly written (not to mention inappropriate unfunny ad libs like Sean Penn’s gratuitous “who gave this guy a green card” before announcing a winner last year). Also glad to see that much of the press since last week has been more critical of Rock than the first reports I wrote about, especially this response in Colorlines. Strangely enough, back in the day when the choices were much more artistically conservative, I don’t remember this kind of controversy ever from the monologues of Johnny Carson or Billy Crystal.  So given that we like to think we have made so much progress when it comes to racial justice and equality, why have we become more insecure and more threatened about the topic of race and seem unable to find humor that doesn’t reinforce old forms of domination?

bell hooks to the rescue

1 Feb

I know that doubt can be one of the hallmarks of good teaching.  We want students to feel encouraged to challenge and reconsider their beliefs, especially those prejudices they have adopted without much thought and certainly, by definition, without considering the evidence.  But there’s another kind of existential doubt, when we’re so hammered by all the problems facing us in the world that we end up questioning the centrality of aspects of human life that we enjoy.  Can we make art, let alone study it, at a time when it is becoming more clear that without concerted action, climate change could kill us all?  And, given that citizens (and voters) are making choices about future leadership at a time when they are woefully uninformed about politics, current events, and science, what is the importance of studying the arts?

I know.  I’m not so doctrinaire that I believe we can have a society without art or education without art. Actually the opposite: I have always had a knee-jerk sense that arts and music and literature education have benefits that go beyond critical thinking and the wonderful list devised by Elliot Eisner.  But one place where I have gotten stuck is on the politics of the arts and arts education.

Doing my class reading for this week, I came across the following in bell hooks’s book of essays, Art on My Mind: Visual Politics:

“There must be a revolution in the way we see, the way we look.  Such a revolution would necessarily begin with diverse programs of critical education that would stimulate collective awareness that the creation and public sharing of art is essential to any practice of freedom.  If black folks are collectively to affirm our subjectivity in resistance, as we struggle against forces of domination and move toward the invention of the decolonized self, we must set our imaginations free.  Acknowledging that we have been and are colonized both in our minds and in our imaginations, we begin to understand the need for promoting and celebrating creative expression.”  (p. 4)

And this:

“Recently, at the end of a lecture on art and aesthetics at the Institute for American Indian Arts in Santa Fe, I was asked whether I thought art mattered, if it really made a difference in our lives.  From my own experience, I could testify to the transformative power of art.  I asked my audience to consider why in so many instances of global imperialist conquest by the West, art has been other [sic?] appropriated or destroyed… It occurred to me then that if one could make a people lose touch with their capacity to create, lose sight of their will and their power to make art, then the work of subjugation, of colonization, is complete.” (p. xv)

I know with these words I am in the right course, I am following the right course, and that this conversation is vital, even/especially in the context of sociology.

Believe me, it is so easy not to practice, even when there is such urgency to practice, to create, to push ourselves to make work that transforms, or even just questions, the status quo. Every school district, every budget cut that reduces arts and music in schools is performing, in doing so, that work of subjugation.  And if you can’t imagine you can’t be free, you can’t envision anything better or different, or you are simply a prisoner of what the state wants and needs you to be for them.

Courage

30 Dec

There’s a moment of dialogue at the end of Ron Howard’s film, In the Heart of the Sea, that pricked my conscience when I saw it last week.  The film is otherwise formulaic in its writing, although being a Ron Howard film the technical side (namely the directing-cinematography-editing troika) can always be counted on to deliver, especially in the sailing sequences, which can be breathtaking.  The Herman Melville character, played by Ben Whishaw, turns to the narrator/interviewee who is the subject of the film, and says something to the effect of: The plot of the story I could come up with, but you have given me the courage I need to write this book.

This was the first in a series of three films in three nights, a kind of paroxysm of release from the semester and other tensions.  The next night I went to see Trumbo, a film that takes on the thought crimes against the American Left in a way that is not only engaging but fun, followed by the new Hungarian Holocaust film, Son of Saul, with its unrelieved claustrophobic tension, no fun and no uplift.  Where Trumbo is a paean to writers who have the audacity to hold on to their morals, stand up for their values, while subverting what they must in order to survive, Son of Saul is a portrayal of Auschwitz as a chamber of horrors so circumscribed that one can hardly come up for air, let alone get enough distance to even begin to consider acts of rebellion.

Speaking of audacity, I know that there is more to these films than a springboard to thinking about my own individual courage (at a time when I happen to be dealing with ongoing writer’s block and a relentless year of personal tribulations that has made any focus on my own creativity a distant possibility).  All three are about acts of moral courage (although to be certain, the story within the frame story in the Melville film is about physical courage), with Dalton Trumbo having the guts and also necessity (psychological as well as financial) to keep writing because that’s what writers must do. Not that all writing is courageous, but writing when the world wants you to be silent is. Auschwitz on the other hand was a world in which even the moral space needed for a courage response was so squeezed as to be impossible – and I don’t think Primo Levi would disagree.  That there managed to be rebellion at all makes us consider how we must be wired for this on some level, or at least a few people, since moral weakness seems to be more the hallmark of the human condition than moral courage.

Full disclosure: Dalton Trumbo’s National Book Award-winning novel Johnny Got His Gun was one of the most influential books in my life, one that, more than any other, turned me irrevocably against war and violence (and as the subject of a pivotal application essay that got me into an elite summer program in high school, also had an instrumental effect upon my intellectual careers as well).  Years later, when asked to recommend books for the “Suggested Reading” bookshelf at a local library, I had to have the librarian rescue the book from the depths of “young adult storage” where it most certainly did not belong for a number of reasons.  I didn’t just want to be a writer after reading Johnny Got His Gun for the sake of being a writer, I really saw what writing could do when it came to ideas and ideals and social change and how writing itself could break free from conventions of not only politics (which I would later see again in The Jungle) but formal written language (language without commas(!), language that represented interior thought and image, which I would come back to in Light in August, penned just a few years earlier).

At one point I wanted to be courageous, but somehow I had lost that over the past few decades.  Although in fairness to me, not giving up and still trying to stay within the bounds of the kind of career I wanted – which is to say, not working for any entity governed by the profit motive, while still occasionally being able to speak my mind in the form of a lecture or more collaboratively via a community or artistic project – took just about all the courage I had left in the tank. It wasn’t that I had lost hope that prevented me from “bothering” to write, it was that I have so profoundly lost courage because I’ve come to believe that nothing I have to say matters, in any way or on any level.  (And not being, like Jimmy Carter, a person of faith, I don’t have that pillar to lean on either.)

Trumbo reminded me that this is a time of exceptional moral challenge.  I also happened to have seen a production of Incident at Vichy by Arthur Miller (himself a playwright concerned with the very theme of moral courage over and over again, from The Crucible to All My Sons) just last week, and my friend remarked that the play seemed relevant “especially now,” and I too felt that urgency.  Why?  We are not occupied, nor at war to any greater degree than we have been since 2001, and yet it does feel especially with catastrophic climate change, a widening war against and by ISIS, along with the current crop of presidential candidates as well as the failure of our educational system to inform students about enough of the basic issues to make intelligent, i.e. informed decisions about policy, that each of us is being asked to take a moral stand, or simply give in, to consumerism as opiate.  Excuse me, but it does feel like at least once a day I have to make a decision that reflects my ethical stance in the world, from the class lectures that I give to the food I buy – maybe because I’m a teacher and I see most of my students living lives that are reactive to the economy, while more than a few are standing up to everything they have had to overcome just to get to college.  The courage of writers like Trumbo and Miller is handed down from generation to generation like a baton in a relay towards justice.

So as luck would have it, I picked up a paperback copy of Rollo May’s book, The Courage to Create, that I found in the trunk of my car, and began reading.  With the break between semesters, I’ve also been able to do some “outside” reading for the first time since at least the summer.  There’s a lot in the slim volume, and to be honest, he does still subscribe to the Western, high art bias that creativity and imagination require on some level novelty, something new, something replacing the old.  While that is an ethnocentric view of creativity, for me already there are at least four takeaways that have added to this week’s ruminations on art, writing, and courage.

I’m deliberately oversimplifying, but here are the main points (for my purposes):

  1. The act of creativity is fundamentally an act of courage.  (Although he contrasts “moral courage” with “creative courage,” and I see them as complementary, if not overlapping.)
  2. Artists have to deal with the existence and synthesis of several conflicting pairs, including chaos vs. form, conviction vs. doubt, and the “solitary” with the “solidary,” meaning, after Camus, the need for solitude as well as the need to connect with others out in the world.
  3. This: “the creative artist and poet and saint must fight the actual (as opposed to the ideal) gods of our society – the god of conformism as well as the god of apathy, material success, and exploitative power” (p. 26).  He wrote that in 1973.
  4. Psychoanalysis historically viewed creativity or the imagination as something negative, even a kind of neurosis, whereas May writes that “The creative process must be explored not as the product of sickness, but as representing the highest degree of emotional health” (p. 38).  (To me the whole diagnosis of ADHD as a kind of “disorder” reflects this residual hostility towards creativity and an institutional desire to destroy it.)
  5. Creative people use threat and anxiety as motivators that push us to creative action in response, while resolving those feelings of anxiety and tension through techniques like meditation can actually dampen our need or desire to resolve them creatively through action; they help us to tolerate the anger and imbalance rather than channeling them into something communicative as art is.  “Bliss” and the need to write can be antithetical.

I don’t know where I’ve misplaced my courage, but I’ve got to take it out of the drawer and start to wear it again.  Without it, in a way, I am and have been nothing.  I don’t have the kind of flow that Trumbo had to keep going, keep going, keep going, but neither have I faced the threats that he, or certainly Primo Levi, ever faced.  Although, that said, these are times when the gods of conformism and materialism are particularly harsh and destructive.  Maybe that’s why it feels that “especially now” the moral choices we make – whether speaking out against injustice or simply welcoming a refugee – are so immediate, a daily occurrence, even as we go off to work or pay the bills or learn the identity of the latest unarmed shooting victim.  That and the threat of a climate that we may soon need to renegotiate on a massive scale if we are going to continue living and flourishing, which will take all the courage and creativity we can muster, morally, socially, artistically, scientifically.

Hope is all very well and good.  But I wish you all Courage for 2016.

 

2015 Film festival experiences in review – Montreal, Tribeca, Margaret Mead, and DOC NYC

17 Dec

The following started out as just kind of notes to myself to remind me how I felt about films I saw at this year’s Montreal World Film Festival. So I apologize in advance to my (at most) six readers for writing about something of no interest to anyone but myself. But then I realized, due to family circumstances beyond my control, almost all my moviegoing this year was to festivals, at least since this spring, when I saw the magnificent Thomas Hardy adaptation, Far from the Madding Crowd.  But, largely thanks to the excuse of class trips, I was able to get to two films each at Tribeca, Margaret Mead, and DOC NYC, which I’ll discuss below.

My first festival has always been Montreal though.  This August I was able to go to the Montreal Festival for the first time in two years, and just for the first weekend; classes started the day after I returned (and there was the matter of a visit to the emergency room in between my return and my first class – but that’s another story) and I haven’t had a moment to breathe or catch up until now.  Montreal always has something like 400 films to choose from, lots of documentaries, from every corner of the world, much less commercial than its Toronto cousin.  While I am always drawn to Asian cinema, this year I seemed to lose my touch when it comes to choosing memorable experiences from the catalogue: ten films in four days, but only two were what I would call very good, one was from Nepal (which I’ll get to below) and the other from China.  For some reason though the festival website has taken down the program, so I have to use Google to find their names, since they’ll probably never be distributed here in the U.S.

I saw two films from China, actually, both on the same theme of traditional ways of life being displaced by the market forces of modernization.  This must be a source of anxiety in China, because I’ve already seen two other films on the same theme: the outstanding Canadian documentary, Up the Yangtze, which is actually more about the loss of traditional villages to dam-induced flooding, and Postmen in the Mountains, which I saw in Montreal thirteen years ago and which remains one of the best films I have ever seen at that festival.  So as I looked through the festival program, I thought, hey, this topic must be “a thing” in China and since cultural sustainability is a theme I have already announced I am interested in, I should check these out.  These two films were Drifting Goats, about the end of the river ferries on goatskin floats, and Song of the Phoenix. about the dying tradition of suona players.

Drifting Goats is a father-son tale (like Postmen in the Mountains) in which the father is a ferryman on the old-fashioned river boats, and while the ferries are being replaced with more modern motorboats,  the growing local tourism industry is basically buying out the ferrymen and turning the picturesque and now exoticized ferries into a tourist attraction.  The stark choice comes down to, either you join the tourist industry and continue to make your livelihood from the boats, by giving rides to tourists, or you give up your ferry and your lifestyle and you retire.  With the son imploring the resistant father to recognize the flow of progress, eventually the ferrymen embrace the opportunity to stay on the river and tourism becomes the only way they can sustain their tradition – a happy ending in which traditions remain economically viable, which is what really matters, even in neoliberal rural China.

In contrast, Song of the Phoenix, while teasing us with the possibility that the folk musicians may give up their instruments and their folk music, stubbornly clings to the idea that traditional music and instruments cannot be replaced.  Nowhere is the criticism of pop commercialism more apparent then a kind of impromptu battle of the bands that arises when a pop group shows up to woo the village crowd away from the suona band.  Characters are not stuck in an idealized village setting and in fact the film acknowledges the tremendous pull away from the villages towards the industrialized wage-economy city.  But rather than espousing the liberal compromise of adapting to the new world, the film holds out for the necessary indomitability of folk music, even in a commercial world, and even for that matter in a commercial motion picture.  It is worth a second viewing.

Of course I’m going to be more favorable to a film with this point of view – and indeed if the story had gone the other way, I would have liked it far less and been less inclined to overlook its conventional or predictable elements.  I tend to have a strong rejection towards films no matter how well-made that espouse a reactionary point of view or that ask us to smugly accept the status quo (especially around issues of violence).  Rather than being open-minded as my society tells me I should be, I tend to think of the values expressed in the film and the damage they can cause by encouraging complacency and self-satisfaction.  It’s not that I can’t learn anything from that with which I disagree, but I do think we need to remember that cinema and television inevitably have an impact that exceeds other popular art forms.

Anyway, that said, even an art film like Postmen in the Mountains can have an impact, as it did on me, even more than just preaching to those naturally inclined to agree.  I saw it only once so far, in 2002 at the Montreal Film Festival, where, I just learned, it won the Audience Choice Award. Though it was made in China in 1999, it wasn’t released in the U.S. until 2004.  (Eventually I’ll have to buy the DVD to see it again.)  Also a father-son tale, it’s the story of an elderly postman in rural Hunan province who has a three-day mail route to all the mountain villages, a job from which he will be retiring and handing off to his son after this last round.  I still remember not only the great performances, but the spectacular cinematography, reminiscent of Chinese watercolors in its composition.  But more than just nostalgia or a theme of anti-modernity, the film shows more than any other (including Song of the Phoenix) what is lost when the old ways will be no more.  It’s not just that those lifestyles and means of subsistence will be gone, it’s also that there’s something very deeply embedded in those practices that, when gone, will change the nature of humanity.  Part of the beauty of the film is that this is implicit, and I wish Song of the Phoenix shared more of its depth; after all, if delivering mail can be rendered meaningful on screen,  the role of traditional music for both villagers and the artists themselves and why it’s survived so long ought to be explored some.  For me, coming from a society in which success is tied in with achievement, and wealth, Postmen was the first I had seen to make the case that ministry to others is a noble achievement and necessary to maintaining the glue that keeps communities, and families for that matter, together.  You don’t have to know a lot about China to understand that people must be incredibly anxious about this, and that both modernization and the consumer economy make those moral choices that much more insanely difficult.

Speaking of both modernity and mountains, the other outstanding film I saw in Montreal this year was the Nepalese art film, Serdhak – The Golden Hill, a low-budget high-altitude film set in the Himalayan villages of northern Nepal.  Shot with a very small crew and using natural light, even for some impressive interior scenes, this independent, naturalistic film avoids cliches while depicting the everyday dramas of village life, in the kinds of villages that outsiders never get to see.  The mountains are not just gorgeous backdrops for exotic effect but are the land on and around which people survive.  (As opposed to the also outstanding documentary, Meru, where the mountains exist to be conquered by humans.  I loved it anyway, largely because of the photography but also the editing.)   The characters and their lives are real, neither sentimentalized nor overly dramatized and the film has the feeling of conveying a story very much within the framework of real life.  There too one of the internal battles characters must deal with is the pull of education in the city, with the ties to the home villages, and what will be lost in terms of beauty when educated young people must leave these villages for city – which becomes synonymous with modern – life.

Also breathtaking, in a different way, is the documentary Song of Lahore, which hasn’t received anywhere near the critical attention it deserves, although it was runner-up in the audience award for documentaries at the Tribeca Film Festival.  It tells one story spread out over two continents: the suppression of Pakistani, or Punjabi, traditional music under Islamist regimes, and the invitation of the musicians to join Wynton Marsalis at his Jazz at Lincoln Center program.  The film is about love and made with love: the love that the musicians have for their music, but also the love they have for one another and their families, expressed through music, as well as the love for a city and what it once offered culturally.  The way that the musicians in the film articulate their feelings about music, especially when it is silenced, is incredibly moving, and in a way useful to those of us who think about art and culture as a humanizing influence.  Co-directed by a Pakistani Oscar winner and an American, the film is so timely it’s hard to believe it hasn’t gotten distribution, and yet its themes transcend current events.  It’s one of the great films I have seen about music and musicians, shot and edited with warmth and compassionate sophistication.

Films can take us inside worlds we would never think we could see, and in documentary film this is especially true today.  At the Margaret Mead Festival, I saw the recent history of Kashmir and the city of Srinigar through the eyes of three artists – two of whom also incredibly insightful verbally – in the extraordinary documentary, Kasheer: Art, Culture, and the Struggle for Azadi, which is rich and thoughtful on at least two levels: telling the story of human political conflict through visual arts, while reflecting on what visual art can do in a time of upheaval and violence.  It also is one of the rare films I have seen to treat folk art and craft with the same respect as fine and graphic arts, and this is due largely to respectful camera work and editing that is nothing less than brilliant.  Likewise, The Anthropologist, which appeared at DOC NYC takes us to Siberia, Kiribati, and the ice glaciers in Peru with anthropologist Susan Crate to show us her work on climate change and its impact on culture.  My adult students who attended with me came out with the firm belief that we need not only more anthropologists, but more jobs to do this kind of necessary work.

Finally from Tribeca, I want to recall the landmark Romanian film, Aferim!, the first film from that country to acknowledge, let alone criticize, the enslavement and mistreatment of Roma people (known also as Gypsies) well into the 19th century.  Worth seeing for that reason alone, but also for the stunning black-and-white photography, the writing, and the disturbing depiction of Romanian 19th century mores – or is it human nature? – and one really unforgettable, incredible period hat.  I see that it’s going to be distributed in the U.S. in late January 2016.

(There.  Glad to have finally gotten all that off my mind and penned down, so to speak, however superficial it all may be.)

%d bloggers like this: