de Dan Tapalagă/ HotNews.ro
Fosta profesoara din Nitchidorf preda cu incapatanare coerenta dizidentei, cu dramele ei. Preotul din Rosia propovaduieste curajul spovedaniei oneste, cu dramele lui. Destine radical opuse, Hertha Muller si Eginald Schlattner se intalnesc undeva in marea literatura. Cei doi inseamna doua atitudini publice slab frecventate de scriitorii romani. Nobelul pentru Hertha Muller ar trebui sa fie un moment de rusine pentru noi toti, nu de mandrie patriotarda.
Hertha Muller nu apartine literaturii romane, doar tema si personajele au legatura cu mizeria universului romanesc sub comunism. Noi i-am livrat, si inainte si dupa '89, motive autentice de oroare, am cutremurat-o apoi cu indiferenta noastra fata de trecutul totalitar. Literatura romana n-a reusit sa nasca nici o Muller, nici un Schlattner.
O cultura minora, scrisa de oameni cu destine aproximative, nici nu putea scrie autentic despre o mare tema - anularea individului in lumea totalitara. Ne salveaza totusi eseistica unor Virgil Ierunca si Monica Lovinescu, memorialistica fostilor detinuti politici, o mana de mari dizidenti, pretioasele recuperari de la Sighet, raportul Tismaneanu si alte cateva reparatii istorice.
Dar literatura a sarit peste dramele comunsimului sau, daca s-a intalnit cu ele, n-a produs mari capodopere. N-a reusit sa puna o mare tema - individul strivit de totalitarism - in circuitul literaturii universale, asa cum au facut-o Hertha Muller sau Eginald Schlattner.
Poate v-ati intrebat, citindu-i pe toti marii scriitori romani care astazi ii aduc elogii: Ei de ce n-au avut curajul Hertei Muller inainte de '89? De ce le lipseste colaborationistilor onestitatea unui Eginald Schlattner, fie si tardiva?
Am vazut-o intr-un interviu la TVR realizat prin 2007 si prezentat in reluare joi seara, cand i s-a decernat premiul Nobel. Hertha Muller punea in incurcatura intelectuali, scriitori si societate civila din Romania ridicand probleme fundamentale: de ce tema colaborationismului cu trecutul e atat de absenta in spatiul romanesc? De ce, in comparatie cu Germania, problema dosarelor lor la Securitate e atat de putin discutata? De ce oamenii nu sunt framantati de detaliile vietii lor sub comunism?
Lipseste, din puzderia de scriitori romani colaborationisti - turantori de profesie sau victimele terorii - un Eginald Schlattner onest, povestindu-si chiar si tarziu viata de antierou impacat cu destinul sau. Avem, poate, mari constiinte fara geniu sau genii pustii, fara constiinta literara puternica.
Nemtii Hertha Muller si Eginald Schlattner sunt turnesolul acid al societatii romanesti, oglinda dureroasa pentru multi scriitori. Hertha Muller a ales sa paraseasca Romania, traumatizata de infernul din ea, si sa-l denunte continuu. Schlattner a ramas in locurile care i-au mutilat viata si se spovedeste calm, marturisind totul, in detalii semnificative.
Noi le-am livrat materia prima, Raul, iar ei l-au povestit altora, in germana. I-am recuperat timid dupa ce opera lor a fost recunoscuta afara si s-a bucurat de succesul total. I-am tradus in tiraje confidentiale in Romania dupa ce au ajuns autori de best-seller in Germania. Astazi ne amintim din nou ca exista pentru ca, nu-i asa, intre timp au luat si un Nobel. I-am ignorat cat am putut in tara sau i-am confiscat ipocrit, intonand imnul national.
La sfarsitul interviului acordat TVR, Hertha Muller isi expune in cuvinte simple un fel de crez scriitoricesc tipic nemtesc: "Trebuie sa facem treaba. Asta este motto-ul meu: Trebuie sa facem treaba, in toate locurile unde este cazul, trebuie sa facem treaba. Si atunci se schimba multe. Si nici nu e greu". Ei, si Herta Muller si Eginald Schlattner, si-au facut-o. Noi, ceilalti, spre rusinea noastra, nu.
Învaţă, Cunoaşte-te pe tine însuţi, Schimbă-te... Învaţă de la oameni, Cunoaşte-i, Schimbaţi Împreună Lumea!
Wednesday, 29 September 2010
Tuesday, 28 September 2010
by Gilad Atzmon / September 28th, 2010
Haaretz reported this week that a boat carrying Jewish activists from Israel, Germany, the U.S. and Britain set sail on Sunday for Gaza, hoping to breach Israel’s blockade there and deliver aid.
9 Jews will participate in this brave mission: amongst them is Rami Elhanan, an Israeli peace activist whose daughter Smadar was killed in a suicide bombing in 1997. Elhanan rightly maintained that it was his moral duty to act in support of the Palestinians in Gaza because reconciliation was the surest path to peace. “Those 1.5 million people in Gaza are victims exactly as I am,” he said.
Refusnik Israel Air Force pilot Jonathan Shapira, another passenger aboard the ship, told Haaretz that “we hope that the soldiers and officers of the Israeli navy will think twice before they obey orders to stop us.” Shapira also reflected on recent Jewish history: “Let them remember the history of our people, and those who followed orders and later said we were only following orders.”
Elhanan and Shapira make a lot of sense, for they speak in the spirit of humanism and universalism.
However: when it comes to Jewish political activism, there is always one ‘righteous person’ who insists on providing a glimpse into what is still a deeply Judeo-centric agenda.
Richard Kuper, an organizer with the U.K. group, ‘Jews for Justice for Palestinians’, said “one goal is to show that not all Jews support Israeli policies toward Palestinians.”
Well done Richard. Let me get it right: amongst the Jewish population of 18 million people, worldwide — all you have managed to apparently represent, speak for and collate, is 9 humanist souls who are not happy with Israeli policies.
I suggest to Jews — and humanist Jews in particular — to once and for all, drop the ‘not in my name’ strategy: it is not going to work, and it doesn’t make any sense either. Implementing such a tactic is as racist as the Zionist project, for it affirms the Zionist racial and collective attribution to Jews. It basically says, ‘look at me, I am nice in spite of being a Jew’. This common Jewish left tactic is, unfortunately, not as forceful as Zionism for Zionism is supported by the vast majority of world Jewry institutionally and spiritually.
Also, I would like to advise Mr. Kuper that the goal of a humanitarian mission to Gaza should aim at helping Gazans rather than make Jews look better.
I should be clear here: of course I wish the Jewish boat all success in accomplishing its sacred mission. I certainly go along with Shapira and Elhanan’s call. It is very impressive to see heroic Israelis opposing their criminal government. Shapira and Elhanan are the seed of a future reconciliation. It is also important to see Jews around the world standing up against Israel.
However, if these Jewish activists are true humanists, they had better operate as ordinary people within the emerging solidarity movement. If these Jews are humanists, they had better accept the true meaning of universalism and stop buying into, and retaining aspects of Zionist racist philosophy and perhaps they should consider not solely operating in Jews only political cells.
Gilad Atzmon was born in Israel and served in the Israeli military. He lives in London and is the author of two novels: A Guide to the Perplexed and the recently released My One and Only Love. Atzmon is also one of the most accomplished jazz saxophonists in Europe. He can be reached at: atz@onetel.net.uk. Read other articles by Gilad.
“To the Finland Station"
by Lesley Chamberlain
The American critic’s masterpiece on the roots of communism — To the Finland Station — continues to have great resonance today.
Edmund Wilson's To the Finland Station is a 20th-century classic by a great American critic about the origins of the Soviet Union. First published in 1940, the book only gained proper recognition in the 1960s, when a new generation began to ask why the Russian Revolution had failed. Its power became legendary, though it was always more admired than understood.
Initially, it looks like a series of vignettes dramatising the lives of leading revolutionary players in France, Germany and Russia in the period between 1789 and 1917. But then the few pages about French historians are followed by what is in effect a small book about Marx and Engels. Their lives act as a stage on which Lenin and Trotsky suddenly appear. Marxism became a script for life, spawning Bolshevism in Russia and dividing politics in new ways.
Because Wilson's book is not straight history but rather a writer's impression of figures from history, it continues to resonate today. It is, says one of Wilson's biographers, Jeffrey Meyers, the "biography of an idea". The writing is typically restrained, shapely and individualistic in spirit - for Wilson's subject is moral passion rather than politics. In Edmund Wilson: a Life in Literature (2005), Lewis M Dabney declares the result to be the greatest imaginative work of American literature of the 1940s, rivalled only by the early novels of William Faulkner.
Wilson's story begins with the great French 19th-century historian Jules Michelet. After reading Giambattista Vico's New Science (1725), Michelet realised that, following centuries of subordination to the church, humanity was now free to design society according to its own needs and wishes. Michelet had a stroke in 1871 when he heard how, in the last echo of the revolution, the Paris Commune had plunged France into civil war. And it is the successive defeats of the "human spirit" (but the persistence of hope) that unify Wilson's project.
Marxist commentators used to criticise Wilson for starting with Michelet and for neglecting Hegel. After all, they pointed out, it was from Hegel that Marx had borrowed the dialectical and materialist theory of history. Wilson's critics were unsettled by his simultaneous sympathy for and deviation from the orthodox story. However, his writerly independence made him much more likely to be right in the long term.
Michelet, who rose from humble beginnings to become France's greatest historian, expressed in his very being what Hegel struggled to put into theory about the possibility of living a fulfilled life in modern times. "Life" and "work" are highly charged values in Hegel, but few readers will be willing to wade through pages of abstraction by the philosopher to discover that. In Michelet, however, the masters and the slaves are real people, finding new freedoms in a changing society.
The theme running through the stories that Wilson tells - of Michelet and his academic successor Ernest Renan, the "utopian socialists" Henri de Saint-Simon, Charles Fourier and Robert Owen, and ultimately Marx - concerns the uncertain role of the individual as an instrument of social and political change. The utopians believed in noblesse oblige and the public good. But which individuals should lead? What are their qualifications to set standards? How are they to motivate others to live charitably?
To these philosophical questions, Wilson brings not answers, but tales of experience. The alternative moral communities founded by the followers of Fourier, Owen and Saint-Simon were bickering dystopias that destroyed the spirit of the participants and bankrupted their founders. Wilson offers wonderful portraits of "persons unworldly and persistent" who were already committing many of the dictatorial sins that communism would later practise. "Intransigents" of the kind that succumb to their own neo-religious zeal occur in every generation - Ferdinand Lassalle and Mikhail Bakunin were two such in Marx's day.
For Wilson, Marx's insight was to see that the utopian socialists were wasting their lives. To be effective, change had to come from within society itself, and not be driven by outsiders. But how? Who teaches the teachers? "Where to begin and who [is] to be trusted to do the beginning?" These were problems that Marxist theoreticians tried to solve. In the end, as Lenin understood, there would have to be a party to lay down the law.
Though a revolutionary industry seized his name, Marx was much more cautious than his followers. At the heart of To the Finland Station is an unprecedented portrait of Marx the man. He was the outsider who thought he had spotted the mechanism by means of which, sooner or later, capitalist society would undo itself. However, according to Wilson, Marx was wrong to believe in dialectical materialism.
His celebrated claim to have set Hegel the right way up was a matter of hope, not "science". And his belief that the proletariat can "become self-aware" was itself utopian. The blow to that submerged idealism was what hurt most when the faith of his last ideological followers finally collapsed, along with the Berlin Wall, in 1989. They experienced a disillusionment that their founding hero had avoided.
The entire Marxist revolution, for Wilson, was built on faith. This did not mean that the vision of a fairer and more equal society was misplaced; only that the way of getting there was no more scientifically grounded than any other. Nonetheless, the author admires the antagonistic, oppositional power of Marxism: "Marx brought into play . . . dialectical materialism to blight the shimmering mirages of the utopians and to make the blood of the bourgeois run cold."
In what must be the most idiosyncratic claim in the book, Wilson asserts that Marx was misled by his own outsider status as a Jew. Because Jews had been deprived of an equal place in society, Marx was able to sympathise with the deprivations suffered by workers in a capitalist state. But he failed to recognise how differently these two exclusions related to the individual. The excluded but revolutionary-minded Jew could become self-aware and end his unfair oppression, but the proletarian could not - or, at least, not easily, because he had no suppressed cultural tradition to draw on. Marx, therefore, lacked any credible notion of what the triumph of the proletariat might mean.
The Russian chapters in To the Finland Station are not the best; they must be read against the background of the sudden and intense disillusionment with Trotsky that his followers suffered in the United States in the late 1930s. Trotsky had appealed to New York intellectuals as the protector of Lenin's legacy against the evils of Stalinism - but as the ink dried on his masterpiece, Wilson conceded that "sympathisers with Trotsky may have invested him with qualities he didn't possess".
He put more into his portrait of Lenin, who simplified Marxism to suit his needs (his aim was to get people to act). He learned Russian and travelled down the Volga to Ulyanovsk - the town of Simbirsk, renamed in Soviet times after its best-known son. Wilson wanted to experience for himself the project to build a fairer world. And he understood that the fate of Russia and that of the west were entwined. Later he apologised, when To the Finland Station was reissued in 1971, for continuing to harbour illusions about the Soviet Union.
Wilson's work has recently begun to appear in the prestigious Library of America series, so it is likely that, in the next few years, To the Finland Station will receive a more wide-ranging assessment. But what is already clear is the resonance, at a time when most critical or oppositional activity is so rapidly absorbed into the mainstream, of its account of the struggle to find the right vehicles and mechanisms for social change.
“To the Finland Station" is published by Phoenix (£8.99)
Lesley Chamberlain is the author of "The Philosophy Steamer: Lenin and the Exile of the Intelligentsia" (Atlantic Books, £9.99)
Get the full magazine for just £1 a week with a trial subscription. PLUS get a free copy of Why Britain Is At War by Harold Nicolson
John Ruskin was an enemy of democracy, writes Tim Abrahams.
British Pavilion
Venice Architecture Biennale
Britain's contribution to the Venice Architecture Biennale has been praised by critics, with good reason. The exhibition, housed in a small, neoclassical pavilion in the Giardini in Venice, explores the relationship between the Victorian art critic John Ruskin and Venice in a questioning way. Dominated by a huge scale model of the London 2012 Olympic stadium, made by Venetian gondola builders and converted into a drawing studio, the exhibition poses some important questions about Ruskin's relationship with architecture's role in contemporary society.
Venice is a city that the English art critic, who was born in 1819, catalogued assiduously. (A small collection of his notebooks is included in the exhibition.) With his book The Stones of Venice, published in 1851, Ruskin made an immense contribution towards establishing architecture as an art form. He posits Venice as a text, which he goes on to decipher, with a specific purpose in mind: he uses the city to make an impassioned defence of the Gothic style - at whose heart he places Venice - as the morally superior form of European architecture. Ruskin's argument is that the Gothic is produced by master builders, dedicated in their tasks to a collective sharing of skill (often in directly venerating God, but not exclusively so), and working in semi-autonomous units throughout Europe.
Up until the end of his life, there remained in this favouring of the Gothic - this transmission of God's word through the tactile language of stone - a mistrust of the centralised authority of the papacy. Though Ruskin was raised as a Protestant by evangelising parents in the 19th century, he found the same sense of brotherhood in the work of masons of the late medieval period. To him, the Renaissance was a period in which mankind regressed morally.
Why should this talk of the Gothic and morality engage us in the present day? Because the focus of Ruskin's argument is not ultimately the evils of the Renaissance, nor even Catholicism. The real enemies, for Ruskin, were democracy and industry. Of particular import to his thinking was the essayist Thomas Carlyle's comparison between the Bury St Edmunds of his day and the same town in the 12th century. Through this juxtaposition, Carlyle extrapolates the need for an industrial aristocracy: a noble feudalism that would protect the working man. Ruskin believed in this.
The Stones of Venice, published in the middle of the 19th century, is a late call for a benign feudalism. The Gothic tradition, Ruskin believed, permits the mason to dictate scale and structure, as opposed to the neoclassical approach, which lends itself to political grandstanding and overly ornate detailing.
To Ruskin, the Renaissance was a time of moral turpitude and Venice was more than simply Venice. "Since first the dominion of men was asserted over the ocean, three thrones of mark beyond all others have been set up on its sands: the thrones of Tyre, Venice and England. Of the first of these great powers, only the memory remains; of the second, the ruin; the third, which inherits their greatness if it forgets their example, may be led through prouder eminence to destruction," he wrote.
The Stones of Venice asserts that the British should care for the Italian city or else be destroyed. At the Biennale exhibition, Ruskin's notebooks are contrasted with a photography project by Alvio Gavagnin, a working-class Venetian. There is a tacit proposition here: Britain does not own and define the city of Venice. The drawing studio, which was modelled on the Olympic stadium, was made to be handed to a group of local anarchists called Rebiennale, which recycles art and architecture installations, to be reused by the city.
Liza Fior, who created the pavilion, has suggested that Ruskin was a radical. Fortunately, her installation is more nuanced. Although Ruskin depicted the way in which the industrial age restricted free expression, that does not make him a revolutionary. But what makes the British Pavilion show such a success is that it captures the rigour and brilliance of his observations while setting his more questionable ideas in context.
Tim Abrahams is associate editor of Blueprint
Get the full magazine for just £1 a week with a trial subscription. PLUS get a free copy of Why Britain Is At War by Harold Nicolson
Thursday, 23 September 2010
Stayin’ Alive: The 1970s and the Last Days of the Working Class
How Bruce Springsteen Helped Make Being a Working Class Rebel Cool Again
An excerpt from author Cowie's new book, 'Stayin' Alive' reveals the tussle between right and left to claim Springsteen as one of their own.
September 23, 2010
Editor's Note: An epic account of how working-class America hit the rocks in the political and economic upheavals of the ’70s, Jefferson Cowie's Stayin’ Alive: The 1970s and the Last Days of the Working Class presents the decade in a new light. Part political intrigue, part labor history, with large doses of American music, film and TV lore, Cowie's book makes new sense of the ’70s as a crucial and poorly understood transition from the optimism of New Deal America to the widening economic inequalities and dampened expectations of the present. From the factory floors of Cleveland, Pittsburgh and Detroit to the Washington of Nixon, Ford and Carter, Cowie connects politics to culture, showing how the big screen and the jukebox can help us understand how America turned away from the radicalism of the ’60s and toward the patriotic promise of Ronald Reagan.
The following is excerpted from Jefferson Cowie's Stayin’ Alive: The 1970s and the Last Days of the Working Class (The New Press, 2010).
In the summer of 1984, Ronald Reagan campaigned toward his landslide victory over liberal Democratic challenger Walter Mondale. That same summer, America's foremost working-class hero appeared on stages across the nation, dwarfed, Patton-like, by an enormous American flag, pounding his fist in the air like it mattered. Tens of thousands of voices united to chant the most popular song of the summer, the year, and the decade:"Born in the U.S.A."
This audience sometimes drowned out the marshal tones of the E Street Band itself, heightening the pitch of an event that was already equal parts rock concert, spiritual revival, and nationalist rally. Replacing the skinny greaser poet of his earlier tours, Bruce Springsteen had become a superhero version of himself, his new pumped-up body accentuated by exaggerated layers of denim and leather, his swollen biceps working his guitar like a jackhammer. Fists and flags surged into the air at the first hint of the singsong melody, as thousands of bodies shadowboxed the empty space above the crowd to the rhythm of the song, the deafening refrain filling stadiums around the world. Whether one chose to compare the spectacle to the horror of a Nuremburg Rally or the ecstasy of an Elvis Presley show, rock 'n' roll felt almost powerful again -- more like a cause than an escape.
On the surface, the performance seemed obvious evidence that working-class identity had been swept out into the seas of Reaganite nationalism. The toughness, the whiteness, the chant, the fists, the flags, the costume, all pointed to the degree to which this figure, once hailed as "the new Dylan," had, like so much else in the 1980s, been stripped of even the pretense of authenticity. Instead, Springsteen, dubbed "rock and roll's future" only a decade earlier, had been painted red, white, and blue, and packaged as an affirmation of American power and innocence to an eagerly waiting marketplace.
"Like Reagan and Rambo," writes Bryan Garman, "the apparently working-class Springsteen was for many Americans a white hardbody hero whose masculinity confirmed the values of patriarchy and patriotism, the work ethic and rugged individualism, and who clearly demarcated the boundaries between men and women, black and white, heterosexual and homosexual." The many and complex labor questions of the 1970s seemed to have found easy answers in the 1980s with the narrowing and hardening of white working-class identity into a blind national pride that sounded like belligerence.
Yet these surface elements of "Born in the U.S.A." and its performance belie a profound complexity -- much like political discourse and popular culture in the 1980s masked the intricacies of post-New Deal working-class identity more generally. The song's story line, buried beneath the pounding music and the patriotic hollers of the chorus, explores the muffled tale of a socially isolated working-class man, burning within the despair of de-industrialized, post-Vietnam America: a social history of white working-class identity unmoored from the elements that once defined it. Though Springsteen projects the chorus with all his might, the tale told by the verses barely manages to peek over the wall of sound, like a man caught in a musical cage, overpowered by the anthem of his own country. Like the neo-patriotism of the Reagan era itself, the power of the national chorus, "I was Born in the U.S.A.," dwarfs the pain of the "dead man's town" below it. "You end up like a dog that's been beat too much / Til you spend half your life just coverin' it up."
The juxtaposition of this unemployed worker's dire, muted narrative, and a thundering patriotic chorus sparked battles among rock critics, pundits, and fans. Was the song part of a patriotic revival or a tale of working-class betrayal? A symptom of Reagan's America, or the antidote to it? Protest song or nationalist anthem? Both sides assumed that the words and the music could not go together, and in picking one over the other denied the song's unity -- and its subject's -- in favor of its far less compelling individual parts. Conservative columnist George Will famously fired the first shots in the Springsteen wars with a September 1984 opinion column that claimed the singer as a repository of Republican values. Will's assertion of the song's conservatism was a product of his one-night stand with the E Street Band, a concert he admittedly heard through ears packed with cotton. "I have not got a clue about Springsteen's politics, if any, but flags get waved at his concerts when he sings songs about hard times," Will explained. "He is no whiner, and the recitation of closed factories and other problems always seems punctuated by a grand, cheerful, affirmation: 'Born in the U.S.A.!' " Casting this "working class hero" as a paragon of what workers should be -- a little more patriotic, a lot more hardworking, and much more grownup -- he saw Springsteen as "vivid proof that the work ethic is alive and well" in the "hard times" of 1984. A few days later, when Will's informal advisee Ronald Reagan requested the song for his presidential campaign (and was turned down) the president invoked Springsteen anyway during a campaign stop in the singer's home state of New Jersey.
Liberals, leftists, and rock critics responded in kind and, ridiculing conservatives, claimed the song and the singer for their own by shoehorning the rock anthem into the withering protest song tradition. Springsteen's most devoted chroniclers admitted that the song functioned more for the Right in the Reagan years, but with apologies: "Released as it was in a time of chauvinism masquerading as patriotism, it was inevitable that 'Born in the U.S.A.' would be misinterpreted, that the album would be heard as a celebration of 'basic values,' " explained one critic, "no matter how hard Springsteen pushed his side of the tale." Even Walter Mondale presumed (incorrectly) to have Springsteen's endorsement for the presidency.
Lost to listeners on the Right and the Left was the fact that "Born in the U.S.A." was consciously crafted as a conflicted, but ultimately indivisible, whole. Its internal conflicts gave musical form to contradictions that grew from fissures to deep chasms in the heart of working-class life during the '70s and their aftermath. The song was first written and recorded with a single acoustic guitar during the recordings for Nebraska (1982) -- a critically acclaimed collection of some of Springsteen's starkest and most haunting explorations of blue-collar despair, faith, and betrayal during the economic trauma of the early Reagan era. "That whole Nebraska album was just that isolation thing and what it does to you," Springsteen explained. "The record was basically about people being isolated from their jobs, from their friends, from their families, their fathers, their mothers -- just not feeling connected to anything that's going on -- your government. And when that happens, there's just a whole breakdown. When you lose that sense of community, there's some spiritual breakdown that occurs. And when that occurs, you just get shot off somewhere where nothing seems to matter."
Most of the lyrics of the original Nebraska period "Born" remain the same in the popular electric version released two years later, but the first recording lacks the pounding accompaniments, and, with them, any reason for pumping fists. "To me," Springsteen explained of the earlier version, "it was a dead song…. Clearly the words and the music didn't go together." So the first draft was shelved, only to emerge again, in a much stormier, amplified form, as the title track of its own album, Born in the U.S.A., in 1984.
In the intervening time, the song had found its soul. As producer Jon Landau explained, Springsteen had "discovered the key, which is that the words were right but they had to be in the right setting. It needed the turbulence and that scale -- there's the song!" The electrification, projection, and anthemification of the first draft placed the chorus-lyrics tension at the center of the song. For Springsteen's project of giving voice to working-class experience, then, the words of working-class desperation "went together" with the music of nationalism -- the "protest" only worked within the framework of the "anthem." For the song to convey its message, the worker had to be lost in the turbulence of the nation's identity. As Springsteen once explained, the narrator of "Born in the U.S.A." longs "to strip away that mythic America which was Reagan's image of America. He wants to find something real, and connecting. He's looking for a home in his country." Putting the pieces together, as Greil Marcus recognized, the song was about "the refusal of the country to treat Vietnam veterans as something more than nonunion workers in an enterprise conducted off the books." As loud as the final product was, then, "Born in the U.S.A." was actually more about silence -- both existential and political.
"Had a brother at Khe Sanh," Springsteen sings, "Fighting off the Vietcong / They're still there / He's all gone." When Springsteen singles out one of the bloodiest and most closely watched battles of the Vietnam War, he has also selected one of the most pointless. The siege of Khe Sanh forced American combat soldiers to live in their own labyrinth of holes and trenches while waiting in fear of the moment when an estimated twenty thousand enemy soldiers amassed outside of the perimeter would storm their position in the winter of 1968. Two and a half months of constant attack ended with American carpet-bombing around Khe Sanh, turning the area around the fort into a sea of rat-chewed bodies, shrapnel, and twisted ordnance. Despite the heroism of the soldiers' stand, a mere two months after the battle, General Westmoreland ordered the fort destroyed and abandoned. The gruesome defense was for naught. "A great many people," explains Michael Herr, "wanted to know how the Khe Sanh Combat Base could have been the Western Anchor of our Defense one month and a worthless piece of ground the next, and they were simply told that the situation had changed."
Springsteen's song was never a ballad of the foreign and faraway, however, but an anthem of the U.S.A. -- the reality of a war, yes, but also a metaphor for domestic working-class life under assault. Khe Sanh and deindustrialized places like Youngstown or Flint (or Cleveland, Toledo, St. Louis, Buffalo, South Chicago, or any one the other battle zones across the Rustbelt) were not that different. The site of the song is not "Khe Sanh," but a wartorn land in which, economist Barry Bluestone explains, "entire communities" were forced "to compete for survival" as shuttered factories, abandoned downtowns, and whitewashed windows were physical evidence of continued doubledigit unemployment. By 1984, a city like Detroit, once of such strategic national importance to be known as the "Arsenal of Democracy," had, like Khe Sanh, become an abandoned pile of twisted refuse.
"Came back home to the refinery," he laments, but the "Hirin' man said, 'Son, if it was up to me.' " It is not surprising, for a nation out of gas, that Springsteen chose a refinery as his character's workplace. Yet things were little better in other industries: across the industrial sector, global competition steadily increased as advanced industrial countries recovered from the industrial devastation of World War II, and third world nations turned toward manufacturing as a development strategy. Corporations decentralized, moved to the South, relocated abroad, replaced workers with technology or diversified into non-manufacturing sectors where the return on investment was higher. Communities began a downward spiral in the competition to create a better "business climate" than the next community down the interstate. Meanwhile, U.S. research and development sagged, complacency trumped innovation, growth rates shriveled, profits sagged, foreign competition took market share, plant technology proved grossly antiquated, and federal policy remained incoherent -- even at odds with itself. Unionized manufacturing, stumbling since the mid-fifties, dropped off at a vertiginous pace. But many of the biggest firms that shut down were nowhere near bankruptcy, merely demonstrating a return on investment that was inadequate for the capitalist reformation already under way.
When, for instance, Ford announced the final closure Dewey Burton's Wixom assembly operation in 2006, the factory had already lost two shifts and several models from its assembly lines -- this despite having been named the most efficient of all of Ford's plants and the third best auto plant in both North and South America by J.D. Power and Associates (a title that included beating all of the Toyota transplants). Odes to efficiency and hard work rang hollow when even the jewel of the system did not survive. Not surprisingly, given the culture such logic engenders, Richard Sennettt's follow up to the 1970s analysis The Hidden Injuries of Class (1972) was called the The Corrosion of Character (1998). By the time the next generation of Detroit residents looked for work, there was little hope of finding the kind of security and remuneration that Dewey Burton finally settled into at Ford after the restlessness of a restless decade. When the shutdown finally came, one of Dewey's fellow skilled tradesmen sent him a DVD disc memorializing the plant and the modest protests to keep it open. It was labeled "Glory Days," and Springsteen's pop hit was its bootlegged soundtrack.
For all of the melodrama of de-industrialization, however, the decline of major industrial manufacturing should not be conflated with the decline of the working class. Making industrial workers synonymous with the working class not only smacks of nostalgia ("Glory Days") but eclipses the possibility of a more expansive notion of working class identity. Those steel mills and their surrounding communities may be gone, but the workers are still out there -- part of the new Wal-Mart working class. Women, immigrants, minorities, and, yes, white guys, all make up the "new working class" that succeeded that of basic industry, but there is no discursive, political place for them comparable to the classic concept of the industrial working class.
Absent a meaningful framework in civic life, fear and anger can quickly take the place of the pride and honor of work. The issues defining working class life, argues Lillian Rubin, are "unnamed, therefore invisible" even to working people themselves. "It is after all, hard to believe in the particularity of the class experience if there's no social category into which it fits." The decline of industry went fist-in-glove with the siege of working-class institutions, an assault that took its most literal form when eleven thousand members of PATCO went on strike in the summer of 1981. In one of the boldest acts of his administration, President Ronald Reagan responded in no uncertain terms by firing the strikers wholesale and banning them from future federal employment. Their union leaders were taken away in chains and jailed. Well before the release of "Born in the U.S.A.," the workers turned to military assault metaphors. "I'm really surprised at how bloodthirsty they've been," exclaimed Frank Massa, a controller from Long Island.
"It's such overkill -- they brought in the howitzers to kill an ant," explained controller Jon Maziel. "It's like, 'Don't sit down and talk to people like human beings, just bring in the howitzers and wipe them out.' There's no reason for this situation to be like this, and I feel scared of a system of government that turns me off as a human being and says, 'O.K., if you don't play the game our way, you're a nonentity.' "
The PATCO disaster revealed the confusion of enemy and ally at the heart of Springsteen's guerrilla combat metonym. During his 1980 campaign, candidate Reagan declared his sympathy with the "deplorable state of our nation's air traffic control system." He claimed that if elected, he would act in a "spirit of cooperation" and "take what ever steps are necessary to provide our air traffic controllers with the most modern equipment available and to adjust staff levels and work days so that they are commensurate with achieving a maximum degree of public safety." Given Carter's failures on the labor and economic fronts, PATCO even endorsed Ronald Reagan in 1980. When the controllers finally walked off the job in the summer of 1981, Reagan, like his hero Calvin Coolidge in the 1919 Boston police strike, attacked them for engaging in an illegal strike "against the public good." Yet it was the size and drama of Reagan's response that shocked even the most jaded labor commentator: the administration's firing the striking workers, smashing the entire organization designed to represent both employees' interest and public safety, and, ultimately, giving the nod to business to declare open season on organized labor. The otherwise bureaucratically calm new AFLCIO president, Lane Kirkland, recognized war when he saw it, describing the federal response as having the "massive, vindictive, brutal quality of the carpet bombing."
After the PATCO defeat, the national strike rate plummeted, and Eddie Sadlowski's nightmare of an economically disarmed working class became a reality. At the beginning of the seventies, about 2.5 million workers across the country were engaged in large strikes -- strikes of over one thousand workers. By the 1980s, that same statistic was a tiny fraction of the earlier rate, hovering between one and three hundred thousand workers total out in major strikes. The number of large walkouts fell from around four hundred at the early years of the story to only about fifty by the mid1980s.
The most famous private sector strikes and lockouts that did take place in the 1980s truly smacked of isolated guerrilla battles in hostile economic terrain. These disputes were mostly an attempt to preserve some semblance of the status quo among the copper miners in Clifton-Morenci, Arizona; the meat packers in Austin, Minnesota; and the cannery workers in Watsonville, California. Their heroic stories unfolded along remarkably similar lines. First, various industries, emboldened by Reagan's move against PATCO, demanded concessions from their employees. One of the communities in the pattern bargaining settlement inevitably fought back -- standing up for standards that rose above the pattern settlement. Those communities then found themselves fighting against the company but also against the very uncertain ally of the international union, which was still trying to keep wages and working conditions even across the nation. By the end of their heartbreaking community-based struggles, all three movements ended in more or less the same place: a broken strike, with striking workers facing "permanent replacement" by nonunion workers, a demoralized community, and an inferior (or non ex is tent) contract that drained all the gold out of the golden age of collective bargaining. One of the theme songs of the Austin meatpackers' struggle was Springsteen's "No Retreat, No Surrender," though the workers ended up doing both. As Jonathan Rosenblum concludes his detailed analysis of the 1983 Clifton-Morenci dispute in Arizona, the copper miners' defeat marked "the decline of two vital achievements of the American labor movement: solidarity and right to strike."
What other recourse did working-class Americans have in the face of lost wars, rusting factories, wilting union strength, and embattled hometowns? One answer was to accept the New Right's retooled discourse of what it meant to be born in the U.S.A.: populist nationalism, protection of family, and traditional morality. This retooling often utilized terms first drafted by segregationist George Wallace, then refined by Richard Nixon, and ultimately perfected by Ronald Reagan, a framework designed to provide symbolic sanctuary for a white working class that felt itself embattled. This discourse tapped into the material as well as the social and moral concerns of its targets but actively and strategically reformulated the terms of resentment away from the economics of class and almost soley onto social issues. While "politics and identity" were being pulled "free from the gravity of class," the screaming chant of "Born in the U.S.A." allowed national mythology to drown out the realities of lived working-class experience. As George Lipsitz argues, the " 'new patriotism' often seems strangely defensive, embattled and insecure" based as it was on "powerlessness, humiliation, and social disintegration."
At a time when the traditional working class politically, the Democratic Party, proved capable of precious little material comfort, the New Right offered soothing tonic for the injured pride and diminished material hopes of America's workingmen. Yet it was just that: tonic that promised to sooth cultural queasiness, rather than cure collective economic illnesses.
"Born in the U.S.A." ends with a hidden eulogy to an interracial republic, the promise of which drew to a close at the end of the decade along with the potential for an honest, multiracial rendition of working-class identity. As the song draws to a close, the narrator finds himself "ten years burning down the road / Nowhere to run ain't got nowhere to go." The reference to Martha and the Vandellas' Motown hit, "Nowhere to Run" makes explicit the theme of being adrift. He then quickly turns to the other tributary of American pop, by invoking the great country and western chronicler of loneliness and alienation, Hank Williams. As "Born in the U.S.A." trails off , its narrator cites the title of a Williams tune when he declares, "I'm a long gone Daddy." In setting up Motown and Nashville as the poles of working-class identity, Springsteen unites black and white experiences -- not in triumph or social unity, but in their shared but separate experiences of rootlessness within American culture. Springsteen, who never indulged in the white racial victimization common in the seventies, suggests that politics -- just like rock 'n' roll -- work best when integrated.
However, the next line uneasily transforms his lament for the dream of unity. He sings, "I'm a cool rocking daddy in the U.S.A." "Long gone" in social, economic, political, and even human senses, the narrator here clings to the "cool" -- a bit of defensive and elusive cultural flotsam left over from the glory days of postwar triumph. The collapse of meaningful, shared, and vernacular social patriotism is driven home as the narrator wails, seems to take punches, and becomes lost as the relentless rhythm of the song finally breaks down -- only to be reconstituted, oblivious to the narrator's story.
Despite a complex revival of labor issues that resonated from Detroit to Hollywood to Washington, by the end of the decade, workers -- qua workers -- had eerily been shaken out of the national scene. The aging labor intellectual J.B.S. Hardman, reflecting on his involvement in organized labor since the beginning of the century, predicted such a fate when he declared that labor stood "at the Rubicon" at the start of the decade. The crossing, he cautioned, would be fraught with treacherous obstacles, but he believed that, win or lose, the decade would represent a watershed in the fortunes of workers.
It did. The seventies whimpered to a close as the labor movement had failed in its major initiatives; de-industrialization decimated the power of the old industrial heartland; market orthodoxy eclipsed all alternatives; and promising organizing drives proved limited. The redefinition of "the working class" beyond its high modern, New Deal, form failed, leaving out the "new" working class of women and minorities -- as well as almost all of the service sector. Workers occasionally reappeared in public discourse as "Reagan Democrats" -- later as "NASCAR Dads" or the victims of another plant shutdown or as irrational protectionists and protestors of free trade, but rarely did they appear as workers. "The era of the forgotten worker," in the words of one journalist, had begun.
Andrew Levison, who had contributed to the revival of working-class studies in the seventies with The Working Class Majority (1974) and The Full Employment Alternative (1980), asked in 2001, "Who Lost the Working Class?" It was too big and complex a question for a single answer. He cited simply the sociological "perfect storm" of post-sixties working-class politics. Indeed, there are points in history in which the confluence of events suggests a transformation that is beyond a single causal explanation, but that requires a multi-layered narrative to capture the complexity. The American working class, a fragmentary but untamed force before the Great Depression, empowered and contained by the New Deal collective bargaining system, ideologically assimilated to the middle class in the fifties, and objectified as an enemy of social change in the 1960s, had always been a vulnerable and malleable thing in American history. Perhaps one of the primary interpretive problems of working-class history was that the baseline of comparison had too often been the extraordinary postwar period. As Eric Hobsbawm wrote of the decline of the golden age:
It was not until the great boom was over, in the disturbed seventies, waiting for the traumatic eighties, that observers -- mainly to begin with, economists-began to realize that the world, particularly the world of developed capitalism, had passed through an altogether exceptional phase of its history; perhaps a unique one…. The gold glowed more brightly against the dull or dark background of the subsequent decades of crisis.
With the failure of union insurgencies and the intransigence of labor leaders of the seventies, the sirens of the Nixon administration, the political divisions and blinders that created the McGovern fiasco, and the dissolution of work in popular culture, the post-New Deal working class never regained its footing. After the seventies, labor's officialdom promised transformations-through the promises of Solidarity Day, John Sweeney's New Voice slate, and the breakaway coalition known as Change to Win -- but these were largely intra-palace machinations. The promise had already passed by the time labor got serious. Talk of labor law reforms under Clinton and Obama raised further, unfulfilled, hopes. Roseanne Barr, Michael Moore, and Homer Simpson all tried to remind us of the void in popular culture, but the jokes really played off of what we as a society had already agreed to forget. "First we stopped noticing members of the working class," wrote one critic, "and now we're convinced they don't exist."
Copyright © 2010 by Jefferson Cowie. This excerpt originally appeared in Stayin’ Alive: The 1970s and the Last Days of the Working Class, published by The New Press. Reprinted here with permission.
Jefferson Cowie is an associate professor of history at Cornell University. He is the author of Stayin’ Alive: The 1970s and the Last Days of the Working Class, and 'Capital Moves: RCA’s Seventy-Year Quest for Cheap Labor' (The New Press), which received the Philip Taft Prize for the Best Book in Labor History for 2000.
An excerpt from author Cowie's new book, 'Stayin' Alive' reveals the tussle between right and left to claim Springsteen as one of their own.
September 23, 2010
Editor's Note: An epic account of how working-class America hit the rocks in the political and economic upheavals of the ’70s, Jefferson Cowie's Stayin’ Alive: The 1970s and the Last Days of the Working Class presents the decade in a new light. Part political intrigue, part labor history, with large doses of American music, film and TV lore, Cowie's book makes new sense of the ’70s as a crucial and poorly understood transition from the optimism of New Deal America to the widening economic inequalities and dampened expectations of the present. From the factory floors of Cleveland, Pittsburgh and Detroit to the Washington of Nixon, Ford and Carter, Cowie connects politics to culture, showing how the big screen and the jukebox can help us understand how America turned away from the radicalism of the ’60s and toward the patriotic promise of Ronald Reagan.
The following is excerpted from Jefferson Cowie's Stayin’ Alive: The 1970s and the Last Days of the Working Class (The New Press, 2010).
In the summer of 1984, Ronald Reagan campaigned toward his landslide victory over liberal Democratic challenger Walter Mondale. That same summer, America's foremost working-class hero appeared on stages across the nation, dwarfed, Patton-like, by an enormous American flag, pounding his fist in the air like it mattered. Tens of thousands of voices united to chant the most popular song of the summer, the year, and the decade:"Born in the U.S.A."
This audience sometimes drowned out the marshal tones of the E Street Band itself, heightening the pitch of an event that was already equal parts rock concert, spiritual revival, and nationalist rally. Replacing the skinny greaser poet of his earlier tours, Bruce Springsteen had become a superhero version of himself, his new pumped-up body accentuated by exaggerated layers of denim and leather, his swollen biceps working his guitar like a jackhammer. Fists and flags surged into the air at the first hint of the singsong melody, as thousands of bodies shadowboxed the empty space above the crowd to the rhythm of the song, the deafening refrain filling stadiums around the world. Whether one chose to compare the spectacle to the horror of a Nuremburg Rally or the ecstasy of an Elvis Presley show, rock 'n' roll felt almost powerful again -- more like a cause than an escape.
On the surface, the performance seemed obvious evidence that working-class identity had been swept out into the seas of Reaganite nationalism. The toughness, the whiteness, the chant, the fists, the flags, the costume, all pointed to the degree to which this figure, once hailed as "the new Dylan," had, like so much else in the 1980s, been stripped of even the pretense of authenticity. Instead, Springsteen, dubbed "rock and roll's future" only a decade earlier, had been painted red, white, and blue, and packaged as an affirmation of American power and innocence to an eagerly waiting marketplace.
"Like Reagan and Rambo," writes Bryan Garman, "the apparently working-class Springsteen was for many Americans a white hardbody hero whose masculinity confirmed the values of patriarchy and patriotism, the work ethic and rugged individualism, and who clearly demarcated the boundaries between men and women, black and white, heterosexual and homosexual." The many and complex labor questions of the 1970s seemed to have found easy answers in the 1980s with the narrowing and hardening of white working-class identity into a blind national pride that sounded like belligerence.
Yet these surface elements of "Born in the U.S.A." and its performance belie a profound complexity -- much like political discourse and popular culture in the 1980s masked the intricacies of post-New Deal working-class identity more generally. The song's story line, buried beneath the pounding music and the patriotic hollers of the chorus, explores the muffled tale of a socially isolated working-class man, burning within the despair of de-industrialized, post-Vietnam America: a social history of white working-class identity unmoored from the elements that once defined it. Though Springsteen projects the chorus with all his might, the tale told by the verses barely manages to peek over the wall of sound, like a man caught in a musical cage, overpowered by the anthem of his own country. Like the neo-patriotism of the Reagan era itself, the power of the national chorus, "I was Born in the U.S.A.," dwarfs the pain of the "dead man's town" below it. "You end up like a dog that's been beat too much / Til you spend half your life just coverin' it up."
The juxtaposition of this unemployed worker's dire, muted narrative, and a thundering patriotic chorus sparked battles among rock critics, pundits, and fans. Was the song part of a patriotic revival or a tale of working-class betrayal? A symptom of Reagan's America, or the antidote to it? Protest song or nationalist anthem? Both sides assumed that the words and the music could not go together, and in picking one over the other denied the song's unity -- and its subject's -- in favor of its far less compelling individual parts. Conservative columnist George Will famously fired the first shots in the Springsteen wars with a September 1984 opinion column that claimed the singer as a repository of Republican values. Will's assertion of the song's conservatism was a product of his one-night stand with the E Street Band, a concert he admittedly heard through ears packed with cotton. "I have not got a clue about Springsteen's politics, if any, but flags get waved at his concerts when he sings songs about hard times," Will explained. "He is no whiner, and the recitation of closed factories and other problems always seems punctuated by a grand, cheerful, affirmation: 'Born in the U.S.A.!' " Casting this "working class hero" as a paragon of what workers should be -- a little more patriotic, a lot more hardworking, and much more grownup -- he saw Springsteen as "vivid proof that the work ethic is alive and well" in the "hard times" of 1984. A few days later, when Will's informal advisee Ronald Reagan requested the song for his presidential campaign (and was turned down) the president invoked Springsteen anyway during a campaign stop in the singer's home state of New Jersey.
Liberals, leftists, and rock critics responded in kind and, ridiculing conservatives, claimed the song and the singer for their own by shoehorning the rock anthem into the withering protest song tradition. Springsteen's most devoted chroniclers admitted that the song functioned more for the Right in the Reagan years, but with apologies: "Released as it was in a time of chauvinism masquerading as patriotism, it was inevitable that 'Born in the U.S.A.' would be misinterpreted, that the album would be heard as a celebration of 'basic values,' " explained one critic, "no matter how hard Springsteen pushed his side of the tale." Even Walter Mondale presumed (incorrectly) to have Springsteen's endorsement for the presidency.
Lost to listeners on the Right and the Left was the fact that "Born in the U.S.A." was consciously crafted as a conflicted, but ultimately indivisible, whole. Its internal conflicts gave musical form to contradictions that grew from fissures to deep chasms in the heart of working-class life during the '70s and their aftermath. The song was first written and recorded with a single acoustic guitar during the recordings for Nebraska (1982) -- a critically acclaimed collection of some of Springsteen's starkest and most haunting explorations of blue-collar despair, faith, and betrayal during the economic trauma of the early Reagan era. "That whole Nebraska album was just that isolation thing and what it does to you," Springsteen explained. "The record was basically about people being isolated from their jobs, from their friends, from their families, their fathers, their mothers -- just not feeling connected to anything that's going on -- your government. And when that happens, there's just a whole breakdown. When you lose that sense of community, there's some spiritual breakdown that occurs. And when that occurs, you just get shot off somewhere where nothing seems to matter."
Most of the lyrics of the original Nebraska period "Born" remain the same in the popular electric version released two years later, but the first recording lacks the pounding accompaniments, and, with them, any reason for pumping fists. "To me," Springsteen explained of the earlier version, "it was a dead song…. Clearly the words and the music didn't go together." So the first draft was shelved, only to emerge again, in a much stormier, amplified form, as the title track of its own album, Born in the U.S.A., in 1984.
In the intervening time, the song had found its soul. As producer Jon Landau explained, Springsteen had "discovered the key, which is that the words were right but they had to be in the right setting. It needed the turbulence and that scale -- there's the song!" The electrification, projection, and anthemification of the first draft placed the chorus-lyrics tension at the center of the song. For Springsteen's project of giving voice to working-class experience, then, the words of working-class desperation "went together" with the music of nationalism -- the "protest" only worked within the framework of the "anthem." For the song to convey its message, the worker had to be lost in the turbulence of the nation's identity. As Springsteen once explained, the narrator of "Born in the U.S.A." longs "to strip away that mythic America which was Reagan's image of America. He wants to find something real, and connecting. He's looking for a home in his country." Putting the pieces together, as Greil Marcus recognized, the song was about "the refusal of the country to treat Vietnam veterans as something more than nonunion workers in an enterprise conducted off the books." As loud as the final product was, then, "Born in the U.S.A." was actually more about silence -- both existential and political.
"Had a brother at Khe Sanh," Springsteen sings, "Fighting off the Vietcong / They're still there / He's all gone." When Springsteen singles out one of the bloodiest and most closely watched battles of the Vietnam War, he has also selected one of the most pointless. The siege of Khe Sanh forced American combat soldiers to live in their own labyrinth of holes and trenches while waiting in fear of the moment when an estimated twenty thousand enemy soldiers amassed outside of the perimeter would storm their position in the winter of 1968. Two and a half months of constant attack ended with American carpet-bombing around Khe Sanh, turning the area around the fort into a sea of rat-chewed bodies, shrapnel, and twisted ordnance. Despite the heroism of the soldiers' stand, a mere two months after the battle, General Westmoreland ordered the fort destroyed and abandoned. The gruesome defense was for naught. "A great many people," explains Michael Herr, "wanted to know how the Khe Sanh Combat Base could have been the Western Anchor of our Defense one month and a worthless piece of ground the next, and they were simply told that the situation had changed."
Springsteen's song was never a ballad of the foreign and faraway, however, but an anthem of the U.S.A. -- the reality of a war, yes, but also a metaphor for domestic working-class life under assault. Khe Sanh and deindustrialized places like Youngstown or Flint (or Cleveland, Toledo, St. Louis, Buffalo, South Chicago, or any one the other battle zones across the Rustbelt) were not that different. The site of the song is not "Khe Sanh," but a wartorn land in which, economist Barry Bluestone explains, "entire communities" were forced "to compete for survival" as shuttered factories, abandoned downtowns, and whitewashed windows were physical evidence of continued doubledigit unemployment. By 1984, a city like Detroit, once of such strategic national importance to be known as the "Arsenal of Democracy," had, like Khe Sanh, become an abandoned pile of twisted refuse.
"Came back home to the refinery," he laments, but the "Hirin' man said, 'Son, if it was up to me.' " It is not surprising, for a nation out of gas, that Springsteen chose a refinery as his character's workplace. Yet things were little better in other industries: across the industrial sector, global competition steadily increased as advanced industrial countries recovered from the industrial devastation of World War II, and third world nations turned toward manufacturing as a development strategy. Corporations decentralized, moved to the South, relocated abroad, replaced workers with technology or diversified into non-manufacturing sectors where the return on investment was higher. Communities began a downward spiral in the competition to create a better "business climate" than the next community down the interstate. Meanwhile, U.S. research and development sagged, complacency trumped innovation, growth rates shriveled, profits sagged, foreign competition took market share, plant technology proved grossly antiquated, and federal policy remained incoherent -- even at odds with itself. Unionized manufacturing, stumbling since the mid-fifties, dropped off at a vertiginous pace. But many of the biggest firms that shut down were nowhere near bankruptcy, merely demonstrating a return on investment that was inadequate for the capitalist reformation already under way.
When, for instance, Ford announced the final closure Dewey Burton's Wixom assembly operation in 2006, the factory had already lost two shifts and several models from its assembly lines -- this despite having been named the most efficient of all of Ford's plants and the third best auto plant in both North and South America by J.D. Power and Associates (a title that included beating all of the Toyota transplants). Odes to efficiency and hard work rang hollow when even the jewel of the system did not survive. Not surprisingly, given the culture such logic engenders, Richard Sennettt's follow up to the 1970s analysis The Hidden Injuries of Class (1972) was called the The Corrosion of Character (1998). By the time the next generation of Detroit residents looked for work, there was little hope of finding the kind of security and remuneration that Dewey Burton finally settled into at Ford after the restlessness of a restless decade. When the shutdown finally came, one of Dewey's fellow skilled tradesmen sent him a DVD disc memorializing the plant and the modest protests to keep it open. It was labeled "Glory Days," and Springsteen's pop hit was its bootlegged soundtrack.
For all of the melodrama of de-industrialization, however, the decline of major industrial manufacturing should not be conflated with the decline of the working class. Making industrial workers synonymous with the working class not only smacks of nostalgia ("Glory Days") but eclipses the possibility of a more expansive notion of working class identity. Those steel mills and their surrounding communities may be gone, but the workers are still out there -- part of the new Wal-Mart working class. Women, immigrants, minorities, and, yes, white guys, all make up the "new working class" that succeeded that of basic industry, but there is no discursive, political place for them comparable to the classic concept of the industrial working class.
Absent a meaningful framework in civic life, fear and anger can quickly take the place of the pride and honor of work. The issues defining working class life, argues Lillian Rubin, are "unnamed, therefore invisible" even to working people themselves. "It is after all, hard to believe in the particularity of the class experience if there's no social category into which it fits." The decline of industry went fist-in-glove with the siege of working-class institutions, an assault that took its most literal form when eleven thousand members of PATCO went on strike in the summer of 1981. In one of the boldest acts of his administration, President Ronald Reagan responded in no uncertain terms by firing the strikers wholesale and banning them from future federal employment. Their union leaders were taken away in chains and jailed. Well before the release of "Born in the U.S.A.," the workers turned to military assault metaphors. "I'm really surprised at how bloodthirsty they've been," exclaimed Frank Massa, a controller from Long Island.
"It's such overkill -- they brought in the howitzers to kill an ant," explained controller Jon Maziel. "It's like, 'Don't sit down and talk to people like human beings, just bring in the howitzers and wipe them out.' There's no reason for this situation to be like this, and I feel scared of a system of government that turns me off as a human being and says, 'O.K., if you don't play the game our way, you're a nonentity.' "
The PATCO disaster revealed the confusion of enemy and ally at the heart of Springsteen's guerrilla combat metonym. During his 1980 campaign, candidate Reagan declared his sympathy with the "deplorable state of our nation's air traffic control system." He claimed that if elected, he would act in a "spirit of cooperation" and "take what ever steps are necessary to provide our air traffic controllers with the most modern equipment available and to adjust staff levels and work days so that they are commensurate with achieving a maximum degree of public safety." Given Carter's failures on the labor and economic fronts, PATCO even endorsed Ronald Reagan in 1980. When the controllers finally walked off the job in the summer of 1981, Reagan, like his hero Calvin Coolidge in the 1919 Boston police strike, attacked them for engaging in an illegal strike "against the public good." Yet it was the size and drama of Reagan's response that shocked even the most jaded labor commentator: the administration's firing the striking workers, smashing the entire organization designed to represent both employees' interest and public safety, and, ultimately, giving the nod to business to declare open season on organized labor. The otherwise bureaucratically calm new AFLCIO president, Lane Kirkland, recognized war when he saw it, describing the federal response as having the "massive, vindictive, brutal quality of the carpet bombing."
After the PATCO defeat, the national strike rate plummeted, and Eddie Sadlowski's nightmare of an economically disarmed working class became a reality. At the beginning of the seventies, about 2.5 million workers across the country were engaged in large strikes -- strikes of over one thousand workers. By the 1980s, that same statistic was a tiny fraction of the earlier rate, hovering between one and three hundred thousand workers total out in major strikes. The number of large walkouts fell from around four hundred at the early years of the story to only about fifty by the mid1980s.
The most famous private sector strikes and lockouts that did take place in the 1980s truly smacked of isolated guerrilla battles in hostile economic terrain. These disputes were mostly an attempt to preserve some semblance of the status quo among the copper miners in Clifton-Morenci, Arizona; the meat packers in Austin, Minnesota; and the cannery workers in Watsonville, California. Their heroic stories unfolded along remarkably similar lines. First, various industries, emboldened by Reagan's move against PATCO, demanded concessions from their employees. One of the communities in the pattern bargaining settlement inevitably fought back -- standing up for standards that rose above the pattern settlement. Those communities then found themselves fighting against the company but also against the very uncertain ally of the international union, which was still trying to keep wages and working conditions even across the nation. By the end of their heartbreaking community-based struggles, all three movements ended in more or less the same place: a broken strike, with striking workers facing "permanent replacement" by nonunion workers, a demoralized community, and an inferior (or non ex is tent) contract that drained all the gold out of the golden age of collective bargaining. One of the theme songs of the Austin meatpackers' struggle was Springsteen's "No Retreat, No Surrender," though the workers ended up doing both. As Jonathan Rosenblum concludes his detailed analysis of the 1983 Clifton-Morenci dispute in Arizona, the copper miners' defeat marked "the decline of two vital achievements of the American labor movement: solidarity and right to strike."
What other recourse did working-class Americans have in the face of lost wars, rusting factories, wilting union strength, and embattled hometowns? One answer was to accept the New Right's retooled discourse of what it meant to be born in the U.S.A.: populist nationalism, protection of family, and traditional morality. This retooling often utilized terms first drafted by segregationist George Wallace, then refined by Richard Nixon, and ultimately perfected by Ronald Reagan, a framework designed to provide symbolic sanctuary for a white working class that felt itself embattled. This discourse tapped into the material as well as the social and moral concerns of its targets but actively and strategically reformulated the terms of resentment away from the economics of class and almost soley onto social issues. While "politics and identity" were being pulled "free from the gravity of class," the screaming chant of "Born in the U.S.A." allowed national mythology to drown out the realities of lived working-class experience. As George Lipsitz argues, the " 'new patriotism' often seems strangely defensive, embattled and insecure" based as it was on "powerlessness, humiliation, and social disintegration."
At a time when the traditional working class politically, the Democratic Party, proved capable of precious little material comfort, the New Right offered soothing tonic for the injured pride and diminished material hopes of America's workingmen. Yet it was just that: tonic that promised to sooth cultural queasiness, rather than cure collective economic illnesses.
"Born in the U.S.A." ends with a hidden eulogy to an interracial republic, the promise of which drew to a close at the end of the decade along with the potential for an honest, multiracial rendition of working-class identity. As the song draws to a close, the narrator finds himself "ten years burning down the road / Nowhere to run ain't got nowhere to go." The reference to Martha and the Vandellas' Motown hit, "Nowhere to Run" makes explicit the theme of being adrift. He then quickly turns to the other tributary of American pop, by invoking the great country and western chronicler of loneliness and alienation, Hank Williams. As "Born in the U.S.A." trails off , its narrator cites the title of a Williams tune when he declares, "I'm a long gone Daddy." In setting up Motown and Nashville as the poles of working-class identity, Springsteen unites black and white experiences -- not in triumph or social unity, but in their shared but separate experiences of rootlessness within American culture. Springsteen, who never indulged in the white racial victimization common in the seventies, suggests that politics -- just like rock 'n' roll -- work best when integrated.
However, the next line uneasily transforms his lament for the dream of unity. He sings, "I'm a cool rocking daddy in the U.S.A." "Long gone" in social, economic, political, and even human senses, the narrator here clings to the "cool" -- a bit of defensive and elusive cultural flotsam left over from the glory days of postwar triumph. The collapse of meaningful, shared, and vernacular social patriotism is driven home as the narrator wails, seems to take punches, and becomes lost as the relentless rhythm of the song finally breaks down -- only to be reconstituted, oblivious to the narrator's story.
Despite a complex revival of labor issues that resonated from Detroit to Hollywood to Washington, by the end of the decade, workers -- qua workers -- had eerily been shaken out of the national scene. The aging labor intellectual J.B.S. Hardman, reflecting on his involvement in organized labor since the beginning of the century, predicted such a fate when he declared that labor stood "at the Rubicon" at the start of the decade. The crossing, he cautioned, would be fraught with treacherous obstacles, but he believed that, win or lose, the decade would represent a watershed in the fortunes of workers.
It did. The seventies whimpered to a close as the labor movement had failed in its major initiatives; de-industrialization decimated the power of the old industrial heartland; market orthodoxy eclipsed all alternatives; and promising organizing drives proved limited. The redefinition of "the working class" beyond its high modern, New Deal, form failed, leaving out the "new" working class of women and minorities -- as well as almost all of the service sector. Workers occasionally reappeared in public discourse as "Reagan Democrats" -- later as "NASCAR Dads" or the victims of another plant shutdown or as irrational protectionists and protestors of free trade, but rarely did they appear as workers. "The era of the forgotten worker," in the words of one journalist, had begun.
Andrew Levison, who had contributed to the revival of working-class studies in the seventies with The Working Class Majority (1974) and The Full Employment Alternative (1980), asked in 2001, "Who Lost the Working Class?" It was too big and complex a question for a single answer. He cited simply the sociological "perfect storm" of post-sixties working-class politics. Indeed, there are points in history in which the confluence of events suggests a transformation that is beyond a single causal explanation, but that requires a multi-layered narrative to capture the complexity. The American working class, a fragmentary but untamed force before the Great Depression, empowered and contained by the New Deal collective bargaining system, ideologically assimilated to the middle class in the fifties, and objectified as an enemy of social change in the 1960s, had always been a vulnerable and malleable thing in American history. Perhaps one of the primary interpretive problems of working-class history was that the baseline of comparison had too often been the extraordinary postwar period. As Eric Hobsbawm wrote of the decline of the golden age:
It was not until the great boom was over, in the disturbed seventies, waiting for the traumatic eighties, that observers -- mainly to begin with, economists-began to realize that the world, particularly the world of developed capitalism, had passed through an altogether exceptional phase of its history; perhaps a unique one…. The gold glowed more brightly against the dull or dark background of the subsequent decades of crisis.
With the failure of union insurgencies and the intransigence of labor leaders of the seventies, the sirens of the Nixon administration, the political divisions and blinders that created the McGovern fiasco, and the dissolution of work in popular culture, the post-New Deal working class never regained its footing. After the seventies, labor's officialdom promised transformations-through the promises of Solidarity Day, John Sweeney's New Voice slate, and the breakaway coalition known as Change to Win -- but these were largely intra-palace machinations. The promise had already passed by the time labor got serious. Talk of labor law reforms under Clinton and Obama raised further, unfulfilled, hopes. Roseanne Barr, Michael Moore, and Homer Simpson all tried to remind us of the void in popular culture, but the jokes really played off of what we as a society had already agreed to forget. "First we stopped noticing members of the working class," wrote one critic, "and now we're convinced they don't exist."
Copyright © 2010 by Jefferson Cowie. This excerpt originally appeared in Stayin’ Alive: The 1970s and the Last Days of the Working Class, published by The New Press. Reprinted here with permission.
Jefferson Cowie is an associate professor of history at Cornell University. He is the author of Stayin’ Alive: The 1970s and the Last Days of the Working Class, and 'Capital Moves: RCA’s Seventy-Year Quest for Cheap Labor' (The New Press), which received the Philip Taft Prize for the Best Book in Labor History for 2000.
Tuesday, 14 September 2010
A Tale of Two Raphaels
http://www.vam.ac.uk/channel/happenings/exhibitions_and_galleries/a_tale_of_two_raphaels/
In this special film, Evans visits the V&A and Windsor Castle to reveal how Raphael made the cartoons which were used to make tapestries by specialist weavers in Brussels and how they come to be at the V&A. Meanwhile, in Rome, Vatican Museum Curator, Professor Arnold Nesselrath explains how Raphael applied his 'universal genius' to sculpture, architecture and tapestry as well as painting.
At the film's climax, the two curators meet in the Sistine Chapel to witness a hanging of the Raphael tapestries in 'the greatest room in art' . This rare event took place in July as the prelude to this autumn's historic V&A exhibition.
In this special film, Evans visits the V&A and Windsor Castle to reveal how Raphael made the cartoons which were used to make tapestries by specialist weavers in Brussels and how they come to be at the V&A. Meanwhile, in Rome, Vatican Museum Curator, Professor Arnold Nesselrath explains how Raphael applied his 'universal genius' to sculpture, architecture and tapestry as well as painting.
At the film's climax, the two curators meet in the Sistine Chapel to witness a hanging of the Raphael tapestries in 'the greatest room in art' . This rare event took place in July as the prelude to this autumn's historic V&A exhibition.
Saturday, 4 September 2010
"The Great Chain of Being" and the infinite chain of objectification
There has been a constant stream of evidence contradicting humanity's most cherished beliefs, including the "presence" of a supreme being in the natural order of things we call "evolution". Yet, despite the apparent snake-like sequence of evolution, the Western civilization's pyramid design (used to justify its ranking of humans on top of this chain) loses sight of scientific rigour to merely justify an opportunistic master versus the meek servants and other exploitative resources narrative...
Whilst even chimpanzees seem to have learnt how to outwit human hunters (http://news.bbc.co.uk/earth/hi/earth_news/newsid_8962000/8962747.stm), there is a lot more than meets the eye in this rather complex matter. For those wishing to voice a more informed opinion, it may be useful to consult the material published below, in the ScienceWeek Journal (http://scienceweek.com/2005/sw050624-3.htm).
Whilst even chimpanzees seem to have learnt how to outwit human hunters (http://news.bbc.co.uk/earth/hi/earth_news/newsid_8962000/8962747.stm), there is a lot more than meets the eye in this rather complex matter. For those wishing to voice a more informed opinion, it may be useful to consult the material published below, in the ScienceWeek Journal (http://scienceweek.com/2005/sw050624-3.htm).
ScienceWeek
EVOLUTION: ON THE GREAT CHAIN OF BEING
The following points are made by Sean Nee (Nature 2005 435:429):
1) For centuries the "great chain of being" held a central place in Western thought. This view saw the Universe as ordered in a linear sequence starting from the inanimate world of rocks. Plants came next, then animals, men, angels and, finally, God. It was very detailed with, for example, a ranking of human races; humans themselves ranked above apes above reptiles above amphibians above fish. This view even predicted a world of invisible life in between the inanimate and the visible, living world, long before Antonie van Leeuwenhoek's discoveries. Although advocates of evolution may have stripped it of its supernatural summit, this view is with us still.
2) Common presentations of evolution mirror the great chain by viewing the process as progressive. For example, in their book THE MAJOR TRANSITIONS IN EVOLUTION, John Maynard Smith and Eors Szathmáry take us from the origin of life, through to the origin of eukaryotic cells, multicellularity, human societies and, finally, of language. They explicitly point out that evolution does not necessarily lead to progress, and even refer to the great chain by its Latin name, scala naturae. But it is impossible to overlook the fact that the "major" evolutionary transitions lead inexorably, step by step, to us. Similarly, in their recent essay in Nature, "Climbing the co-evolution ladder" (Nature 431, 913:2004), Lenton and colleagues illustrate their summary of life-environment interactions through the ages with a ladder whose rungs progress through microbes, plants, and, at the top, large animals.
3) In his recent book THE ANCESTOR'S TALE, Richard Dawkins reverses the usual temporal perspective and looks progressively further back in time to find our ancestors. Like Maynard Smith and Szathmáry, he cautions us against thinking that evolution is progressive, culminating with us. He emphasizes that with whatever organism we begin the pilgrimage back through time, we all are reunited at the origin of life. But by beginning the journey with us and looking backwards along our ancestry, Dawkins generates a sequence of chapter titles that would read like a typical chain to a medieval theologian, albeit with some novelties and the startling omission of God.
4) By starting with us, Dawkins regenerates the chain because species that are more closely related to us are more similar as well, and such similarity was an important criterion in determining the rankings in the classical chain. But there is nothing about the world that compels us to think about it in this way, suggesting, instead, that we have some deep psychological need to see ourselves as the culmination of creation. Illustrating this, when we represent the relationships between species, including ourselves, in a family tree, we automatically construct it so that the column of species' names forms a chain with us as the top, as in the first of the trees pictured. But the other construction is equally valid.
References:
1. Lovejoy, A. O. The Great Chain of Being (Harper and Row, New York, 1965)
2. Gee, H. Nature 420, 611 (2002)
3. Maynard Smith, J. and Szathmáry, E. The Major Transitions of Evolution (W. H. Freeman & Co., Oxford, 1995)
4. Dawkins, R. The Ancestor's Tale (Weidenfeld & Nicolson, New York, 2004)
5. Nee, S. Nature 429, 804-805 (2004).
Nature http://www.nature.com/nature
--------------------------------
Related Material:
EVOLUTIONARY BIOLOGY: ON THE SCHEME OF ANIMAL PHYLA
The following points are made by M. Jones and M. Blaxter (Nature 2005 434:1076):
1) Despite the comforting certainty of textbooks and 150 years of argument, the true relationships of the major groups (phyla) of animals remain contentious. In the late 1990s, a series of controversial papers used molecular evidence to propose a radical rearrangement of animal phyla [1-3]. Subsequently, analyses of whole-genome sequences from a few species showed strong, apparently conclusive, support for an older view[4-6]. New work [7] now provides evidence from expanded data sets that supports the newer evolutionary tree, and also shows why whole-genome data sets can lead phylogeneticists seriously astray.
2) Traditional trees group together phyla of bilaterally symmetrical animals that possess a body cavity lined with mesodermal tissue, the "coelom" (for example, the human pleural cavity), as Coelomata. Those without a true coelom are classified as Acoelomata (no coelom) and Pseudocoelomata (a body cavity not lined by mesoderm). We call this tree the A-P-C hypothesis. Under A-P-C, humans are more closely related to the fruitfly Drosophila melanogaster than either is to the nematode roundworm Caenorhabditis elegans[5,6].
3) In contrast, the new trees [1-3,7] suggest that the basic division in animals is between the Protostomia and Deuterostomia (a distinction based on the origin of the mouth during embryo formation). Humans are deuterostomes, but because flies and nematodes are both protostomes they are more closely related to each other than either is to humans. The Protostomia can be divided into two "superphyla": Ecdysozoa (animals that undergo ecdysis or moulting, including flies and nematodes) and Lophotrochozoa (animals with a feeding structure called the lophophore, including snails and earthworms). We call this tree the L-E-D hypothesis. In this new tree, the coelom must have arisen more than once, or have been lost from some phyla.
4) Molecular analyses have been divided in their support for these competing hypotheses. Trees built using single genes from many species tend to support L-E-D, but analyses using many genes from a few complete genomes support A-P-C [5,6]. The number of species represented in a phylogenetic study can have two effects on tree reconstruction. First, without genomes to represent most animal phyla, genome-based trees provide no information on the placement of the missing taxonomic groups. Current genome studies do not include any members of the Lophotrochozoa. More notably, if a species' genome is evolving rapidly, tree reconstruction programs can be misled by a phenomenon known as long-branch attraction.
5) In long-branch attraction, independent but convergent changes (homoplasies) on long branches are misconstrued as "shared derived" changes, causing artefactual clustering of species with long branches. Because these artefacts are systematic, confidence in them grows as more data are included, and thus genome-scale analyses are especially sensitive to long-branch attraction. Long branches can arise in two ways. One is when a distantly related organism is used as an "outgroup" to root the tree of the organisms of interest. The other is when one organism of interest has a very different, accelerated pattern of evolution compared with the rest.
References (abridged):
1. Aguinaldo, A. M. A. et al. Nature 387, 489-493 (1997)
2. Winnepenninckx, B. et al. Mol. Biol. Evol. 12, 1132-1137 (1995)
3. Adoutte, A., Balavoine, G., Lartillot, N. & de Rosa, R. Trends Genet. 15, 104-108 (1999)
4. Mushegian, A. R., Garey, J. R., Martin, J. & Liu, L. X. Genome Res. 8, 590-598 (1998)
5. Blair, J. E., Ikeo, K., Gojobori, T. & Hedges, S. B. BMC Evol. Biol. 2, 7 (2002)
6. Wolf, Y. I., Rogozin, I. B. & Koonin, E. V. Genome Res. 14, 29-36 (2004)
7. Philippe, H., Lartillot, N. & Brinkmann, H. Mol. Biol. Evol. 22, 1246-1253 (2005)
Nature http://www.nature.com/nature
--------------------------------
Related Material:
EVOLUTION: GENOMES AND THE TREE OF LIFE
The following points are made by K.A. Crandall and J.E. Buhay (Science 2004 306:1144):
1) Although we have not yet counted the total number of species on our planet, biologists in the field of systematics are assembling the "Tree of Life" (1,2). The Tree of Life aims to define the phylogenetic relationships of all organisms on Earth. Driskell et al (3) recently proposed a computational method for assembling this phylogenetic tree. These investigators probed the phylogenetic potential of ~300,000 protein sequences sampled from the GenBank and Swiss-Prot genetic databases. From these data, they generated "supermatrices" and then super-trees.
2) Supermatrices are extremely large data sets of amino acid or nucleotide sequences (columns in the matrix) for many different taxa (rows in the matrix). Driskell et al (3) constructed a supermatrix of 185,000 protein sequences for more than 16,000 green plant taxa and one of 120,000 sequences for nearly 7500 metazoan taxa. This compares with a typical systematics study of, on a good day, four to six partial gene sequences for 100 or so taxa. Thus, the potential data enrichment that comes with carefully mining genetic databases is large. However, this enrichment comes at a cost. Traditional phylogenetic studies sequence the same gene regions for all the taxa of interest while minimizing the overall amount of missing data. With the database supermatrix method, the data overlap is sparse, resulting in many empty cells in the supermatrix, but the total data set is massive.
3) To solve the problem of sparseness, the authors built a "super-tree" (4). The supertree approach estimates phylogenies for subsets of data with good overlap, then combines these subtree estimates into a supertree. Driskell et al (3) took individual gene clusters and assembled them into subtrees, and then looked for sufficient taxonomic overlap to allow construction of a supertree. For example, using 254 genes (2777 sequences and 96,584 sites), the authors reduced the green plant supermatrix to 69 taxa from 16,000 taxa, with an average of 40 genes per taxon and 84% missing sequences! This represents one of the largest data sets for phylogeny estimation in terms of total nucleotide information; but it is the sparsest in terms of the percentage of overlapping data.
4) Yet even with such sparseness, the authors are still able to estimate robust phylogenetic relationships that are congruent with those reported using more traditional methods. Computer simulation studies (5) recently showed that, contrary to the prevailing view, phylogenetic accuracy depends more on having sufficient characters (such as amino acids) than on whether data are missing. Clearly, building a super-tree allows for an abundance of characters even though there are many missing entries in the resulting matrix.
References (abridged):
1. M. Pagel, Nature 401, 877 (1999)
2. A new NSF program funds computational approaches for "assembling the Tree of Life" (AToL). Total AToL program funding is $13 million for fiscal year 2004. NSF, Assembling the Tree of Life: Program Solicitation NSF 04-526 (www.nsf.gov/pubs/2004/nsf04526/nsf04526.pdf)
3. A. C. Driskell et al., Science 306, 1172 (2004)
4. M. J. Sanderson et al., Trends Ecol. Evol. 13, 105 (1998)
5. J. Wiens, Syst. Biol. 52, 528 (2003)
Science http://www.sciencemag.org/
ScienceWeek http://scienceweek.com/
Copyright © 2005 ScienceWeek
All Rights Reserved
US Library of Congress ISSN 1529-1472
Subscribe to:
Posts (Atom)