Boycott the "Greater" Israeli Apartheid Regime!

Boycott the "Greater" Israeli Apartheid Regime!

Învaţă, Cunoaşte-te pe tine însuţi, Schimbă-te... Învaţă de la oameni, Cunoaşte-i, Schimbaţi Împreună Lumea!

Saturday 27 November 2010

Truthdig - Book Review


By Nomi Prins


“A Question of Values” is an alternately sobering and inspiring collection of essays by noted historian and cultural critic Morris Berman. Berman pulls no punches in laying bare the truths about who we are, not just as a nation, but also as individuals wrapped up in the destructive pursuit of material excess. In the unswerving style of his other writings, he rips apart the national illusion of greatness.

The book is divided into four sections: “Lament for America,” “Mind and Body,” “Progress True and False” and “Quo Vadis?” (Where are you going?). Each part examines the American identity from a historical, spiritual, technological and alternative future perspective, respectively. Taken together, they ask the imperative questions: How did we get to this point, and how do we get out? Or will we? (Here being a country caught in a societal malaise of promoting external accumulation over internal compassion.) Taken together, the sections inspect our inner and outer fabric as a nation.

In Section I, the second essay, “Conspiracy vs. Conspiracy in American History,” Berman dissects America’s profound sense of self-importance, a central theme of the entire collection. He discusses how the “post-election euphoria in the United States over Barack Obama was nothing more than a bubble, an illusion, because the lion’s share of the $750 million he collected in campaign contributions” came from Wall Street. Thus, the fact that Obama proceeded to promise to rein in Wall Street’s excesses lay in stark and rather public contrast to his own connections with the banks.

This political sleight of hand is part of a larger problem for which Berman lists four descriptive conspiracies (or fallacies): First, that we are a chosen people (so we get to do whatever we want); second, that America itself is a kind of religion; third, that we must endlessly expand, whether it be geographically or financially; and, lastly, that our national character is composed of extreme individuals going back to our colonization. This he considers to be the main reason why “American history can be seen as the story of a nation consistently choosing individual solutions over collective ones.”

Berman expertly interweaves narrative and analysis, supported by anecdotes, historical fact and a plethora of quotes from historians, philosophers and authors, spanning Plato to Chris Hedges. With an ardent voice and poignant prose, Berman brings us to his conclusion that the only hope for America is to stop believing its own hype—something he doesn’t consider very likely. Ranging from Wall Street bailouts to political delusion, to 9/11 and the Iraq war to the Virginia Tech massacre to the interplay with China, Berman’s lament isn’t for an America that lost its way, but for one that never had a heart, but rather a colossal ego that raids other nations with self-righteous impunity.

The collection also provides a guide not just on what to think, but how to think. In Section II, Berman subtly balances the more dour aspects of the first section with a chattier discourse, relying on a combination of outside sources and his own entertaining life experiences. The section covers the message in certain modern Greek tragedies, like the movie “Damage,” and the very mortal question we all ask ourselves at different points in our lives: If I had to do it all over again, would I do it differently? And if so, would I wind up in the same place anyway? In his essay “Be Here Now,” Berman examines the need to be present in one’s life, because of the rapidity with which it flashes by. As Zen as this sentiment is, coming at the heels of his historical analysis, it centers our own focus, not selfishly, but with self-awareness. For in the end, as Berman writes, “there is no forcing things to make sense: either they do or they don’t, and there is no guarantee that they will.” It’s the thought about them that counts. Or doesn’t.

Section III spans the socially bankrupt practices resulting from endless technological advances, through the disastrous global competition for food and water. If we measure progress by consumption, how can it ever stop until there’s nothing left? According to Berman it can’t, which underscores a phenomenon he dubs “catastrophism.” As he puts it, “it is a fair guess that we shall start doing things differently only when there is no other choice; and even then, we shall undoubtedly cast our efforts in the form of a shiny new and improved hula hoop, the belief system that will finally be the true one, after all of those false starts; the one we should have been following all along.”

In Section IV, Berman brings us full circle in our assessment of national identity, taking us to Asia, a target of Ben Bernanke’s snotty finger-wagging this month. Berman notes the irony that “when Mao Zedong called the United States the paper tiger in the 1950s, everybody laughed.” As we know now, this pronouncement wasn’t so far off base. Our Washington finance chiefs, notably Treasury Secretary Tim Geithner and Fed Chairman Bernanke, want to keep pumping, printing and devaluing our money to create the illusion of national economic well-being while demanding that China keep its currency strong. And thus America’s national ego carries on, as Berman illustrates.

In “A Question of Values,” Berman not only warns us that America must change or die, but he calls on each of us to stop and imagine the potential of living in a better way. As such, there is also something uplifting about the book; it makes you want to call your parents to see how they are doing, or check in with your friends or your community. You know, be more in touch. Help someone.

When Berman’s last book, “Dark Ages America: The Final Phase of Empire,” came out in 2006, New York Times reviewer Michiko Kakutani declared it “the kind of book that gives the Left a bad name.” There is no doubt that Berman’s work hits American ideals where it counts, but Kakutani’s kind of knee-jerk rather than introspective response is precisely the reaction that illustrates Berman’s thesis.

If there is anything missing from Berman’s collection, it’s that it offers no pat remedies of the kind that authors tend to stick into the wrap-up chapter. But, it’s the lack of a clearly delineated way out of our collective malaise that is the most honest answer of all. It is the basis of our entire value and priority system that is off. So, the only possible strategy for any kind of national redemption is to reassess our core values and original construction. There’s no easy way to achieve that. Still, any hope of resurrecting ourselves as a nation begins with a keener awareness of who we really are and why, and to that end, Berman’s book of essays will inspire much-needed introspection.

Monday 22 November 2010

Sex and Relationships

Women Who Like to Be Dominated in Bed: Talking to BDSM Submissives

Photo Credit: Chilli Photograph

Sunday 21 November 2010

Truthdig - Book excerpt

Beyond ‘1984’: New Frontiers of Mass Surveillance
Book excerpt


By Elliot D. Cohen

Does the notion of remote-controlled soldiers—the fully human kind—seem only a sci-fi vision or the product of someone’s paranoid imagination? Guess again: There’s a project in the works as the military and big business join forces to make privacy a thing of the past, according to Elliot D. Cohen, whose new book, “Mass Surveillance and State Control: The Total Information Awareness Project,” is excerpted below.
* * *

Elliot D. Cohen, “Mass Surveillance and State Control,” published 2010, Palgrave Macmillan, reproduced with permission of Palgrave Macmillan.

Surveillance cameras have finite ranges within which they can track a person. However, there are currently other technologies that can be used to track people in real time, which are not constrained by location.


Radio Frequency ID Technologies and Government Surveillance

One such technology is Radio Frequency Identification (RFID) microchips, which can be smaller than a grain of sand. These devices have the capacity to store data, which can be read at a distance by an RFID reader. Like our cell phones, the emerging technology also has GPS capacity and can thus be used to locate and track a person or object carrying the device.

Now RFID chips are also being implanted in human beings, not just human artifacts. In 2004, the Food and Drug Administration approved the use of RFID chips for subcutaneous implantation in patients in hospitals, which could be used by medical staff to access computerized patient information such as the patient’ medical history. The maker of this chip, Verichip, has also lobbied the Department of Defense to embed RFID chips in soldiers to replace the standard “dog tags.” Other human applications include implanting them in children, and even in prisoners.

In fact, the London justice department has begun to explore the idea of using a hypodermic needle to inject such devices into the back of the arms of certain inmates, such as sex offenders, then releasing them from prison, thereby freeing up space in overcrowded British prisons. The prisoners would be tracked by satellite and barred from entering certain “safe” zones such as schools, playgrounds, and former victims’ homes.


An Emerging Internet of Humans

One wave of research concerns the creation of “an internet of things” whereby RFID interfaces are constructed between cyberspace and physical objects, thereby permitting two-way exchanges between online software technologies and databases, on the one end, and objects in the material world, on the other end. Thereby, these objects can be identified, tracked, traced, monitored, and controlled.

The “internet of things” project began as a research project by Massachusetts Institute of Technology’s Auto-ID Labs to help the Department of Defense precisely track and control billions of dollars of military inventory; but there is already concern by prominent technology watchdog organizations, such as the Electronic Frontier Foundation, that the government may also have designs on using such systems for purposes of monitoring and collecting information on peoples’ interests, habits, and activities through the things that they purchase.

Further, since RFID chips have already begun to be embedded in human beings, the progressive development of such a project may come to embrace human beings along with physical objects. Thus, with the advance of an “internet of things,” human beings, like physical inventory, might be “tagged” with an RFID chip and systematically tracked, traced, monitored, and controlled.

Are such possibilities speculative? Yes, but the potential of RFID technologies to become an incredibly oppressive kind of surveillance is not speculative. As was discussed in the preceding chapters, there is now a trend for government to override privacy for the sake of “winning the war on terror.” Viewed in this light, it would be presumptuous to think that such technology would not be so used—at least if government does not depart from its current tendency to abridge the right to privacy in the name of national security.


The DARPA/IBM Global Brain Surveillance Initiative

Going beyond monitoring such aspects of human life as behavior, electronic messaging, and geographical location is the direct monitoring of people’s mental aspects, such as their thoughts, perceptions, and emotions. In December 2008, IBM and collaborators from several major universities were awarded US$4.9 million from DARPA to launch the first phase of its “Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) initiative.” Under this grant, IBM has launched its “cognitive computing initiative” to develop a (literal) “global brain.”

The enormity of this project is glaring. Nonetheless, its intentions seem clear, and they include, among other things, the global monitoring of human beings’ most intimate and personal space: what is going on inside their minds; and then what is going on inside their organizations, their homes, and even their cars.

In 2004, DARPA funded a US$19 million program led by a Duke University neurobiologist, Miguel Nicolelis, in which a monkey was able to control a remote robotic arm hundreds of miles away, through a two-way wireless interaction between the monkey’s cerebral cortex and the robotic arm. DARPA’s military goals for this project included giving combat soldiers the power to remotely control military equipment and weapons at a distance through such brain machine interfaces (BMI). As was mentioned in Chapter 1, another goal of DARPA is to remotely control the soldiers themselves through the use of peripheral devices wirelessly interfacing with their brains, including remotely controlling natural emotions, such as fear, and feelings, such as that of fatigue, in combat situations.

Here, there are profound implications for DARPA/IBM’s cognitive computing initiative to build a “global brain.” If sensors that monitor and control soldiers’ motor and sensory brain activities were “plugged into” a global brain through BMI interfaces, the possibility would emerge of remotely controlling and coordinating an entire army of soldiers by networking their individual brains. … The stored data and supercomputing capabilities could then … give an army a marked, logistical advantage over a nonnetworked opponent. Of course, this advantage would be purchased at the expense of turning human soldiers into military robots plugged into a literal network of remotely controlled fighting machines. There would be little left that would make them distinctively human.

But why limit BMI technology when it could also be used to improve parenting skills; exponentially expand individual intellects and knowledge bases; and eliminate or greatly reduce accidents on the highways, criminal activities, and, of course, “win the war on terror.” In other words, why not make BMI/global brain technologies mainstream?

Truthdig - Film review


by Richard Schickel

imdb.com
Isabelle Huppert plays the lead in Claire Denis’ “White Material.”

In a post-colonial, pre-revolutionary African country, the French peacekeepers are pulling out and government troops are contending for control of the rapidly failing nation with a rebel leader known as The Boxer, whose troops are largely child soldiers. Wounded, The Boxer has taken refuge in a coffee plantation owned by the Vials and managed by Maria (a muscular Isabelle Huppert), who is determined to hang on to the acreage despite the increasingly desperate conditions she confronts.

Director Claire Denis’ “White Material” tells her story in a jumpy jumble of narrative leaps that at first annoys and then absorbs the viewer. Another, perhaps lesser, director would probably have done her best to clarify the confusions of this story; instead, Denis embraces them. The predominant image of this film—repeated in a dozen variants—is of a lone woman walking or driving the empty roads of this beautiful, unnamed country, seeking a salvation that is both practical and spiritual.

It is time to harvest her crop and her workers have fled the plantation. Therefore, her first order of business is to replace them. This she briefly manages to do, although her new crew also decamps almost immediately. For allies she is pretty much reduced to a feckless former husband and a “half-baked” son (as someone describes him)—a lad who may merely be afflicted with adolescent angst, but who is more likely on his way to becoming a full-scale nut job.

We never quite understand why saving the plantation is so important to Maria. Yes, the family has lived there for a couple of generations, And, yes, we understand that she cannot imagine any other life for herself. On the other hand, it has been years since they turned a profit on their crop, and the old colonial life style that once sustained them is long gone. To remain in place is to assure an absurd and anonymous extinction—the machete at midnight, the rape by the roadside.

A couple of times Maria reflects on the beauty of the landscape. But that scarcely seems sufficient reason to wage this hopeless fight. She is, we come to see, stubborn simply because that is her nature. Caught up in the practical details of her struggle, she cannot pause to contemplate the larger meaning—if any—of what she is doing. She is, you might say, morally mute. Racing hither and tither, improvising this or that solution to whatever practical problem presents itself, she has no time for irony, let alone long, long thoughts.

Therein lies the rough beauty of this film. The hills may be alive with menace—child soldiers shooting off their guns, rebel and government troops ready to kill for no good reason—but Huppert’s character, as sinewy in spirit as she is in physical appearance, just keeps plowing on. Never once does she openly acknowledge the peril that surrounds her. She seems to feel that if she just keeps busy she is impervious to threat.

Eventually one comes to think she represents that curious sense of exceptionalism that has led to so many modern tragedies—not only in the colonial world, but elsewhere as well. “White Material” never mentions it, but Maria is, in effect, shouldering the White Man’s Burden. She has the energy, the intelligence—and the blindness—to dominate a vastly larger and infinitely more chaotic civil population. But she cannot recognize that it would require only a few—well, yes, “half-baked”—organizing ideas for the restive natives to casually, heedlessly exterminate her.

Her only real option is to run. This was true in Nazi Germany, in Kenya or Rwanda, dozens of other post-colonial contexts as well. But people do not exercise that choice. They think the ugliness must only be temporary. They rely on their pride of place in the pecking order. They are full of an unrecognized hubris. And so they die. The last we see of Maria is her standing alone in open country, at the end of her tether, but perhaps not fully realizing, even then, that she has reached that point.

“White Material” is a difficult, narratively thorny, film. But in Huppert’s uncompromising performance—she never once appears to harbor an abstract or idealistic thought or one that we would feel comfortable sentimentalizing—and in Denis’ refusal to embrace easy, uplifting answers to an insoluble problem, it offers us a portrait of an exemplary, persistent (and, in a certain sense, tragic) figure. For Maria Vial is the victim of a spirit too primitive, too animalistic, to be the avatar of a new, somewhat better society. She will, we are sure, die the victim of blind forces that can only make a bad place even worse—more brutal, more irrational—than it already is.

Tuesday 16 November 2010

Monday 15 November 2010

Fake news on Fakebook

Somali Pirates Refuse to Board Carnival Cruise Ships

Fake news by Andy Borowitz


MOGADISHU, Somalia - In yet another public relations setback for the beleaguered cruise ship company, Somali pirates today said they would no longer board Carnival Cruise ships, citing “unsafe working conditions.”

“If Carnival thinks that it’s going to be business as usual between them and the Somali pirates, they need to have their heads examined,” said Somali pirate spokesman Sugule. “We Somali pirates may be bold, but we’re not crazy.”

The pirate said that the recent fire that crippled the giant cruise ship Carnival Splendor “has sent a shiver through the pirate community.”

“We Somali pirates face enough risks without dealing with decks bursting into flames,” he said. “And don’t get me started on the nonfunctioning toilets.”

When asked if the Somali pirates might attempt to board Carnival ships in the future, he responded, “I am telling me hearties that if they were thinking of pillaging a Carnival ship of its booty over the holidays, they should make alternative plans.”

Carol Foyler, a spokesperson for Carnival Cruises, said that the company “would be working overtime to win back the pirates’ trust.” In the meantime, Foyler said, Carnival would be unveiling a new slogan in the weeks to come: “Come for the fun, stay for the raging inferno.”


Bush Publishes ‘I Can Has Prezidensy’

The Borowitz Report has obtained an advance copy of former President George W.[W-aterboarding/W.M.D.] Bush’s memoir, entitled “I Can Has Prezidensy.” Here are some highlights:

—The book contains a “Where’s Waldo?” foldout section with WMDs.

—Bush says the biggest disappointment of his eight years in office was learning there was no Santa Claus.

—The book’s appendix includes a series of connect-the-dot drawings Bush was unable to complete.

—Bush on the unfinished business of his presidency: “I never did learn how that neat story about the goat ended.”

—Bush’s memoir is a quick read, since 95 percent of it has been redacted by Dick Cheney.

—Six months after the book’s publication, there will be an English-language version.


Award-winning humorist, television personality and film actor Andy Borowitz is author of the book “The Republican Playbook.”

© 2010 CREATORS SYNDICATE

Disclaimer: The add-ons to the great American leader's name do not appear in the original article and could be considered as prima-facie evidence for a defamation case... In case this helps the great American justice system's piranha-lawyers, so be it!

Thursday 11 November 2010


by Dominic Sandbrook

When Penguin Books prevailed in the famed obscenity trial 50 years ago, the result was as much a victory for the free market as for free expression.


One day in November 1960, a man sat down and wrote a letter to the prime minister. "England needs your help," he began, imploring Harold Macmillan to intervene in what he saw as the most flagrant miscarriage of justice in living memory - the acquittal of Penguin Books in the celebrated case of Lady Chatterley's Lover. And contrary to what we might assume today, Macmillan's correspondent was far from alone. Sexual intercourse, Philip Larkin once claimed, began "between the end of the Chatterley ban/ and the Beatles' first LP" - but if it did, many people were dead set against it.

Even as thousands of copies of D H Lawrence's explicit tale of forbidden love were flying off the shelves, there were reports of deep popular dissatisfaction with the verdict. In Edinburgh, one woman bought a copy only to set it on fire on the pavement outside the bookshop; in South Wales, female library assistants demanded their employers' permission to refuse to handle the book. Writing to the Home Office, a "family man and grammar schoolmaster" claimed that his Essex pupils were finding it "impossible to buy 'proper comics' in local shops, their place being taken by sex-filled trash". And from Surrey, a distressed woman wrote to the wife of the home secretary, Rab Butler, explaining that she had a 13-year-old daughter at boarding school and was afraid that "day girls there may introduce this filthy book at only three and sixpence . . . If a mistress protests, girls can reply that a clergyman has said: 'Every Christian should read it.'"

Few accounts of the Chatterley case - which has become a familiar symbol of the social and cultural changes sweeping over Britain half a century ago - find much room for such protests. Even at the time, most opinion-formers saw Penguin's acquittal on charges of breaching the Obscene Publications Act as a victory of freedom over repression. Penguin added a caption on new editions of the novel, proclaiming that the trial at the Old Bailey had been "not just a legal tussle but a conflict of generation and class". Kenneth Tynan claimed that it had been a struggle "between Lawrence's England and Sir Clifford Chatterley's England; between contact and separation; between freedom and control; between love and death". "I feel as if a window has been opened," said Lawrence's stepdaughter Barbara, "and fresh air has blown right through England."

A half-century later, it is hard to recapture the moral climate of a society that still saw fit to ban books and magazines because they were considered likely to "deprave and corrupt" their readers. In fact, the law under which Penguin was prosecuted, the Obscene Publications Act 1959, was itself a symbol of change, introduced by the young Labour MP Roy Jenkins precisely because its crucial loophole, the question of literary merit, would clear the way for books such as Lady Chatterley's Lover. But what seems most obvious, looking back, is how much this was a change driven from the top - or by those on their way there - rather than from the bottom.

Jenkins, though the son of a miner, later became a kind of William Gladstone tribute act. Meanwhile, Gerald Gardiner, who was widely praised for leading Penguin's defence, was the Harrow-educated son of a mining executive and one of the leading lights of what Michael Frayn memorably called the "herbivore" establishment. A pacifist and founder member of CND, Gardiner became Harold Wilson's lord chancellor four years later.

In a sense, therefore, the Chatterley case was less a clash of ideals or classes than one between two different wings of a political and cultural elite: one rooted in old ideas of decency and de­corum, the other in latitude and self-expression. And as in the drugs trial of Mick Jagger and Keith Richards, which followed in 1967, most of the press supported the forces of liberation. Yet, thanks to Jenkins's legislation, the Crown had little choice but to prosecute. As the prosecuting counsel, Mervyn Griffith-Jones, told the director of public prosecutions, Sir Theobald Mathew: "If no action is taken in respect of this publication, it will make proceedings against any other novel very difficult."

Even at the time, Griffith-Jones's handling of the prosecution attracted outright mockery. Although the prosecution drew up a long list of potential witnesses who might condemn the novel as obscene, none of them agreed to testify. At one point, it even considered bringing over an American literary critic who had once described the book as "a dreary, sad performance with some passages of unintentional, hilarious low comedy", although it eventually abandoned the idea. Instead, the prosecution team wasted its time before the trial going through the book line by line, noting down obscenities: on page 204, for example, one "bitch goddess of Success", one "fucking", one "shit", one "best bit o' cunt left on earth" and three mentions of "balls".

But Griffith-Jones was not the man to fight this battle. A war hero who was awarded the Military Cross after serving in North Africa and Italy, he had been left high and dry by the tides of cultural change. When he asked what is perhaps the best-known rhetorical question in legal history - "Is it a book you would even wish your wife or your servants to read?" - he guaranteed his own defeat.

By contrast, Gardiner had lined up 35 dis­tinguished witnesses convinced of the novel's literary merit, including Cecil Day-Lewis, E M Forster, Richard Hoggart and Rebecca West. The most widely reported exchange came when the bishop of Woolwich, John Robinson, appeared in the witness box. "What I think is clear," he told the court, "is that what Lawrence is trying to do is to portray the sex relationship as something essentially sacred . . . as in a real sense an act of holy communion." Asked whether it was "a book which, in your view, Christians ought to read", he replied confidently: "Yes, I think it is."

Fifty years later, I doubt there are many readers who wish the verdict had gone the other way. By and large, the days when the state controlled literary publications and private morality are long gone; few of us, barring the most backward-looking, would want things to be different. And predictions that the Chatterley verdict would open the floodgates to a tide of promiscuity proved, in the short term at least, very wide of the mark. Despite the extraordinary sales of Lawrence's novel after the publishers were acquitted - Foyles bookshop sold its first 300 copies in 15 minutes on 10 November, and took orders for 3,000 more - Britain in the 1960s remained a remarkably conservative society. Whatever Larkin may have thought, it was not Lawrence's prose that transformed sex in Britain. It was rising prosperity, individualism and reliable birth control.

But you do not, I think, have to be an arch-reactionary to see that the legacy of the Chatterley case was more ambiguous than the conventional wisdom often suggests. In one sense, Penguin's acquittal was a victory for the free market - a market not just of books, but of words and ideas - over state regulation. Kicking against their elders, the reformers of the 1960s worshipped freedom and self-expression; as they saw it, the old values of collective morality and individual restraint were inappropriate in an age of affluence and mass education.

Perhaps they were. But only a thin line divides self-expression from self-indulgence, and the freedom to flout social convention can easily curdle into the freedom to ignore the rest of society altogether. Sometimes, reading accounts of the case, I even wonder whether somebody ought to speak up for poor old Griffith-Jones. His remark about wives and servants was absurd, but surely, as a war hero who gave sterling service at the Nuremberg war crimes trials, he deserves a better reputation. In his own way, he made a fine representative of a Britain soon to be swept away. It was a Britain with many evils and injustices, certainly, but one from which we could still learn something about the virtue of self-restraint.


“Lady Chatterley's Lover - 50th Anniversary Edition" is published in the Penguin Classics series (£8.99)

Thursday 4 November 2010

The NewStatesman


by Michael Brooks


Every November, the Royal Society and the French Académie des sciences give out a prize to a scientist who has discovered how to do something innovative with computers. Naturally, it has to be a useful innovation. The 45 employees at the Department for Work and Pensions who have been disciplined after using their computers for shopping, pornography and "unauthorised downloading" need not apply.

It's not just government employees who abuse their computers. In May 2008, the head of the Max Planck Institute for Mathematics warned that City traders were doing it, too. He argued that their lazy acceptance of whatever the computers said would lead to financial disaster.

No one understands "garbage in = garbage out" better than the scientists who have learned that their reputation can be ruined by placing unquestioning trust in the printout. To take a slightly ridiculous example, researchers at the UK's National Physical Laboratory have just fixed a 20-year-old problem in the official computer model of the shape of the outer ear. It was created to define a quality standard for the performance of headphones and mobile phones (you are forgiven for not realising how useful scientists can be). The input scan for the computer model was done at too low a resolution, however, and manufacturers have since been filling in the gaps in ways that reflect best on their brand. It ain't necessarily so, just because the computer says it is.

Innovation in scientific computing has done some wonderful things. It has allowed us to model the heart, giving us insight into how drugs affect cardiac rhythms and how heart attacks develop. It enables us to analyse medical images more accurately, providing earlier diagnosis of cancers. Thanks to computer models, we can see why misfolded protein gives rise to cystic fibrosis and predict the path of dangerous epidemics.

But in every case, the computer's predictions or declarations have to be checked against what scientists can observe happening.

In science, the only credible guide is real-world experiment. Across at the Channel, the French use a slightly different word for "experiment" - expérience. That is no coincidence: it emphasises that scientists learn from trial and realisation. Think, for example, of what the British economist John Maynard Keynes called the "folly and injustice" of the UK government's 1931 plan to beat the recession. The measures, which in effect shut down economic activity, didn't work, and led to the abandonment of the gold standard.

The gold-standard scientific approach - experiment or experience - suggests there is no reason to believe that the same approach to beating recession will work now, either - whatever the printout from the Treasury's computer might say.

Wednesday 3 November 2010

Project Syndicate: A world of Ideas


by Henry I. Miller
PALO ALTO – “It’s alive, it’s moving, it’s alive... IT’S ALIVE!” So said Dr. Victor Frankenstein when his “creation” was complete. Researchers have long been fascinated with trying to create life, but mainly they have had to settle for crafting variations of living organisms via mutation or other techniques of genetic engineering.

In May, researchers at the J. Craig Venter Institute, led by Venter himself, synthesized the genome of a bacterium from scratch using chemical building blocks, and inserted it into the cell of a different variety of bacteria. The new genetic information “rebooted” its host cell and got it to function, replicate, and take on the characteristics of the “donor.” In other words, a sort of synthetic organism had been created.

Reactions in the scientific community ranged from “slight novelty” to “looming apocalypse.” The former is more apt: Venter’s creation is evolutionary, not revolutionary.

The goal of “synthetic biology,” as the field is known, is to move microbiology and cell biology closer to the approach of engineering, so that standardized parts can be mixed, matched, and assembled – just as off-the-shelf chassis, engines, transmissions, and so on can be combined to build a hot-rod.

Achieving this goal could offer scientists unprecedented opportunities for innovation, and better enable them to craft bespoke microorganisms and plants that produce pharmaceuticals, clean up toxic wastes, and obtain (or “fix”) nitrogen from the air (obviating the need for chemical fertilizers).

During the past half-century, genetic engineers, using increasingly powerful and precise tools and resources, have achieved breakthroughs that are opening up new opportunities in a broad array of fields. The Venter lab’s achievement builds on similar work that began decades ago. In 1967, a research group from Stanford Medical School and Caltech demonstrated the infectiousness of the genome of a bacterial virus called ΦΧ174, whose DNA had been synthesized with an enzyme using the intact viral DNA as a template, or blueprint. That feat was hailed as “life in a test tube.”

In 2002, a research group at the State University of New York, Stony Brook, created a functional, infectious poliovirus solely from basic, off-the-shelf chemical building blocks. Their only blueprint for engineering the genome was the known sequence of RNA (which comprises the viral genome and is chemically very similar to DNA). Similar to the 1967 experiments, the infectious RNA was synthesized enzymatically. It was able to direct the synthesis of viral proteins in the absence of a natural template. Once again, scientists had, in effect, created life in a test tube.

Venter’s group did much the same thing in the recently reported research, except that they used chemical synthesis instead of enzymes to make the DNA. But some of the hype that surrounded the publication of the ensuing article in the journal Nature was disproportionate.

Along with the Venter paper, Nature published eight commentaries on the significance of the work. The “real” scientists were aware of the incremental nature of the work, and questioned whether the Venter group had created a genuine “synthetic cell,” while the social scientists tended to exaggerate the implications of the work.

Mark Bedau, a professor of philosophy at Reed College, wrote that the technology’s “new powers create new responsibilities. Nobody can be sure about the consequences of making new forms of life, and we must expect the unexpected and the unintended. This calls for fundamental innovations in precautionary thinking and risk analysis.”

But, with increasing sophistication, genetic engineers using old and new techniques have been creating organisms with novel or enhanced properties for decades. Regulations and standards of good practice already effectively address organisms that may be pathogenic or that threaten the natural environment. (If anything, these standards are excessively burdensome.)

On the other hand, Swiss bioengineer Martin Fussenegger correctly observed that the Venter achievement “is a technical advance, not a conceptual one.” Other scientists noted that the organism is really only “semi-synthetic,” because the synthetic DNA (which comprises only about 1% of the dry weight of the cell) was introduced into a normal, or non-synthetic, bacterium.

Understanding the history of synthetic biology is important, because recognizing the correct paradigm has critical implications for how governments regulate it, which in turn affects the potential application and diffusion of the technology. Thirty-five years ago, the US National Institutes of Health adopted overly risk-averse guidelines for research using recombinant DNA, or “genetic engineering,” techniques. Those guidelines, based on what has proved to be an idiosyncratic and largely invalid set of assumptions, sent a powerful message that scientists and the federal government were taking seriously speculative, exaggerated risk scenarios – a message that has afflicted the technology’s development worldwide ever since.

Synthetic biology offers the prospect of powerful new tools for research and development in innumerable fields. But its potential can be fulfilled only if regulatory oversight is based on science, sound risk analysis, and an appreciation of the mistakes of history.


Henry I. Miller, a physician and molecular biologist and a fellow at Stanford University's Hoover Institution, was the founding director of the Office of Biotechnology at the US Food and Drug Administration. His most recent book is The Frankenfood Myth.


Copyright: Project Syndicate, 2010.
http://www.project-syndicate.org/