Sunday, October 31, 2010

Central Europe in Washington: Notes from a Cold-War Public Diplomacy Perspective

All of a sudden, I felt back in Central Europe during the Cold War.

But this was Washington, D.C., on October 30, 2010.

At the Rally to Restore Sanity in the imperial capital yesterday, the mood reminded me of my postings as a U.S. Foreign Service public diplomacy officer in Prague (1983-1985) and Krakow (1986-1990).

In Prague, working with the Jazz Section, I used the small garden of my "official" residence near the Vltava river (with its then ever-present swans) as a venue for Jazz concerts. Most of the Czechs attending these events were "dissidents" -- a hard word to define, but meaning persons (mostly young) who looked beyond the narrow, parochial views of a dinosaur communist regime. Humor and irony were an essential part of their politics. Living in an Orwellian society that was in many ways absurd, they used as sanity tools gentle you-know-what-I-mean winks, and, above all, music. The last thing on their minds was violence.

Our last jazz "concert" took place in a tram. The Section somehow got a hold of a city tram and off we were -- about thirty of us -- in the tram, riding around downtown Prague, in the heart of communist-controlled Central Europe, for some two hours, with jazz music blasting from a tape recorder, drinking Soviet (if I remember its provenance correctly) champagne. A great American jazz group, the Louisiana Repertory Jazz Ensemble, happened to be in zlata Praha at the time, and took part in the on-rail festivities. Talk about a magical mystery tour!

In Krakow, home of one of Europe's oldest universities, the Piwnica pod Baranami, a cabaret full of wit and energy, was kind (and couragerous) enough to establish contact with American diplomats. Its stellar cellar performances on late-night occasions were highlighted by the singing of Anna SzaƂapak, with whom it was impossible not to fall in love. After the cabaret returned from the United States on a tour, a reception was held in its honor at the American Consulate in Krakow. The leader of the group, the unforgettable Piotr Skrzynecki, brought a goat to the party.

I can see Piotr at the rally yesterday. He doubtless would have brought his goat with him.

Skrzynecki image from

America and the ‘Fun’ Generation

America and the ‘Fun’ Generation

CAMBRIDGE, MASSACHUSETTS — From the first years of the American republic, a quiet battle has simmered over the words that denote the nation’s soul. And now a count can declare the victors: “achievement” and “fun.”

From the 1810s to the 2000s, the frequency of “achievement” in written American English grew elevenfold, according to a search of the Corpus of Historical American English, a database of 107,000 newspapers, magazines, novels, plays and film scripts. In the same period, the frequency of “fun” multiplied by more than eight times.

Meanwhile, another pair of words met an opposite fate. As talk of “achievement” soared over two centuries, the term “excellence” dropped out of favor, also elevenfold. As “fun” gained influence, mentions of “pleasure” fell by a factor of four.

In the history of language, words rise and fall. We make and remake them; they make and remake us.

The story of a word is as complex as a hurricane. It is difficult to know for sure how it catches on, meets new needs, acquires new valences. It is impossible to blame the decline of one word on the rise of another.

But in the destinies of these two pairs of words is a suggestion of a turning in American culture, and one that has influenced the world. It is a turning away from an arguably aristocratic idea of the intrinsic worth of things: from pleasure, with its sense of an internal condition of mind, to fun, so closely affiliated with outward activities; from excellence, an inner trait whose attainment is its own reward, to achievement, which comes through slogging and recognition.

Merriam-Webster defines “pleasure” as “a state of gratification,” while fun is “what provides amusement or enjoyment; specifically, playful, often boisterous, speech or action.” It defines “excellence” as “the quality of being excellent,” which in turn means “very good of its kind: eminently good.” “Achievement,” meanwhile, is “a result gained by effort.”

The arc of American usage from “pleasure” to “fun” can be traced in the corpus’s database. In an 1812 play, John Blake White wrote, “Wherefore wealth, if not to purchase pleasure? Wherefore health, if not to taste, when pleasure holds the cup and bids us drink.” By 2009, this line from the novelist Hyatt Bass was more typical: “Come on. Don’t you think it’s fun to have a bottle of wine that was released the same month you got engaged?”

“Pleasure” carries a hint of the sublime; it speaks of a state of mind that comes organically, that need not be artificially induced.

“Fun,” though almost synonymous with “pleasure” for contemporary speakers, often involves artificial inducement. You don’t feel fun; you do a fun thing. And fun has no hint of elitism, whereas pleasure vaguely does.

Gushing waterfalls provide pleasure; games of paintball, in which friends playfully (and sometimes painfully) shoot one another with pellets of paint, provide fun. A long, gabby dinner party may well be a “pleasure”; a crowded, sweat-laced night at the six-deep bar is more likely to be termed “fun.”

If “pleasure” comes from being and from talking through ideas, “fun” comes from doing and, often, switching off the brain. The change perhaps partly accounts for the American insistence on activities for all occasions, rather than trusting pleasure to develop on its own.

Rare is the American corporate retreat or after-Thanksgiving party that does not involve a skit or contest or session of Nintendo Wii. Where others might eat, drink and talk, Americans often create themes and talent shows.

In “Eat, Pray, Love,” the best-selling memoir by Elizabeth Gilbert, she describes discovering the Italian idea of pleasure as if it were a buried city: “During my first few weeks in Italy, all my Protestant synapses were zinging in distress, looking for a task. I wanted to take on pleasure like a homework assignment.” She concludes that “Americans have an inability to relax into sheer pleasure. Ours is an entertainment-seeking nation, but not necessarily a pleasure-seeking one.”

Italians, on the other hand, have mastered “bel far niente,” the beauty of doing nothing, Ms. Gilbert writes.

Then there is the arc from “excellence” to “achievement.” Consider this sentence, from an 1813 poem: “would she thus a moral teach; / That man should see, but never reach, / The height of excellence, and show / The vanity of works below?”

And this one, from a 2005 biography: “the young man pursues his dream while others scoff, he undertakes a lonely journey from the country to the city in search of fulfillment, overcomes obstacles with a combination of pluck, determination, and talent, and finally rises to heights of achievement and prosperity.”

“Excellence” evokes Aristotle with its overtones of virtue. Anyone can achieve, in garbage collection or neurosurgery, but how many can truly be excellent?

“The ancient Greek definition of happiness was the full use of your powers along lines of excellence,” President John F. Kennedy once said.

“Achievement” is a word more likely to come from American leaders today, and, like “fun,” it is outward in nature. It comes in doing specific things. It is more about checking boxes than fulfilling inner potentialities.

The achievement culture permeates life today. From elementary-school testing to the incessant pressure to overschedule as a university student, educational culture emphasizes the racking up of achievements over intellectual crackle. Wall Street stumbled in part because so many chased achiever bonuses while neglecting the pursuit of excellence in their vocation. An American culture of instantaneous celebrity teaches young people that fame is an end in itself rather than an incidental symptom of excellence in craft.

The world in which “pleasure” and “excellence” roared was less equitable than our world today. It shut out vast categories of humankind. In the intervening years, those exclusions dwindled; the world opened up for so many, not least in the United States. But with that change has come another: what would seem to be a growing intolerance for merely being, and an anguished insistence on doing, doing, doing.

Join an online conversation at

Thursday, October 28, 2010

Guitars, Google, and guns: a new view of Western power

The Christian Science Monitor -
Guitars, Google, and guns: a new view of Western power
As the West gears up for a NATO summit, free nations must consider how to be smarter about their tools of influence.


By Andras Simonyi, Markos Kounalakis
posted October 27, 2010 at 3:53 pm EDT

Budapest, Hungary —

In the run-up to the NATO summit Nov. 19 in Lisbon, the transatlantic community must confront not just the burning issues it faces (from Afghanistan to Russia), but the way free nations can and should wield their power for global progress.

Indeed, it needs to address the biggest questions of all: How is the free world going to lead in an age when its values are increasingly under attack? When it is facing threats and challenges unknown in the past? And when its economic model – the source of our power and freedoms – is being questioned?

Smart power

The buzzword for dealing with these challenges in the corridors of power in Washington and European capitals is "smart power." But a buzzword is no substitute for honest reflection. What the West needs most is a fresh look at the full range of its capabilities and interests. Only then can its power fulfill its purpose.

Seen as a wonder tool, smart power has been embraced as a fresh and benign aspect of power; a definably formulaic mix of soft (cultural) power and hard (military) power. The reality is that the need for hard power has not vanished. And soft power alone will never suffice to win a war, push down threatening dictators, or keep a peace. We still live in a world that requires both swords and plowshares.

Soft power has always had a place. During the cold war, rock songs by The Beatles, The Rolling Stones, and Janis Joplin played an important political role by inspiring a young, disaffected, and rebellious generation in Eastern Europe to help bring down the Iron Curtain.

Today, rock almost seems like a soft-power anachronism, along with most shortwave radio broadcasts; underwritten overseas English-language training; and other pricey, legacy public diplomacy programs paid for by the European Union and the United States.

In the past 20 years, the transatlantic community has expanded its military, political, and economic institutions, but it hasn't come forward with new ways to augment its arsenal of soft and hard power influence. At least not in a big way. America is sorting out where it erred with its extreme embrace of its military power after 9/11. But Europe, too, must reflect on why its global strategic and political influence is not on par with its economic might.

America is not Mars. Europe is not Venus.

A better – and smarter – use of both hard and soft power is necessary. A more efficient use of the softer aspects of power will mitigate the need for actual military intervention. In the transatlantic relationship, this calls for a sober realization that a distribution of labor between the US and Europe along the lines of hard and soft power is not viable. The US is not Mars and Europe is not Venus. They are both earthbound.

Power – hard or soft, American or European – is still just power and it is spectral. At the two ends of the visible power spectrum are the extremes: strategic nuclear forces at one end and cultural diplomacy on the other. Hot, hard war tactics are on the red end of the spectrum and cool, soft sells to societies are at the opposite, bluer end. There is a lot of space in between: for example, military assisted humanitarian actions or helping fight devastating disease in Africa.

Extending the metaphor of spectral power, we need also to understand that there are parts of the spectrum that are "invisible" until they strike.

Take nonstate actors. They have become a curse word in recent years, and many remain a source of worry and threat. The worst are invisible inhabitants of the hard end of the power spectrum, becoming visible only when they carry out devastating terrorist acts.

At the softer end, free societies have their own nonstate actors, too. Our innovative and maturing technologies – YouTube, Facebook, Google, and others – empower individuals worldwide. Unsung heroes of the Internet community figured out how to circumvent the Iranian regime's Internet control via proxy servers, for example. Technology by itself, however, is not a liberation panacea. It is just a tool. And tools – whether a match or a firewall – can be used for good as well as evil.

Power toolbox

The concept of spectral power is essentially a new way of looking at our power toolbox in a more integrated manner. Free and democratic countries, alliances, and organizations will have to begin to see more clearly the subtle colors, shades, and mixtures of power that a full and wide spectrum view allows. The most important expected result will be a framework that will help define a more efficient and effective use of our human, economic, military, scientific, and cultural assets.

It is a great consolation that, in the end, the full and sophisticated use of spectral power will only be effective in the hands of those who understand that lasting influence is never achieved by military force or economic influence alone, but by sharing values and solutions that simultaneously have benefits for both the global community and the individual.

Andras Simonyi is a former Hungarian ambassador to the United States and NATO. Markos Kounalakis is the former publisher and president of Washington Monthly and currently a senior fellow at the Center for Media and Communication Studies at Central European University in Budapest.

© The Christian Science Monitor. All Rights Reserved. Terms under which this service is provided to you. Privacy Policy.

Tuesday, October 19, 2010

Abstract Expressionism and the State Department

"AbEx was a political, 'a type of artistic expression with no polemical axes to grind and no political agendas; moreover, its avatars, aside from a few token females, were 'real men,' two-fisted paint slingers like Jackson Pollock, to whom any taint of sexual nonconformity was anathema.' (Citation from Gary Indiana, Andy Warhol and the Can that Sold the World.)

Thus the US State Department was among its enthusiastic promoters abroad. Although a number of its major artists were privately gay, its public image was macho. 'By 1949', Indiana notes,

The Abstract Expressionists had coalesced into a men's club, headquartered at the Cedar Bar, which disparaged all but a handful of female artists, expressed real hatred of homosexuals, bathed in a sea of booze every night, and considered the only place for blacks in the arts a jazz club.

Although the style was adored and supported by influential gallery owners and critics, it was never popular."

--Elaine Showalter, "Brillo: New Assessments of Andy Warhol: via Wittengstein and Viva, as master aesthete and ambilatory wig," Times Literary Supplement (October 15, 2010), p. 4

From Wikipedia:

"Abstract expressionism and the Cold War

Since mid 1970s it has been argued by revisionist historians that the style attracted the attention, in the early 1950s, of the CIA, who saw it as representative of the USA as a haven of free thought and free markets, as well as a challenge to both the socialist realist styles prevalent in communist nations and the dominance of the European art markets. The book by Frances Stonor Saunders [2], The Cultural Cold War—The CIA and the World of Arts and Letters, [3] and other publications such as Who Paid the Piper?: CIA and the Cultural Cold War, detail how the CIA financed and organized the promotion of American abstract expressionists as part of cultural imperialism via the Congress for Cultural Freedom from 1950–67. Against this revisionist tradition, an essay by Michael Kimmelman, chief art critic of The New York Times, called Revisiting the Revisionists: The Modern, Its Critics and the Cold War, argue that much of this information (as well as the revisionists' interpretation of it) concerning what was happening on the American art scene during the 1940s and 50s is flatly false, or at best (contrary to the revisionists' avowed historiographic principles) decontextualized[citation needed]. Other books on the subject include Art in the Cold War by Christine Lindey, which also describes the art of the Soviet Union at the same time; and Pollock and After edited by Francis Frascina, which reprinted the Kimmelman article."

Image: Willem De Kooning, Woman V, 1952–1953. De Kooning's series of Woman paintings in the early 1950s caused a stir in the New York City avant-garde circle.

Monday, October 18, 2010

Molodtsi! Russian spies get Kremlin's highest honors

Report: Russian spies get Kremlin's highest honors
The Associated Press
Monday, October 18, 2010; 11:48 AM

MOSCOW -- President Dmitry Medvedev bestowed the country's highest state honor Monday on the Russian sleeper agents deported from the United States as part of the countries' biggest spy swap since the Cold War, the Interfax news agency reported.

The awards were handed out at a Kremlin ceremony less than four months after the exchange, the agency quoted Medvedev spokeswoman Natalya Timakova as saying. No other details on the ceremony were available and Kremlin spokespeople were not immediately reachable.

In June, 10 Russian agents who infiltrated suburban America were deported in exchange for four people convicted in Russia of spying for the West.

The spies received a hero's welcome in Russia, with Prime Minister Vladimir Putin leading them in a patriotic singalong in July.

The most famous of the agents, Anna Chapman, visited the Baikonur cosmodrome in Kazakhstan this month for the launch of a Russian spaceship, fueling her celebrity in Russia and abroad.

Chapman was in Baikonur ostensibly as the new celebrity face of a Moscow bank.

FondServisBank, which works with Russian companies in the aerospace industry, said it had hired Chapman to bring innovation to its information technologies.

It did not escape Russians' attention that the initials of the bank, FSB, are the same as Russia's main spy agency.

Although Russia has now reportedly given the spies the country's top state honor, the U.S. court complaint against the flame-haired Chapman and her alleged cohorts described their many spying blunders, leading to some embarrassing coverage for Russia's Foreign Intelligence Service in the Western press.

Putin, who served as a spy in East Germany before going into politics, said in July that he had met with the spies to celebrate their return and warned that the "traitors" who exposed them could end up "in a ditch."

Russia and the United States have both said the spy scandal would not interfere with the improving tone in their relations.

Saturday, October 16, 2010

A Portrait of Putin as a Young Man (with apologies to Norman Rockwell)


Putin image from; boy scout (by Norman Rockwell) image from

Monday, October 11, 2010

Rockwell and North Korean Propaganda

North Korean propaganda leaflet
from: Christmas - Home - Happiness. Propaganda leaflet. Text Reads: Frozen Rations eaten on the run. Any moment he may have to run again, to fight or die - and so may you. Those who love you want you back home safe and sound. FIND A WAY OUT! It's no disgrace to quit fighting in this unjust war! Leaflet 10/3/1

Image: Rockwell's Freedom from Want (1943) from

More on Rockwell and "socialist realist" art at

Friday, October 8, 2010

Walter Isaacson: The Declaration of Independence as "a work of propaganda"

A Declaration of Mutual Dependence
The New York Times
July 4, 2004

Amid all the hot dogs and fireworks, it's useful to reflect for a moment on precisely what we are celebrating today. Yes, Americans know that the Fourth of July is about independence and an aversion to colonialism -- but what was that sacred parchment to which the founders affixed their John Hancocks really all about, and why is it relevant today?

By July 1776, the Continental Congress had concluded not only that the American colonies ought to be independent, but also that they needed to explain why to the rest of the world. Thomas Jefferson, who received the honor of writing the first draft of this document, was very direct about the motivation in his first sentence: "a decent respect to the opinions of mankind" required the founders to explain what they were doing.

Thus the Declaration of Independence is, in effect, a work of propaganda -- or, to put it more politely, an exercise in public diplomacy intended to enlist other countries to the cause.

If you are trying to persuade people to join with you, there are three general methods. You can coerce them with threats, convince them by pointing out their own interests, or entice them by appealing to their ideals. Those who run businesses or, for that matter, who have teenage children, know how each of these approaches work.

One can imagine the founders trying the first approach on France and other European countries in 1776. We are breaking away from Britain, they could have said, and you're either for us or against us. If you're against us, your ships are not safe near our shores, your future trade is at risk, and if we win you might as well forget about the fur trade and navigating on the Mississippi River.

Or, they could have used the second tack. The continental Europeans, they could have pointed out, had been fighting England off and on for four centuries or so, and the best way to shift the balance of power would be by driving a wedge between England and its colonies and forging treaties of friendship with America.

Instead, they tried the third method: they appealed to the values and the ideals of potential allies.

Because they were Enlightenment thinkers, the drafters of the declaration, particularly Jefferson and Benjamin Franklin, began by positing basic premises, an analytic approach that reflected the philosophical methods of John Locke and the scientific method of Isaac Newton. People are created equal, they postulated, and they have certain unalienable rights.

Where did these axioms come from? At first, the founders foundered a bit in figuring that out. "We hold these truths to be sacred and undeniable," Jefferson wrote in his initial rough draft. Franklin crossed this out with his heavy printer's pen and changed it to "we hold these truths to be self-evident." Drawing on the concepts of his friend David Hume, Franklin believed that the truths were grounded in rationality and reason, not in the dictates or dogma of any particular religion.

Similarly, Jefferson originally noted that "from that equal creation they derive rights inherent and inalienable." John Adams, a product of Puritan Massachusetts, appears to be the one who suggested that this be amended to, "they are endowed by their Creator with certain inalienable rights." But whatever the provenance of these basic premises, it was clear what this meant for the role and the legitimacy of governments: "To secure these rights, governments are instituted among men, deriving their just powers from the consent of the governed." A nice concept.

In order to make these ideals a reality, however, the Americans had to get France in on their side. Even back then, the French were a bit difficult, so Congress sent Franklin, by then in his 70's, to woo them. He wrote some brilliant balance-of-power memos appealing to France's interests, but then he did something unusual: he began appealing to France's ideals as well. He built a press at his house on the outskirts of Paris and printed the declaration and other inspiring documents from America to show the French that the colonists were fighting for the ideals of liberty and out of an aversion to tyranny, causes that were welling up in their country as well. It worked. France joined our cause in 1778 and helped make sure that we won.

These are the same values -- liberty and aversion to tyranny -- that America still shares with the French and our other natural allies. But unlike the founders, we are not as willing to court the hearts and minds of others. Rather than caring for the opinions of mankind, President Bush jokes, "Call my lawyer," when the concept of international law is raised. Defense Secretary Donald Rumsfeld saw little need to distribute the Geneva Convention rules to American soldiers dealing with prisoners.

Machiavelli famously advised his prince that it was better to be feared than loved. By that standard, the United States is doing rather well. Alas, this is not a formula for winning a war against terrorism and the spread of dangerous weapons. We need allies who will want to help not because we scare them but because they share our values.

This will require leadership that values the role of diplomacy and doesn't scoff at international law. It will take ambassadors who do not cower behind barricades in their embassies but instead engage in the arena of ideas and values. It will require filling the vacant State Department job for public diplomacy with a competent person who actually believes in the mission. It will require leaders who display a decent respect to the opinions of mankind.

It was the appeal of America's values -- and the vision of statesmen like Jefferson and Franklin who were willing to engage in a war of ideas -- that won us independence. Likewise, it was the appeal of America's values -- and the vision of wise leaders who were willing to engage in a war of ideas -- that assured victory over communism in the cold war. Both of those generations realized that ideas had power which would prove stronger than our weapons. Now we are losing the war of ideas and ideals around the world.

This failure would dismay the founders, for they knew the power of those self-evident truths that they proclaimed 228 years ago: that people are entitled to liberty and that their rights should be guaranteed by a government whose legitimacy comes from their consent. These were inspiring ideals then, and they remain so today. The founders had the pride to realize that they could enlist legions to this noble cause. But they also had the humility to realize that this required a decent respect for the opinions of mankind.

Walter Isaacson, president of the Aspen Institute, is the author of "Benjamin Franklin: An American Life."

Thursday, October 7, 2010

Mind Games: A brief history of information warfare

By Philip M. Taylor | Oct 7, 2010 7:00 AM

Israeli information warriors—both government operatives and the media they work to manipulate—fail to understand global media warfare. Their mishandling of the crises in Lebanon and Gaza and the recent flotilla incident has cast Israel as the villain in the global theater of war and conflict. Israel has not understood the difference between how it sees itself and how others perceive its actions, or if it has, it seems not to care whether the audience dislikes its performance on the Middle Eastern stage. Many Western observers are bemused by the actions of a democratic nation that arguably has the best cause in the world but the worst propaganda, especially on a regional and wider global level. Perhaps, for domestic political purposes, Israel is too preoccupied with domestic opinion—which, in reality, can be relied upon to be largely patriotic given that the Jewish state is bordered by so many hostile neighbors.

Yet Israel is hardly alone in failing to grasp the forces that have reshaped the nature of armed conflict in our 21st-century global information society, in which perception is almost as important as—some would say more important than—reality. The term “information warfare” first gained currency at the end of the 1980s as the Cold War was drawing to a close. Indeed the Gulf War of 1991 was labeled by some analysts as the “first information war.” Ever since, the phrase has entered popular media parlance, while academic journals [1], international conferences, [2] and even scholarly institutes [3] have been created for its analysis. It is sometimes used interchangeably with “media warfare,” but as it also became a military doctrine the relationship of IW with media relations—or military public affairs—can sometimes cause confusion in military parlance, where it has been replaced by the broader term “information operations.”

Of course, the use of information in warfare has always been a vital component of military strategy. The side with the best intelligence about its adversaries’ capabilities, troop sizes, equipment, and disposition, together with an understanding of the terrain, psychology, motivation, and even the weather conditions that were likely to affect the outcome of battles, has always enjoyed a greater likelihood of victory—from Alexander the Great to today’s commanders in Iraq and Afghanistan. Information operations, or IO, have increased in military significance as modern conflicts have shifted from conventional war-fighting to counterinsurgency strategies that require greater attention to so-called “hearts and minds.”

IO as a military doctrine is broadly seen as a toolbox of capabilities consisting of computer network operations, electronic warfare, operational security, psychological operations, and deception. Computer network operations, or CNO, are about defending one’s own computer-based military systems—information assurance—as well as attacking adversaries’ systems. The attempt by NATO to destroy the broadcasts of Radio Television Serbia during the 1999 Kosovo conflict is often cited as an example of the latter, but that was really media warfare; the attempt to disrupt Serbian command-and-control capabilities is a much better example of electronic warfare. (The Stuxnet [4] worm that has attacked the control systems of the Bushehr nuclear reactor in Iran also comes to mind.) Combined CNO and electronic warfare are what most people think of when talking about information warfare, such as the devastating 2008 cyberattacks on Estonia’s financial and other computer services by suspected Russian info-warriors resentful about the move of a memorial to Soviet World War II soldiers.

As the military doctrine of information warfare was emerging throughout the 1990s, there was an obsession with the new technology that was increasingly driving a revolution in military affairs—from cameras on the noses of smart missiles navigated by GPS services coordinated by satellites to the widespread take-up of Internet access, email, and cell telephony. The military began to talk of “asymmetric warfare,” in which a militarily inferior opponent could inflict significant damage through computer-based technologies using viruses, worms, trojans, and other “info-bombs” in cyber or hacker warfare. The threat most feared was an “electronic Pearl Harbor,” not Sept. 11—an attack that may have been coordinated partly using the Internet, but it was carried out by people who piloted old-fashioned airplanes into the World Trade Center and Pentagon.

With the Cold War won, the U.S. government had also downgraded its international information programs, culminating in the closure of the U.S. Information Agency in 1999, creating a space that adversaries were eager and able to fill with a new kind of asymmetric warfare. Especially in places like the Middle East, terrorist groups were able to internationalize themselves quickly and at a very low cost by tapping into the global power of World Wide Web. The Palestinians were among the first to demonstrate how local causes could be internationalized via new media as the so-called “Electronic Intifada” became the most potent force-multiplier in the arsenals of Fatah and Hamas. Terrorist groups like al-Qaida, popularly thought of as struggling for a return to medieval values, embraced new media technologies not just to coordinate their planned violent attacks but also to disseminate their messages and recruit followers from the worldwide Muslim community, the Ummah.

The Sept. 11 attacks also demonstrated how sophisticated terrorists are when exploiting the old media to wage their new kind of warfare. Striking at rush hour, when so many TV stations have traffic helicopters patrolling the skies above the cities, helped ensure that the attacks would be captured live (in real time) on television—and it worked. How many people have described watching those terrible scenes on live television that day as like watching a movie? From the terrorists’ point of view, that was precisely the point—especially given the importance in Islamic thought of bearing witness to so-called acts of martyrdom. Terrorists also understand that their acts of violence are unlikely to succeed in a military victory—they are acts of theater designed to strike fear into their opponents and instill pride in their supporters. The audience is their main target, not the victims of their violence, although the target audience is more likely to be the chattering classes who are most likely to express horror and disgust.

So, the declaration of a “global war on terror,” fought primarily with kinetic weaponry—guns, bombs, and drones—played into the terrorists’ hands. Acts of violence might repulse most sensible people, and the waging of a kinetic war, first in Afghanistan, then in Iraq, and again in Afghanistan, did precisely that, especially among the Islamic Ummah. Anti-Americanism, even in non-Islamic countries, grew to unprecedented levels as the war on terror dragged on to twice the length of World War II. The Internet became the primary battle space for anti-American and anti-Western propaganda about a renewed crusade against Islam, a clash of civilizations and a Zionist-Christian plot to subordinate Muslims everywhere.

It is impossible not to conclude that the war on terror completely missed the point about what Sept. 11 was all about. Although the ongoing fighting has been re-branded by the Obama Administration into the “struggle against violent extremism”—which is better than the use of the word “war”—the realization that a war of ideas is what’s really happening has come far too late in a conflict in which words and images matter and the primary battle spaces are Google, YouTube, and Facebook. For good or ill, the citizen information warriors who fight these conflicts are the bloggers, the citizen journalists, and digital eyewitnesses who disseminate images from Abu Ghraib or Afghan weddings to a global audience.

It is in this virtual theater that the real war is now being fought. From just a handful of extreme jihadist websites in 2001, there are now thousands. We are in the era of Web 2.0 [5], in which interactivity rather than just the passive receipt of information is the norm: This is a space in which it is impossible, to use military jargon again, to take command and control or achieve full-spectrum dominance. It is also a strategic space, in which military doctrines like information operations have real limitations. IO embraces the use of military deception, and whereas terrorist organizations don’t play by the same rules when it comes to information and disinformation, democratic military organizations do have a degree of accountability. If they lie deliberately, they will get found out in an era characterized by the near-impossibility of keeping secrets. And if that happens, the credibility of any truthful messaging that may be disseminated will be irreparably damaged. IO has proved useful, to varying degrees of success, in the real war theaters of Iraq and Afghanistan but only really at the tactical and operational levels of command. For al-Qaida, it is the main tool at the strategic level of communication.

Although the United States and its allies are now in the process of developing strategic communications capabilities for conducting “global engagement” (another rebranding by the Obama Administration), early hopes for a return to non-military information strategies created by the president’s 2009 Cairo speech [6] are as yet unfulfilled. As long as the war in Afghanistan continues in its current kinetic surge there is little likelihood of short-term success in the information domain. Many in the audience will continue to think they are watching a tragedy, and no matter how well the military actors perform, war is no laughing matter.

Philip M. Taylor is a professor of international communications at the University of Leeds, U.K.


Article printed from Tablet Magazine:

URL to article:

URLs in this post:

[1] journals:

[2] conferences,:

[3] institutes:

[4] Stuxnet:

[5] Web 2.0:

[6] speech:

Tuesday, October 5, 2010

Public Diplomacy: Walter Lippmann and the Committee on Public Information (1917-1919)

"Was Walter Lippmann,

that key figure in the study/shaping of US public opinion in the 20th century, a member of the Committee on Public Information (CIP, 1917-1919), arguably the first USG 'propaganda' agency in the US? Some esteemed scholarly studies state that he was. However, other sources -- which I consider more reliable on the issue -- do not. The journalist George Creel -- the head of the CIP -- was described by his boss, Woodrow Wilson, as 'a man with a passion for adjectives.'

Lippmann was analytical, philosophical. This reflects the many-century-old debate between the philosopher (Lippmann) and the rhetorican (Creel). The Lippmann/Creel divide -- thought vs. hype, to put it crudely -- is essential to understanding the nature and inner tension of USG overseas propaganda in the 20th century. And it has important implications for US public diplomacy today.

Hence I am working on an article on confirming that, in fact, Lippmann was not a member of the CPI, and indeed was critical of it." Lippmann image (above) from; Creel image (below) from