Ten Things Never to Say or Do in Russia - By Andrew Kaufman, Ph.D., Serafima Gettys, Ph.D., and Nina Wieda, dummies.com:
Sometimes, knowing what NOT to do is even more important if you want to fit in or at least produce a good impression. Read on to find out about ten Russian social taboos.
Don't come to visit empty-handed
If you're invited over for dinner, or just for a visit, don't come to a Russian house with nothing. What you bring doesn't really matter — a box of chocolates, flowers, or a small toy for a child. Russian hosts prepare for company by cooking their best dishes and buying delicacies that they normally wouldn't for themselves. If, after all this effort, a guest shows up without even a flower, Russians believe he doesn't care.
Don't leave your shoes on in someone's home
Russian apartments are covered in rugs. Often, they're expensive Persian rugs with intricate designs, which aren't cleaned as easily as traditional American carpeting. Besides, Russians walk a lot through dusty streets, instead of just stepping from the car directly into the home. For these reasons, and also because this tradition has gone on for centuries, Russians take off their street shoes when they enter private residencies. The host usually offers a pair of tapochki (tah-puhch-kee; slippers); if you go to a party, women usually bring a pair of nice shoes to wear inside. And again, if you fail to take your shoes off, nobody will say anything. But sneak a peek: Are you the only person wearing your snow-covered boots at the dinner table?
Don't joke about the parents
Russians aren't politically correct. Go ahead and tell an anyekdot (uh-neek-doht; joke) based on ethnicity, appearance, or gender stereotypes; just steer clear of jokes about somebody's mother or father. You won't be understood.
Don't toast with "Na Zdorov'ye!"
People who don't speak Russian usually think that they know one Russian phrase: a toast, Na Zdorov'ye! Little do they know that Na Zdorov'ye! (nuh zdah-rohv'-ee; for health) is what Russians say when somebody thanks them for a meal. In Polish, indeed, Na Zdorov'ye! or something close to it, is a traditional toast. Russians, on the other hand, like to make up something long and complex, such as, Za druzhbu myezhdu narodami! (zah droozh-boo myezh-doo nuh-roh-duh-mee; To friendship between nations!) If you want a more generic Russian toast, go with Za Vas! (zuh vahs; To you!)
Don't take the last shirt
A Russian saying, otdat' poslyednyuyu rubashku (aht-daht' pahs-lyed-nyu-yu roo-bahsh-koo; to give away one's last shirt), makes the point that you have to be giving, no matter what the expense for yourself. In Russia, offering guests whatever they want is considered polite. Those wants don't just include food or accommodations; old-school Russians offer you whatever possessions you comment on, like a picture on the wall, a vase, or a sweater.
Now, being offered something doesn't necessarily mean you should take it. Russians aren't offering something because they want to get rid of it; they're offering because they want to do something nice for you. So, unless you feel that plundering their home is a good idea, don't just take things offered to you and leave. Refuse first, and do so a couple of times, because your hosts will insist. And only accept the gift if you really want this special something, but then return the favor and give your hosts something nice, as well.
Don't underdress
Russians dress up on more occasions than Americans do. Even to go for a casual walk, a Russian woman may wear high heels and a nice dress. A hardcore feminist may say women do this because they're victimized and oppressed. But Russian women themselves explain it this way, "We only live once; I want to look and feel my best."
On some occasions, all foreigners, regardless of gender, run the risk of being the most underdressed person in the room. These occasions include dinner parties and trips to the theater. Going to a restaurant is also considered a festive occasion, and you don't want to show up in your jeans and T-shirt, no matter how informal you think the restaurant may be. In any case, checking on the dress code before going out somewhere is a good idea.
Don't go dutch
Here's where Russians differ strikingly from Western Europeans. They don't go Dutch. So, if you ask a lady out, don't expect her to pay for herself, not at a restaurant or anywhere else. You can, of course, suggest that she pay, but that usually rules out the possibility of seeing her again. She may not even have money on her. Unless they expect to run into a maniac and have to escape through the back exit, Russian women wouldn't think of bringing money when going out with a man.
Don't let a woman carry something heavy
This rule may make politically correct people cringe, but Russians believe that a man is physically stronger than a woman. Therefore, they believe a man who watches a woman carry something heavy without helping her is impolite.
Don't overlook the elderly on public transportation
When Russians come to America and ride public transportation, they're very confused to see young people sitting when an elderly person is standing nearby. They don't understand that in America, an elderly person may be offended when offered a seat. In Russia, if you don't offer the elderly and pregnant women a seat on a bus, the entire bus looks at you as if you're a criminal. Women, even (or especially) young ones, are also offered seats on public transportation. But that's optional. Getting up and offering a seat to an elderly person, on the other hand, is a must.
Don't burp in public
Bodily functions are considered extremely impolite in public, even if the sound is especially long and expressive, and the author is proud of it.
Moreover, if the incident happens (we're all human), don't apologize. By apologizing, you acknowledge your authorship, and attract more attention to the fact. Meanwhile, Russians, terrified by what just happened, pretend they didn't notice, or silently blame it on the dog. Obviously, these people are in denial. But if you don't want to be remembered predominantly for this incident, steer clear of natural bodily functions in public.
Via NI on facebook
Monday, February 27, 2012
Monday, February 20, 2012
Strategy
"Strategy is a quintessential American Century word, ostensibly connoting knowingness and sophistication. Whether working in the White House, the State Department, or the Pentagon, strategists promote the notion that they can anticipate the future and manage its course. Yet the actual events of the American Century belie any such claim. Remember when Afghanistan signified victory over the Soviet empire? Today, the genius of empowering the mujahedin seems less than self-evident.
Strategy is actually a fraud perpetrated by those who covet power and are intent on concealing from the plain folk the fact that the people in charge are flying blind. With only occasional exceptions, the craft of strategy was a blight on the American Century. ...
Having learned nothing from the American Century, present-day strategists—the ones keen to bomb Iran, confront China, and seize control of outer space as the 'ultimate high ground'—will continue the practice of doing Mayhem's bidding. As usual, the rest of us will be left to cope with the havoc that results, albeit this time without the vast reserves of wealth and power that once made an American Century appear plausible. Brace yourself."
--Andrew J. Bacevich, professor of history and international relations at Boston University
Strategy is actually a fraud perpetrated by those who covet power and are intent on concealing from the plain folk the fact that the people in charge are flying blind. With only occasional exceptions, the craft of strategy was a blight on the American Century. ...
Having learned nothing from the American Century, present-day strategists—the ones keen to bomb Iran, confront China, and seize control of outer space as the 'ultimate high ground'—will continue the practice of doing Mayhem's bidding. As usual, the rest of us will be left to cope with the havoc that results, albeit this time without the vast reserves of wealth and power that once made an American Century appear plausible. Brace yourself."
--Andrew J. Bacevich, professor of history and international relations at Boston University
Culture Vs. Strategy Is A False Choice
Culture Vs. Strategy Is A False Choice
By Bob Frisch, fastcompany.com
[JB Comment: "Strategy" may be applicable to certain narrowly-focused business practices, but "culture" (of course, not quite in the sense the author of this article means it, but rather as being a recognition that the world is diverse and complicated and cannot be shaped by imperial commands) is nevertheless important (even for "planned" operations) in that it allows for improvisation and adaptation to immediate needs and circumstances. In the case of on-the-ground American diplomacy, an awareness of the vagaries and complexities of local cultures (and of US diplomats nagivating in them) is, in my view, far more important for long-term American national interests than these diplomats blindly following a narrow "strategic plan" from headquarters, pronouncements all-too-often produced by inside-the-beltway operatives who know little about other countries in a concrete, down-to-earth level (not to speak about their ignorance of these countries' languages or history). For anybody who lives in the real world, "strategy," in contrast to thought or experience, is just another word for pedantic idiocy and limitless arrogance. Want to stay honest and do good on our small planet? Pick modest doubt over grandiose "strategy," camouflaged with its phony "options," buried footnotes to keep criticism at bay].
Strategy seems to have fallen on hard times. In his recent Fast Company piece “Culture Eats Strategy for Lunch [1],” author Shawn Parr joins a long list of commentators, psychologists, authors, and consultants who’ve used that dietary line to argue that company culture is a greater determinant of success than competitive strategy.
A strong culture is important, and for all the reasons Parr mentions: employee engagement, alignment, motivation, focus, and brand burnishing. But is it the most important element of company success, as the more ferocious of the culture warriors assert? Is long-term success, as Parr writes, “dependent on a culture that is nurtured and alive”? If history is any guide, the answer to both questions is no.
Certainly, Southwest Airlines has a great culture and funny flight attendants. Employees seem genuinely enthusiastic about their employer. But Southwest also has a great strategy: no-frills service, a young fleet with a limited number of planes flying mostly short-hops from formerly secondary airports, and inexpensive and flexible labor agreements relative to other airlines. And if that strategy ever stops paying off, the jokes will likely go sour and the culture south. Pan Am, too, had a great culture, but was strategically unprepared to deal with oil shocks and a decline in demand for air travel.
Parr attributes the success of Zappos to a culture that is “inclusionary, encouraging, and empowering.” Customer service representatives write zany emails and company leaders have often affirmed their belief that if you get culture right, success follows. But Zappos also has fast delivery, deep inventory, a 365-day return policy, and free shipping both ways. That’s a strategy--not a culture--and if weren’t competitive with catalog sellers and mall shoe stores, all the culture in the world would make little difference. Pets.com and Webvan, two companies that appeared at about the same time as Zappos, didn’t disappear because they had weak cultures. They simply had unworkable economic models.
Businesses are economic as well as human entities, and need to be built on a solid base of sustainable competitive advantage. Culture can reinforce strategy, as it does Zappos’ strategy of customer convenience. But it can’t prevail if a strategy is poorly conceived or the company faces competitors with superior strategies, resources, and positioning. As Damon Runyon wrote, “the race is not always to the swift nor the victory to the strong, but that's how you bet."
In the business world, it’s easy to take a handful of current winners, give them a backstory about their cultures and conclude, like Parr, that “authenticity and values always win out.” Always? Walmart is the winner in retail. McDonald's serves more meals than anyone else. And yet they're hardly praised as paragons of superior culture. American manufacturing has moved overseas to Asia, and I don't see anyone writing positive articles about the culture at Foxconn.
Parr is correct that the culture of the U.S. Marines has few parallels in terms of its strength, depth, and the commitment to mission it engenders. But ask any Marine commander which he would prefer to go into battle with--superior morale or superior strategic position--and he would tell you he wanted both.
Ultimately, the culture versus strategy question is a false choice. It's like asking whether you would rather back a great poker player with weak cards or an average player with great cards. You’re more likely to win when you have both: a great player and great cards. The same goes for culture and strategy. You don’t have to choose. Culture doesn’t eat strategy, and the company that lets culture do so is likely to starve.
--Author Bob Frisch is the author of Who’s in the Room? How Great Leaders Structure and Manage the Teams Around Them [2] (Jossey-Bass) and managing partner of The Strategic Offsites Group [3]. Previously he was a Managing Partner at Accenture, held leadership roles at Gemini Consulting and The Boston Consulting Group, and headed corporate strategy for Dial and Sears. He earned degrees from Tufts and the Yale School of Management.
By Bob Frisch, fastcompany.com
[JB Comment: "Strategy" may be applicable to certain narrowly-focused business practices, but "culture" (of course, not quite in the sense the author of this article means it, but rather as being a recognition that the world is diverse and complicated and cannot be shaped by imperial commands) is nevertheless important (even for "planned" operations) in that it allows for improvisation and adaptation to immediate needs and circumstances. In the case of on-the-ground American diplomacy, an awareness of the vagaries and complexities of local cultures (and of US diplomats nagivating in them) is, in my view, far more important for long-term American national interests than these diplomats blindly following a narrow "strategic plan" from headquarters, pronouncements all-too-often produced by inside-the-beltway operatives who know little about other countries in a concrete, down-to-earth level (not to speak about their ignorance of these countries' languages or history). For anybody who lives in the real world, "strategy," in contrast to thought or experience, is just another word for pedantic idiocy and limitless arrogance. Want to stay honest and do good on our small planet? Pick modest doubt over grandiose "strategy," camouflaged with its phony "options," buried footnotes to keep criticism at bay].
Strategy seems to have fallen on hard times. In his recent Fast Company piece “Culture Eats Strategy for Lunch [1],” author Shawn Parr joins a long list of commentators, psychologists, authors, and consultants who’ve used that dietary line to argue that company culture is a greater determinant of success than competitive strategy.
A strong culture is important, and for all the reasons Parr mentions: employee engagement, alignment, motivation, focus, and brand burnishing. But is it the most important element of company success, as the more ferocious of the culture warriors assert? Is long-term success, as Parr writes, “dependent on a culture that is nurtured and alive”? If history is any guide, the answer to both questions is no.
Certainly, Southwest Airlines has a great culture and funny flight attendants. Employees seem genuinely enthusiastic about their employer. But Southwest also has a great strategy: no-frills service, a young fleet with a limited number of planes flying mostly short-hops from formerly secondary airports, and inexpensive and flexible labor agreements relative to other airlines. And if that strategy ever stops paying off, the jokes will likely go sour and the culture south. Pan Am, too, had a great culture, but was strategically unprepared to deal with oil shocks and a decline in demand for air travel.
Parr attributes the success of Zappos to a culture that is “inclusionary, encouraging, and empowering.” Customer service representatives write zany emails and company leaders have often affirmed their belief that if you get culture right, success follows. But Zappos also has fast delivery, deep inventory, a 365-day return policy, and free shipping both ways. That’s a strategy--not a culture--and if weren’t competitive with catalog sellers and mall shoe stores, all the culture in the world would make little difference. Pets.com and Webvan, two companies that appeared at about the same time as Zappos, didn’t disappear because they had weak cultures. They simply had unworkable economic models.
Businesses are economic as well as human entities, and need to be built on a solid base of sustainable competitive advantage. Culture can reinforce strategy, as it does Zappos’ strategy of customer convenience. But it can’t prevail if a strategy is poorly conceived or the company faces competitors with superior strategies, resources, and positioning. As Damon Runyon wrote, “the race is not always to the swift nor the victory to the strong, but that's how you bet."
In the business world, it’s easy to take a handful of current winners, give them a backstory about their cultures and conclude, like Parr, that “authenticity and values always win out.” Always? Walmart is the winner in retail. McDonald's serves more meals than anyone else. And yet they're hardly praised as paragons of superior culture. American manufacturing has moved overseas to Asia, and I don't see anyone writing positive articles about the culture at Foxconn.
Parr is correct that the culture of the U.S. Marines has few parallels in terms of its strength, depth, and the commitment to mission it engenders. But ask any Marine commander which he would prefer to go into battle with--superior morale or superior strategic position--and he would tell you he wanted both.
Ultimately, the culture versus strategy question is a false choice. It's like asking whether you would rather back a great poker player with weak cards or an average player with great cards. You’re more likely to win when you have both: a great player and great cards. The same goes for culture and strategy. You don’t have to choose. Culture doesn’t eat strategy, and the company that lets culture do so is likely to starve.
--Author Bob Frisch is the author of Who’s in the Room? How Great Leaders Structure and Manage the Teams Around Them [2] (Jossey-Bass) and managing partner of The Strategic Offsites Group [3]. Previously he was a Managing Partner at Accenture, held leadership roles at Gemini Consulting and The Boston Consulting Group, and headed corporate strategy for Dial and Sears. He earned degrees from Tufts and the Yale School of Management.
Saturday, February 18, 2012
Is there a new geek anti-intellectualism?
Larry Sanger Blog: Is there a new geek anti-intellectualism?
[JB COMMENT: This posting makes an important point: That knowledge is not simply accessing information on a computer, but also enriching our lives by making intellectual discoveries part of ourselves as human beings through memory.]
Is there a new anti-intellectualism? I mean one that is advocated by Internet geeks and some of the digerati. I think so: more and more mavens of the Internet are coming out firmly against academic knowledge in all its forms. This might sound outrageous to say, but it is sadly true.
Let’s review the evidence.
1. The evidence
Programmers have been saying for years that it’s unnecessary to get a college degree in order to be a great coder–and this has always been easy to concede. I never would have accused them of being anti-intellectual, or even of being opposed to education, just for saying that. It is just an interesting feature of programming as a profession–not evidence of anti-intellectualism.
In 2001, along came Wikipedia, which gave everyone equal rights to record knowledge. This was only half of the project’s original vision, as I explain in this memoir. Originally, we were going to have some method of letting experts approve articles. But the Slashdot geeks who came to dominate Wikipedia’s early years, supported by Jimmy Wales, nixed this notion repeatedly. The digerati cheered and said, implausibly, that experts were no longer needed, and that “crowds” were wiser than people who had devoted their lives to knowledge. This ultimately led to a debate, now old hat, about experts versus amateurs in the mid-2000s. There were certainly notes of anti-intellectualism in that debate.
Around the same time, some people began to criticize books as such, as an outmoded medium, and not merely because they are traditionally paper and not digital. The Institute for the Future of the Book has been one locus of this criticism.
But nascent geek anti-intellectualism really began to come into focus around three years ago with the rise of Facebook and Twitter, when Nicholas Carr asked, “Is Google making us stupid?” in The Atlantic. More than by Carr’s essay itself, I was struck by the reaction to it. Altogether too many geeks seemed to be assume that if information glut is sapping our ability to focus, this is largely out of our control and not necessarily a bad thing. But of course it is a bad thing, and it is in our control, as I pointed out. Moreover, focus is absolutely necessary if we are to gain knowledge. We will be ignoramuses indeed, if we merely flow along with the digital current and do not take the time to read extended, difficult texts.
Worse still was Clay Shirky’s reaction in the Britannica Blog, where he opined, “no one reads War and Peace. It’s too long, and not so interesting,” and borrows a phrase from Richard Foreman in claiming, “the ‘complex, dense and “cathedral-like” structure of the highly educated and articulate personality’ is at risk.” As I observed at the time, Shirky’s views entailed that Twitter-sized discourse was our historically determined fate, and that, if he were right, the Great Books and civilization itself would be at risk. But he was not right–I hope.
At the end of 2008, Don Tapscott, author of Wikinomics, got into the act, claiming that Google makes memorization passe. ”It is enough that they know about the Battle of Hastings,” Tapscott boldly claimed, “without having to memorise that it was in 1066. [Students] can look that up and position it in history with a click on Google.”
In 2010, Edge took up the question, “Is the Internet changing the way you think?” and the answers were very sobering. Here were some extremely prominent scientists, thinkers, and writers, and all too many of them were saying again, more boldly, that the Internet was making it hard to read long pieces of writing, that books were passe, and that the Internet was essentially becoming a mental prosthesis. We were, as one writer put it, uploading our brains to the Internet.
As usual, I did not buy the boosterism. I was opposed to the implicit techno-determinism as well as the notion that the Internet makes learning unnecessary. Anyone who claims that we do not need to read and memorize some facts is saying that we do not need to learn those facts. Reading and indeed memorizing are the first, necessary steps in learning anything.
This brings us to today. Recently, Sir Ken Robinson has got a lot of attention by speaking out–inspiringly to some, outrageously to others–saying that K-12 education needs a sea change away from “boring” academics and toward collaborative methods that foster “creativity.” At the same time, PayPal co-founder Peter Thiel sparked much discussion by claiming that there is a “higher education bubble,” that is, the cost of higher education greatly exceeds its value. This claim by itself is somewhat plausible. But Thiel much less plausibly implies that college per se is now not recommendable for many, because it is “elitist.” With his Thiel Fellowship program he hopes to demonstrate that a college degree is not necessary for success in the field of technology. Leave it to a 19-year-old recipient of one of these fellowships to shout boldly that “College is a waste of time.” Unsurprisingly, I disagree.
2. Geek anti-intellectualism
In the above, I have barely scratched the surface. I haven’t mentioned many other commentators, blogs, and books that have written on such subjects. But this is enough to clarify what I mean by “geek anti-intellectualism.” Let me step back and sum up the views mentioned above:
1. Experts do not deserve any special role in declaring what is known. Knowledge is now democratically determined, as it should be. (Cf. this essay of mine.)
2. Books are an outmoded medium because they involve a single person speaking from authority. In the future, information will be developed and propagated collaboratively, something like what we already do with the combination of Twitter, Facebook, blogs, Wikipedia, and various other websites.
3. The classics, being books, are also outmoded. They are outmoded because they are often long and hard to read, so those of us raised around the distractions of technology can’t be bothered to follow them; and besides, they concern foreign worlds, dominated by dead white guys with totally antiquated ideas and attitudes. In short, they are boring and irrelevant.
4. The digitization of information means that we don’t have to memorize nearly as much. We can upload our memories to our devices and to Internet communities. We can answer most general questions with a quick search.
5. The paragon of success is a popular website or well-used software, and for that, you just have to be a bright, creative geek. You don’t have to go to college, which is overpriced and so reserved to the elite anyway.
If you are the sort of geek who loves all things Internet uncritically, then you’re probably nodding your head to these. If so, I submit this as a new epistemological manifesto that might well sum up your views:
You don’t really care about knowledge; it’s not a priority. For you, the books containing knowledge, the classics and old-fashioned scholarship summing up the best of our knowledge, the people and institutions whose purpose is to pass on knowledge–all are hopelessly antiquated. Even your own knowledge, the contents of your mind, can be outsourced to databases built by collaborative digital communities, and the more the better. After all, academics are boring. A new world is coming, and you are in the vanguard. In this world, the people who have and who value individual knowledge, especially theoretical and factual knowledge, are objects of your derision. You have contempt for the sort of people who read books and talk about them–especially classics, the long and difficult works that were created alone by people who, once upon a time, were hailed as brilliant. You have no special respect for anyone who is supposed to be “brilliant” or even “knowledgeable.” What you respect are those who have created stuff that many people find useful today. Nobody cares about some Luddite scholar’s ability to write a book or get an article past review by one of his peers. This is why no decent school requires reading many classics, or books generally, anymore–books are all tl;dr for today’s students. In our new world, insofar as we individually need to know anything at all, our knowledge is practical, and best gained through projects and experience. Practical knowledge does not come from books or hard study or any traditional school or college. People who spend years of their lives filling up their individual minds with theoretical or factual knowledge are chumps who will probably end up working for those who skipped college to focus on more important things.
Do you find your views misrepresented? I’m being a bit provocative, sure, but haven’t I merely repeated some remarks and made a few simple extrapolations? Of course, most geeks, even most Internet boosters, will not admit to believing all of this manifesto. But I submit that geekdom is on a slippery slope to the anti-intellectualism it represents.
So there is no mistake, let me describe the bottom of this slippery slope more forthrightly. You are opposed to knowledge as such. You contemptuously dismiss experts who have it; you claim that books are outmoded, including classics, which contain the most significant knowledge generated by humankind thus far; you want to memorize as little as possible, and you want to upload what you have memorized to the net as soon as possible; you don’t want schools to make students memorize anything; and you discourage most people from going to college.
In short, at the bottom of the slippery slope, you seem to be opposed to knowledge wherever it occurs, in books, in experts, in institutions, even in your own mind.
But, you might say, what about Internet communities? Isn’t that a significant exception? You might think so. After all, how can people who love Wikipedia so much be “opposed to knowledge as such”? Well, there is an answer to that.
It’s because there is a very big difference between a statement occurring in a database and someone having, or learning, a piece of knowledge. If all human beings died out, there would be no knowledge left even if all libraries and the whole Internet survived. Knowledge exists only inside people’s heads. It is propagated not by being accessed in a database search, but by being learned and mastered. A collection of Wikipedia articles about physics contains text; the mind of a physicist contains knowledge.
3. How big of a problem is geek anti-intellectualism?
Once upon a time, anti-intellectualism was said to be the mark of knuckle-dragging conservatives, and especially American Protestants. Remarkably, that seems to be changing.
How serious am I in the above analysis? And is this really a problem, or merely a quirk of geek life in the 21st century?
It’s important to bear in mind what I do and do not mean when I say that some Internet geeks are anti-intellectuals. I do not mean that they would admit that they hate knowledge or are somehow opposed to knowledge. Almost no one can admit such a thing to himself, let alone to others. And, of course, I doubt I could find many geeks who would say that students should not graduate from high school without learning a significant amount of math, science, and some other subjects as well. Moreover, however they might posture when at work on Wikipedia articles, most geeks have significant respect for the knowledge of people like Stephen Hawking or Richard Dawkins, of course. Many geeks, too, are planning on college, are in college, or have been to college. And so forth–for the various claims (1)-(5), while many geeks would endorse them, they could also be found contradicting them regularly as well. So is there really anything to worry about here?
Well, yes, there is. Attitudes are rarely all or nothing. The more that people have these various attitudes, the more bad stuff is going to result, I think. The more that a person really takes seriously that there is no point in reading the classics, the less likely he’ll actually take a class in Greek history or early modern philosophy. Repeat that on a mass scale, and the world becomes–no doubt already has become–a significantly poorer place, as a result of the widespread lack of analytical tools and conceptual understanding. We can imagine a world in which the humanities are studied by only a small handful of people, because we already live in that world; just imagine the number of people getting smaller.
But isn’t this just a problem just for geekdom? Does it really matter that much if geeks are anti-intellectuals?
Well, the question is whether the trend will move on to the population at large. One does not speak of “geek chic” these days for nothing. The digital world is now on the cutting edge of societal evolution, and attitudes and behaviors that were once found mostly among geeks back in the 1980s and 1990s are now mainstream. Geek anti-intellectualism can already be seen as another example. Most of the people I’ve mentioned in this essay are not geeks per se, but the digerati, who are frequently non-geeks or ex-geeks who have their finger on the pulse of social movements online. Via these digerati, we can find evidence of geek attitudes making their way into mainstream culture. One now regularly encounters geek-inspired sentiments from business writers like Don Tapscott and education theorists like Ken Robinson–and even from the likes of Barack Obama (but not anti-intellectualism, of course).
Let’s just put it this way. If, in the next five years, some prominent person comes out with a book or high-profile essay openly attacking education or expertise or individual knowledge as such, because the Internet makes such things outmoded, and if it receives a positive reception not just from writers at CNET and Wired and the usual suspects in the blogosphere, but also serious, thoughtful consideration from Establishment sources like The New York Review of Books or Time, I’ll say that geek anti-intellectualism is in full flower and has entered the mainstream.
UPDATE: I’ve posted a very long set of replies.
UPDATE 2: I’ve decided to reply below as well–very belatedly…
Popularity: 100% [?]
Share this post
About the author
Larry Sanger had written 88 articles for Larry Sanger Blog
I call myself an "Internet Knowledge Organizer." I started Wikipedia.org, Citizendium.org, and WatchKnow.org, among others. Now I am lucky enough to be able to work full-time on creating free materials for early education, which I am using with my two little boys and sharing with you.
[JB COMMENT: This posting makes an important point: That knowledge is not simply accessing information on a computer, but also enriching our lives by making intellectual discoveries part of ourselves as human beings through memory.]
Is there a new anti-intellectualism? I mean one that is advocated by Internet geeks and some of the digerati. I think so: more and more mavens of the Internet are coming out firmly against academic knowledge in all its forms. This might sound outrageous to say, but it is sadly true.
Let’s review the evidence.
1. The evidence
Programmers have been saying for years that it’s unnecessary to get a college degree in order to be a great coder–and this has always been easy to concede. I never would have accused them of being anti-intellectual, or even of being opposed to education, just for saying that. It is just an interesting feature of programming as a profession–not evidence of anti-intellectualism.
In 2001, along came Wikipedia, which gave everyone equal rights to record knowledge. This was only half of the project’s original vision, as I explain in this memoir. Originally, we were going to have some method of letting experts approve articles. But the Slashdot geeks who came to dominate Wikipedia’s early years, supported by Jimmy Wales, nixed this notion repeatedly. The digerati cheered and said, implausibly, that experts were no longer needed, and that “crowds” were wiser than people who had devoted their lives to knowledge. This ultimately led to a debate, now old hat, about experts versus amateurs in the mid-2000s. There were certainly notes of anti-intellectualism in that debate.
Around the same time, some people began to criticize books as such, as an outmoded medium, and not merely because they are traditionally paper and not digital. The Institute for the Future of the Book has been one locus of this criticism.
But nascent geek anti-intellectualism really began to come into focus around three years ago with the rise of Facebook and Twitter, when Nicholas Carr asked, “Is Google making us stupid?” in The Atlantic. More than by Carr’s essay itself, I was struck by the reaction to it. Altogether too many geeks seemed to be assume that if information glut is sapping our ability to focus, this is largely out of our control and not necessarily a bad thing. But of course it is a bad thing, and it is in our control, as I pointed out. Moreover, focus is absolutely necessary if we are to gain knowledge. We will be ignoramuses indeed, if we merely flow along with the digital current and do not take the time to read extended, difficult texts.
Worse still was Clay Shirky’s reaction in the Britannica Blog, where he opined, “no one reads War and Peace. It’s too long, and not so interesting,” and borrows a phrase from Richard Foreman in claiming, “the ‘complex, dense and “cathedral-like” structure of the highly educated and articulate personality’ is at risk.” As I observed at the time, Shirky’s views entailed that Twitter-sized discourse was our historically determined fate, and that, if he were right, the Great Books and civilization itself would be at risk. But he was not right–I hope.
At the end of 2008, Don Tapscott, author of Wikinomics, got into the act, claiming that Google makes memorization passe. ”It is enough that they know about the Battle of Hastings,” Tapscott boldly claimed, “without having to memorise that it was in 1066. [Students] can look that up and position it in history with a click on Google.”
In 2010, Edge took up the question, “Is the Internet changing the way you think?” and the answers were very sobering. Here were some extremely prominent scientists, thinkers, and writers, and all too many of them were saying again, more boldly, that the Internet was making it hard to read long pieces of writing, that books were passe, and that the Internet was essentially becoming a mental prosthesis. We were, as one writer put it, uploading our brains to the Internet.
As usual, I did not buy the boosterism. I was opposed to the implicit techno-determinism as well as the notion that the Internet makes learning unnecessary. Anyone who claims that we do not need to read and memorize some facts is saying that we do not need to learn those facts. Reading and indeed memorizing are the first, necessary steps in learning anything.
This brings us to today. Recently, Sir Ken Robinson has got a lot of attention by speaking out–inspiringly to some, outrageously to others–saying that K-12 education needs a sea change away from “boring” academics and toward collaborative methods that foster “creativity.” At the same time, PayPal co-founder Peter Thiel sparked much discussion by claiming that there is a “higher education bubble,” that is, the cost of higher education greatly exceeds its value. This claim by itself is somewhat plausible. But Thiel much less plausibly implies that college per se is now not recommendable for many, because it is “elitist.” With his Thiel Fellowship program he hopes to demonstrate that a college degree is not necessary for success in the field of technology. Leave it to a 19-year-old recipient of one of these fellowships to shout boldly that “College is a waste of time.” Unsurprisingly, I disagree.
2. Geek anti-intellectualism
In the above, I have barely scratched the surface. I haven’t mentioned many other commentators, blogs, and books that have written on such subjects. But this is enough to clarify what I mean by “geek anti-intellectualism.” Let me step back and sum up the views mentioned above:
1. Experts do not deserve any special role in declaring what is known. Knowledge is now democratically determined, as it should be. (Cf. this essay of mine.)
2. Books are an outmoded medium because they involve a single person speaking from authority. In the future, information will be developed and propagated collaboratively, something like what we already do with the combination of Twitter, Facebook, blogs, Wikipedia, and various other websites.
3. The classics, being books, are also outmoded. They are outmoded because they are often long and hard to read, so those of us raised around the distractions of technology can’t be bothered to follow them; and besides, they concern foreign worlds, dominated by dead white guys with totally antiquated ideas and attitudes. In short, they are boring and irrelevant.
4. The digitization of information means that we don’t have to memorize nearly as much. We can upload our memories to our devices and to Internet communities. We can answer most general questions with a quick search.
5. The paragon of success is a popular website or well-used software, and for that, you just have to be a bright, creative geek. You don’t have to go to college, which is overpriced and so reserved to the elite anyway.
If you are the sort of geek who loves all things Internet uncritically, then you’re probably nodding your head to these. If so, I submit this as a new epistemological manifesto that might well sum up your views:
You don’t really care about knowledge; it’s not a priority. For you, the books containing knowledge, the classics and old-fashioned scholarship summing up the best of our knowledge, the people and institutions whose purpose is to pass on knowledge–all are hopelessly antiquated. Even your own knowledge, the contents of your mind, can be outsourced to databases built by collaborative digital communities, and the more the better. After all, academics are boring. A new world is coming, and you are in the vanguard. In this world, the people who have and who value individual knowledge, especially theoretical and factual knowledge, are objects of your derision. You have contempt for the sort of people who read books and talk about them–especially classics, the long and difficult works that were created alone by people who, once upon a time, were hailed as brilliant. You have no special respect for anyone who is supposed to be “brilliant” or even “knowledgeable.” What you respect are those who have created stuff that many people find useful today. Nobody cares about some Luddite scholar’s ability to write a book or get an article past review by one of his peers. This is why no decent school requires reading many classics, or books generally, anymore–books are all tl;dr for today’s students. In our new world, insofar as we individually need to know anything at all, our knowledge is practical, and best gained through projects and experience. Practical knowledge does not come from books or hard study or any traditional school or college. People who spend years of their lives filling up their individual minds with theoretical or factual knowledge are chumps who will probably end up working for those who skipped college to focus on more important things.
Do you find your views misrepresented? I’m being a bit provocative, sure, but haven’t I merely repeated some remarks and made a few simple extrapolations? Of course, most geeks, even most Internet boosters, will not admit to believing all of this manifesto. But I submit that geekdom is on a slippery slope to the anti-intellectualism it represents.
So there is no mistake, let me describe the bottom of this slippery slope more forthrightly. You are opposed to knowledge as such. You contemptuously dismiss experts who have it; you claim that books are outmoded, including classics, which contain the most significant knowledge generated by humankind thus far; you want to memorize as little as possible, and you want to upload what you have memorized to the net as soon as possible; you don’t want schools to make students memorize anything; and you discourage most people from going to college.
In short, at the bottom of the slippery slope, you seem to be opposed to knowledge wherever it occurs, in books, in experts, in institutions, even in your own mind.
But, you might say, what about Internet communities? Isn’t that a significant exception? You might think so. After all, how can people who love Wikipedia so much be “opposed to knowledge as such”? Well, there is an answer to that.
It’s because there is a very big difference between a statement occurring in a database and someone having, or learning, a piece of knowledge. If all human beings died out, there would be no knowledge left even if all libraries and the whole Internet survived. Knowledge exists only inside people’s heads. It is propagated not by being accessed in a database search, but by being learned and mastered. A collection of Wikipedia articles about physics contains text; the mind of a physicist contains knowledge.
3. How big of a problem is geek anti-intellectualism?
Once upon a time, anti-intellectualism was said to be the mark of knuckle-dragging conservatives, and especially American Protestants. Remarkably, that seems to be changing.
How serious am I in the above analysis? And is this really a problem, or merely a quirk of geek life in the 21st century?
It’s important to bear in mind what I do and do not mean when I say that some Internet geeks are anti-intellectuals. I do not mean that they would admit that they hate knowledge or are somehow opposed to knowledge. Almost no one can admit such a thing to himself, let alone to others. And, of course, I doubt I could find many geeks who would say that students should not graduate from high school without learning a significant amount of math, science, and some other subjects as well. Moreover, however they might posture when at work on Wikipedia articles, most geeks have significant respect for the knowledge of people like Stephen Hawking or Richard Dawkins, of course. Many geeks, too, are planning on college, are in college, or have been to college. And so forth–for the various claims (1)-(5), while many geeks would endorse them, they could also be found contradicting them regularly as well. So is there really anything to worry about here?
Well, yes, there is. Attitudes are rarely all or nothing. The more that people have these various attitudes, the more bad stuff is going to result, I think. The more that a person really takes seriously that there is no point in reading the classics, the less likely he’ll actually take a class in Greek history or early modern philosophy. Repeat that on a mass scale, and the world becomes–no doubt already has become–a significantly poorer place, as a result of the widespread lack of analytical tools and conceptual understanding. We can imagine a world in which the humanities are studied by only a small handful of people, because we already live in that world; just imagine the number of people getting smaller.
But isn’t this just a problem just for geekdom? Does it really matter that much if geeks are anti-intellectuals?
Well, the question is whether the trend will move on to the population at large. One does not speak of “geek chic” these days for nothing. The digital world is now on the cutting edge of societal evolution, and attitudes and behaviors that were once found mostly among geeks back in the 1980s and 1990s are now mainstream. Geek anti-intellectualism can already be seen as another example. Most of the people I’ve mentioned in this essay are not geeks per se, but the digerati, who are frequently non-geeks or ex-geeks who have their finger on the pulse of social movements online. Via these digerati, we can find evidence of geek attitudes making their way into mainstream culture. One now regularly encounters geek-inspired sentiments from business writers like Don Tapscott and education theorists like Ken Robinson–and even from the likes of Barack Obama (but not anti-intellectualism, of course).
Let’s just put it this way. If, in the next five years, some prominent person comes out with a book or high-profile essay openly attacking education or expertise or individual knowledge as such, because the Internet makes such things outmoded, and if it receives a positive reception not just from writers at CNET and Wired and the usual suspects in the blogosphere, but also serious, thoughtful consideration from Establishment sources like The New York Review of Books or Time, I’ll say that geek anti-intellectualism is in full flower and has entered the mainstream.
UPDATE: I’ve posted a very long set of replies.
UPDATE 2: I’ve decided to reply below as well–very belatedly…
Popularity: 100% [?]
Share this post
About the author
Larry Sanger had written 88 articles for Larry Sanger Blog
I call myself an "Internet Knowledge Organizer." I started Wikipedia.org, Citizendium.org, and WatchKnow.org, among others. Now I am lucky enough to be able to work full-time on creating free materials for early education, which I am using with my two little boys and sharing with you.
Friday, February 17, 2012
In Jeremy Lin, a stereotype that should be celebrated
In Jeremy Lin, a stereotype that should be celebrated
By Jonathan Zimmerman, Washington Post Published: February 16
I’m an Ivy League graduate and a crazed basketball fan. That gives me two very good reasons to celebrate the meteoric rise of Jeremy Lin, the Harvard-educated point guard who has brought the New York Knicks back to life.
But I’m also a university professor. So I’m troubled by the much-heard refrain that Lin — whose parents are Taiwanese immigrants — has “overcome the Asian stereotype.” In the popular mind, this story goes, Asian Americans are quiet, studious and really good at math. By scoring 20 or more points in each of his first six NBA starts, including 38 against Kobe Bryant and the Los Angeles Lakers, Lin supposedly dealt a decisive blow against an insidious ethnic caricature.
But isn’t that stereotype — especially the part about studying hard — a very good model to follow? Why should anyone want or need to “overcome” it?
Here’s one sad answer: In our college admissions process, especially, we punish Asian Americans who hew too closely to the stereotype. Rather than rewarding students for their individual effort and achievement, we effectively penalize them for doing so well as a group.
In fact, the Education Department is currently investigating a complaint against Harvard — Jeremy Lin’s alma mater — for allegedly discriminating against Asian Americans in admissions. The department is also looking at Princeton, where a faculty member’s own research has shown that Asian Americans need SAT scores about 140 points higher than white students’ — when everything else is equal — to have the same chance of getting into an elite college.
Harvard and Princeton officials deny any overt discrimination, of course. “Our review of every applicant’s file is highly individualized and holistic,” a Harvard spokesman told Bloomberg News this month, “as we give serious consideration to all the information we receive and all the ways in which the candidate might contribute to our vibrant educational environment and community.”
Translated: It’s not sufficient to earn near-perfect grades and test scores, or to excel at a musical instrument, or to win a science-fair contest. Asian American applicants do all those things, in droves. But our elite universities don’t want too much of a good thing, if it all comes from the same racial group.
The Education Department also investigated Harvard’s admission of Asian American applicants in 1990. It found then that university officials routinely described these applicants as “quiet/shy, science/math oriented, and hard workers.” Admissions officials also had a hard time ranking one Asian American over another. One complained that an applicant’s credentials “seem so typical of other Asian applications I’ve read: extraordinarily gifted in math with the opposite extreme in English.” A second Harvard admissions officer sounded equally frustrated. “He’s quiet and, of course, wants to be a doctor,” the officer wrote of one Asian American applicant.
The 1990 probe determined that Harvard admitted 13.2 percent of its Asian American applicants and 17.4 percent of whites, even though the Asians had better grades and scores. But the investigation concluded that the university’s preference for athletes and the children of alumni — not racial discrimination per se — accounted for the gap.
Asian Americans are underrepresented in both of those categories. Like Jeremy Lin, many Asian American applicants are first- or second-generation immigrants. They can’t rely on Daddy’s résumé — or his charitable donations to his college — to get them in.
That’s one reason why their parents encourage hitting the books instead of the courts. In 2009, when Lin was a senior at Harvard, he was one of just 18 Asian Americans playing NCAA Division I men’s basketball. Opposing players hissed slurs at him, asking if he was missing orchestra practice or telling him to “open up” his “slanty” eyes.
Although Lin led his California high school team to a state championship, no big-time NCAA program offered him a basketball scholarship. And as everyone knows now, two NBA teams cut him before the Knicks picked him up.
So we should congratulate Lin for overturning one stereotype: that Asian men can’t excel at sports. But let’s not forget that he was quite a studious fellow, too, earning a 4.2 grade-point average in high school and a perfect score on his SAT subject test in math. He graduated from Harvard with a 3.1 GPA in economics, one of the most demanding majors.
In that sense, Jeremy Lin didn’t overcome an ethnic stereotype; he confirmed it.
Yet the more we glorify Lin for breaking the typical mold, the more we denigrate the hundreds of thousands of Asian Americans who study hard and succeed. If you’re Asian, our society says, excelling at school simply isn’t good enough. And that’s what I call Linsanity.
By Jonathan Zimmerman, Washington Post Published: February 16
I’m an Ivy League graduate and a crazed basketball fan. That gives me two very good reasons to celebrate the meteoric rise of Jeremy Lin, the Harvard-educated point guard who has brought the New York Knicks back to life.
But I’m also a university professor. So I’m troubled by the much-heard refrain that Lin — whose parents are Taiwanese immigrants — has “overcome the Asian stereotype.” In the popular mind, this story goes, Asian Americans are quiet, studious and really good at math. By scoring 20 or more points in each of his first six NBA starts, including 38 against Kobe Bryant and the Los Angeles Lakers, Lin supposedly dealt a decisive blow against an insidious ethnic caricature.
But isn’t that stereotype — especially the part about studying hard — a very good model to follow? Why should anyone want or need to “overcome” it?
Here’s one sad answer: In our college admissions process, especially, we punish Asian Americans who hew too closely to the stereotype. Rather than rewarding students for their individual effort and achievement, we effectively penalize them for doing so well as a group.
In fact, the Education Department is currently investigating a complaint against Harvard — Jeremy Lin’s alma mater — for allegedly discriminating against Asian Americans in admissions. The department is also looking at Princeton, where a faculty member’s own research has shown that Asian Americans need SAT scores about 140 points higher than white students’ — when everything else is equal — to have the same chance of getting into an elite college.
Harvard and Princeton officials deny any overt discrimination, of course. “Our review of every applicant’s file is highly individualized and holistic,” a Harvard spokesman told Bloomberg News this month, “as we give serious consideration to all the information we receive and all the ways in which the candidate might contribute to our vibrant educational environment and community.”
Translated: It’s not sufficient to earn near-perfect grades and test scores, or to excel at a musical instrument, or to win a science-fair contest. Asian American applicants do all those things, in droves. But our elite universities don’t want too much of a good thing, if it all comes from the same racial group.
The Education Department also investigated Harvard’s admission of Asian American applicants in 1990. It found then that university officials routinely described these applicants as “quiet/shy, science/math oriented, and hard workers.” Admissions officials also had a hard time ranking one Asian American over another. One complained that an applicant’s credentials “seem so typical of other Asian applications I’ve read: extraordinarily gifted in math with the opposite extreme in English.” A second Harvard admissions officer sounded equally frustrated. “He’s quiet and, of course, wants to be a doctor,” the officer wrote of one Asian American applicant.
The 1990 probe determined that Harvard admitted 13.2 percent of its Asian American applicants and 17.4 percent of whites, even though the Asians had better grades and scores. But the investigation concluded that the university’s preference for athletes and the children of alumni — not racial discrimination per se — accounted for the gap.
Asian Americans are underrepresented in both of those categories. Like Jeremy Lin, many Asian American applicants are first- or second-generation immigrants. They can’t rely on Daddy’s résumé — or his charitable donations to his college — to get them in.
That’s one reason why their parents encourage hitting the books instead of the courts. In 2009, when Lin was a senior at Harvard, he was one of just 18 Asian Americans playing NCAA Division I men’s basketball. Opposing players hissed slurs at him, asking if he was missing orchestra practice or telling him to “open up” his “slanty” eyes.
Although Lin led his California high school team to a state championship, no big-time NCAA program offered him a basketball scholarship. And as everyone knows now, two NBA teams cut him before the Knicks picked him up.
So we should congratulate Lin for overturning one stereotype: that Asian men can’t excel at sports. But let’s not forget that he was quite a studious fellow, too, earning a 4.2 grade-point average in high school and a perfect score on his SAT subject test in math. He graduated from Harvard with a 3.1 GPA in economics, one of the most demanding majors.
In that sense, Jeremy Lin didn’t overcome an ethnic stereotype; he confirmed it.
Yet the more we glorify Lin for breaking the typical mold, the more we denigrate the hundreds of thousands of Asian Americans who study hard and succeed. If you’re Asian, our society says, excelling at school simply isn’t good enough. And that’s what I call Linsanity.
Wednesday, February 15, 2012
Lin/Xi Jinping: When Public Diplomacy Really Works
It's far too brilliant for any propagandist, of any nationality, to have thought it up: The presumed future leader of the People's Republic of China, Vice President Xi Jinping, visits the United States at a time when relations between the two countries are tense; meanwhile, a Chinese-American, Harvard-educated Jeremy Lin, not long ago a bench warmer for the NY Knicks, becomes an instantaneous professional basketball star.
What great (unexpected?) publicity for "Communist" China!
Chinese kid makes it in the USA when a Mainland China vice-president comes to our shores -- the Land of the Free where, for all we know, "Jinping" could be the name of a panda.
Sure, Lin's Christian family originally came from Taiwan, but "so what" for most of us in the USA, who couldn't find that far-off island on a map.
And what better for the PRC than for Americans to think that all Chinese are just like us: they play basketball and they're Christian!
Not to speak of the Chinese disaspora in North America: Is it not in the PRC's interest that hyphenated Chinese feel proud to be Chinese?
Sometimes coincidence is far more effective than calculation. Which tells us something about "strategic plans" for public diplomacy.
P.S. As a naive American, I am assuming, of course, that the N.B.A. PR apparatus and the PRC's propaganda organs didn't strike a deal to make Jeremy, a laudably talented person in many fields, an overnight media sports sensation.
The New York Times: "The N.B.A. has estimated 300 million people in China play basketball. The retirement last year of Yao Ming, a basketball star from mainland China, deprived the N.B.A. of its main Asian draw. But Lin’s emergence has at least temporarily strengthened the league as a centerpiece of Chinese online chatter. "
Imagine, if you were an N.B.A exec: 300 million B-ball consumers/spectators in China ... and all it takes to grab 'em (and it's cheap) is to get a Harvard-educated Chinese-American who "made it" in the Land of Opportunity to make a few more hoops for NYC on the tube.
Wow! We're talking about real money -- on the cheap!
Image from
What great (unexpected?) publicity for "Communist" China!
Chinese kid makes it in the USA when a Mainland China vice-president comes to our shores -- the Land of the Free where, for all we know, "Jinping" could be the name of a panda.
Sure, Lin's Christian family originally came from Taiwan, but "so what" for most of us in the USA, who couldn't find that far-off island on a map.
And what better for the PRC than for Americans to think that all Chinese are just like us: they play basketball and they're Christian!
Not to speak of the Chinese disaspora in North America: Is it not in the PRC's interest that hyphenated Chinese feel proud to be Chinese?
Sometimes coincidence is far more effective than calculation. Which tells us something about "strategic plans" for public diplomacy.
P.S. As a naive American, I am assuming, of course, that the N.B.A. PR apparatus and the PRC's propaganda organs didn't strike a deal to make Jeremy, a laudably talented person in many fields, an overnight media sports sensation.
The New York Times: "The N.B.A. has estimated 300 million people in China play basketball. The retirement last year of Yao Ming, a basketball star from mainland China, deprived the N.B.A. of its main Asian draw. But Lin’s emergence has at least temporarily strengthened the league as a centerpiece of Chinese online chatter. "
Imagine, if you were an N.B.A exec: 300 million B-ball consumers/spectators in China ... and all it takes to grab 'em (and it's cheap) is to get a Harvard-educated Chinese-American who "made it" in the Land of Opportunity to make a few more hoops for NYC on the tube.
Wow! We're talking about real money -- on the cheap!
Image from
Tuesday, February 14, 2012
Xanadu
Xanadu -- The word used by Washington Post columnist Richard Cohen (evidently, without irony) to describe the US Embassy in Baghdad ("Obama in denial about American influence," February 13). For a more realistic evaluation, see (1) (2).
Sunday, February 12, 2012
The Certainty of Doubt
February 11, 2012
The Certainty of Doubt
By CULLEN MURPHY, New York Times
THE building at No. 11 Piazza del Sant’Uffizio is an imposing ocher-and-white palazzo that stands just inside the gates of Vatican City, behind the southern arc of Bernini’s colonnade. Above the main entrance is a marble scroll. It once held a Latin inscription, placed there in the 16th century, proclaiming that the palazzo had been built as a bulwark against “heretical depravity.” This was the headquarters of the Roman Inquisition, the arm of the Roman Catholic Church that tried Galileo and created the Index of Forbidden Books. You won’t see the inscription above the entrance now — it was chiseled off by French troops during Napoleon’s occupation. All that’s left is some mottled scarring.
The Roman Inquisition was one of several inquisitions conducted under the auspices of the church. These had in common a deeply rooted sense of fear (of heretics, of Jews, of Protestantism) and a deeply rooted moral certainty, a conviction that the cause was not only just but also so urgent that nothing must stand in the way: not practical considerations (workers were diverted from the unfinished St. Peter’s to complete the Inquisition’s palazzo) and certainly not competing considerations of principle or moderation.
That’s the way it is with moral certainty. It sweeps objections aside and makes anything permissible if pursued with an appeal to a higher justification. That higher justification does not need to be God, though God remains serviceable. The higher justification can also be the forces of history. It can be rationalism and science. It can be some assertion of the common good. It can be national security.
The power of the great “isms” of the 20th century — fascism, communism — has dissipated, but moral certainty arises in other forms. Are certain facts and ideas deemed too dangerous? Then perhaps censorship is the answer. (China’s Great Firewall is one example, but let’s not forget that during the past decade, there have been some 4,600 challenges to books in schools and libraries in the United States.) Are certain religions and beliefs deemed intolerable? Then perhaps a few restrictions are in order. (Bills have been introduced in several states to ban recognition of Islamic Shariah law.) In a variety of guises, a conviction of certainty lurks within debates on marriage, on reproduction, on family values, on biotechnology. It peers from behind the question “Is America a Christian nation?”
An “ism” that retains its vitality — terrorism — is justified unapologetically by moral certainty. In a vastly different way, not always recognized, so have been some of the steps taken to combat it. Necessity overrides principle. The inventory of measures advanced in the name of homeland security during the past decade would fill a book. In the United States, the surveillance of citizens and noncitizens alike has become increasingly pervasive. The legal system has been under pressure to constrict protections for the accused. The National Defense Authorization Act, signed into law in December by President Obama despite his own reservations, gives the government enhanced powers to detain, interrogate and prosecute.
In Britain, a new Green Paper on Justice and Security has laid out changes in the legal system that would extend the circumstances in which evidence may be presented secretly in court without being made known to defendants. It would also allow government ministers to withhold from certain court proceedings information that the ministers deem sensitive. Visitors to Britain for this summer’s Olympics will notice the CCTV cameras — there are reportedly more than four million of them — that monitor ordinary daily activity throughout the country. This effort, the most advanced in the world, is supported by the slogan “If You Have Nothing to Hide, You Have Nothing to Fear.”
Meanwhile, to a degree that Americans of a generation ago would never have thought possible, the argument is made that torture can play a legitimate role in interrogation, the practice justified with reference to a greater good (and with the help of semantic fig leaves). Three of the Republican presidential candidates still in the race, Newt Gingrich, Rick Santorum and Mitt Romney, maintain that waterboarding, which the Inquisition matter-of-factly considered to be torture, really isn’t, and Mr. Gingrich and Mr. Santorum openly support its use. (Mr. Romney hasn’t said what he’d allow.)
The theoretical arguments for torture are slippery and dangerous. The inquisitors of old knew this all too well, and even popes tried to draw the line, to little avail — and in practice torture is more slippery still.
The idea that some single course is right and necessary — and, being right and necessary, must trump everything else, for all our sakes — is a seductive one. Isaiah Berlin knew where this idea of an “ultimate solution” would lead — indeed, had already led in the murderous century he witnessed: “For, if one really believes that such a solution is possible, then surely no cost would be too high to obtain it: to make mankind just and happy and creative and harmonious forever — what could be too high a price to pay for that? To make such an omelet, there is surely no limit to the number of eggs that should be broken. ... If your desire to save mankind is serious, you must harden your heart, and not reckon the cost.”
The French soldiers who erased the inscription from the Inquisition’s palazzo in Rome didn’t know that they were replacing one form of certainty with another — in their case, the certainty of faith with the certainty of reason. The key words here are not “faith” and “reason” but “didn’t know”: the right way forward is always elusive. The drafters of the United States Constitution — fearful of rule by one opinion, whether the tyrant’s or the mob’s — created a governmental structure premised on the idea that human beings are fallible, fickle and unreliable, and in fundamental ways not to be trusted. Triumphalist rhetoric about the Constitution ignores the skeptical view of human nature that underlies it.
A long philosophical tradition in the Roman Catholic Church itself — admittedly, not the one most in evidence today — has long balanced the comfort of certainty against the corrective of doubt. Human beings are fallen creatures. Certitude can be a snare. Doubt can be a helping hand. Consider a list of theologians who have found themselves targets of church discipline — Pierre Teilhard de Chardin, John Courtney Murray, Yves Congar — only to be “surrounded with a bright halo of enthusiasm” at some later point, as the late Cardinal Avery Dulles once put it.
Doubt sometimes comes across as feeble and meek, apologetic and obstructionist. On occasion it is. But it’s also a powerful defensive instrument. Doubt can be a bulwark. We should inscribe that in marble someplace.
Cullen Murphy is an editor at large at Vanity Fair and the author of “God’s Jury: The Inquisition and the Making of the Modern World.”
The Certainty of Doubt
By CULLEN MURPHY, New York Times
THE building at No. 11 Piazza del Sant’Uffizio is an imposing ocher-and-white palazzo that stands just inside the gates of Vatican City, behind the southern arc of Bernini’s colonnade. Above the main entrance is a marble scroll. It once held a Latin inscription, placed there in the 16th century, proclaiming that the palazzo had been built as a bulwark against “heretical depravity.” This was the headquarters of the Roman Inquisition, the arm of the Roman Catholic Church that tried Galileo and created the Index of Forbidden Books. You won’t see the inscription above the entrance now — it was chiseled off by French troops during Napoleon’s occupation. All that’s left is some mottled scarring.
The Roman Inquisition was one of several inquisitions conducted under the auspices of the church. These had in common a deeply rooted sense of fear (of heretics, of Jews, of Protestantism) and a deeply rooted moral certainty, a conviction that the cause was not only just but also so urgent that nothing must stand in the way: not practical considerations (workers were diverted from the unfinished St. Peter’s to complete the Inquisition’s palazzo) and certainly not competing considerations of principle or moderation.
That’s the way it is with moral certainty. It sweeps objections aside and makes anything permissible if pursued with an appeal to a higher justification. That higher justification does not need to be God, though God remains serviceable. The higher justification can also be the forces of history. It can be rationalism and science. It can be some assertion of the common good. It can be national security.
The power of the great “isms” of the 20th century — fascism, communism — has dissipated, but moral certainty arises in other forms. Are certain facts and ideas deemed too dangerous? Then perhaps censorship is the answer. (China’s Great Firewall is one example, but let’s not forget that during the past decade, there have been some 4,600 challenges to books in schools and libraries in the United States.) Are certain religions and beliefs deemed intolerable? Then perhaps a few restrictions are in order. (Bills have been introduced in several states to ban recognition of Islamic Shariah law.) In a variety of guises, a conviction of certainty lurks within debates on marriage, on reproduction, on family values, on biotechnology. It peers from behind the question “Is America a Christian nation?”
An “ism” that retains its vitality — terrorism — is justified unapologetically by moral certainty. In a vastly different way, not always recognized, so have been some of the steps taken to combat it. Necessity overrides principle. The inventory of measures advanced in the name of homeland security during the past decade would fill a book. In the United States, the surveillance of citizens and noncitizens alike has become increasingly pervasive. The legal system has been under pressure to constrict protections for the accused. The National Defense Authorization Act, signed into law in December by President Obama despite his own reservations, gives the government enhanced powers to detain, interrogate and prosecute.
In Britain, a new Green Paper on Justice and Security has laid out changes in the legal system that would extend the circumstances in which evidence may be presented secretly in court without being made known to defendants. It would also allow government ministers to withhold from certain court proceedings information that the ministers deem sensitive. Visitors to Britain for this summer’s Olympics will notice the CCTV cameras — there are reportedly more than four million of them — that monitor ordinary daily activity throughout the country. This effort, the most advanced in the world, is supported by the slogan “If You Have Nothing to Hide, You Have Nothing to Fear.”
Meanwhile, to a degree that Americans of a generation ago would never have thought possible, the argument is made that torture can play a legitimate role in interrogation, the practice justified with reference to a greater good (and with the help of semantic fig leaves). Three of the Republican presidential candidates still in the race, Newt Gingrich, Rick Santorum and Mitt Romney, maintain that waterboarding, which the Inquisition matter-of-factly considered to be torture, really isn’t, and Mr. Gingrich and Mr. Santorum openly support its use. (Mr. Romney hasn’t said what he’d allow.)
The theoretical arguments for torture are slippery and dangerous. The inquisitors of old knew this all too well, and even popes tried to draw the line, to little avail — and in practice torture is more slippery still.
The idea that some single course is right and necessary — and, being right and necessary, must trump everything else, for all our sakes — is a seductive one. Isaiah Berlin knew where this idea of an “ultimate solution” would lead — indeed, had already led in the murderous century he witnessed: “For, if one really believes that such a solution is possible, then surely no cost would be too high to obtain it: to make mankind just and happy and creative and harmonious forever — what could be too high a price to pay for that? To make such an omelet, there is surely no limit to the number of eggs that should be broken. ... If your desire to save mankind is serious, you must harden your heart, and not reckon the cost.”
The French soldiers who erased the inscription from the Inquisition’s palazzo in Rome didn’t know that they were replacing one form of certainty with another — in their case, the certainty of faith with the certainty of reason. The key words here are not “faith” and “reason” but “didn’t know”: the right way forward is always elusive. The drafters of the United States Constitution — fearful of rule by one opinion, whether the tyrant’s or the mob’s — created a governmental structure premised on the idea that human beings are fallible, fickle and unreliable, and in fundamental ways not to be trusted. Triumphalist rhetoric about the Constitution ignores the skeptical view of human nature that underlies it.
A long philosophical tradition in the Roman Catholic Church itself — admittedly, not the one most in evidence today — has long balanced the comfort of certainty against the corrective of doubt. Human beings are fallen creatures. Certitude can be a snare. Doubt can be a helping hand. Consider a list of theologians who have found themselves targets of church discipline — Pierre Teilhard de Chardin, John Courtney Murray, Yves Congar — only to be “surrounded with a bright halo of enthusiasm” at some later point, as the late Cardinal Avery Dulles once put it.
Doubt sometimes comes across as feeble and meek, apologetic and obstructionist. On occasion it is. But it’s also a powerful defensive instrument. Doubt can be a bulwark. We should inscribe that in marble someplace.
Cullen Murphy is an editor at large at Vanity Fair and the author of “God’s Jury: The Inquisition and the Making of the Modern World.”
Saturday, February 11, 2012
Public Diplomacy and USG International Broadcasting
There's been much talk lately about the relationship between public diplomacy and USG non-military international broadcasting. As I see it, as a Foreign Service officer having been privileged to serve "in the field" for over twenty years (mostly in Eastern Europe, during the past century) the difference is this: FSOs talk to people in the flesh, face-to-face, whereas USIB, even now via the new social media, "communicates" to basically invisible, humanly little-known "target audiences" without individuality. I leave it up to you to decide on what is the more meaningful dialogue -- and best for the US national interest. A mixture of both? But, if so, what is the priority?
Friday, February 10, 2012
Comments on Ted Lipien's "VOA harms Putin opposition in Russia" by the Director of Communications and External Affairs, Broadcasting Board of Governors
Message to "John Brown's Notes and Essays" from: Lynne Weil, Director of Communications and External Affairs, Broadcasting Board of Governors
"[B]elow is the letter sent to the Washington Times from the Broadcasting Board of Governors in response to an op-ed published on Wednesday. We hope that you will consider it for both publication and follow-up.
To the Editor:
The op-ed you published on the Broadcasting Board of Governors ('VOA harms Putin opposition in Russia,' [http://www.washingtontimes.com/news/2012/feb/8/voa-harms-putin-opposition-in-russia/] Commentary, Ted Lipien, Feb. 8) cynically attempts to exploit a real, but quickly addressed, journalistic error by the Voice of America’s Russian Service in order to deliver an inaccurate, exaggerated and distorted attack on the BBG.
The Russian Service published an online interview with someone purported to be Russian opposition leader Alexei Navalny. It then reported Mr. Navalny's message denying having done the interview, removed the interview, and issued a prompt apology.
VOA is taking steps to better vet its sources in today’s changing, fast-paced digital media environment. Publication of the interview was regrettable, but hardly a reasonable basis for a broad challenge to the utility and effectiveness of U.S. international broadcasting and the BBG’s oversight of it.
The commentary overlooks compelling data on our impact. In 2011 the BBG reached record audiences: 187 million people worldwide weekly, 22 million more than the year before. To continue to thrive within federal budgetary constraints, the agency has embarked on an ambitious, well-researched plan to make U.S. international broadcasting more effective and efficient. Our broadcasts are and will continue to be one of the best values for the dollar in U.S. foreign policy.
The suggestion that the Board failed to recognize VOA’s 70th anniversary is false: The Board adopted and published a resolution noting the milestone at its January 13 meeting, and has been involved in plans for a major commemoration in the coming weeks. News of the resignation of BBG Chairman Walter Isaacson was immediately shared with the staff and then posted on the agency’s website. It is true that agency managers at VOA, IBB and BBG received bonuses, but the amounts were below government average.
The Feb. 8 commentary contained similar misstatements concerning the BBG’s restructuring plan, the leadership of its management team, a desire to emulate National Public Radio, the reasoning behind changes in the way the BBG engages with people in Russia and China, and the significance of a review of VOA Russian news.
We recommend that The Washington Times fact-check this commentary and consider issuing a correction.
Sincerely,
Lynne Weil
Director of Communications and External Affairs
Broadcasting Board of Governors"
U.SS
U.S. Marines posed with Nazi symbol in Afghanistan - Julie Watson, Associated Press, The Washington Times: The U.S. Marine Corps confirmed Thursday that a sniper team in Afghanistan posed for a photograph in front of a flag with a logo resembling that of the notorious Nazi SS.
Use of the SS symbol is not acceptable, and the Marine Corps has addressed the issue, Lt. Col. Stewart Upton said in a statement. He did not specify what action was taken. Image from
Use of the SS symbol is not acceptable, and the Marine Corps has addressed the issue, Lt. Col. Stewart Upton said in a statement. He did not specify what action was taken. Image from
Tuesday, February 7, 2012
Monday, February 6, 2012
Key Document in Understanding modern USG propaganda: How We Advertised America
George Creel,
1920
I. THE "SECOND LINES"
1 As Secretary Baker points out, the war [the Great War] was not fought in France alone. Back of the firing-line, back of armies and navies, back of the great supply-depots, another struggle waged with the same intensity and with almost equal significance attaching to its victories and defeats. It was the fight for the minds of men, for the "conquest of their convictions," and the battle-line ran through every home in every country. 2 It was in this recognition of Public Opinion as a major force that the Great War differed most essentially from all previous conflicts. The trial of strength was not only between massed bodies of armed men, but between opposed ideals, and moral verdicts took on all the value of military decisions. Other wars went no deeper than the physical aspects, but German Kultur raised issues that had to be fought out in the hearts and minds of people as well as on the actual firing-line. The approval of the world meant the steady flow of inspiration into the trenches; it meant the strengthened resolve and the renewed determination of the civilian population that is a nation's second line. The condemnation of the world meant the [4] destruction of morale and the surrender of that conviction of justice which is the very heart of courage. 3 The Committee on Public Information was called into existence to make this fight for the "verdict of mankind," the voice created to plead the justice of America's cause before the jury of Public Opinion. The fantastic legend that associated gags and muzzles with its work may be likened only to those trees which are evolved out of the air by Hindu magicians and which rise, grow, and flourish in gay disregard of such usual necessities as roots, sap, and sustenance. In no degree was the Committee an agency of censorship, a machinery of concealment or repression. Its emphasis throughout was on the open and the positive. At no point did it seek or exercise authorities under those war laws that limited the freedom of speech and press. In all things, from first to last, without halt or change, it was a plain publicity proposition, a vast enterprise in salesmanship, the world's greatest adventure in advertising. 4 Under the pressure of tremendous necessities an organization grew that not only reached deep into every American community, but that carried to every corner of the civilized globe the full message of America's idealism, unselfishness, and indomitable purpose. We fought prejudice, indifference, and disaffection at home and we fought ignorance and falsehood abroad. We strove for the maintenance of our own morale and the Allied morale by every process of stimulation; every possible expedient was employed to break through the barrage of lies that kept the people of the Central Powers in darkness and delusion; we sought the friendship and support of the neutral nations by continuous presentation of facts. We did not call it propaganda, for that word, in German hands, had come to be associated with deceit and corruption. Our effort was educational and informative throughout, for we had such confidence in our case as to feel that no other [5] argument was needed than the simple, straightforward presentation of facts. 5 There was no part of the great war machinery that we did not touch, no medium of appeal that we did not employ. The printed word, the spoken word, the motion picture, the telegraph, the cable, the wireless, the poster, the sign-board--all these were used in our campaign to make our own people and all other peoples understand the causes that compelled America to take arms. All that was fine and ardent in the civilian population came at our call until more than one hundred and fifty thousand men and women were devoting highly specialized abilities to the work of the Committee, as faithful and devoted in their service as though they wore the khaki. 6 While America's summons was answered without question by the citizenship as a whole, it is to be remembered that during the three and a half years of our neutrality the land had been torn by a thousand divisive prejudices, stunned by the voices of anger and confusion, and muddled by the pull and haul of opposed interests. These were conditions that could not be permitted to endure. What we had to have was no mere surface unity, but a passionate belief in the justice of America's cause that should weld the people of the United States into one white-hot mass instinct with fraternity, devotion, courage, and deathless determination. The war-will, the will-to-win, of a democracy depends upon the degree to which each one of all the people of that democracy can concentrate and consecrate body and soul and spirit in the supreme effort of service and sacrifice. What had to be driven home was that all business was the nation's business and every task a common task for a single purpose. 7 Starting with the initial conviction that the war was not the war of an administration, but the war of one hundred million people, and believing that public support [6] was a matter of public understanding, we opened up the activities of government to the inspection of the citizenship. A voluntary censorship agreement safeguarded military information of obvious value to the enemy, but in all else the rights of the press were recognized and furthered. Trained men, at the center of effort in every one of the war-making branches of government, reported on progress and achievement, and in no other belligerent nation was there such absolute frankness with respect to every detail of the national war endeavor. 8 As swiftly as might be, there were put into pamphlet form America's reasons for entering the war, the meaning of America, the nature of our free institutions, our war aims, likewise analyses of the Prussian system, the purposes of the imperial German government, and full exposure of the enemy's misrepresentations, aggressions, and barbarities. Written by the country's foremost publicists, scholars, and historians, and distinguished for their conciseness, accuracy, and simplicity, these pamphlets blew as a great wind against the clouds of confusion and misrepresentation. Money could not have purchased the volunteer aid that was given freely, the various universities lending their best men and the National Board of Historical Service placing its three thousand members at the complete disposal of the Committee. Some thirty-odd booklets, covering every phase of America's ideals, purposes, and aims, were printed in many languages other than English. Seventy-five millions reached the people of America, and other millions went to every corner of the world, carrying our defense and our attack. 9 The importance of the spoken word was not underestimated. A speaking division toured great groups like the Blue Devils, Pershing's Veterans, and the Belgians, arranged mass-meetings in the communities, conducted forty-five war conferences from coast to coast, co-ordinated [7] the entire speaking activities of the nation, and assured consideration to the crossroads hamlet as well as to the city. 10 The Four Minute Men, an organization that will live in history by reason of its originality and effectiveness, commanded the volunteer services of 75,000 speakers, operating in 5,200 communities, and making a total of 755,190 speeches, every one having the carry of shrapnel. 11 With the aid of a volunteer staff of several hundred translators, the Committee kept in direct touch with the foreign-language press, supplying selected articles designed to combat ignorance and disaffection. It organized and directed twenty-three societies and leagues designed to appeal to certain classes and particular foreign-language groups, each body carrying a specific message of unity and enthusiasm to its section of America's adopted peoples. 12 It planned war exhibits for the state fairs of the United States, also a great series of interallied war expositions that brought home to our millions the exact nature of the struggle that was being waged in France. In Chicago alone two million people attended in two weeks, and in nineteen cities the receipts aggregated $1,432,261.36. 13 The Committee mobilized the advertising forces of the country--press, periodical, car, and outdoor--for the patriotic campaign that gave millions of dollars' worth of free space to the national service. 14 It assembled the artists of America on a volunteer basis for the production of posters, window-cards, and similar material of pictorial publicity for the use of various government departments and patriotic societies. A total of 1,438 drawings was used. 15 It issued an official daily newspaper, serving every department of government, with a circulation of one hundred thousand copies a day. For official use only, its value was such that private citizens ignored the supposedly prohibitive [8] subscription price, subscribing to the amount of $77,622.58. 16 It organized a bureau of information for all persons who sought direction in volunteer war-work, in acquiring knowledge of any administrative activities, or in approaching business dealings with the government. In the ten months of its existence it gave answers to eighty-six thousand requests for specific information. 17 It gathered together the leading novelists, essayists, and publicists of the land, and these men and women, without payment, worked faithfully in the production of brilliant, comprehensive articles that went to the press as syndicate features. 18 One division paid particular attention to the rural press and the plate-matter service. Others looked after the specialized needs of the labor press, the religious press, and the periodical press. The Division of Women's War Work prepared and issued the information of peculiar interest to the women of the United States, also aiding in the task of organizing and directing. 19 Through the medium of the motion picture, America's war progress, as well as the meanings and purposes of democracy, were carried to every community in the United States and to every corner of the world. "Pershing's Crusaders," "America's Answer," and "Under Four Flags" were types of feature films by which we drove home America's resources and determinations, while other pictures, showing our social and industrial life, made our free institutions vivid to foreign peoples, From the domestic showings alone, under a fair plan of distribution, the sum of $878,215 was gained, which went to support the cost of the campaigns in foreign countries where the exhibitions were necessarily free. 20 Another division prepared and distributed still photographs and stereopticon slides to the press and public. [9] Over two hundred thousand of the latter were issued at cost. This division also conceived the idea of the "permit system," that opened up our military and naval activities to civilian camera men, and operated it successfully. It handled, also, the voluntary censorship of still and motion pictures in order that there might be no disclosure of information valuable to the enemy. The number of pictures reviewed averaged seven hundred a day. 21 Turning away from the United States to the world beyond our borders, a triple task confronted us. First, there were the peoples of the Allied nations that had to be fired by the magnitude of the American effort and the certainty of speedy and effective aid, in order to relieve the war-weariness of the civilian population and also to fan the enthusiasm of the firing-line to new flame. Second, we had to carry the truth to the neutral nations, poisoned by German lies; and third, we had to get the ideals of America, the determination of America, and the invincibility of America into the Central Powers. 22 Unlike other countries, the United States had no subsidized press service with which to meet the emergency. As a matter of bitter fact, we had few direct news contacts of our own with the outside world, owing to a scheme of contracts that turned the foreign distribution of American news over to European agencies. The volume of information that went out from our shores was small, and, what was worse, it was concerned only with the violent and unusual in our national life. It was news of strikes and lynchings, riots, murder cases, graft prosecutions, sensational divorces, the bizarre extravagance of "sudden millionaires." Naturally enough, we were looked upon as a race of dollar-mad materialists, a land of cruel monopolists, our real rulers the corporations and our democracy a "fake."
Source:George Creel, How We Advertised America (New York: Harper & Brothers, 1920), 3-9. Paragraph numbers have been added, and the original pagination appears in brackets.
Friday, February 3, 2012
Intellectual Diplomacy
In recent years, we've had all kinds of new "diplomacies," ranging from "panda diplomacy" to "helium diplomacy."
And now, lo and behold, the Director of the USC Center on Public Diplomacy, Professor/Doctor Philip Seib (who, to the best of my knowledge, has never actually practiced "public diplomacy" on behalf of the USG or any other government), has come up with a new (by no means viral, but mentioned) term: "intellectual diplomacy."
Well, I do not intend to be ill-mannered, but how can one not be but somewhat concerned by the apparently anti-intellectual declaration in the above-cited Seib article, which states: "The United States must become more adept at diplomacy grounded in strategic intellectual competitiveness."
OMG!
"Strategic intellectual competitiveness?" What does that jaw-breaker have to do with the pleasures of the mind -- or, indeed, with our new supposedly collaborative, interconnected new world of the 21st century?
I suggest Professor Seib read Montaigne, who had
sense of frivolity and absurdity, rather than his (Prof. Seib) feebly attempting to reproduce, at USC (it's in laid-back Los [Lost?] Angeles), yet another oh-so-serious "foreign-policy" "think"-tank report from inside-the-beltway Washington, DC.
Evidently homo ludens is not part of Professor Seib's mythology. Just take a look at his "I'm-so-macho" "A Strategy for Cultural Diplomacy." Such a piece, in my modest opinion, belongs in the KGB archives, given, as it suggests, that culture should propagandize power.
Could the good professor be -- although, perhaps, that is not his intention -- talking about a centralized USG uber-organized realpolitik crude propaganda establishment exploiting the intellect for narrow "national security" interests?
Just askin'.
P.S. Persons interested in this posting might wish to read about "la trahison des clercs."
And now, lo and behold, the Director of the USC Center on Public Diplomacy, Professor/Doctor Philip Seib (who, to the best of my knowledge, has never actually practiced "public diplomacy" on behalf of the USG or any other government), has come up with a new (by no means viral, but mentioned) term: "intellectual diplomacy."
Well, I do not intend to be ill-mannered, but how can one not be but somewhat concerned by the apparently anti-intellectual declaration in the above-cited Seib article, which states: "The United States must become more adept at diplomacy grounded in strategic intellectual competitiveness."
OMG!
"Strategic intellectual competitiveness?" What does that jaw-breaker have to do with the pleasures of the mind -- or, indeed, with our new supposedly collaborative, interconnected new world of the 21st century?
I suggest Professor Seib read Montaigne, who had
sense of frivolity and absurdity, rather than his (Prof. Seib) feebly attempting to reproduce, at USC (it's in laid-back Los [Lost?] Angeles), yet another oh-so-serious "foreign-policy" "think"-tank report from inside-the-beltway Washington, DC.
Evidently homo ludens is not part of Professor Seib's mythology. Just take a look at his "I'm-so-macho" "A Strategy for Cultural Diplomacy." Such a piece, in my modest opinion, belongs in the KGB archives, given, as it suggests, that culture should propagandize power.
Could the good professor be -- although, perhaps, that is not his intention -- talking about a centralized USG uber-organized realpolitik crude propaganda establishment exploiting the intellect for narrow "national security" interests?
Just askin'.
P.S. Persons interested in this posting might wish to read about "la trahison des clercs."
More evidence of the decline of America
MORE EVIDENCE OF THE DECLINE OF AMERICA
"The Chinese ... drank hardly any beer, until as recently as 1980, consuming 0.2 billion litres a year, as against 24 billion litres in the United States. Since
2003, however, China has overtaken America as the largest market for beer, and now drinks a fifth of all the beer in the world."
--Bee Wilson, "Drink beer, it makes you drunk," The Times Literary Supplement (January 27, 2012), p. 5; image from
"The Chinese ... drank hardly any beer, until as recently as 1980, consuming 0.2 billion litres a year, as against 24 billion litres in the United States. Since
2003, however, China has overtaken America as the largest market for beer, and now drinks a fifth of all the beer in the world."
--Bee Wilson, "Drink beer, it makes you drunk," The Times Literary Supplement (January 27, 2012), p. 5; image from
Thursday, February 2, 2012
What a difference a hyphen makes ...
An English-reader is not necessarily an English reader.
See also Tina Blue, "A Hyphen is not a Dash."
See also Tina Blue, "A Hyphen is not a Dash."
Subscribe to:
Posts (Atom)