Friday, December 2, 2016

After a mere 25 years, the triumph of the West is over


image (not from article) from

By Charles Krauthammer Opinion writer December 1 at 7:25 PM, Washington Post; see also Iam Buruma, "The End of the Anglo-American Order," New York Times

Twenty-five years ago — December 1991 — communism died, the Cold War ended and the Soviet Union disappeared. It was the largest breakup of an empire in modern history and not a shot was fired. It was an event of biblical proportions that my generation thought it would never live to see. As Wordsworth famously rhapsodized (about the French Revolution), “Bliss was it in that dawn to be alive/But to be young was very heaven!”

That dawn marked the ultimate triumph of the liberal democratic idea. It promised an era of Western dominance led by a preeminent America, the world’s last remaining superpower.

And so it was for a decade as the community of democracies expanded, first into Eastern Europe and former Soviet colonies. The U.S. was so dominant that when, on Dec. 31, 1999, it gave up one of the most prized geostrategic assets on the globe — the Panama Canal — no one even noticed.

That era is over. The autocracies are back and rising; democracy is on the defensive; the U.S. is in retreat. Look no further than Aleppo. A Western-backed resistance to a local tyrant — he backed by a resurgent Russia, an expanding Iran and an array of proxy Shiite militias — is on the brink of annihilation. Russia drops bombs; America issues statements.

What better symbol for the end of that heady liberal-democratic historical moment. The West is turning inward and going home, leaving the field to the rising authoritarians — Russia, China and Iran. In France, the conservative party’s newly nominated presidential contender is fashionably conservative and populist and soft on Vladimir Putin. As are several of the newer Eastern Europe democracies — Hungary, Bulgaria, even Poland — themselves showing authoritarian tendencies.

And even as Europe tires of the sanctions imposed on Russia for its rape of Ukraine, President Obama’s much-touted “isolation” of Russia has ignominiously dissolved, as our secretary of state repeatedly goes cap in hand to Russia to beg for mercy in Syria.

The European Union, the largest democratic club on Earth, could itself soon break up as Brexit-like movements spread across the continent. At the same time, its members dash with unseemly haste to reopen economic ties with a tyrannical and aggressive Iran.

As for China, the other great challenger to the post-Cold War order, the administration’s “pivot” has turned into an abject failure. The Philippines openly defected to the Chinese side. Malaysia then followed. And the rest of our Asian allies are beginning to hedge their bets. When the president of China addressed the Pacific Rim countries in Peru last month, he suggested that China was prepared to pick up the pieces of the Trans-Pacific Partnership, now abandoned by both political parties in the United States.

The West’s retreat began with Obama, who reacted to (perceived) post-9/11 overreach by abandoning Iraq, offering appeasement (“reset”) to Russia and accommodating Iran. In 2009, he refused even rhetorical support to the popular revolt against the rule of the ayatollahs.

Donald Trump wants to continue the pullback, though for entirely different reasons. Obama ordered retreat because he’s always felt the U.S. was not good enough for the world, too flawed to have earned the moral right to be the world hegemon. Trump would follow suit, disdaining allies and avoiding conflict, because the world is not good enough for us — undeserving, ungrateful, parasitic foreigners living safely under our protection and off our sacrifices. Time to look after our own American interests.

Trump’s is not a new argument. As the Cold War was ending in 1990, Jeane Kirkpatrick, the quintessential neoconservative, argued that we should now become “a normal country in a normal time.” It was time to give up the 20th-century burden of maintaining world order and of making superhuman exertions on behalf of universal values. Two generations of fighting fascism and communism were quite enough. Had we not earned a restful retirement?

At the time, I argued that we had earned it indeed, but a cruel history would not allow us to enjoy it. Repose presupposes a fantasy world in which stability is self-sustaining without the United States. It is not. We would incur not respite but chaos.

A quarter-century later, we face the same temptation, but this time under more challenging circumstances. Worldwide jihadism has been added to the fight, and we enjoy nothing like the dominance we exercised over conventional adversaries during our 1990s holiday from history.

We may choose repose, but we won’t get it.

Creator of the Big Mac - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


Wall Street Journal




Jim Delligatti’s burger innovation spread around the world.


Big Mac creator Michael "Jim" Delligatti, the Pittsburgh-area McDonald's franchisee who created the Big Mac in 1967, has died. He was 98. ENLARGE
Big Mac creator Michael "Jim" Delligatti, the Pittsburgh-area McDonald's franchisee who created the Big Mac in 1967, has died. He was 98. PHOTO: ASSOCIATED PRESS
Jim Delligatti, the McDonald’s Corp. franchisee who created the Big Mac, died Monday at the age of 98. His story is a reminder that great business innovations often have nothing to do with high technology but always have a lot to do with satisfying the customer.
The U.S. economy benefits greatly from the latest Silicon Valley creations, but creativity also occurs in places like Uniontown, Pennsylvania, where Delligatti served his first Big Mac in 1967.
Executives at McDonald’s headquarters were skeptical of the new sandwich. But Delligatti’s customers instantly liked the combination of “two all-beef patties, special sauce, lettuce, cheese, pickles, onions on a sesame seed bun,” as the company would later celebrate in a memorable jingle. Within a year it would be on the fast-food chain’s menus nationwide.
An Army veteran who served in Europe during World War II, Delligatti freely admitted that the Big Mac was inspired by double-decker burgers offered by local competitors. “This wasn’t like discovering the lightbulb,” he told the Los Angeles Times in 1993. “The bulb was already there. All I did was screw it in the socket.”
Billions of Big Macs later, it’s clear that he met the demand of consumers world-wide. Sales growth has slowed as millennials look for allegedly healthier options, often with more organic ingredients. We’ll leave it to nutritionists to debate whether diners are really better off eating something that was grown in local manure. [JB emphasis]But we note that Jim Delligatti ate a lot of Big Macs in the last half-century of his very long and fruitful life

Thursday, December 1, 2016

Internet of stings



Image, with caption: The Brave Little Toaster” by Jon Laing

Jennifer Howard, " Internet of stings," The Times Literary Supplement; see also John Brown, "The End of Cyber-Utopianism?" Huffington Post (2015); "Remember When Social Media Was the Solution to All Our Global Problems?" Huffington Post (2014)


Does anyone surf the internet anymore? The days of free-wheeling it around the web, just to see what’s out there, feel distant. In this post-factual, truth-averse era, many of the destinations that draw us online have become unsafe spaces, hostile and treacherous, where hatefulness and fake news prevail and surveillance is omnipresent. The web has changed, and it has changed us. How big those changes are, and what we can or ought to do about them, form the subject of these four books.

Big Brother might not be our biggest problem. With nationalism surging on both sides of the Atlantic, the online world looks more and more like a retro-futurist megalopolis overrun by trolls, criminals and state-backed hackers with the power not just to leap corporate firewalls but to skew elections. Marc Goodman’s Future Crimes: Inside the digital underground and the battle for our connected world peddles a pessimistic view in line with the Edward Snowden era, in which every week brings another data-hungry gadget along with a fresh round of headlines about governmental or corporate breaches of privacy and trust.

Bad enough was Yahoo!’s admission earlier this year that it leaked 500 million users’ data, followed by allegations that it acquiesced without protest to the US government’s demand to scan millions of emails for phrases indicating threats to national security. Far scarier was the sophisticated DDoS (distributed denial of service) attack in late October against Dyn, a company that helps to keep the internet humming along, which left users unable to reach sites including Twitter, Netflix and the New York Times. In what that paper called “a troubling development”, those responsible for the Dyn attack apparently hijacked the Internet of Things, commandeering “hundreds of thousands of internet-connected devices like cameras, baby monitors and home routers that have been infected – without their owners’ knowledge – with software that allows hackers to command them to flood a target with overwhelming traffic”.

One could be forgiven for feeling paranoid these days. Goodman’s biography lends itself to alarmist tendencies: the author has worked as a police officer and police trainer, an adviser to Interpol, and as a “futurist-in-residence” with the FBI. He is described on his website as the founder of something called “the Future Crimes Institute” – its name reminiscent of something from the film Minority Report. He is also the chair for Policy, Law, and Ethics at Silicon Valley’s Singularity University, an incubator/think-tank that touts its members as “community of change agents”.

His book reads like notes for a hard-boiled detective novel set in some near-future dystopia. Goodman spends hundreds of pages describing what he sees as the cybergalactic battle taking place all around us, mostly invisibly. He catalogues hundreds of real and hypothetical skirmishes, ambushes and manoeuvres involving criminals, technocrats, corporations and governments. Caught in the middle, we hapless Netizens blithely post to Facebook, check our smartphones every thirty seconds, hand over all sorts of personal information without a second thought, and leave trails of unprotected data everywhere we go.

Goodman’s message is that what we don’t know will hurt us:

As I look toward the future, I am increasingly concerned about the ubiquity of computing in our lives and how our utter dependence on it is leaving us vulnerable in ways that very few of us can even begin to comprehend. The current systemic complexities and interdependencies are great and growing all the time. Yet there are individuals and groups who are rapidly making sense of them and innovating in real time, to the detriment of us all.

The dark side of internet-enabled innovation makes for spooky storytelling, and Goodman plays it for all it’s worth. Who are these sinister figures who have taken Silicon Valley’s promised “techno-utopia”, as Goodman calls it, and perverted it to their own ends? Almost everyone, according to Future Crimes. Better to ask who isn’t out to get us or, more to the point, our data. This is where Goodman’s footing feels firmest. We are profligate with our data, and we don’t even seem to care half the time.

Corporations want it. Google, Facebook and other corporations offer free services in exchange for intimate and monetizable knowledge of our lives, experiences and connections. Governments want it. They surveil us in the name of protection, as Snowden’s revelations made clear. Terrorists want it. One especially chilling episode in Future Crimes details how the masterminds of the 2008 Mumbai attacks used mobile phones and internet access to run their team of assailants remotely and maximize their deadliness. And cyber spies and foreign powers want to get our data, too – or skew it for their own ends. In the US, many Americans are haunted by the idea that Russian hackers had a hand in the WikiLeaks release of hacked Democratic emails, perhaps skewing the course of the election. Hackers, it seems, can get to us at the ballot box too.

Criminals, of course, have always had a field day with data – the code to the safe, the details of where you live, the routines of the security guards at the bank. They just have a lot more data to steal now, and lots of virtual unlocked doors to sneak through to get it. Goodman leaves no potential threat unmentioned. Cyberthieves, petty and professional, pilfer passwords and social-security numbers in order to steal identities and drain bank accounts. They buy and sell almost anything – weapons, trafficked people and wildlife, sensitive information – on the black market, and they commit their crimes remotely, across continents and oceans. He conjures visions of a Spectre-like underworld he calls Crime Inc. that brings a high degree of professionalism to illicit multinational dealings. It relies on the Dark Web – the vast portion of online activity Google won’t help you find – to do business.

All the good things about digital life – connection, information, analysis, activism, discovery – are almost invisible here. Our devices don’t connect us, they betray us. Smartphones, so ready to hand, so hard to put down, become “the snitches in our pockets”. Corporations that design our devices, software and ruling algorithms want it that way. The Dyn attack lends support to Goodman’s warning. It is not impossible that someday soon, as we bring more everyday objects online, they too will sell us out: computer-run cars and “smart” homes with networked appliances create security gaps we don’t see. “Once they’ve compromised the coffeepot, they’ve broken the virtual Maginot Line perimeter of your network”, Goodman writes. “From there it’s just a hop, skip, and jump to infect and attack the more secure and profitable devices in your home”, like the “locked-down encrypted laptop” that uses the same network as that traitorous coffee pot. And science-fiction writers might want to take notes on Goodman’s speculations about how even our bodies might betray us, not in the old physical ways but through hacker-vulnerable implants and prosthetics and ingestibles and whatever else we invent to bring ourselves and our machines into greater alignment. Meanwhile, the drone that delivers pizza to your home, or that Amazon Prime impulse buy to your doorstep, could be repurposed by a terrorist to deliver a bomb to any number of targets. That’s just plausible enough to be unsettling, and something one hopes the authorities will keep an eye on. But – at least in America – fellow citizens wielding firearms or spray-painting swastikas and committing hate crimes – feel like a much greater and more immediate threat to personal safety and democratic values.

What, in this booby-trapped digital landscape, does taking care look like? Goodman has less to say about what we ought to do about it, other than to caution that we can’t look to the government for help. “Our federal government has effectively abdicated responsibility for protecting its citizenry in our increasingly connected digital world”, he says. When the National Security Agency has its own hacking tools compromised and published on the website Github, as occurred earlier this year, that is all too apparent.

In an appendix, Goodman shares security advice that, after 600 pages of warnings, feels like treating a severed artery with a plaster. Update frequently to take advantage of security patches and software fixes. Use more complicated passwords. Download only from reputable sites. Encrypt your data. Etc. He is right that many of us live in relative ignorance of what happens as we move around online. That message bears repeating. But to invoke Moore’s Law, as Goodman does again and again to convey an exponential increase in technology-fuelled risk, feels dated and a little cheap. (It also does not take into account the many people worldwide still trying to get online in the first place.) Paranoia has its limits as an operating system, and Goodman doesn’t give enough credit to the many efforts under way to educate and protect us. Other commentators make a better case for the need to equip individuals to keep control of their own data. One of the strongest voices belongs to Audrey Watters, who runs the excellent Hack Education website (www.hackeducation.com) and has a keen eye for corporate incursions into our lives. Watters and other “cybercritics” call attention to the unthinking ways in which we hand control of our data over to third parties, and encourage data-independence projects such as A Domain of One’s Own at the University of Mary Washington (http://umw.domains/).

While Goodman keeps his eye on risk, Scott Malcomson trains his on history. In Splinternet: How geopolitcs and commerce are fragmenting the World Wide Web, Malcomson traces cyberspace back to “the world of physical power”, specifically military might and national influence. For him, “the Internet’s alternative world was built by people in love with and in fear of machines, who wanted to see how machines might communicate with each other – forming, in a sense, one big machine – and how humans could communicate within and through this machine”. While we worry about governments tracking us, Malcomson makes the case that we wouldn’t have an internet at all without the state. In a sometimes dry but useful recap, he gives a good deal of credit to the “scientific cooperation and innovation on a large scale” created in response to the industrial scale of the First World War.

It is fascinating to read how the National Research Council (NRC), established by President Woodrow Wilson, became, “in effect, the birthplace of the military-industrial complex”. Malcomson describes it as “a formative place and experience for many of the people who would construct and unite the elements of digital computing and the Internet”, including Vannevar Bush, Frank Jewett and Elmer Sperry. (In Malcomson’s account, building the internet was pretty much entirely a man’s game.) The first big challenge involved how to fire ships’ guns accurately. “Naval gunnery was, in essence, a contest for survival between two machine-enhanced humans at a distance”, Malcomson explains, and to win the contest, the military needed control systems “that could very rapidly calculate the relationships among the shifting variables and aim and fire guns accordingly”. The (non-digital) device invented to accomplish that was called a computer – “the first use of the term to describe a machine”. Next came the rise of “a new political system aimed at constant innovation in the service of defense”, including the development during the Second World War of the supercomputer called ENIAC (Electronic Numerical Integrator and Computer), which helped to hone the missile-guidance systems of the Cold War era. (Incidentally, ENIAC’s programmers were women – the “ENIAC girls”.) The launch of the Soviet satellite Sputnik led President Eisenhower to create ARPA, the Advanced Research Projects Agency, in 1959. (“Defense” was added to its name in 1972, hence DARPA.) ARPANET – “the first Internet” – “was built primarily to enable time-sharing on a network of computers, which meant information had to be passed among them”.

Malcomson describes how the urge to create better machines, and to get them to talk to each other, toggled back and forth between the public and private sectors. In the 1960s, computing was still allied to national defence. By the 70s, it had shifted in what he calls “a countercultural direction” as computers went from room-sized to a more individual scale, a downsizing we’re still witnessing. Free-spirited San Francisco became a hub of activity, and the men – still men, in this telling – behind computers embraced hacker culture. Eventually, at the end of the 1980s, Tim Berners-Lee figured out protocols for content that permitted information “to be smoothly delivered to the individual computers that were requesting it”. Hello, internet.

Splinternet does a good job of describing the push–pull among hackers and engineers and researchers, the US government and private research companies and service providers – including Bell Labs and Fairchild Semiconductor, Apple and Microsoft, Google and Facebook – that created our online environment. By 2000, Malcomson says, the balance had shifted back towards the big players. “The subculture had lost the battle. Governments and large corporations would now shape the Internet.”

Yet, ever creative, the counter-culture finds workarounds. Hackers and the privacy-minded “jailbreak” their devices, removing software restrictions imposed by manufacturers, or use VPNs (virtual private networks), which enable users to bypass official, commercial channels and share data more securely. For Irene S. Wu, a senior analyst at the US Federal Communications Commission, commerce has been an enabler, not a handicap, for the internet and the connective technologies that preceded it, at least as far back as the telegraph in the nineteenth century. In Forging Communities: How technology changes politics, Wu considers case studies that illustrate how innovation has enabled activism, agitation and social change. Technology often feels like the organizing principle of the age; in this and previous eras, it has also shown its virtues as an enabler of organized protest and grass-roots action.

Like Malcomson, Wu takes the long view, but she broadens her focus beyond the West. She mentions the role played by Twitter and Facebook during the Arab Spring, and several of her case studies explore episodes that either took place in or focused on Asia. TsunamiHelp, for instance, brought together volunteers who used a blog and wiki (a collaborative website) to share quick-time information after the Boxing Day tsunami of 2004 that killed more than 225,000 people in South and Southeast Asia. A “trust community” of volunteers accomplished what governments and media could not, demonstrating how sharing data can be a great good as well as a great risk.

According to Wu, “The commercialization of communications technology is the foundation of that technology’s usefulness as a political tool”. It is easy and often justifiable to demonize corporations, but those that specialize in tech development have a pecuniary interest in the delivery of easy-to-use solutions and products. The easier a technology is to use, Wu points out, the more easily adopted it tends to be. Facebook and Twitter encourage us to part with our data, but their reach and practicality make them handy for organizers and activists as well as casual users.

Wu’s embrace of commerce as an engine of “trust communities” comes across as glib and too trusting of market-centred interventions. It’s true that where there’s a market, there’s usually a will to serve and exploit it. But Wu neglects the ways in which commerce undercuts independence online, and how prefab platforms and ready-to-go technologies can create what the anthropologist Amber Case calls the “templated self”. Trust communities have a dark side, too. During the recent American election season, many a Facebook page and Twitter feed became a study in the echo-chamber effect, where like-minded groups reinforce their own certainties and shout down opposing views. If there is a market for civilized debate, it is woefully under-capitalized.

Wu also fails to look over her shoulder at the creeping threat posed by government entities’ continued advance on the internet. Instead, she notes that just because a government controls information channels, such as state-run TV channels or websites, this doesn’t mean citizens are compelled to believe the news they are fed. That is true, but not very reassuring. For many content consumers, reality is mostly virtual anyway. That creates a problem scarier than many of Goodman’s scenarios: the rise of fabricated news, which spreads faster via Facebook and other online platforms than a cold virus in a kindergarten classroom. A BuzzFeed analysis from mid-November found that “top fake election news stories generated more total engagement on Facebook than top election stories from 19 major news outlets combined”. As one master of the fake-news genre told the Washington Post: “Honestly, people are definitely dumber. They just keep passing stuff around. Nobody fact-checks anything anymore”. Separating truth from fiction takes time, information literacy, and an open mind, all of which seem in short supply in a distracted, polarized culture. We love to share instantly – and that makes us easy to manipulate.

Our herd instinct complicates the notion of privacy. As Roberto Simanowski makes clear in his compelling Data Love: The seduction and betrayal of digital technologies, it’s hard to keep private something which on other levels we so fiercely desire to collect and share. “One needs to ask why people as citizens insist on a private sphere that they blithely ignore as consumers”, he writes. “Data love – our love for data and its love for us – is the embrace that hardly anyone in this day and age can avoid.”

Simanowski makes an excellent case that the most essential struggle is not with the NSA or Facebook but with ourselves. We are in the middle of a “cold civil war”, he writes, one taking place “not between the citizens but within the citizenry, that is, between the interest in technological progress, orientation, and being noticed on the one hand and, on the other, the occasional sense of discomfort at being the object of surveillance and control”.

Simanowski, too, takes a historical view. But is it any comfort to know that the late Middle Ages experienced “a paradigm shift from qualitative to quantitative understanding”? Not really, unless you find it soothing to think that humanity has been looking for ways to quantify and surveil itself since long before Steve Jobs brought us the iPhone.

Measuring and surveillance are indeed close cousins, and together they lead the hunt for more data. More data becomes Big Data, and Big Data wants to be used. A recent article in the Times Higher Education raised the question of whether university libraries, for instance, ought to use the data they collect about how often students visit to give evidence that increased library use improves academic performance. “You could argue it is unethical for institutions not to use this type of data if they know they can help students gain higher grades or stop them from dropping out”, said a consultant interviewed for the story. Taken far enough, that reasoning becomes as sinister as anything described in Future Crimes.

How to deal with the compulsion to collect and use ever more data is a matter of real urgency, but cybersecurity should be seen in the context of a much bigger set of concerns. (There is one notable omission in all of these books: none of the authors asks what happens to offline experience – or to the nonhuman world – as more and more of our thought and attention revolves around what we do online.) “The question we need to ask”, Simanowski insists at the end of Data Love, “is the one that addresses the message of the medium”: “What is the inner logic of digital media, and how do these media enforce this logic in the social constellation?” Online networking, born of a desire to share as well as count, increases the pull to quantify, and Simanowski identifies a “contemporary imperative of transparency and disclosure” in which there’s no such thing as too much sharing.

How much privacy are we willing to give up to reap the benefits of a networked world? To live digitally is a more complex and ambivalent process than any of these books captures, and there are risks that the authors do not acknowledge – for instance, how to archive and access the public data and cultural knowledge being created in quantities never seen before. At this moment in our digital evolution, though, what worries me most is whether we can find the collective will and the technological capacity to reclaim the internet from those who use it to exploit, control and abuse, whether they are criminals, governments, or white supremacists. It would be a disaster to let this decade spiral into a tech-enriched replay of the 1930s. Fear technology if you must, but fear the people who control it more.

The Post-Truth President and U.S. Credibility


image from

lobelog.com

by Paul R. Pillar; see also.
Just when we may have started to hope that the excesses of Donald Trump’s campaign will give way to a more sober and reasonable mode of behavior once in office, the president-elect has a way of lurching back to the familiar excesses, usually with an outburst on Twitter.  This past weekend it was his return to the Big Lie with the accusation that millions of people voted illegally in this month’s election.  It was an assertion so far removed from truth that the New York Times dispensed with journalistic political correctness and described the assertion correctly and accurately in a headline as “baseless.”
Maybe Trump was calculatedly laying groundwork for the enactment at the state level of additional voter suppression laws.  Maybe it was another instance of his using an attention-getting blurt to attract attention away from other matters, such as disarray in his transition operation or conflict-of-interest issues involving his business interests.  More likely it was a less calculated and less controlled lashing out by a notoriously thin-skinned man who abhors losing and has been seeing his losing popular vote margin grow to well over two million votes—without regard to how such a lashing out assists Russian efforts to discredit the workings of American democracy.
There are many other sad things that could be said about the consequences for those workings of having a leader with so little regard for truth, which encourages further entrenchment of falsehood in politics and public affairs.  In this respect Trump is both a symbol and arch-facilitator of a malevolent trend that led the Oxford English Dictionary to make “post-truth” its word of the year.
But consider for the moment one significant consequence for U.S. foreign relations: the greater disinclination of foreign governments and peoples to believe what the United States says. A significant ingredient of the pursuit of U.S. interests abroad is being weakened.  Daniel Drezner has explained part of the problem, citing John Mearsheimer’sresearch on lying by leaders and how they usually have good reason not to lie to other governments, and how credible commitment is a key component of deterrence.  But it is not just deterrence, and keeping others from doing what we don’t want them to do, that is involved.  Being able to make credible promises, and getting others to do what we want them to do as part of cooperative arrangements, also requires others to believe that one’s leader speaks truthfully and has every intention of following through on positive commitments.  Here Trump’s record of lying complements in the most deleterious way his business record of repeatedly stiffing vendors and sub-contractors—another habit of his that does not appear to be ending.
A fundamental underlying fact about the exercise of U.S. power overseas is that most of the time it is exercised not by the United States directly, physically doing things.  Most of the time its exercise involves other states perceiving the U.S. ability to do certain things and believing it will do those things under certain conditions.  That in brief is why credibility matters.
At stake is not just the reputation of any one occupant of the White House.  The credibility of the U.S. president affects the credibility of the United States.  And the perceptions that matter are those held not only by foreign governments but also by foreign publics.  A reputation for lying by the person at the top exacerbates what are already widespread and unhelpful tendencies of many people overseas not to believe what the United States says are its reasons for its actions overseas.  This is especially a problem in the Muslim world; in this instance with Trump, the deleterious complementarity is between his lying and his Islamophobia.
The threat to U.S. credibility involved here is far more real than the supposed threat that often is posited: that if the United States does not immerse itself in this or that conflict that is peripheral to its interests, then other governments will not believe that the United States will stand up for its interests elsewhere.  That is not how governments calculate credibilityU.S. credibility depends not on intervening in what is peripheral but instead on U.S. leaders being believed when they say something is vital.
This article was first published by the National Interest and was reprinted here with permission. Copyright The National Interest. Image courtesy of Thomas Guest via Flickr.

Teaching 1984 in 2016


via AA on Facebook

Every year, one high-school educator converts his classroom into a totalitarian state to teach George Orwell’s book. This year, the lesson feels different.

image from article

Andrew Simmons, The Atlantic

My classroom becomes a totalitarian state every school year toward the end of October. In preparation for teaching 1984 to seniors, I announce the launch of a new program aimed at combating senioritis, a real disease with symptoms that include frequent unexplained absences, indifferent reading, and shoddy work. I tell each class that another class is largely to blame for the problem and require, for a substantial participation grade, that students file daily reports on another student’s work habits and conduct; most are assigned to another student in the same class.

We blanket the campus in posters featuring my face and simple slogans that warn against the dangers of senioritis and declare my program the only solution to the school’s woes. Last year, my program was OSIP (Organization for Senior Improvement Project); this year, it’s SAFE (Scholar Alliance For Excellence). We chant a creed at the start of each class, celebrate the revelatory reports of “heroes” with cheers, and boo those who fail to participate enthusiastically. I create a program Instagram that students eagerly follow. I occasionally bestow snacks as rewards.

After a week, new posters (and stickers) speak less to senioritis and more to, well, me. The new slogans are simpler: my name, mostly. My image is everywhere. I change the rules, requiring students to obtain more points in order to pass. I restrict previously granted privileges, like the right to leave the room to use the bathroom. I subtract points for subjectively noted lapses in conviction. I fabricate a resistance movement and vow to stamp out the ignorant opposition to our noble cause.

Occasionally, a kid groans in exasperation and I fix him with a long, nasty, meaningful look. If a student asks about the point of it all, I ask him why no one else seems to have the same concern. I get louder. I get meaner. I give students points for alerting me to the sources of dissent. Eager to shore up their grades, gleeful at the chance to tweak friends and possibly enemies, a few students furtively hand over notes after classes. I collect the reports two weeks after they start the book, pronounce the experiment over (with language paying tribute to Orwell’s telling appendix), and ask them what they learned.

The simulation is my favorite activity of the year. This year, it feels a little different than usual. “Make School SAFE Again,” reads the students’ main slogan for their campaign this year, which launched two weeks ago. Other posters employ a comically primitive vocabulary arranged in brutally simple syntax:

Senioritis is a Disaster.

Senioritis is Disgusting.

Senioritis is Sad.

Senioritis is Shameful.

This year, I plumb the depths of the iffy performance instincts I honed in my high-school theater classes to attempt an increasingly belligerent swagger—a departure from my usual grinning cult leader shtick. Rampant senioritis is a problem, I warn, squinting and jabbing with a finger. I’m gonna stop it, I say. My antics and governing strategy highlight hallmarks of the superstate Oceania in 1984: An effective message should be simple, relentless, and inescapable; lies can become truths when listeners can’t conceive of alternatives; threats against free speech dampen resistance; fear of personal injury inhibits solidarity among citizens; scapegoating divides the populace; political enemies and those offering rational critical responses to tyranny are demonized. Evaluating these tactics is particularly important because my students live in a society in which they can, I believe, work spectacularly well.

Tremendously well.

The 1984 unit always reflects what’s going on in the country and world. The past few years, my classes have studied the NSA, the Patriot Act, and online privacy. Right now, some of my students are afraid that their world may start to feel more like the one they’re reading about in the book and experiencing in my classroom. My school is 65 percent Latino. The white kids tend to be liberal. I teach in Marin County, in the San Francisco Bay Area. Still, Latino students arrived on the morning of November 9, some from Richmond, across the bay, and reported that a handful of white people interrupted their morning commutes to urge them to “go back to Mexico.” I live in Oakland, and at about 12:30 a.m. on that Wednesday, a few men who appeared to be drunk and white staggered down my street, shooting extremely powerful fireworks into the sky above my house, yelling that this was “the greatest day in the history of America.”

I came to school on Wednesday, but a lot of students didn’t. Many of those who came said they were afraid, confused, angry, and anxious. Many Latino students tell me of ICE raids that happened on their neighborhood a decade ago. Several recall seeing their fathers handcuffed and thrown into vans by armored, helmeted officers. Some did not see them for years. A few kids have done time in private immigration detention centers. Many have family members who came to the United States from countries abused by corrupt regimes. Maybe the kids don’t understand how America’s government works. Don’t they know that there are checks on executive power and that campaign bravado—even the cruelest sort—doesn’t necessarily follow the country’s elected presidents into the White House? Maybe the kids—some of them gay, many of them immigrants, most of them young women—worry in a histrionic, sky-is-falling fashion because they’re less touched by “real-world” concerns than adults, who, of course, know better.

Their sense of the significance of the occasion and their expression of their concerns should not be dampened. Instead, it should be viewed as the ultimate teachable moment.

Every year, when I ask students what they learned from the class simulation—my artificial teachable moment—they say they realize that loyalty isn’t as ironclad as it should be. They didn’t question why they had to spy on fellow seniors; they chuckled about the task but they did it anyway. I had a points-hungry go-getter two years ago who eagerly filed detailed supplemental reports on her boyfriend.

Students say they learned how quickly a mission supposedly for the greater good can take an unpalatable detour. They admit that they did not always immediately grasp swift changes to previously outlined rules. They admit that they followed me on Instagram without considering the risk of letting an authority figure (were he so inclined) glimpse their personal lives. They realize that they didn’t ask for details about my plan to eliminate senioritis—that they formed no serious opposition, that they just grimaced when my back was turned and whined lightly in isolation. They cared about their grades, they admit, and they thought I was funny, so they did as they were told.

For these reasons, they always fail the simulation. From their performance, they learn a lesson about their weaknesses. This is also a key lesson from 1984. Understanding it can inform their response to the direction their country might be headed.

A good teacher does not try to firm up an ideological resistance along partisan lines. Instead, a good teacher shows students how to discern clickbait from reported stories and to read both Breitbart and The New York Times, not to keep a balanced personal perspective so much as to examine how media outlets interpret and spin events. In an age when fact-checkers can provide guidance in real time and the internet swells with more information than a person can actually take in, students need to be able to read more than captions and watch clips longer than 10 seconds.

Analytical, communication, and attention deficits are a problem of education but also a social environment that has steadily required less in the way of written and verbal communication, as well as an entertainment industry that has provided content—shorter, faster, brighter, simpler—to suit that shift. Students need to hack through manipulative language, whether it be a bill’s obtuse legalese concealing bigotry or stark campaign declarations loaded with ugly connotations. They need to see books as rich, perpetual gifts to those in need of solace and inspiration, and to know that their fears have been addressed before, in more dire circumstances, and that thinkers from the past can help them anticipate the new guises of the terrors they faced.

I am ecstatic to be a teacher at this time in American history. I have a responsibility—not to transform every liberal parent’s progeny into a slightly sharper copy or radicalize future voters skeptical of politics, but to shore up their critical faculties, to make them more skilled readers, writers, and thinkers. And to also make them decent, compassionate, alert, engaged truth-seekers, neither callous, fearful Party enablers nor complacent, dead-eyed Proles who poke their iPhones and scoff at memes and chirp their discontent in brief blips of coherence. Bravery is something that people can be taught. Books may be the best teachers for what to do when the fireworks veer too close. They show students how to write their own appendix to a sad chapter that feels final. My 12th-grade classes are reading 1984. And, in an essay for another day, my ninth-grade class is halfway through To Kill a Mockingbird. Former high-schoolers who did their reading probably don’t need that story’s relevance explained.

In December 2015, a student reacted angrily when I wondered if the average social-media-enthralled 17-year-old in 2015 might not possess the reading and writing proficiency of her 1965 counterpart. I was asking students if, as with the Newspeak-besieged citizens of Oceania in 1984, a struggle to unravel and communicate complex ideas could result in the gradual erosion of those ideas themselves. It’s just different now, not worse, the student said. With the bell, 10 minutes later, she breezed toward the door. Over her shoulder, she shouted, sprightly and confident, that classes shouldn’t have to read 1984. It was too long, too confusing, and too full of words no one used anymore. Nothing that has happened in the past 365 days has made me more afraid and emboldened than that.

The Two Americas of 2016 - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."

For many Americans, it feels as if the 2016 election split the country in two.
To visualize this, we took the election results and created two new imaginary nations by slicing the country along the sharp divide between Republican and Democratic Americas.

Trump’s America

Greater
Puget
Sound
Alaskan
Outpost
Rolette
Cove
Spokane
International Falls
Baxter
Island
Missoula
Lake
Minot
Connecticut
Strait
Helena
Eugene
Estuary
Portland
Sound
Boundary
Water
Bismarck
Augusta
Fargo
Bozeman
Lake
Billings
Boise
Upstate
Twin Cities
Reservoir
Boston
Harbor
Teton
Tarn
Pierre
Buffalo
Bay
Sioux Falls
Madison
Lake
Casper
Ithaca
Lake
Detroit
Delta
Des Moines
Pond
Lakota
Lakes
Elko
Salt
Lake
New York
Narrows
Chicago
Sea
Reno
Reservoir
Cheyenne
Cleveland
Cove
Pittsburgh
Puddle
San
Francisco
Bay
Indy
Puddle
Omaha
Acela
Channel
D.C.
Delta
Denver
Sea
Champaign
Kettle
Kansas City
Isthmus
Cincinnati
Slough
Charleston
Colorado Springs
St. Louis
Pond
Norfolk
Sound
GREAT
AMERICAN
PLAINS
Louisville
Lake
Bakersfield
Hook
Las Vegas
Harbor
Raleigh
Bay
Lake
Flagstaff
Knoxville
Los Angeles
Bay
Tulsa
Nashville
Pond
Santa Fe Sea
Amarillo
Wilmington
Memphis
Harbor
Oklahoma City
Phoenix
Columbia
Lake
San Diego
Inlet
Atlanta
Lake
Miss.
Lake
Bay of Tucson
Fort Worth
Dallas
Pond
Selma
Sea
El Paso
Bay
Jacksonville
Austin
Pond
Houston
Bay
San Antonio
Lake
New Orleans
Inlet
Lake
Disney
By Tim Wallace/The New York Times
Laredo Gulf
Tampa
Bay
Corpus Christi
Miami
Bay
Geographically, Donald J. Trump won most of the land area of the United States. A country consisting of areas he won retains more than 80 percent of the nation’s counties.
While Trump country is vast, its edges have been eroded by coastal Democrats, and it is riddled with large inland lakes of Clinton voters who were generally concentrated in dense urban areas.

Clinton’s America

Seattle
Montana
Archipelago
Old Glacier
Gulf
Maine
Cove
Lutsen
Island
Portland
Northwest
Sea
Albany
Narrows
Coos
Bay
Minneapolis
Boston
Great Bays
New
England
Buffalo
Milwaukee
Detroit
Wyoming
Shallows
Reno
Island
Cleveland
New York
Salt Lake City
Chicago
Philadelphia
Midwest
Isles
Des Moines
Pittsburgh
Denver
San Francisco
Washington
Cincinnati
St. Louis
Norfolk
Blue Ridge
Sea
Bakersfield
Bay
Las Vegas
Raleigh
Great
American
Ocean
Nashville
Albuquerque
Island
Los Angeles
Carolina
Islands
Santa Fe
Memphis
Maricopa
Sea
San Diego
Atlanta
Mississippi
Island
Tucson
Island
High Plains
Sea
Dallas
El Paso
Island
Jackson
Georgian
Strait
Hawaiian
Islands
Alabama
Gulf
Houston
San Antonio
Tampa
Isla
Grande
By Tim Wallace/The New York Times
Miami
Hillary Clinton overwhelmingly won the cities, like Los Angeles, Chicago and New York City, but Mr. Trump won many of the suburbs, isolating the cities in a sea of Republican voters.
Mrs. Clinton’s island nation has large atolls and small island chains with liberal cores, like college towns, Native American reservations and areas with black and Hispanic majorities. While the land area is small, the residents here voted for Mrs. Clinton in large enough numbers to make her the winner of the overall popular vote.

Land Area

Clinton’s America
15%530,000 square miles
 
Trump’s America
85%3,000,000 square miles

Population

Clinton’s America
54%174 million
 
Trump’s America
46%148 million

Popular Vote

As of Friday, Nov. 18. Percentages are for Trump
and Clinton votes only and exclude other candidates.
For Clinton
50.5%62.1 million
 
For Trump
49.5%61.0 million
Note: The illustrations are based on an analysis of county-level voting data to determine where a dividing line between areas that voted Democratic and Republican would fall.