Sunday, December 31, 2017

Murrow's Cold War


Renee Earle, American Diplomacy


Murrow's Cold War: Public Diplomacy for the Kennedy Administration (2016) by Gregory M. Tomlin. University of Nebraska Press: Potomac Books. ISBN 978-1-61234-771-4. 400 pp 12 illustrations. Hardcover, 34.95.

For many Public Diplomacy practitioners, the three years that Edward R. Murrow headed the U.S. Information Agency represent the golden years, a time when Public Diplomacy was there at the take-off and not only at the crash landing, as Murrow famously said. [See *** JB note below.]The seat at the policy-decision table was not easily won, however, even for the likes of Murrow. Gregory Tomlin’s engrossing book takes us through this period with thorough research and interesting insights into the policies and events, both domestic and international, of the Kennedy Administration and the role of public information in the evolution of these events. The book is as much about USIA and the conduct of public diplomacy as about Murrow himself, and, in focusing on this period in Edward R. Murrow’s life, Tomlin addresses a fundamental question concerning Public Diplomacy, then as today: is Public Diplomacy a nice add-on or an important element in determining outcomes in the conduct of our foreign policy.


With an excellent introduction tracing the evolution of the USIA and the many debates surrounding the agency’s mission, effectiveness, and the overall value of considering foreign public opinion when determining policy, Tomlin sets the stage for Murrow’s arrival at USIA in 1960. He examines this more rarely explored period in Murrow’s life, depicting the man and the professional, and paints an accessible picture of the often enigmatic Murrow. During his term as USIA director, Murrow is shown as interested not only in policy and programs, but also in the workings and people of his agency. At this time, most exchange programs were still with the State Department, and USIA’s core programs were in information (radio, film, and libraries, followed by TV). Tomlin expertly describes Murrow’s struggle to tell USIA’s own story so as to convince Washington of the value and contributions of its information activities, a task the author rightly judges as often more difficult than the effective global propaganda with which it was charged.

Tomlin masterfully presents the political backdrop of the momentous years of the Cold War, from Cuba to the Berlin Wall and Vietnam. He retells historical events vividly with enough but not overwhelming detail, facilitating the reader’s entry into the many and simultaneous challenges facing the Kennedy Administration and USIA. With numerous examples, Tomlin describes the potential and decisive game changers of public opinion that lead to policy change, and argues for the importance of getting public diplomacy right. Getting it right, Tomlin posits, includes bringing in PD early and the coordination among USG agencies that Murrow worked so hard to achieve. To demonstrate, Tomlin, contrasts two telling examples of when it works and when it doesn’t—the Bay of Pigs and Cuba Missile Crisis.

The author highlights several elements which Murrow insisted were essential to successful PD. The importance of understanding audiences through consistent research informed Murrow’s strong recommendation for the U.S. not to resume nuclear testing. U.S. policy was caught between international public opinion that indicated fear of literal and figurative nuclear fallout and the need to show strength to domestic hardliners as well as to NATO allies who feared, in turn, that the U.S. capability might not be not strong enough in a stand-off with the USSR. The author describes the accompanying dilemma for the U.S. Information Agency as it worked to address both issues while still heeding one of Murrow’s primary tenets, that U.S. information be truthful.

The book often returns to Murrow’s deep understanding of the value of using the human dimension in successful Public Diplomacy. He grasped, perhaps like no other USIA director, that communication is furthered when one understands the interlocutor/audience and crafts a message in that context. He further understood the irreplaceable contribution that direct contact makes to this effort, giving rise to his perhaps Apocryphal comment that in effective persuasion, it’s the last three feet that counts. Our new “interconnectedness” through the Internet, and in Murrow’s day television, is no substitute. The human dimension infused USIA’s exploitation of the U.S. space program. Tomlin offers a captivating account of the origins of this effort and the mostly positive public opinion it garnered. These were also the heady days of the birth of USAID and Peace Corps. Murrow saw clear links between the missions of these two agencies and the USIA’s ability to project a better image of the U.S.

The author addresses the fundamental question of why the USG needs an information capability at all while we have CNN, a question as pertinent today as in Murrow’s time. Despite the increasingly media-rich environment, reaching new heights first through television and now with the Internet, it was not evident to everyone that the USG needed an energetic source of authoritative and trusted information in this mix of voices. Tomlin rightly highlights the continuous struggle for adequate financing of the USG information effort in this increasingly competitive world of information dissemination. As Tomlin points out, already in Murrow’s time “the days were over when USIA films would draw thousands into commercial theaters.” The author presents illustrative examples of Murrow’s uphill battle with Congress, and his tremendous efforts to supplement USIA’s own meager resources by enlisting the help of the private sector. Murrow was forced to persuade a German printing press to print pamphlets about the Berlin Wall since the U.S. Air Force would transport them from the U.S. only if it was reimbursed.

Tomlin devotes a chapter to Murrow’s clear understanding of the lingering importance of race in America in presenting a credible image of the U.S. as the global megaphone for democracy and freedom. This domestic issue, ignored by many foreign-policy makers, dogged the USIA as much, if not more, than most foreign policy issues. Here again the author sheds new light on an issue largely ignored in Murrow biographies. Realizing that African diplomats considered the U.S. a “hardship post “given their treatment in the U.S. during the segregation years, Murrow understood immediately how much this tarnished the U.S.’s credentials in fighting the Communist system throughout the Cold War. Well exploited by Communist propagandists, images of racial strife in the U.S. quickly made it around the world.

Race in America continues to be an issue for our Public Diplomacy today—and not only in Africa—, and USIA’s film about segregation in the U.S., “The March,” still captivates embassy audiences worldwide. Criticized by many at the time, the film is one of many examples of Murrow’s quest for balance and insistence on the truth, as he realized that exclusively “positive” pictures of America would instantly be received as propaganda and thereby be counterproductive. Moreover, we learn that Murrow’s insistence on raising the issue with Kennedy was not simply one of good PR, but also a moral one for him. The author credits Murrow’s efforts with effecting the pace with which the Administration addressed civil rights and credits USIA and its information campaigns with Khrushchev’s realization that confrontation with the U.S. would not advance his goals in Latin America.

For many past and present PD practitioners, the challenges to Edward R. Murrow as director of USIA will read like déjà vu. They will recognize their own experience in Murrow’s struggles, as Public Diplomacy is often still held to a higher standard of proof of effectiveness than other foreign affairs activity, requiring investment of already limited time and resources. And, Public Diplomacy too often remains an afterthought. Thanks to Tomlin’s detailed account of PD in Murrow’s day, it becomes apparent that, while tools must change with the time, Murrow’s principles of successful PD—gaining credibility by telling the truth and understanding audiences through direct contacts—stand and should be emulated still.


Tomlin offers a highly readable account of Edward R. Murrow’s role in a tumultuous time of U.S. political history and the development of America’s information dissemination capabilities. It will be of interest to professionals and the more casual reader alike. I highly recommend it.


Renee M.Earle is a retired Public Diplomacy Foreign Service Officer with the rank of Minister-Counselor. She served at our Embassies in Turkey, USSR/Russia, Kazakhstan, the Czech Republic, France, and the U.S. Mission to the European Union in Brussels. Domestic positions with the Department of State included Diplomat-in-Residence at Duke University in North Carolina, Acting Office Director of Public Diplomacy in the European Bureau, and Chief of the Central Asia Division of the Voice of America, where she directed the Pashto, Dari, Farsi, Uzbek, Azeri, and Turkish language services.
***
JB comment:  What Murrow actually said, according to Google search citing the Tomlin book:
"Frustrated and angry, Murrow complained: 'Dammit, if they want me in on the crash landing, I'd better damned well be in on the take-off!' (67). Regrouping after the Bay of Pigs fiasco, USIA scored mixed successes in promoting economic development in Latin America, but its efforts foundered as poverty and inequality in."
****
From: "The Evolution of American Public Diplomacy: Four Historical Insights By Seth Center, Ph.D. Office of the Historian, U.S. Department of State"
"Throughout the Cold War, the PD apparatus was a regular target of reform studies, and its budgets were under constant scrutiny. Public diplomatists wrestled with the balance between unapologetic messaging and building two-way bridges through intercultural communication. The United States Information Agency (USIA) rarely had a “seat at the table” in policy deliberations. It was after all Edward R. Murrow, USIA’s most famous director, who lamented that if PD was expected to be in on the crash landing it should also be in on the takeoff. When Americans did pay attention to PD, it was often with over-inflated expectations. Many Americans--including Presidents and Congressmen--could not comprehend how information seemed incapable of blunting anti-Americanism abroad and building sympathy for U.S. policies."
"In on the take-offs and not just the crash landings"

Scholars and practitioners of public diplomacy often attribute this phrase to former USIA Director Edward R. Murrow, who indeed used it to make the case for putting USIA at the policymaking table during the Kennedy Administration. The phrase was used earlier by Senator Arthur H. Vandenberg in calling for bipartisanship during the Truman Administration. Vandenberg attributed the phrase to Harold Stassen.

"As Murrow saw it, the important thing was not that the USIA Director, as a member (sic) of the National Security Council, should argue for or against policy on psychological grounds. It was that he should be informed in advance of policies in the making, and take part in their formulation. As he frequently stated it, the USIA should be "in on the take-offs, and not just the crash landings," like that of the U-2 spy plane shot down in Siberia." Alexander Kendrick, Prime Time: The Life of Edward R. Murrow, 1969, p. 456.

". . . I don't care to be involved in the crash-landing unless I can be in on the take-off. Harold Stassen, comment on bipartisanship, attributed to him by Senator Arthur H. Vandenberg." Suzy Platt, ed. Respectfully Quoted: A Dictionary of Quotations Requested from the Congressional Research Service, 1989, pp. 260-261.

"Once again, it was the procedure of a hurried call to senators and a last-minute meeting to inform them of an impending development or of the execution of a policy, and not to consult on the formation of policy. Vandenberg then and thereafter insisted that real bipartisanship meant consultation in advance and not a perfunctory reading to legislators of an impending press announcement or policy statement . . . Stassen's comment, the Senator used to say, was such a good statement of the Republican case that he wished it were his." Arthur H. Vandenberg, Jr., ed. The Private Papers of Senator Vandenberg, 1952, p. 230.

How Google’s Quantum Computer Could Change the World


Jack Nicas, Wall Street Journal

The ultra-powerful machine has the potential to disrupt everything from science and medicine to national security—assuming it works


Excerpts from the article:
"Classical computers, like your laptop or phone, store and process information using bits, which have a value of either 1 or 0. Bits are represented by tiny electrical circuits called transistors that toggle between on (1) and off (0). To your iPhone, every finger tap, selfie and Rihanna hit is simply a long sequence of ones and zeros.

Quantum bits, or qubits, use superposition to exist in both states at once—effectively one and zero at the same time. In a classical computer, bits are like coins that display heads or tails. Qubits, on the other hand, are like coins spinning through the air in a coin toss, showing both sides at once. ...
Documents leaked by former NSA contractor Edward Snowden in 2013 showed that the NSA is building its own quantum computer as part of an $80 million research program called Penetrating Hard Targets, according to the Washington Post. ...
Because particles lose superposition with the slightest interference, quantum computers must be radically isolated from the outside world. ...
Neven’s team in Southern California is racing to finish the 49-qubit chip that they hope will carry them to quantum supremacy and into a new frontier of technology, where computers leverage unthinkably complex natural laws rather than converting the world into ones and zeros."
*** 
Hartmut Neven believes in parallel universes. On a recent morning outside Google’s Los Angeles office, the 53-year-old computer scientist was lecturing me on how quantum mechanics—the physics of atoms and particles—backs the theory of a so-called multiverse. Neven points to the tape recorder between us. What we’re seeing is only one of the device’s “classical configurations,” he says. “But somewhere, not perceived by us right now, there are other versions.” According to Neven, this is true for not just tape recorders but all physical objects. “Even for systems like you and me,” he says. “There is a different configuration of all of us in a parallel universe.”


Neven, who speaks with a thick German accent and favors pink Christian Louboutin sneakers covered in spikes, has led some of Google’s most groundbreaking projects, from image-recognition software to Google Glass, a consumer flop that pioneered the idea of head-worn computers. The task in front of him is the most complex of his career: Build a computer based on the strange laws of quantum mechanics.
There is no quick explanation of quantum mechanics, but the Cliffs Notes version goes something like this: Scientists have proved that atoms can exist in two states at once, a phenomenon called superposition. A single atom, for example, can be in two locations at the same time.
Superposition gets even stranger as it scales. Because everything is made of atoms, some physicists theorize that entire objects can exist in multiple dimensions, allowing—as Neven suggested—for the possibility of parallel universes.
Even Albert Einstein couldn’t get his head around this. The Nobel Prize-winning physicist declared the thinking behind quantum mechanics to be fundamentally flawed. Scientists have since proved the theory repeatedly and conclusively.
Inside Google’s Santa Barbara, Calif. lab, where the company’s delicate quantum chips sit frozen in a cryostat suspended off the floor.
Inside Google’s Santa Barbara, Calif. lab, where the company’s delicate quantum chips sit frozen in a cryostat suspended off the floor. PHOTO: SPENCER LOWELL FOR THE WALL STREET JOURNAL
These laws are behind the next revolution in computing. In a small lab outside Santa Barbara, Calif., stocked with surfboards, wetsuits and acoustic guitars, Neven and two dozen Google physicists and engineers are harnessing quantum mechanics to build a computer of potentially astonishing power. A reliable, large-scale quantum computer could transform industries from AI to chemistry, accelerating machine learning and engineering new materials, chemicals and drugs.
“If this works, it will change the world and how things are done,” says physicist Vijay Pande, a partner at Silicon Valley venture firm Andreessen Horowitz, which has funded quantum-computing start-up Rigetti Computing.
Others, especially those in academia, take a more nuanced view.
“It isn’t just a faster computer of the kind that we’re used to. It’s a fundamentally new way of harnessing nature to do computations,” says Scott Aaronson, the head of the Quantum Information Center at the University of Texas at Austin. “People ask, ‘Well, is it a thousand times faster? Is it a million times faster?’ It all depends on the application. It could do things in a minute that we don’t know how to do classically in the age of the universe. For other types of tests, a quantum computer probably helps you only modestly or, in some cases, not at all.”
For nearly three decades, these machines were considered the stuff of science fiction. Just a few years ago, the consensus on a timeline to large-scale, reliable quantum computers was 20 years to never.
“Nobody is saying never anymore,” says Scott Totzke, the chief executive of Isara Corp., a Canadian firm developing encryption resistant to quantum computers, which threaten to crack current methods. “We are in the very, very early days, but we are well past the science-fiction point.”
Companies and universities around the world are racing to build these machines, and Google, a unit of Alphabet Inc., appears to be in the lead. Early next year, Google’s quantum computer will face its acid test in the form of an obscure computational problem that would take a classical computer billions of years to complete. Success would mark “quantum supremacy,” the tipping point where a quantum computer accomplishes something previously impossible. It’s a milestone computer scientists say will mark a new era of computing, and the end of what you might call the classical age.
Google’s 64-square-millimeter chips are currently the most advanced general-purpose quantum computers in the world.
Google’s 64-square-millimeter chips are currently the most advanced general-purpose quantum computers in the world. PHOTO: SPENCER LOWELL FOR THE WALL STREET JOURNAL
Classical computers, like your laptop or phone, store and process information using bits, which have a value of either 1 or 0. Bits are represented by tiny electrical circuits called transistors that toggle between on (1) and off (0). To your iPhone, every finger tap, selfie and Rihanna hit is simply a long sequence of ones and zeros.

Quantum bits, or qubits, use superposition to exist in both states at once—effectively one and zero at the same time. In a classical computer, bits are like coins that display heads or tails. Qubits, on the other hand, are like coins spinning through the air in a coin toss, showing both sides at once.
That dynamism allows qubits to encode and process more information than bits do. So much more, in fact, that computer scientists say today’s most powerful laptops are closer to abacuses than quantum computers. The computing power of a data center stretching several city blocks could theoretically be achieved by a quantum chip the size of the period at the end of this sentence.
That potential is a result of exponential growth. Adding one bit negligibly increases a classical chip’s computing power, but adding one qubit doubles the power of a quantum chip. A 300-bit classical chip could power (roughly) a basic calculator, but a 300-qubit chip has the computing power of two novemvigintillion bits—a two followed by 90 zeros—a number that exceeds the atoms in the universe.

QUANTUM VS. CLASSICAL COMPUTERS

Classical computers run on bits, units of information that are either one or zero. Quantum computers use qubits, which can be both one and zero at the same time. This allows qubits to process far more information than bits for specific tasks—particularly when combined. Each additional qubit doubles a quantum computer’s power, and this exponential growth creates a dramitically more powerful machine at scale.
How Google’s Quantum Computer Could Change the World
ILLUSTRATION: TODD DETWILER
But this sort of comparison works only for specific computational tasks. Comparing bits to qubits is facile because quantum and classical computers are fundamentally different machines. Unlike classical computers, quantum computers don’t test all possible solutions to a problem. Instead, they use algorithms to cancel out paths leading to wrong answers, leaving only paths to the right answer—and those algorithms work only for certain problems. This makes quantum computers unsuited for everyday tasks like surfing the web, so don’t expect a quantum iPhone. But what they can do is tackle specific, unthinkably complex problems like simulating new molecules to engineer lighter airplane parts, more effective drugs and better batteries.
Quantum computers are also subject to high error rates, which has led some scientists and mathematicians to question their viability. Google and other companies say the solution is error-correction algorithms, but those algorithms require additional qubits to check the work of the qubits running computations. Some experts estimate that checking the work of a single qubit will require an additional 100.
Confused? You’re in good company. In a recent interview with “WSJ. Magazine,” Microsoft Corp. co-founder Bill Gates said the company’s quantum-computing project is “the one part of Microsoft where they put up slides that I truly don’t understand.”
Richard Feynman, a Nobel Prize-winning theoretical physicist, put it this way: “I think I can safely say that nobody understands quantum mechanics.”
Feynman was one of the first to introduce the idea of a quantum computer. In a 1981 lecture, he said simulating physics would require a computer based on nature or quantum mechanics. “Nature isn’t classical, damn it,” he said. “If you want to make a simulation of nature, you’d better make it quantum mechanical.”
For the next two decades, researchers tried and failed to create the machines Feynman envisioned. Qubits proved extremely fragile and fickle. They could maintain superposition—the state that enables their massive computing power—for just a few nanoseconds, or billionths of a second. And an almost imperceptible temperature change or even a single molecule of air could knock them out of that state.
“It’s a bit like trying to balance an egg at the end of a needle,” IBM ’s quantum-computer scientist Jerry Chow said in a speech. “You certainly can do it, but any little disturbance from noise, from heat, from vibrations, and you’ve suddenly got yourself a sunny-side up.”
In the past five years, scientists have made major progress on that balancing act. In response, investment has surged, with projects under way at Google, Microsoft, IBM and Intel Corp. , and interest from potential customers has followed.
Volkswagen AG is testing quantum computers made by Canadian firm D-Wave Systems Inc. In March, the companies said that, using GPS data from 10,000 taxis in Beijing, they created an algorithm to calculate the fastest routes to the airport while also minimizing traffic. A classical computer would have taken 45 minutes to complete that task, D-Wave said, but its quantum computer did it in a fraction of a second.
This makes it sound like D-Wave has won the race, but the company’s $15 million 2000Q model is useful only for a narrow category of data analysis, which includes the Volkswagen test. While the 2000Q has 2,000 qubits—-a figure scientists warn shouldn’t be compared with general-purpose quantum computers like Google’s—the machine hasn’t achieved quantum supremacy. D-Wave President Bo Ewald says the 2000Q isn’t designed to get the best answer, but rather a “good enough answer in a short period of time.”
Not everyone is eager for large-scale, accurate quantum computers to arrive. Everything from credit-card transactions to text messaging is encrypted using an algorithm that relies on factorization, or reverse multiplication. An enormous number—several hundred digits long—acts as a lock on encrypted data, while the number’s two prime factors are the key. This so-called public-key cryptography is used to protect health records, online transactions and vast amounts of other sensitive data because it would take a classical computer years to find those two prime factors. Quantum computers could, in theory, do this almost instantly.
Companies and governments are scrambling to prepare for what some call Y2Q, the year a large-scale, accurate quantum computer arrives, which some experts peg at roughly 2026. When that happens, our most closely guarded digital secrets could become vulnerable.
The NSA warns that code-breaking quantum computers could be “devastating” to national security 
Last year the National Security Agency issued an order that U.S. national-security employees and vendors must, “in the not-too-distant future,” begin overhauling their encryption to guard against the threat posed by quantum computers. Because national-security information must be protected for decades, the agency says new encryption needs to be in place before these machines arrive. Otherwise, the NSA warns, code-breaking quantum computers would be “devastating” to national security.
Governments aren’t just playing defense. Documents leaked by former NSA contractor Edward Snowden in 2013 showed that the NSA is building its own quantum computer as part of an $80 million research program called Penetrating Hard Targets, according to the Washington Post. It’s unclear how far the NSA has gotten in its quest. The agency declined to comment.
The primary impetus in the race for quantum computers is the potential to upend industries. Experts believe their biggest near-term promise is to supercharge machine learning and AI, two rapidly growing fields—and businesses. Neven of Google says he expects all machine learning to be running on quantum computers within the decade.
This commercial race heated up considerably earlier this year. In May, IBM unveiled a chip with 16 qubits, a milestone for general-purpose quantum computers. The day before, a trade group published an interview with John Martinis, Google’s head of quantum hardware, in which he let slip that Google had a 22-qubit chip.
A look under the microscope at Google’s 64-square-millimeter chip reveals a sly bit of branding: the company’s name spelled out along the bottom.
A look under the microscope at Google’s 64-square-millimeter chip reveals a sly bit of branding: the company’s name spelled out along the bottom. PHOTO: SPENCER LOWELL FOR THE WALL STREET JOURNAL
Today Google’s chips sit frozen inside elaborate vats called cryostats in the company’s Santa Barbara lab, a laid-back outpost of the quantum project led by Neven in Los Angeles. With its ping-pong table and assorted bongos, the space feels like an extension of the nearby University of California, Santa Barbara campus. Martinis, who runs the office, is a physics professor at UCSB, and many of his hires are graduates. Staff meetings are occasionally interrupted by the resident lab dog, a Papillon-Pomeranian mix named Qubit.
On a recent afternoon, Daniel Sank and Amit Vainsencher, two laid-back engineers with mops of curly hair and recent Ph.D.s from UCSB, led me to a gleaming cryostat in one corner of the lab. Because particles lose superposition with the slightest interference, quantum computers must be radically isolated from the outside world. The cryostat’s mu-metal exterior, a soft magnetic alloy that blocks the Earth’s magnetic field, was adorned with a single bumper sticker: “My other computer is classical.”
Compressed helium and liquid nitrogen, pumped from an adjacent frost-covered tank, cool the inside of the cryostat to minus 459.6 degrees Fahrenheit, a fraction of a degree above the lowest temperature possible, which enables the conductivity necessary for Google’s qubits to run computations. “If you were to vibrate this frame, you can actually see the temperature rise on the thermometer,” Vainsencher says before shaking the structure that suspends the cryostat above the ground to limit interference from vibration. “I probably shouldn’t do that,” he says.
Such a complex and expensive setup means that Google and its peers will likely sell quantum computing via the cloud, possibly charging by the second.
For now, Neven’s team in Southern California is racing to finish the 49-qubit chip that they hope will carry them to quantum supremacy and into a new frontier of technology, where computers leverage unthinkably complex natural laws rather than converting the world into ones and zeros.
“There is no transistor in this computer,” Neven says. “It’s a completely different beast. It’s a native citizen of the multiverse.”

Friday, December 29, 2017

Important article for students of propaganda: The Year the News Accelerated to Trump Speed


By MATT FLEGENHEIMER DEC. 29, 2017, New York Times [Original article contains links.]

image (not from article) from

[JB comment, inspired by this article: What is "masterful" propaganda today (if not since the beginning of time)? Not the "meaning" of what you say -- but how do you control (overwhelm?) human minds/feelings with hair styles/physical gestures/tweets/"news/information" -- but above all, with meaninglessness?]. The height of cynicism?

WASHINGTON — Barack Obama was president earlier this year.

Really, eyewitness accounts from the period confirm this. It lasted nearly three weeks, it seems, or roughly the time elapsed since a Democrat won a Senate seat in Alabama, if memory serves, which it generally does not anymore.

That special election came before President Trump helped usher a once-in-a-generation tax overhaul through Congress, but after he recognized Jerusalem as Israel’s capital, which preceded threats to end American aid to any countries that objected — and they did, en masse, in a remarkable United Nations vote that almost certainly took place somewhere in there, right? Possibly around the time the
president accused a female senator of doing “anything” for campaign contributions, touching off tremors in the #MeToo movement he helped inspire, and alleged another wide-scale conspiracy against him in the upper reaches of the F.B.I.

Or was that one over the summer? When were those hurricanes again? Oh, and the Pentagon has been tracking possible alien visitation. That definitely came up.

One year out, this may be Mr. Trump’s greatest trick: His tornado of news-making has scrambled Americans’ grasp of time and memory, producing a sort of sensory overload that can make even seismic events — of his creation or otherwise — disappear from the collective consciousness and public view.

He is the magician who swallows a sword no one thought was part of the act, stuffs a dozen rabbits into a hat before the audience can count them — and then merrily tweets about “Fox & Friends” while the crowd strains to remember what show it had paid to attend in the first place.

Diplomatic crises. Human tragedies. The Mooch.

Poof.

“We crammed six years in,” said Jason Chaffetz, a former Republican congressman from Utah who left the job at the end of June.

“We get to the point where we’re just done dealing with something,” said Matt Negrin, a digital producer at “The Daily Show,” recalling unresolved maelstroms like Mr. Trump’s feud with a Gold Star widow, his baseless claim that Mr. Obama wiretapped him and his defense of white nationalist supporters amid the deadly violence this summer in Charlottesville, Va.

“That’s something, in my opinion, we should be talking about,” Mr. Negrin said. “But then the eclipse happened five days later. Not that Trump created the eclipse. But maybe.”

The disorientation has had far-reaching effects, shaping not only Mr. Trump’s public image but also the ways in which lawmakers, journalists and others in his ecosystem are compelled to operate.

It is not exactly that “nothing matters,” to borrow social media’s favorite nihilistic buzz-phrase of the Trump age. It is that nothing matters long enough to matter.

“Las Vegas and the church in Texas have fallen off the map — two of the most heinous mass murders in recent American history,” said Tom Brokaw, the special correspondent at NBC News, flagging two episodes that would have, under previous circumstances, most likely remained seared in the national conversation. “It’s astonishing. It should be one of the defining stories not just of the year but of our
time.”

There are a lot of those. And the president’s apparent triumph over the space-time continuum has created practical concerns across newsrooms and congressional offices, exacerbated by forces that predate Mr. Trump: the rise of Facebook and Twitter, the partisan instincts of cable news and, in the case of mass shootings, what many describe as a growing public imperviousness to horror.

Senator Christopher S. Murphy, Democrat of Connecticut, who became a prominent gun control advocate after the 2012 massacre in Newtown, Conn., described his task under Mr. Trump as a “triage” mission, “newly overwhelming” every day.

“As someone who works on an issue that is unfortunately driven by news cycles,” he said, “it makes it harder to try to focus attention.” The most heinous shootings once dominated TV news for days. “Now,” Mr. Murphy said, “it doesn’t seem that there’s much more than 24 hours’ room for any
story.”

Of course, Mr. Trump runs neither the networks nor the newspapers, much as he might prefer it at times, and the news media has come by its share of criticism honestly. Corralling the fire hose of White House doings has become a near-constant exercise in news judgment, with mixed returns.

Not every Twitter tremble requires mass attention. Not all executive skirmishes need a referee on every channel.

“Trump is just so dislocating for everybody that it’s making us all nuts,” said Peter Hamby, the head of news at Snapchat. “There’s so much sexy, salacious, bananas-crazy news happening every single day. But there is a duty, I think, to cover substance.”

No one suggests the mandate is simple. Even Mr. Brokaw, a dean of meat-and-potatoes news delivery, allowed that the daily Trumpian churn is “not unimportant, and it’s got this Shakespearean quality about it.”

And while there have been frenzied, tumultuous years before, present conditions are unique.

The 2016 presidential campaign season delivered rapid-fire insanities without precedent, but still adhered broadly to the rhythms of an election cycle: the primaries, the conventions, the debates, the big day.

In 2017, the chaos tends to be unscheduled.

The most oft-cited parallel is 1968 — a tinderbox of tragedy, protest and political upheaval — though the composition of the news industry then precluded the minute-to-minute ubiquity of 2017.

“It isn’t as though we haven’t seen a year like this,” said Nancy Gibbs, the former editor in chief of Time magazine. “But in the past, we haven’t been mainlining it.”

Mr. Negrin, from “The Daily Show,” has pursued social media performance art to combat the times, hoping to drill down on a single, elusive subject. His Twitter handle includes the number of days since Mr. Trump promised to clarify his position on Hezbollah within 24 hours — a pledge, like many presidential utterances of nontrivial consequence, that went largely ignored in the typical swirl of the moment.

That was July.

Mr. Negrin offered two predictions for the new year: Mr. Trump is unlikely to hold forth on Hezbollah anytime soon.

And: “2018 is going to be 10 times worse.”

Important article for students of propaganda: The Year the News Accelerated to Trump Speed


By MATT FLEGENHEIMER DEC. 29, 2017, New York Times [Original article contains links.]

image (not from article) from

[JB comment, inspired by this article: What is "masterful" propaganda today (if not since the beginning of time)? Not the "meaning" of what you say -- but how you do control (overwhelm?) human minds/feelings with hair styles/physical gestures/tweets/"news/information" -- but above all, with meaninglessness?]. The height of cynicism?

WASHINGTON — Barack Obama was president earlier this year.

Really, eyewitness accounts from the period confirm this. It lasted nearly three weeks, it seems, or roughly the time elapsed since a Democrat won a Senate seat in Alabama, if memory serves, which it generally does not anymore.

That special election came before President Trump helped usher a once-in-a-generation tax overhaul through Congress, but after he recognized Jerusalem as Israel’s capital, which preceded threats to end American aid to any countries that objected — and they did, en masse, in a remarkable United Nations vote that almost certainly took place somewhere in there, right? Possibly around the time the
president accused a female senator of doing “anything” for campaign contributions, touching off tremors in the #MeToo movement he helped inspire, and alleged another wide-scale conspiracy against him in the upper reaches of the F.B.I.

Or was that one over the summer? When were those hurricanes again? Oh, and the Pentagon has been tracking possible alien visitation. That definitely came up.

One year out, this may be Mr. Trump’s greatest trick: His tornado of news-making has scrambled Americans’ grasp of time and memory, producing a sort of sensory overload that can make even seismic events — of his creation or otherwise — disappear from the collective consciousness and public view.

He is the magician who swallows a sword no one thought was part of the act, stuffs a dozen rabbits into a hat before the audience can count them — and then merrily tweets about “Fox & Friends” while the crowd strains to remember what show it had paid to attend in the first place.

Diplomatic crises. Human tragedies. The Mooch.

Poof.

“We crammed six years in,” said Jason Chaffetz, a former Republican congressman from Utah who left the job at the end of June.

“We get to the point where we’re just done dealing with something,” said Matt Negrin, a digital producer at “The Daily Show,” recalling unresolved maelstroms like Mr. Trump’s feud with a Gold Star widow, his baseless claim that Mr. Obama wiretapped him and his defense of white nationalist supporters amid the deadly violence this summer in Charlottesville, Va.

“That’s something, in my opinion, we should be talking about,” Mr. Negrin said. “But then the eclipse happened five days later. Not that Trump created the eclipse. But maybe.”

The disorientation has had far-reaching effects, shaping not only Mr. Trump’s public image but also the ways in which lawmakers, journalists and others in his ecosystem are compelled to operate.

It is not exactly that “nothing matters,” to borrow social media’s favorite nihilistic buzz-phrase of the Trump age. It is that nothing matters long enough to matter.

“Las Vegas and the church in Texas have fallen off the map — two of the most heinous mass murders in recent American history,” said Tom Brokaw, the special correspondent at NBC News, flagging two episodes that would have, under previous circumstances, most likely remained seared in the national conversation. “It’s astonishing. It should be one of the defining stories not just of the year but of our
time.”

There are a lot of those. And the president’s apparent triumph over the space-time continuum has created practical concerns across newsrooms and congressional offices, exacerbated by forces that predate Mr. Trump: the rise of Facebook and Twitter, the partisan instincts of cable news and, in the case of mass shootings, what many describe as a growing public imperviousness to horror.

Senator Christopher S. Murphy, Democrat of Connecticut, who became a prominent gun control advocate after the 2012 massacre in Newtown, Conn., described his task under Mr. Trump as a “triage” mission, “newly overwhelming” every day.

“As someone who works on an issue that is unfortunately driven by news cycles,” he said, “it makes it harder to try to focus attention.” The most heinous shootings once dominated TV news for days. “Now,” Mr. Murphy said, “it doesn’t seem that there’s much more than 24 hours’ room for any
story.”

Of course, Mr. Trump runs neither the networks nor the newspapers, much as he might prefer it at times, and the news media has come by its share of criticism honestly. Corralling the fire hose of White House doings has become a near-constant exercise in news judgment, with mixed returns.

Not every Twitter tremble requires mass attention. Not all executive skirmishes need a referee on every channel.

“Trump is just so dislocating for everybody that it’s making us all nuts,” said Peter Hamby, the head of news at Snapchat. “There’s so much sexy, salacious, bananas-crazy news happening every single day. But there is a duty, I think, to cover substance.”

No one suggests the mandate is simple. Even Mr. Brokaw, a dean of meat-and-potatoes news delivery, allowed that the daily Trumpian churn is “not unimportant, and it’s got this Shakespearean quality about it.”

And while there have been frenzied, tumultuous years before, present conditions are unique.

The 2016 presidential campaign season delivered rapid-fire insanities without precedent, but still adhered broadly to the rhythms of an election cycle: the primaries, the conventions, the debates, the big day.

In 2017, the chaos tends to be unscheduled.

The most oft-cited parallel is 1968 — a tinderbox of tragedy, protest and political upheaval — though the composition of the news industry then precluded the minute-to-minute ubiquity of 2017.

“It isn’t as though we haven’t seen a year like this,” said Nancy Gibbs, the former editor in chief of Time magazine. “But in the past, we haven’t been mainlining it.”

Mr. Negrin, from “The Daily Show,” has pursued social media performance art to combat the times, hoping to drill down on a single, elusive subject. His Twitter handle includes the number of days since Mr. Trump promised to clarify his position on Hezbollah within 24 hours — a pledge, like many presidential utterances of nontrivial consequence, that went largely ignored in the typical swirl of the moment.

That was July.

Mr. Negrin offered two predictions for the new year: Mr. Trump is unlikely to hold forth on Hezbollah anytime soon.

And: “2018 is going to be 10 times worse.”

Thursday, December 28, 2017

Quotations from the article by scholar Stephen Kotkin, "When Stalin Faced Hitler: Who Fooled Whom?"


foreignaffairs.com


Image from article, with caption: A German infantryman walks toward the body of a Soviet soldier and burning Soviet BT-7 tank in the early days of Operation Barbarossa, 1941.

"History is full of surprises."

***
"[A]s a teenager, [Stalin] got his poems published in well-regarded Georgian periodicals. ('To this day his beautiful, sonorous lyrics echo in my ears,' one reader would later recall.)"

"He liked colored pencils—blue, red, and green."

"His personal library would ultimately grow to more than 20,000 volumes."

"Among Russian authors, Stalin’s favorite was probably Anton Chekhov ..."

"Film of him walking was prohibited."

"Only a few intimates knew that Stalin suffered nearly constant pain in the joints of his legs, which may have been a genetic condition and which movement partly alleviated."

"[A] narrow circle of Russian physicians had acquired detailed knowledge of his illnesses and of his bodily deformities, including his barely usable left arm, the thick, discolored toenails on his right foot, and the two webbed toes on his left foot (an omen, in traditional Russian folklore, of Satanic influence)."

"Stalin’s stomach was a wreck. He suffered from regular bouts of diarrhea."

"Stalin was a Germanophile."

"Stalin had allowed Lavrenti Beria, the feared head of the secret police, to imprison Poskryobyshev’s [Stalin’s top aide, Alexander Poskryobyshev] beloved wife as a “Trotskyite” in 1939. (Beria had sent a large basket of fruit to their two girls; he then executed their mother.)

"He [Stalin] labeled as 'disinformation' whatever he chose not to believe."

"[General] Zhukov would later recall ... 'And to say out loud that Stalin was wrong, that he is mistaken, to say it plainly, could have meant that without leaving the building, you would be taken to have coffee with Beria.' ”
***
"He [Hitler] also frequented the city’s public libraries, where he read ... the fiction of Karl May, set in the cowboys-and-Indians days of the American West or in the exotic Near East."

"Hitler dodged the Austrian draft."

"In April 1919, after Social Democrats and anarchists formed the Bavarian Soviet Republic, the Communists quickly seized power; Hitler, who contemplated joining the Social Democrats, served as a delegate from his battalion’s soviet (council). He had no profession to speak of but appears to have taken part in leftist indoctrination of the troops."

"During his first two weeks in prison [in prison for a failed 1923 attempt to seize power in Munich], Hitler refused to eat, believing he deserved to die, but letters arrived congratulating him as a national hero."

"In his residence in the old Reich Chancellery, Hitler did not sleep for a second straight night. He took a meal in the dining room. He listened to Les Préludes, the symphonic poem by Franz Liszt. He summoned Goebbels, who had just finished watching Gone With the Wind."

***
"A lifelong Germanophile, Stalin appears to have been mesmerized by the might and daring of Germany’s parallel totalitarian regime. For a time, he recovered his personal and political equilibrium in his miraculous pact with Hitler, which deflected the German war machine, delivered a bounty of German industrial tools, enabled the conquest and Sovietization of tsarist borderlands, and reinserted the Soviet Union into the role of arbitrating world affairs. Hitler had whetted and, reluctantly, abetted Stalin’s own appetite. But far earlier than the despot imagined, his ability to extract profit from the immense danger Hitler posed to Europe and the world had run its course. This generated unbearable tension in Stalin’s life and rule, yet he stubbornly refused to come to grips with the new realities, and not solely out of greed for German technology. Despite his insight into the human psyche, demonic shrewdness, and sharp mind, Stalin was blinkered by ideology and fixed ideas."

The United States of America Is Decadent and Depraved - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


James Traub, foreignpolicy.com [Original article contains links.]

Image from article, with caption: "Then-president-elect Donald J. Trump arrives at his inauguration at the United States Capitol on Jan. 20, in Washington, D.C."

In The History of the Decline and Fall of The Roman Empire, Edward Gibbon luridly evokes the Rome of 408 A.D., when the armies of the Goths prepared to descend upon the city. The marks of imperial decadence appeared not only in grotesque displays of public opulence and waste, but also in the collapse of faith in reason and science. The people of Rome, Gibbon writes, fell prey to “a puerile superstition” promoted by astrologers and to soothsayers who claimed “to read in the entrails of victims the signs of future greatness and prosperity.”

Would a latter-day Gibbon describe today’s America as “decadent”? I recently heard a prominent, and pro-American, French thinker (who was speaking off the record) say just that. He was moved to use the word after watching endless news accounts of U.S. President Donald Trump’s tweets alternate with endless revelations of sexual harassment. I flinched, perhaps because a Frenchman accusing Americans of decadence seems contrary to the order of nature. And the reaction to Harvey Weinstein et al. is scarcely a sign of hysterical puritanism, as I suppose he was implying.

And yet, the shoe fit. The sensation of creeping rot evoked by that word seems terribly apt.

Perhaps in a democracy the distinctive feature of decadence is not debauchery but terminal self-absorption — the loss of the capacity for collective action, the belief in common purpose [JB emphasis], even the acceptance of a common form of reasoning. We listen to necromancers who prophesy great things while they lead us into disaster. We sneer at the idea of a “public” and hold our fellow citizens in contempt. We think anyone who doesn’t pursue self-interest is a fool.

We cannot blame everything on Donald Trump, much though we might want to. In the decadent stage of the Roman Empire, or of Louis XVI’s France, or the dying days of the Habsburg Empire so brilliantly captured in Robert Musil’s The Man Without Qualities, decadence seeped downward from the rulers to the ruled. But in a democracy, the process operates reciprocally. A decadent elite licenses degraded behavior, and a debased public chooses its worst leaders. Then our Nero panders to our worst attributes — and we reward him for doing so.

“Decadence,” in short, describes a cultural, moral, and spiritual disorder — the Donald Trump in us. It is the right, of course, that first introduced the language of civilizational decay to American political discourse. A quarter of a century ago, Patrick Buchanan bellowed at the Republican National Convention that the two parties were fighting “a religious war … for the soul of America.” Former Speaker Newt Gingrich (R-Ga.) accused the Democrats of practicing “multicultural nihilistic hedonism,” of despising the values of ordinary Americans, of corruption, and of illegitimacy. That all-accusing voice became the voice of the Republican Party. Today it is not the nihilistic hedonism of imperial Rome that threatens American civilization but the furies unleashed by Gingrich and his kin.

The 2016 Republican primary was a bidding war in which the relatively calm voices — Jeb Bush and Marco Rubio — dropped out in the early rounds, while the consummately nasty Ted Cruz duked it out with the consummately cynical Donald Trump. A year’s worth of Trump’s cynicism, selfishness, and rage has only stoked the appetite of his supporters. The nation dodged a bullet last week when a colossal effort pushed Democratic nominee Doug Jones over the top in Alabama’s Senate special election. Nevertheless, the church-going folk of Alabama were perfectly prepared to choose a racist and a pedophile over a Democrat. Republican nominee Roy Moore almost became a senator by orchestrating a hatred of the other that was practically dehumanizing.

Trump functions as the impudent id of this culture of mass contempt. Of course he has legitimized the language of xenophobia and racial hatred, but he has also legitimized the language of selfishness. During the campaign, Trump barely even made the effort that Mitt Romney did in 2012 to explain his money-making career in terms of public good. He boasted about the gimmicks he had deployed to avoid paying taxes. Yes, he had piled up debt and walked away from the wreckage he had made in Atlantic City. But it was a great deal for him! At the Democratic convention, then-Vice President Joe Biden recalled that the most terrifying words he heard growing up were, “You’re fired.” Biden may have thought he had struck a crushing blow. Then Americans elected the man who had uttered those words with demonic glee. Voters saw cruelty and naked self-aggrandizement as signs of steely determination.

Perhaps we can measure democratic decadence by the diminishing relevance of the word “we.” It is, after all, a premise of democratic politics that, while majorities choose, they do so in the name of collective good. Half a century ago, at the height of the civil rights era and Lyndon B. Johnson’s Great Society, democratic majorities even agreed to spend large sums not on themselves but on excluded minorities. The commitment sounds almost chivalric today. Do any of our leaders have the temerity even to suggest that a tax policy that might hurt one class — at least, one politically potent class — nevertheless benefits the nation?

There is, in fact, no purer example of the politics of decadence than the tax legislation that the president will soon sign. Of course the law favors the rich; Republican supply-side doctrine argues that tax cuts to the investor class promote economic growth. What distinguishes the current round of cuts from those of either Ronald Reagan or George W. Bush is, first, the way in which they blatantly benefit the president himself through the abolition of the alternative minimum tax and the special treatment of real estate income under new “pass-through” rules. We Americans are so numb by now that we hardly even take note of the mockery this implies of the public servant’s dedication to public good.

Second, and no less extraordinary, is the way the tax cuts have been targeted to help Republican voters and hurt Democrats, above all through the abolition or sharp reduction of the deductibility of state and local taxes. I certainly didn’t vote for Ronald Reagan, but I cannot imagine him using tax policy to reward supporters and punish opponents. He would have thought that grossly unpatriotic. The new tax cuts constitute the economic equivalent of gerrymandering. All parties play that game, it’s true; yet today’s Republicans have carried electoral gerrymandering to such an extreme as to jeopardize the constitutionally protected principle of “one man, one vote.” Inside much of the party, no stigma attaches to the conscious disenfranchisement of Democratic voters. Democrats are not “us.”

Finally, the tax cut is an exercise in willful blindness. The same no doubt could be said for the 1981 Reagan tax cuts, which predictably led to unprecedented deficits when Republicans as well as Democrats balked at making offsetting budget cuts. Yet at the time a whole band of officials in the White House and the Congress clamored, in some cases desperately, for such reductions. They accepted a realm of objective reality that existed separately from their own wishes. But in 2017, when the Congressional Budget Office and other neutral arbiters concluded that the tax cuts would not begin to pay for themselves, the White House and congressional leaders simply dismissed the forecasts as too gloomy.

Here is something genuinely new about our era: We lack not only a sense of shared citizenry or collective good, but even a shared body of fact or a collective mode of reasoning toward the truthWe lack not only a sense of shared citizenry or collective good, but even a shared body of fact or a collective mode of reasoning toward the truth. A thing that we wish to be true is true; if we wish it not to be true, it isn’t. Global warming is a hoax. Barack Obama was born in Africa. Neutral predictions of the effects of tax cuts on the budget must be wrong, because the effects they foresee are bad ones.

Of course, our president who finds in smoking entrails the proof of future greatness and prosperity. The reduction of all disagreeable facts and narratives to “fake news” will stand as one of Donald Trump’s most lasting contributions to American culture, far outliving his own tenure. He has, in effect, pressed gerrymandering into the cognitive realm. Your story fights my story; if I can enlist more people on the side of my story, I own the truth. And yet Trump is as much symptom as cause of our national disorder. The Washington Post recently reported that officials at the Center for Disease Control were ordered not to use words like “science-based,” apparently now regarded as disablingly left-leaning. But further reporting in the New York Times appears to show that the order came not from White House flunkies but from officials worried that Congress would reject funding proposals marred by the offensive terms. One of our two national political parties — and its supporters — now regards “science” as a fighting word. Where is our Robert Musil, our pitiless satirist and moralist, when we need him (or her)?

A democratic society becomes decadent when its politics, which is to say its fundamental means of adjudication, becomes morally and intellectually corrupt. But the loss of all regard for common ground is hardly limited to the political right, or for that matter to politics. We need only think of the ever-unfolding narrative of Harvey Weinstein, which has introduced us not only to one monstrous individual but also to a whole world of well-educated, well-paid, highly regarded professionals who made a very comfortable living protecting that monster. “When you quickly settle, there is no need to get into all the facts,” as one of his lawyers delicately advised.

This is, of course, what lawyers do, just as accountants are paid to help companies move their profits into tax-free havens. What is new and distinctive, however, is the lack of apology or embarrassment, the sheer blitheness of the contempt for the public good. When Teddy Roosevelt called the monopolists of his day “malefactors of great wealth,” the epithet stung — and stuck. Now the bankers and brokers and private equity barons who helped drive the nation’s economy into a ditch in 2008 react with outrage when they’re singled out for blame. Being a “wealth creator” means never having to say you’re sorry. Enough voters accept this proposition that Donald Trump paid no political price for unapologetic greed.

The worship of the marketplace, and thus the elevation of selfishness to a public virtue, is a doctrine that we associate with the libertarian right. But it has coursed through the culture as a self-justifying ideology for rich people of all political persuasions — perhaps also for people who merely dream of becoming rich.

Decadence is usually understood as an irreversible condition — the last stage before collapse. The court of Muhammad Shah, last of the Mughals to control the entirety of their empire, lost itself in music and dance while the Persian army rode toward the Red Fort. But as American decadence is distinctive, perhaps America’s fate may be, too. Even if it is written in the stars that China will supplant the United States as the world’s greatest power, other empires, Britain being the most obvious example and the one democracy among them, have surrendered the role of global hegemon without sliding into terminal decadence.

Can the United States emulate the stoic example of the country it once surpassed?  I wonder. The British have the gift of ironic realism. When the time came to exit the stage, they shuffled off with a slightly embarrassed shrug. That, of course, is not the American way. When the stage manager beckons us into the wings we look for someone to hit — each other, or immigrants or Muslims or any other kind of not-us. Finding the reality of our situation inadmissible, like the deluded courtiers of the Shah of Iran, we slide into a malignant fantasy.

But precisely because we are a democracy, because the values and the mental habits that define us move upward from the people as well as downward from their leaders, that process need not be inexorable. The prospect of sending Roy Moore to the Senate forced a good many conservative Republicans into what may have been painful acts of self-reflection. The revelations of widespread sexual abuse offer an opportunity for a cleansing moment of self-recognition — at least if we stop short of the hysterical overreaction that seems to govern almost everything in our lives.

Our political elite will continue to gratify our worst impulses so long as we continue to be governed by them. The only way back is to reclaim the common ground — political, moral, and even cognitive — that Donald Trump has lit on fire. Losing to China is hardly the worst thing that could happen to us. Losing ourselves is.

ABOUT THE AUTHOR

James Traub is a contributing editor at Foreign Policy, a fellow at the Center on International Cooperation, and author of the book "John Quincy Adams: Militant Spirit."