Sunday, November 5, 2017

Why Arthur Schlesinger’s ‘Disuniting of America’ Lives - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


By MICHAEL LIND NOV. 2, 2017, New York Times
[Original article contains more links]

From the NYT (Pamela Paul Editor of The New York Times Book Review
@PamelaPaulNYT) via email: Arthur M. Schlesinger Jr. was arguably the most prominent historian of the 20th century [JB -- I would less "nationalistically" say, "prominent American historian]", between his foundational work “The Age of Jackson,” his still relevant “The Vital Center” and his books documenting the Kennedy years. This week, we review Richard Aldous’s new biography, “Schlesinger: The Imperial Historian,” and Michael Lind reconsiders Schlesinger’s book “The Disuniting of America.”


Image from article, with caption: The historian Arthur M. Schlesinger, Jr., who
tackled the  cannon wars a generation ago.

In contemporary debates that involve history and historical symbols like the
controversies over the removal of Confederate statues from public parks or the place
of Andrew Jackson and Alexander Hamilton on United States currency, it may seem
impossible to find middle ground. But a generation ago in the 1990s the search for
common ground in the history wars was undertaken by the leading liberal historian
of his era, Arthur M. Schlesinger Jr., in “The Disuniting of America: Reflections on a
Multicultural Society,” published in 1991 and in a revised edition in 1998. From
1949, when he published “The Vital Center,” Schlesinger, one of the founders of
Americans for Democratic Action and a confidant of the Kennedys, sought to defend
his conception of centrist liberalism against the radical left as well as the right.

The title of “The Disuniting of America” might mislead contemporary readers
into assuming that the book is about social polarization in general, which is the
subject of more recent publications like Bill Bishop’s “The Big Sort: Why the
Clustering of Like-Minded America Is Tearing Us Apart.” Instead, Schlesinger’s
polemic is an intervention in the “canon wars” of the 1980s and 1990s, when
curriculums in history and literature courses became the source of passionate
national debate. One defining event in that discussion was the publication in 1987 of
“The Closing of the American Mind” by the philosopher Allan Bloom. Another
occurred with the Jan. 18, 1995, vote by the United States Senate (99 to 1)
condemning proposed “national history standards” promulgated by the National
Center for History in the Schools at the University of California, Los Angeles, for not
showing “a decent respect for United States history’s roots in Western civilization,”
in the words of the Senate resolution.

Amid what was becoming a debate among left-leaning academics and populist
tribunes like Rush Limbaugh and Lynne Cheney, Schlesinger sought to define a
liberal alternative to what he described as militant multiculturalism on the left and
bigoted monoculturalism on the right: “The monoculturalists are hyperpatriots,
fundamentalists, evangelicals, laissez-faire doctrinaires, homophobes, antiabortionists,
pro-assault-gun people.” Of the two groups, Schlesinger considered the
monoculturalists a greater threat: “Left-wing political correctness is an irritation and
a nuisance. It becomes a threat to the young only when it invades the public
schools.” In contrast: “Right-wing political correctness catches kids before they are
old enough to take care of themselves and in environments where they are rarely
exposed to clashes of opinion. It is a weapon with which small-town bigots,
conducting pogroms against Darwin, Marx, J.D. Salinger, Judy Blume and other
villains, seize control of school committees and library boards.”

According to Schlesinger, “Monoculturalists abuse history as flagrantly as
multiculturalists. They sanitize the past and install their own set of patriotic heroes
and myths.” In a chapter titled “History the Weapon,” Schlesinger acknowledges
what he sees as the valid complaints of multiculturalists: “American history was long
written in the interests of white Anglo-Saxon Protestant males. My father, growing
up in the 1890s in Xenia, a small Ohio town containing large contingents of
Germans, Irish and blacks, one day asked his father, who had come from Germany
as a child and whose hero was Carl Schurz, the American general, politician and
reformer, why the schoolbooks portrayed England as the one and only mother
country. My grandfather’s wry comment was that apparently the only Germans
worth mentioning were ‘the Hessians who had fought on the wrong side in the War
for Independence.’ Irish and blacks fared even less well in schoolbooks, and the only
good Indians were dead Indians. Non-WASPs were the invisible men (and women)
in the American past.”

Schlesinger notes one predictable response by minorities to their exclusion from
mainstream historical texts and commemorations: “The ethnic enclaves thus
developed a compensatory literature.” To illustrate this, he quotes from the Irish-American
scholar John V. Kelleher about articles claiming “that the Continental
Army was 76 percent Irish, or that many of George Washington’s closest friends
were nuns or priests.” However badly the “white ethnics” suffered from Anglo-Saxon
Protestant condescension, Schlesinger notes, blacks, Latinos and Native Americans
suffered far worse: “The situation is radically different for non-white minorities facing
not snobbism but racism.”

But Schlesinger maintains that what he calls “compensatory history” is bad history,
whether it takes the form of Afrocentrism, or the claim that other regions have
falsely taken credit for inventions that originated in Africa, or what he, following
Kelleher, calls “the there’s-always-an-Irishman-at-the-bottom-of-it-doing-the-real-work
approach to American history.” These views ceased to be harmless folly when
their holders enlisted the support of federal, state or local governments to impose
them as official versions of history, Schlesinger argues: “ ‘Who controls the past
controls the future,’ runs the Party slogan in George Orwell’s ‘1984’; ‘who controls
the present controls the past.’ ”

At worst, Schlesinger writes, the sanctioning of state ethnonational ideologues
could Balkanize American society further. [JB emphasis] He denounces the federal 1974 Ethnic
Heritage Studies Program Act because it “ignored those millions of Americans —
surely a majority — who refused identification with any particular ethnic group.”
Schlesinger may have seen himself in the latter group. His paternal ancestors
included Prussian Jews and Austrian Catholics, while his mother was a descendant
of the Mayflower colonists and supposedly related to the 19th-century American
historian George Bancroft.

“I don’t want to sound too apocalyptic about these developments,” Schlesinger
writes. Indeed, unlike many of his contemporaries who criticized multiculturalism,
he did not see Latino immigration as either a linguistic or a social challenge to
American national unity. Schlesinger noted: “As for Hispanic-Americans, first-generation
Hispanics born in the United States speak English fluently, according to a
Rand Corporation study; more than half of second-generation Hispanics give up
Spanish altogether.” Subsequent social science studies by Stephen Trejo, Richard
Alba and others have confirmed that marriage outside of the group and erosion of
ethnic identity tends to increase with each generation of Latinos, as it did in the case
of European immigrant diasporas in the United States in the past.

Like other memorable tracts for the times, “The Disuniting of America” blends
passages of enduring relevance with much that has become obsolete. Today, what is
most striking about this book and other entries in the late-20th-century battle of the
books is the assumption shared by all sides in the canon wars that the fate of the
nation might depend on the content of the curriculum, as determined by academic
experts.

Since Schlesinger wrote, there has been a collapse in the authority of
establishments of all kinds, not just academics. In the age of Twitter and Facebook
and 24-hour cable news, public intellectuals like Schlesinger, based in the academy
or in journalism, have lost influence over public opinion to movie stars, cable
commentators, pop musicians and late-night comedians.

Perhaps the greatest change has involved the declining status of liberal arts
education and the historical studies at its core. In response to decades of slower-than-expected
growth and heightened foreign competition, students deserted the
humanities for more practical degrees like business. Meanwhile, in the 2000s and
2010s the bipartisan elite shared a new consensus that national success depended
not on widespread liberal arts education but on student proficiency in science,
technology, engineering and math. The debate over federal “no child left behind”
standards that aimed to increase the number of Americans who go into engineering
or science eclipsed the debates over the content of the American historical pantheon.
The only academics who seem to find audiences among today’s elite are economists
and social scientists who claim to know how to boost gross domestic product or
manipulate human behavior.

Today the canon wars have given way to the icon wars. Although the focus of
controversy has shifted from the contents of undergraduate education to the
historical figures commemorated by statues and currency, debates over America’s
past continue to mirror debates over America’s present and future. To the challenges
of teaching history in a way that is at once accurate and inclusive, Schlesinger
remains an insightful guide.

Michael Lind, a professor at the Lyndon B. Johnson School of Public Affairs at the
University of Texas, is the author of “Land of Promise: An Economic History of the
United States.”

Saturday, November 4, 2017

An Icy Conquest - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


Susan Dunn, New York Review of Books

A Cold Welcome: The Little Ice Age and Europe’s Encounter with North America

by Sam White

Harvard University Press, 361 pp., $29.95

[JB note: A thoughtful article that is not exactly a "celebration" of the mythical (?) narrative of the creation by Europeans of a paradise-like "new world" in the geographical area now known as North America, including the United States ... ]





Sarin Image/Granger
Captain John Smith taken captive by the Powhatan Native Americans; color engraving from Captain Smith’s Generall Historie of Virginia, 1624

“We are starved! We are starved!” the sixty skeletal members of the English colony of Jamestown cried out in desperation as two ships arrived with provisions in June 1610. Of the roughly 240 people who were in Jamestown at the start of the winter of 1609–1610, they were the only ones left alive. They suffered from exhaustion, starvation, and malnutrition as well as from a strange sickness that “caused all our skinns to peele off, from head to foote, as if we had beene flayed.” Zooarchaeological evidence shows that during those pitiless months of “starving time” they turned to eating dogs, cats, rats, mice, venomous snakes, and other famine foods: mushrooms, toadstools, “or what els we founde growing upon the grounde that would fill either mouth or belly.” Some of the settlers reportedly ingested excrement and chewed the leather of their boots. Recent discoveries of human skeletons confirm the revelation of the colony’s president, George Percy, that they also resorted to cannibalism: “Some adventuringe to seeke releife in the woods, dyed as they sought it, and weare eaten by others who found them dead.” When one man confessed under torture to having murdered and eaten his wife, Percy ordered his execution.
That happened a mere three years after the first adventurous group of Englishmen arrived in Jamestown. From the beginning, it was a struggle for subsistence. Most of the settlers fell ill only a few weeks after landfall in May 1607. One colonist recalled that “scarse ten amongst us coulde either goe, or well stand, such extreame weaknes and sicknes oppressed us.” The corn withered in the summer drought, and as the flow of the James River waned in the unrelenting heat, salt water encroached from the sea, depriving the settlers of their main source of fresh water. Nor was divine assistance forthcoming. The Quiyoughcohannock Indians, scarcely better off, beseeched the Englishmen to intercede and ask their powerful God for supernatural intervention. But when the colonists’ prayers seemed to bring only more suffering instead of rain to Jamestown, the natives concluded that the Christian god must be a vindictive one, and their relations with the colonists deteriorated.
By September 1607, half the colony’s members were dead. “Our men were destroyed with cruell diseases as Swellings, Flixes, Burning Fevers, and by warres, and some departed suddenly,” Percy later recalled, “but for the most part they died of meere famine.” The next winter months would prove equally deadly. “It got so very cold and the frost so sharp that I and many others suffered frozen feet,” another witness wrote, adding that the cold was so severe that “the river at our fort froze almost all the way across.”
Fresh groups of colonists arrived in 1608 and 1609, but steady attrition and the “starving time” of 1609–1610 pushed the settlement to the brink. In June 1610, when the two ships arrived with provisions for the emaciated survivors, it seemed too late. Jamestown’s leaders announced to the settlers that they would all return to England by way of Newfoundland. “There was a general acclamation, and shoute of joy,” one person remembered. They set sail on June 17, but the next day, when they reached the small settlement on Mulberry Island along the James River just a few miles away, they sighted another boat, working its way up the river with news that an English relief fleet was on its way with more settlers and enough provisions to last a year. That chance encounter saved the colony of Jamestown. “God would not have it so abandoned,” one settler wrote. The following winter proved less harsh, and by 1614 colonists had begun lucrative exports of tobacco. In 1619 the Virginia House of Burgesses would hold its first assembly in Jamestown.

The brutal story of Jamestown scarcely fits the pageant of success that students are often taught in the condensed version of early American history that starts in 1492 when Columbus sailed the ocean blue and then jumps to the Pilgrims’ safe landing at Plymouth Rock in 1620 and their peaceful celebration of the first Thanksgiving the following year.  [JB emphasis] But in his deeply researched and exciting new book, A Cold Welcome, the historian Sam White focuses on the true stories of the English, Spanish, and French colonial expeditions in North America. He tells strange and surprising tales of drought, famine, bitterly cold winters, desperation, and death, while anchoring his research in the methods and results of the science of climate change and historical climatology. In doing so, he erases what C.P. Snow, the British physicist and author of The Two Cultures, considered the damaging cultural barrier and “mutual incomprehension” estranging humanists and scientists from one another.1 “Historians can, and must, embrace this science,” White counsels.
He weaves an intricate, complex tapestry as he examines the effects both of climate—meteorological conditions over relatively long periods of time—and of weather—the conditions of the atmosphere over a short term—on vulnerable colonists in North America in the late sixteenth and early seventeenth centuries. The half-century that led up to the founding of permanent settlements saw, as White notes, “one of the steepest declines in Northern Hemisphere temperatures in perhaps thousands of years.” 
His fresh account of the climatic forces shaping the colonization of North America differs significantly from long-standing interpretations of those early calamities. Edmund S. Morgan’s classic American Slavery, American Freedom: The Ordeal of Colonial Virginia (1975) contains a lengthy assessment of the reasons why the Jamestown colonists experienced their “Lord of the Flies” fate. Morgan faults the poor organization and direction of the colony but most of all points to sociological and psychological factors, especially the indolence of the colonists and the large number of “gentlemen” among them who were averse to descending to ordinary labor. “He that will not worke, shall not eate,” John Smith warned them to little avail.2 A Cold Welcome does not replace these well-grounded interpretations but rather supplements them by shining a spotlight on a wholly different dimension: the timing of these colonial enterprises, which ensnared them in what came to be known as the Little Ice Age.
As climatologists define it, the Little Ice Age was a long-term cooling of the Northern Hemisphere between 1300 and 1850. They locate maximum cooling in the early seventeenth century, just when European settlers were attempting to establish colonies in North America. To reconstruct past climate, scientists use indicators called climate “proxies,” such as ice cores, tree rings, and lake-bottom sediments that they analyze for indications of past temperatures and precipitation. In addition, zooarchaeologists examine animal bones to see what settlers ate, while bioarchaeologists study human skeletons to probe health and nutrition.
Climate proxies also provide important evidence of volcanic activity. Between the 1580s and 1600 large tropical volcanic eruptions spewed dust and sulfates high into the atmosphere, dimming sunlight, cooling Earth’s surface, and causing oscillations in atmospheric and oceanic circulation. Eruptions in Colima, Mexico, in 1586, in Nevado del Ruiz in present-day Colombia in 1595, and especially the huge Huaynaputina eruption in the Peruvian Andes in 1600 helped produce shockingly cold decades.
Even before colonists departed from Europe, their lack of reliable information about the extremes of weather in the Little Ice Age was compounded by fatal misconceptions linking geographical latitudes with climate. Educated in the work of the classical Greek geographer Ptolemy, for whom climate and latitude were synonymous, Europeans assumed that they would find a relatively mild climate in North America, since Britain lies latitudinally north of the continental United States and Paris north of Quebec, while Spain lines up with New Mexico. The confusion sowed by those misleading notions would doom many of their enterprises.
During those harrowing decades, European countries—England and Spain in particular—also suffered from freezing winters, cold, wet summers, intense rain, flooding, ruined crops, famine, outbreaks of disease, plague, and spikes in mortality. In the mid-1590s, William Shakespeare found poetry in the capricious climate of the age:
And thorough this distemperature we see
The seasons alter: hoary-headed frosts
Fall in the fresh lap of the crimson rose,
…The spring, the summer,
The childing autumn, angry winter, change
Their wonted liveries, and the mazèd world
…now knows not which is which.



Bettmann/Getty Images

Economic and demographic factors, worsened by climate-related disasters, White argues, influenced the colonial ambitions of European nations: “The Little Ice Age came at a particular moment and in a particular way that helped to undermine Spain’s commitment to North American colonization but to reinforce England’s.” He suggests that a pervasive sense of overcrowding in England, worsened by an influx of poverty-stricken famine refugees into London, helped the planners and promoters of American colonies secure private investment and gather public support by depicting North America as an opportune overseas outlet for the surplus population. In Spain, meanwhile, a decline in imperial revenue, heavy military expenses, and disillusionment with the nation’s fragile settlements in North America, along with weather-related hardships and a general sense of crisis in the empire, led King Philip III to pull back on Spain’s North American claims, opening the way for the English and the French to establish their own colonies there and ultimately allowing for a decisive shift of power in the North Atlantic world.
Spain’s expeditions in the early sixteenth century to La Florida—today’s southeastern United States—resulted in lost lives and lost investments. Explorers and colonists expected to find a familiar Mediterranean climate in La Florida: hot, dry summers and cool, wet winters. Instead they encountered wet summers, storms, hurricanes, and freezing winters. “We were farming people in Spain,” wrote one bitterly disillusioned settler in Santa Elena, now Parris Island in South Carolina. “Here we are lost, old, weary, and full of sickness.” In 1587, the few remaining colonists in Santa Elena left for St. Augustine. Frustrated, Philip III was anxious to abandon La Florida and focus instead on New Spain—the territory encompassing the Caribbean and what is now Mexico. In 1608, however, he yielded to Franciscan missionaries who urged him to maintain the settlement in St. Augustine and not abandon the Indians who had been converted to Christianity.
The Spanish colony of New Mexico received a reprieve at the same time and for the same reason: the Franciscans convinced the viceroy of the need to minister to the more than seven thousand Indians who had been baptized. Ever since the colonists’ first arrival in 1540, the barren desert landscape had tested their endurance. In 1598 they set up a base about thirty miles north of present-day Santa Fe, built houses and a church, and dug irrigation channels for crops. But neither they nor the Pueblo Indians, born to that climate, were immune to the hazards of New Mexico’s Little Ice Age.
The nadir came in 1601 following the Huaynaputina eruption, when both colonists and natives found themselves unprepared, physically and psychologically, for one of the coldest and driest periods of the past millennium. During the long freezing winter months, fields of cotton and corn were destroyed, livestock perished in the snow, and even the Rio Grande froze over. Summer was no less discouraging. One witness reported that the four months of summer heat were “almost worse than the cold in winter; and so the saying there is, winter for eight months and hell for four.”
The New Mexico colony all but collapsed at the end of 1601. Gradually, though, the drought came to an end, the winters became less unforgiving, and in 1608 the colonists and missionaries were granted land to set up a new town called Santa Fe, making it, White comments, “an almost exact contemporary of Jamestown.”
In 1609, just when Spanish colonists were securing their settlement in Santa Fe and English colonists starved in Jamestown, the French explorer Samuel de Champlain established a settlement on low ground near the edge of the St. Lawrence River; it had good soil, streams, fresh water, and the protective shelter of high cliffs. He called the colony Quebec, a name derived from the Algonquin word kébec, meaning “where the river narrows.”
Champlain was by then painfully familiar with the climate and geography of the region. He and the explorers Pierre Dugua and François Gravé had already experienced the challenges of establishing settlements in Canada. Their first attempt to set up a colony on the island of St. Croix in the Bay of Fundy failed during the devastating winter of 1604–1605. “The cold is harsher and more excessive than in France and much longer,” Champlain discovered. In the summer of 1605, he and Dugua led the St. Croix colonists who hadn’t died of malnutrition and scurvy to a new site, Port Royal on Nova Scotia. Though the first winter in Port Royal was also deadly, the second one, Champlain noted, “was not so long as in preceding years.” The settlers on Port Royal chanced upon more fresh food, including berries, and suffered fewer instances of scurvy; Champlain’s beneficial creation of a social club, the Order of Good Cheer, also boosted morale. But just when the settlement began to thrive, King Henry IV abruptly canceled the fur trade monopoly that made Port Royal economically viable.
In the end, St. Croix and Port Royal contributed to the eventual success of the French in Canada, for Champlain was able to apply to Quebec what he had learned from the mistakes on St. Croix and the accomplishments in Port Royal. He grasped the importance of constructing storehouses with cellars to insulate food and drink from the winter cold and of locating dwellings around a compact central courtyard for defense against storms as well as Indian attacks. White also praises Champlain for having sought out Native Americans for their local knowledge, though the Frenchman could neither abide nor understand their consumption of raw organ meat—pancreas, kidney, tongue—one of the few sources of ascorbic acid that protected them from scurvy during the frigid winter months.
After decades of failed European expeditions and aborted settlements in North America, England, Spain, and France finally had their first enduring colonies in Jamestown, St. Augustine, Santa Fe, and Quebec in the early seventeenth century. At great cost in lives, money, and hopes and expectations, these colonies not only overcame the rigors and ravages of the Little Ice Age but would come to define much of the cultural heritage of the continent.
White remarks that, in undertaking this intriguing study, he was “conscious of the challenges posed by climate change” today. Indeed, he acknowledges that he wrote A Cold Welcome “from the vantage point of global warming” and that he saw in the colonial period “an era that addresses concerns of the present.” It was “another age when America spoke many languages and when its future, its environment, and its place in the world were all uncertain. It was another age when climatic change and extremes threatened lives and settlements.” But while the Europeans who traveled to North America in the sixteenth and seventeenth centuries were not responsible for the Little Ice Age, today the responsibility for the global climate lies largely with humanity.

1. C.P. Snow, The Two Cultures (1959; Cambridge University Press, 1998), p. 4. ↩

2. Edmund S. Morgan, American Slavery, American Freedom: The Ordeal of Colonial Virginia (Norton, 1975), pp. 75, 78.

Thursday, November 2, 2017

Fancy sleeping in Stalin’s room in Sochi? Now you can


Nikolay Shevchenko, rbth.com; via AH on Facebook; see also (2014 RFE article). JB comment: I would title the piece "The perfect place for a honeymoon" :)

image from article and from

However bizarre this may sound, in 2017 you can actually check into the room previously occupied by Stalin himself.
Reuters

Catch some Z’s in the dictator’s bed and try out his specially designed snooker cue.
Rumors of Joseph Stalin’s ailing health began circulating in Germany before spreading across the Atlantic in September, 1936. Newspapers suggested Stalin was so ill that he wouldn’t be able to continue ruling the Soviet Union, triggering a power struggle in the Kremlin.
Moscow vehemently denied the claims, offering an alternative explanation for the leader’s unusually prolonged absence from public view. Stalin, they said, was on holiday on the Black Sea in Sochi, 1,500 km south of Moscow.
But as they say, there’s no smoke without fire - there was potential credence in both accounts. Stalin indeed suffered from poor joints and lungs, and he found Sochi’s Matsesta baths healing. What’s more, his new Sochi residence was finally completed in 1936.
After Stalin’s death in 1953, some 20 retreat residences were left unused.
Check-in, Sir?
However bizarre this may sound, in 2017 you can actually check into the room previously occupied by the man himself. Its chief occupant being long gone, holidaymakers are able to rent the rooms of Stalin’s Sochi dacha.
Stalin’s residence is a bright green complex located on a hill right above Sochi’s Zelenaya Rocha (Green Grove) sanatorium. The environment is quite spartan: Small rooms stuffed with old Soviet furniture and dim lighting may not be what the modern traveler is used to, but at least it’s authentic.
“This is not a museum, we do not issue ads and we do not have a website or a cash register,” says a local guide who takes tourists around the complex.
The residence remains part of the guest complex, which at times creates ridiculous problems for its staff.
After Stalin’s death in 1953, some 20 retreat residences were left unused. Nikita Khrushchev, the architect of de-Stalinization, gave the dachas to local authorities to do as they pleased. Since Sochi was regarded as a retreat city, the authorities there built a sanatorium on the site of his residence. In 1968, the 12 rooms - including the one previously reserved for Stalin - welcomed the first visitors.
In 1968, the 12 rooms - including the one previously reserved for Stalin - welcomed the first visitors.
The residence remains part of the guest complex, which at times creates ridiculous problems for its staff.
“Once we were ordered to take important guests on a tour around the dacha, but its rooms were all booked by residents. So we then urgently took those staying in the dacha on an improvised tour to Sukhumi (a city in the partially recognized state of Abkhazia, 150 kilometers west of Sochi), boarded them on buses and sent them to Abkhazia. In the meantime, the important guests took their tour of Stalin’s residence,” said one employee working at the residence.
This year, part of Stalin’s Sochi residence is under renovation.
Under construction
This year, part of Stalin’s Sochi residence is under renovation. Visitors are welcome to take a tour around the complex, which is largely unaffected by the construction works - although the rooms in this part of the building are not for rent at the moment.
When the renovation is completed the next year, everyone will be able to check into all rooms previously occupied by Stalin, members of his family, and close associates. In the meantime, you may freely play Stalin’s chess, test his snooker cue (specifically modified for his dysfunctional hand), and take a picture with the Generalissimo himself… well, his wax statue, to be specific.
If using any of Russia Beyond's content, partly or in full, always provide an active hyperlink to the original material.

Russia Inquiry Fails to Unite a Nation - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


Jim Rutenberg, New York Times

image (not from article) from
Excerpt:
Between the promotion of alternative narratives and the way the social media
platforms have been so slow in describing their inadvertent hosting of the Russian
effort, there’s a striking lack of national unity over what appears to have been a
foreign incursion in an American election. [JB emphasis] So you have to wonder how the country will ever come together to do something about it.

“There’s this epic lack of consensus, or active dis-census,” said Shawn Powers,
the executive director of the United States Advisory Commission on Public
Diplomacy, who has spent years studying what he calls the “geopolitics of
information.”

Mr. Powers was confident that, as the facts come out, the country will find a way
to fend off similar attacks. For instance, he said, Facebook, Twitter and Google are
adding new policies to combat false information, fake accounts and nefarious foreign
political advertising. ...

Wednesday, November 1, 2017

Why Is the U.S. So Susceptible to Social-Media Distortion? - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


Stephen Marche October 31, 2017, The New Yorker

image (not from article) from
Excerpt:
Self-determination is the source of America’s oldest political commitments and its deepest clichés—“Life, Liberty and the Pursuit of Happiness,” the cowboy, the astronaut, Thoreau at Walden, Emerson on “Self-Reliance.” In America, everyone is entitled to his or her own vision of the universe. Therefore Mormonism. Therefore Scientology. Therefore the various phases of Bob Dylan’s career. Self-determination is a moral state and not simply an economic one. How else would so many new religions, new art forms, be born out of a single country? The idea that meaning will blossom from individuals rather than be imposed from an outside order is why America, though imperial, has never considered itself an empire. This self-determining instinct attaches to both the left and the right. “The ultimate victory will depend upon the hearts and the minds of the people who actually live out there,” President Lyndon B. Johnson said of Vietnam. “ ‘You’re on your own. Here’s a copy of the Federalist Papers. Good luck,’ ” John Bolton said of Iraq. The idea that meaning is something that comes from within a person is so entrenched in American thinking that even Americans who spend decades abroad cannot quite imagine that people work any other way. ...

NYC terror attack: Sayfullo Saipov was here on diversity visa, Trump says. What is that?


Alan Gomez and Ashley May, USA TODAY

Saipov image from article

The Uzbekistan native arrested for the deadly truck rampage through a New York City bike path entered the United States under a visa program to encourage immigration from underrepresented nations, President Trump said Wednesday. 
Trump called on Congress to immediately end the Diversity Immigrant Visa Program, which was used by Sayfullo Saipov, 29, to enter the U.S. in 2010. The State Department issues up to 50,000 visas a year under the program using a lottery system.
"We need to get rid of the lottery program as soon as possible," Trump said before a Cabinet meeting at the White House.
The purpose of the visa program, created by the 1990 Immigration and Nationality Act, is to diversify the incoming pool of immigrants and provide people with no family or economic ties to the U.S. a small chance to enter. Those who receive the visa are then eligible for a green card and, eventually, U.S. citizenship.

The terrorist came into our country through what is called the "Diversity Visa Lottery Program," a Chuck Schumer beauty. I want merit based.
The program is eligible only to people who live in countries where few residents regularly immigrate to the U.S., mostly from Africa, Asia and eastern European countries that were part of the Soviet bloc, according to data from the State Department. In 2016, people from African nations accounted for 44% of the 46,718 visas granted, those from eastern Europe received 33%, and those from Asia got 19%.
Saipov came to America from Uzbekistan in 2010. Since then, more than 21,000 residents of that nation have been granted diversity visas, according to the State Department.
The program has long been a target of immigration hard-liners wanting either to  reduce the number of immigrants entering the U.S. or switch to a more "merit-based" immigration system. To qualify for a diversity visa, applicants need only a high school degree and two years of work experience. 
In 2013, when the Senate passed an immigration overhaul, Democrats defended the program as a much-needed lifeline to people hoping for a shot at the American dream. Republicans attacked the program as an outdated system that does not serve the national interest.
The program was eliminated as part of a compromise measure that passed the Senate, but the legislation died when the House of Representatives failed to act.
Now, those same Republicans are renewing their call to end the program.
Sen. Jeff Flake, R-Ariz., one of the co-authors of the 2013 bill along with Sen. Chuck Schumer, D-N.Y., countered Trump's criticism Wednesday of the program and Schumer by pointing out their legislation would have ended the diversity program. "I know, I was there," Flake wrote in a tweet.
Sen. Lindsey Graham, R-S.C., another member of the Gang of 8 that sponsored the 2013 immigration overhaul, said the diversity visa will now become part of ongoing negotiations to protect DREAMers — undocumented immigrants brought to the U.S. as children.
"It makes no sense to hand out visas and green cards this way," Graham said Wednesday on Fox News. "We want merit-based immigration. When it comes times to deal with ... Dream Act kids, that part of the deal should be to do away with the lottery system."
Last month, ending the program was included on a long list of demands that the White House said were needed before Congress could pass a law protecting DREAMers.
Those who apply for a diversity visa must submit biographic information, two passport-style photos, copy of a birth certificate, medical examination, vaccination record and an arrival/departure record. Applicants go through a federal background check and an in-person interview with a U.S. consular official in their home country.

Review: The Turn to Tyranny


Joshua Rubenstein, Wall Street Journal

image from article

We may never know what degree of personal obsession, political calculation and ideological zeal drove Stalin to kill and persecute so many. Joshua Rubenstein reviews ‘Stalin: Waiting for Hitler, 1929-1941’ by Stephen Kotkin.


In the aftermath of Lenin’s death in January 1924, Joseph Stalin —already secretary-general of the Communist Party—emerged as the outright leader of the Soviet Union. “Right through 1927,” Stephen Kotkin notes, Stalin “had not appeared to be a sociopath in the eyes of those who worked most closely with him.” But by 1929-30, he “was exhibiting an intense dark side.” Mr. Kotkin’s “Stalin: Waiting for Hitler, 1929-1941,” the second volume of a planned three-volume biography, tracks the Soviet leader’s transformation during these crucial years. “Impatient with dictatorship,” Mr. Kotkin says, Stalin set out to forge “a despotism in mass bloodshed.”
The three central episodes of Mr. Kotkin’s narrative, all from the 1930s, are indeed violent and catastrophic, if in different ways: the forced collectivization of Soviet agriculture; the atrocities of the Great Terror, when Stalin “arrested and murdered immense numbers of loyal people”; and the rise of Adolf Hitler, the man who would become Stalin’s ally and then, as Mr. Kotkin puts it, his “principal nemesis.” In each case, as Mr. Kotkin shows, Stalin’s personal character—a combination of ruthlessness and paranoia—played a key role in the unfolding of events.
Forced collectivization was the linchpin of Stalin’s first Five-Year Plan. With the peasants living mostly on small-scale plots, he compelled millions of households to move onto collective farms and sought to turn many peasants into the industrial workers who would build the factories and electric stations needed for crash industrialization. To enforce his plan, he set draconian quotas for the confiscation of “surplus” food and violently repressed millions of so-called kulaks (supposedly better-off peasants), whom he wanted to exterminate as a class.
The consequent famine killed more than five million people in Ukraine, Kazakhstan and Russia’s North Caucasus region. Scholars continue to debate whether the famine in Ukraine, which killed some 3.5 million, was a deliberate aim of Stalin’s policies—intended to destroy Ukraine’s national spirit and culture—or the unforeseen result of his war on the peasantry. Although Mr. Kotkin argues that the famine was “not intentional,” his book makes it clear that Stalin was well aware of widespread starvation and that he responded with remarkable cruelty, sealing Ukraine’s borders to make escape impossible. The Kremlin allowed the famine to deepen, accepting a high number of victims rather than ameliorate its most calamitous effects.
PHOTO: WSJ

STALIN: WAITING FOR HITLER, 1929-1941

By Stephen Kotkin
Penguin Press, 1,154 pages, $40
Another crisis erupted after the assassination of the Leningrad party chief Sergei Kirov in December 1934. Although many historians, including Robert Conquest and Amy Knight, have argued that Stalin almost certainly orchestrated the crime, Mr. Kotkin accepts the current scholarly consensus that Stalin was not behind Kirov’s murder and that Leonid Nikolayev, a disaffected young worker, carried it out on his own.
There is no debate, however, over how Stalin exploited the murder. He had always insisted that the country “was honeycombed with wreckers,” as Mr. Kotkin writes, and beset by conspiracies to subvert Bolshevik rule. In the wake of Kirov’s death, Stalin first accused thousands of Communist Party figures of engaging in a conspiracy to kill Kirov and then expanded the purge to encompass tens of thousands of military commanders, state-security personnel and party officials, including leaders of the revolution like Nikolai Bukharin, Lev Kamenev and Grigory Zinoviev. Mr. Kotkin argues that Stalin carried out the purge to “smash his inner circle” and avenge elements within the party that had opposed collectivization, but he doesn’t provide sufficient documentation to buttress the claim. Stalin probably regarded army and state-security officers as the only force that could dislodge him.

With the purges under way, Stalin embarked on the Great Terror, a wave of violence that killed more than 800,000 people in the space of 16 months. Among those targeted were the members of ethnic groups—Poles, Koreans, Germans—whom Stalin regarded as unreliable elements, a fifth column that could threaten the regime in case of war. As with all great crimes, we may never truly know what degree of personal obsession, political calculation and ideological fanaticism drove Stalin to order the execution and imprisonment of so many.

While Mr. Kotkin discusses foreign-policy developments throughout the book, including the establishment of diplomatic relations with the United States in 1933 and Soviet intervention in the Spanish Civil War in 1936, the final chapters of “Waiting for Hitler” focus on the 1939 Non-Aggression Pact with Nazi Germany. Within days of its signing, the Wehrmacht invaded Poland from the west, and the Red Army soon occupied Poland from the east. It was the pact that created a common border between Germany and the Soviet Union, a miscalculation by Stalin that proved to be nearly fatal to his regime.
For the next two years, while cooperating with Germany, Stalin tried in vain to fathom Hitler’s intentions. Mr. Kotkin provides a nearly day-by-day account of diplomatic maneuvers involving the Soviet Union, Germany, England and France, along with urgent intelligence reports sent to the Kremlin beginning in August 1940 arguing that Hitler was planning an attack on the Soviet Union. It is here that Stalin’s paranoia proved momentously damaging. No Moscow intelligence chief, aware that several of his predecessors had been executed, would dare contradict Stalin when he insisted that Hitler could be trusted. The reckoning came on June 22, 1941: Stalin’s disregard of the warnings left his country unprepared for the German attack, the point at which Mr. Kotkin’s third volume will presumably begin.
There have been many other biographies of Stalin, but none matches the range of information and analysis that animates Mr. Kotkin’s ambitious project. “Waiting for Hitler” is biography and history on a grand scale—equal in scope to the enormity of the events it describes.
Mr. Rubenstein is the author of “The Last Days of Stalin.”