Saturday, April 22, 2017

A New Parchment Declaration of Independence Surfaces. Head­-Scratching Ensues - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


Jennifer Schuessler, New York Times [original article contains links and additional illustrations]


Image from article, with caption: A parchment manuscript of the Declaration of Independence, believed to date from the 1780s and held in the West Sussex Record Office in England

Archival research doesn’t get much more exciting than the 2004 heist movie
“National Treasure.” Nicolas Cage, playing a historian named Benjamin Franklin
Gates, discovers a coded map on the back of the Declaration of Independence.
Globe­-spanning intrigue ensues — accompanied, offscreen, by a tsunami of eye-rolling
by actual historians.

But now, in a bit of real­-life archival drama, a pair of scholars are announcing a
surprising discovery: a previously unknown early handwritten parchment of the
Declaration, buried in a provincial archive in Britain.

The document is the only other 18th­-century handwritten parchment
Declaration known to exist besides the one from 1776 now displayed at the National
Archives in Washington. It isn’t an official government document, like the 1776
parchment, but a display copy created in the mid-­1780s, the researchers argue, by
someone who wanted to influence debate over the Constitution.

It may not hold the key to a Masonic conspiracy, as in “National Treasure.” But
its subtle details, the scholars argue, illuminate an enduring puzzle at the heart of
American politics: Was the country founded by a unitary national people, or by a
collection of states? [JB emphasis]

“That is really the key riddle of the American system,” said Danielle Allen, a
professor of government at Harvard, who discovered the document with a colleague,
Emily Sneff.

That riddle has bedeviled American history, from debates over Southern
secession to calls to abolish the Electoral College today. And it was the burning
question in the mid-­1780s, when the American experiment was at risk of falling
apart, and the push for a federal constitution, creating a strong national government
(with, crucially, the right to tax), gained steam.

The new parchment will hardly end the argument. But it “really shifts our
understanding in how the nationalist position emerged,” Ms. Allen said.

It remains to be seen what scholars will make of the discovery, which will be
announced on Friday at a conference at Yale. A paper, posted online, runs through a
wealth of textual and material evidence supporting the claim that the document,
while found in Britain, was created in America in the 1780s. Ms. Allen and Ms.
Sneff’s conference presentation will focus on their leading candidate for person
behind it: James Wilson, a Pennsylvania lawyer and one of the strongest nationalists
at the 1787 Constitutional Convention, who probably commissioned the parchment.

Some historians who have previewed their research are impressed.

“The sleuthing they’ve done is just remarkable,” said Benjamin Irvin, an
associate professor of history at the University of Arizona and the author of “Clothed
in Robes of Sovereignty,” a 2011 study of the Continental Congress. The
identification as American, from the mid-­1780s, he added, “looks pretty watertight.”

And the whodunit? William Ewald, a legal historian at the University of
Pennsylvania Law School who is writing a biography of Wilson, said he found the
case for Wilson — one of six men who signed both the Declaration and the
Constitution, and the rare founder to invoke the earlier document in the 1780s —
“very plausible.”

Even if that attribution is wrong, Mr. Ewald added, the parchment is still “the
discovery of a lifetime.”

“Every 20 years or so, someone discovers an unknown copy of one of the
newspaper printings,” he said. “But a new formal parchment — how many people
can say they found that?”

The new discovery grew out of the Declaration Resources Project, which Ms.
Allen, the author of the book “Our Declaration,” created in 2015 as a clearinghouse
for information about the myriad versions — newspaper printings, broadsides,
ornamental engravings — that circulated in the decades after independence.

So far, the project’s database counts some 306 made between July 4, 1776, when
Congress commissioned a broadside from the Philadelphia printer John Dunlap,
and 1800. (The parchment “original” at the National Archives was in fact signed in
early August 1776, nearly a month after independence.)

Soon after the effort started, Ms. Sneff, the project manager, noticed an entry in
an online catalog of British archives listing a parchment copy of the Declaration held
by the West Sussex Record Office in Chichester, England, but providing no date or
other detail.

“I was very skeptical but intrigued,” she said. (The document, deposited in West
Sussex in 1956, had come from a law firm connected with the dukes of Richmond.)
She requested an image. Then, last summer, she and Ms. Allen traveled to
Britain to see the original, which had been folded into a small square.
“I was on pins and needles,” Ms. Allen recalled. “I thought we would turn it over
and the back would say, ‘Ha, ha, we fooled you!’”

The parchment — the only known iteration of the Declaration oriented
horizontally — was stylistically similar to 18th­-century American legal and
mercantile documents, suggesting it was made by a commercial clerk, probably in
New York or Philadelphia. (A comparison with more than 150 handwriting samples
drew no matches to known individuals.)

Some details of the text suggest that whoever created it had had access to
congressional records, including the 1776 parchment. But it deviated from that
parchment — along with every known 18th­-century version of the Declaration — in
one striking respect: the ordering of the 56 signatures.

All known 18th-­century iterations, Ms. Allen said, show the signatures grouped
by state, with some printers even adding state labels. But here they were all jumbled.

“I just kept staring at it,” she said. “There was no discernible order.”
But then she labeled each name with the number of the column it appeared in
on the 1776 parchment, and noticed that they alternated in a clear pattern — a
pattern, she and Ms. Sneff argue, created with help from a well-­known 18th-­century
cipher.

That random order, Ms. Allen and Ms. Sneff argue, was meant to send a
political message: The signers pledged “to each other our lives, our fortunes and our
sacred honor,” as the last line puts it, as individuals, not as representatives of states.
And that message, they argue, points to Wilson.

Today, he is remembered by the public, if he is remembered at all, as the flip-flopper
in the musical “1776,” who can’t decide whether or not to vote for
independence. But at the Constitutional Convention, he was the leading voice for a
strong national government, undergirded by popular sovereignty.

“Can we forget for whom we are forming a government?” Wilson said. “Is it for
men, or for the imaginary beings called states?”

He was also the rare politician of the 1780s to repeatedly cite the Declaration —
a document whose history he would have had the chance to ponder, Ms. Allen and
Ms. Sneff note, during research he is known to have done in 1785 in the archives of
the Continental Congress.

“Before he does that archival work, he doesn’t reference the Declaration,” Ms.
Allen said. Afterward, “he always cites it” when making the nationalist argument.
There are other riddles to be unwoven, including with just how the document
got to England. The researchers’ preliminary hypothesis? It passed into the
possession of the third Duke of Richmond, a supporter of American independence,
possibly through Thomas Paine.

But for now, they point to a broader lesson: Every iteration of the Declaration
has a story to tell.

“This one,” Ms. Sneff said, “just happens to tell a pretty significant one.”

Friday, April 21, 2017

To Stay Married, Embrace Change - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


 Ada Calhoun, New York Times [original article contains links]; see also re change as a suggested "unifying" factor in American life.

image from article

A couple of years ago, it seemed as if everyone I knew was on the verge of divorce.

“He’s not the man I married,” one friend told me.

“She didn’t change, and I did,” said another.

And then there was the no­-fault version: “We grew apart." [JB emphasis]

Emotional and physical abuse are clear-­cut grounds for divorce, but they aren’t
the most common causes of failing marriages, at least the ones I hear about. What’s
the more typical villain? Change.

Feeling oppressed by change or lack of change; it’s a tale as old as time [JB emphasis].  Yet at
some point in any long­-term relationship, each partner is likely to evolve from the
person we fell in love with into someone new — and not always into someone cuter
or smarter or more fun. Each goes from rock climber to couch potato, from rebel to
middle manager, and from sex crazed to sleep obsessed.

Sometimes people feel betrayed by this change. They fell in love with one person,
and when that person doesn’t seem familiar anymore, they decide he or she violated
the marriage contract. I have begun to wonder if perhaps the problem isn’t change
itself but our susceptibility to what has been called the “end of history” illusion.

“Human beings are works in progress that mistakenly think they’re finished,”
the Harvard professor Daniel Gilbert said in a 2014 TED talk called “The Psychology
of Your Future Self.” He described research that he and his colleagues had done in
2013: Study subjects (ranging from 18 to 68 years old) reported changing much
more over a decade than they expected to.

In 2015, I published a book about where I grew up, St. Marks Place in the East
Village of Manhattan. In doing research, I listened to one person after another claim
that the street was a shadow of its former self, that all the good businesses had
closed and all the good people had left. This sentiment held true even though people
disagreed about which were the good businesses and who were the good people.

Nostalgia, which fuels our resentment toward change, is a natural human
impulse. And yet being forever content with a spouse, or a street, requires finding
ways to be happy with different versions of that person or neighborhood.

Because I like to fix broken things quickly and shoddily (my husband, Neal, calls
my renovation aesthetic “Little Rascals Clubhouse”), I frequently receive the advice:
“Don’t just do something, stand there.”

Such underreacting may also be the best stance when confronted by too much
or too little change. Whether or not we want people to stay the same, time will bring
change in abundance.

A year and a half ago, Neal and I bought a place in the country. We hadn’t been
in the market for a house, but our city apartment is only 500 square feet, and we
kept admiring this lovely blue house we drove by every time we visited my parents. It
turned out to be shockingly affordable.

So now we own a house. We bought furniture, framed pictures and put up a
badminton net. We marveled at the change that had come over us. Who were these
backyard-­grilling, property­-tax­-paying, shuttle-cock­batting people we had become?

When we met in our 20s, Neal wasn’t a man who would delight in lawn care,
and I wasn’t a woman who would find such a man appealing. And yet here we were,
avidly refilling our bird feeder and remarking on all the cardinals.

Neal, who hadn’t hammered a nail in all the years I’d known him, now had
opinions on bookshelves and curtains, and loved going to the hardware store. He
whistled while he mowed. He was like an alien. But in this new situation, I was an
alien, too — one who knew when to plant bulbs and how to use a Crock­-Pot, and
who, newly armed with CPR and first aid certification, volunteered at a local camp.
Our alien selves were remarkably compatible.

Several long­-married people I know have said this exact line: “I’ve had at least
three marriages. They’ve just all been with the same person.” I’d say Neal and I have
had at least three marriages: Our partying 20s, child-­centric 30s and home­-owning
40s.

Then there’s my abbreviated first marriage. Nick and I met in college and dated
for a few months before dropping out and driving cross-­country. Over the next few
years, we worked a series of low-­wage jobs. On the rare occasions when we discussed
our future, he said he wasn’t ready to settle down because one day, he claimed, he
would probably need to “sow” his “wild oats” — a saying I found tacky and a concept
I found ridiculous.

When I told Neal about this years later, he said, “Maybe you found it ridiculous
because you’d already done it.”

It’s true that from ages 16 to 19 I had a lot of boyfriends. But with Nick, I
became happily domestic. We adopted cats. I had changed in such a way that I had
no problem being with just one person. I was done changing and thought he should
be, too. Certainly, I thought he should not change into a man who sows oats.

When we got married at the courthouse so he could get his green card (he was
Canadian), I didn’t feel different the next day. We still fell asleep to “Politically
Incorrect” with our cats at our feet as we always had.

We told anyone who asked that the marriage was no big deal, just a formality so
the government wouldn’t break us up. But when pressed, it was hard to say what
differentiated us from the truly married beyond the absence of a party.

When I grew depressed a few months later, I decided that he and our pseudo-marriage
were part of the problem. After three years of feeling like the more
committed person, I was done and asked him to move out. When he left, I felt sad
but also thrilled by the prospect of dating again. A couple of years later, I met Neal.

Recently, I asked Nick if we could talk. We hadn’t spoken in a decade. He lives
in London now, so we Skyped. I saw that he looked almost exactly as he had at 22,
though he’d grown a long beard. We had a pleasant conversation. Finally, I asked
him if he thought our marriage counted.

“Yeah,” he said. “I think it counts.”

We were married, just not very well. The marriage didn’t mean much to us, and
so when things got rough, we broke up. I had been too immature to know what I was
getting into. I thought passion was the most important thing. When my romantic
feelings left, I followed them out the door. It was just like any breakup, but with
extra paperwork.

Nick now works at a European arts venue. He’s unmarried. I wouldn’t have
predicted his life or his facial hair. I don’t regret our split, but if we had stayed
married, I think I would have liked this version of him.

My hair is long and blond now. When Neal and I met, it was dyed black and cut
to my chin. When I took to bleaching it myself, it was often orange, because I didn’t
know what I was doing.

Now I weigh about 160 pounds. When I left the hospital after being treated for a
burst appendix, I weighed 140. When I was nine months pregnant and starving every
second, I weighed 210. I have been everything from size 4 to 14. I have been the life
of the party and a drag. I have been broke and loaded, clinically depressed and
radiantly happy. Spread out over the years, I’m a harem.

How can we accept that when it comes to our bodies (and everything else, for
that matter), the only inevitability is change? And what is the key to caring less about
change as a marriage evolves — things like how much sex we’re having and whether
or not it’s the best sex possible?

One day in the country, Neal and I heard a chipmunk in distress. It had gotten
inside the house and was hiding under the couch. Every few minutes, the creature let
out a high-­pitched squeak. I tried to sweep it out the door to safety with a broom, but
it kept running back at my feet.

“Wow, you’re dumb,” I said to it.

“I got this,” Neal said, mysteriously carrying a plastic cereal bowl. “Shoo it out
from under there.”

I did, and the chipmunk raced through the living room. Neal, like an ancient
discus thrower, tossed the bowl in a beautiful arc, landing it perfectly atop the
scampering creature. He then slid a piece of cardboard under the bowl and carried
the chipmunk out into the bushes, where he set it free.

“That was really impressive,” I said.

“I know,” he said.

To feel awed by a man I thought I knew completely: It’s a shock when that
happens after so many years. And a boon. That one fling of a bowl probably bought
us another five years of marriage.

Ada Calhoun, who lives in New York, is the author of a forthcoming memoir, “Wedding
Toasts I’ll Never Give,” from which this essay is adapted.

Cherokee Nation Sues Walmart, others over illegal opioids - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


Josh Saul, newsweek.com; see also.

Image from, with caption: "Flag of the Cherokee Nation"

Updated| Walmart, Walgreens and CVS Health are among the targets of a lawsuit filed Thursday by the Cherokee Nation accusing companies of flooding Indian Country with prescription opioids in order to boost their bottom lines.

“Today in the Cherokee Nation, as elsewhere in the country, prescription opioids are more deadly than heroin,” reads the lawsuit, filed in the District Court of the Cherokee Nation against six companies that also include McKesson Corporation, Cardinal Health Inc. and AmerisourceBergen. “Defendants created conditions in which vast amounts of opioids have flowed freely from manufacturers to abusers and drug dealers.” Lawyers for the Cherokee Nation—which is made up of 14 counties in Northeast Oklahoma—say the lawsuit is the first of its kind.

The suit charges that the six companies have long had the ability to reduce the death toll and financial impact of the opioid epidemic in the Cherokee Nation—adding up to hundreds of deaths and hundreds of millions of dollars—but they instead chose to pursue profits.

The lawsuit seeks unspecified damages, but Cherokee Nation Attorney General Todd Hembree said the total will eventually be hundreds of millions of dollars. “We’ve seen broken families. We’ve seen children born addicted,” Hembree tells Newsweek. “This lawsuit is important because the defendants have used every effort to market these drugs and profit from these drug sales to our people in our jurisdiction.” Hembree accuses the companies of allowing prescription opioids to fall into illegal distribution channels, failing to alert regulators of extreme volume and incentivizing sales of the drugs with financial bonuses.

“I fear the opioid epidemic is emerging as the next great challenge of our modern era,” Cherokee Nation Principal Chief Bill John Baker said in a press release. “As we fight this epidemic in our hospitals, our schools and our Cherokee homes, we will also use our legal system to make sure the companies, who put profits over people while our society is crippled by this epidemic, are held responsible for their actions.”

A CVS spokesman sent a statement that said, in part, "CVS Health is committed to the highest standards of ethics and business practices, including complying with all federal and state laws governing the dispensing of controlled substance prescriptions, and is dedicated to reducing prescription drug abuse and diversion." A Cardinal Health Inc. spokeswoman sent a statement that said the company is committed to helping solve the opioid crisis and, "Cardinal Health is confident that the facts and the law are on our side, and we intend to vigorously defend ourselves against the plaintiff’s mischaracterization of those facts and misunderstanding of the law." A Walgreens spokesman declined to comment on pending litigation. The other companies did not immediately respond to requests for comment.

A fraternity was told it was ‘appropriating culture.’ Administrators won’t say which.


washingtonpost.com


 Opinion writer  

Don’t blame college students for their hostility to free expression. The fault ultimately lies with cowardly school administrations, who so often cave to student demands for censorship. Or as some now prefer to call it, “empowering a culture of controversy prevention.”
Those are the actual, Orwellian words of an official at American University.
Several weeks ago, a fraternity at AU, Sigma Alpha Mu, began planning a fundraiser for a veterans’ organization. Student groups often center fundraisers on athletic tournaments, fraternity president and sophomore Rocco Cimino told me, but all the popular sports had already been claimed. The fraternity members decided to go with . . . badminton.  
To jazz things up, they called their event “Bad(minton) and Boujee.” It’s a pun on “Bad and Boujee,” a popular rap song by the group Migos about being newly rich and hanging with materialistic women. Sigma Alpha Mu registered the fundraiser on American’s online scheduling system, required for all campus events.
A few days later Cimino got a strange email from the school.

Colin Gerker, assistant director of fraternity and sorority life, said the word “boujee” might be criticized for “appropriating culture.” He would not approve the event unless the fraternity changed the name.
“I want to continue empowering a culture of controversy prevention among [Greek] groups,” Gerker wrote. He advised them to “stay away from gender, culture, or sexuality for thematic titles.”
The students were perplexed. 
A brief etymology, for those not familiar with “boujee”: The word originates with the Latin for castle or fortified town, “burgus.” This evolved into the French “bourgeois,” for people who live in town rather than the countryside. Town dwellers were more likely to engage in commerce and craftsmanship, and so rose over time to achieve middle-class incomes. That’s why Karl Marx later used the term to derisively refer to the class that upheld capitalism. Over time, “bourgeois” morphed into a more generic description of middle-class (and eventually upper-middle-class) materialism and obsession with respectability
More recently, “bourgeois” was shortened to the colloquial “bourgie ,” alternately spelled “bougie” or “boujee,” used disdainfully to describe upper-middle-class or high-end tastes (driving your Prius to Trader Joe’s after yoga class, for example). The “boujee” variation is common when referring to middle-class or upwardly mobile blacks, as in the Migos song. That’s hardly this spelling’s exclusive usage, though, as is evident from its entries in the crowd-sourced slang glossary Urban Dictionary.
So, in a way, “boujee” is indeed an appropriation — or rather an appropriation of an appropriation of an appropriation. That’s how language works. It’s fluid, evolving, constantly taking from other tongues, dialects and usages.
When the fraternity was accused of “appropriating culture,” the obvious question was: Which culture? Latin? French? Marxist? Urban hip-hop? Maybe their own? After all, if you’re wondering who best epitomizes today’s upper-middle class, bear in mind that these are college kids whose parents pay extra money on top of tuition to throw parties.
Figuring the administration misunderstood what “boujee” meant, Cimino challenged the school’s ultimatum. He explained the term, and added that this was just a regular sports tournament with a punny name. Otherwise it had nothing to do with the content of a rap song, in case that was the concern.
But Gerker ceded no ground, reiterating that the fraternity was “appropriating culture,” and added that in the interim he had received “multiple complaints” about the event title.
“I am awaiting a response from some folks on how they want to move forward with their complaints,” he wrote.
Still puzzled, the fraternity asked whether they could see the complaints lodged against them, but they never heard back. With time running short, they canceled the event and posted a GoFundMe page instead.
I reached out to the school to ask for clarification.
A spokeswoman sent a statement about how the “sequence of events did not go according to our normal process for working with student organizations.” She said the administration should not have prohibited the fundraiser and that it usually focuses on “coaching” students about how to proceed when an event “could have a negative impact and unintended consequences on campus.” But I never got an answer to what was so objectionable about the event title in the first place. 
Neither did the students. In a meeting Thursday, another administrator apologized to Cimino for not following protocol and pledged to help promote a future event for the veterans’ group. Still no explanation, though, of the “cultural appropriation” accusations beyond something like “we thought it could be controversial.” 
Schools were once charged with educating, challenging and setting an example for their wards. Today’s pupils must settle for controversy-prevention empowerment instead.

A Fictional (So Far) History of the Second American Civil War - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


By Justin Cronin, April 20, New York Times

Review of AMERICAN WAR By Omar El Akkad 333 pp. Alfred A. Knopf. $26.95.

image from article


Nationalistic movements blooming across Europe, sectarian violence roiling the
Middle East, vast refugee populations on the move, chunks of ice the size of Rhode
Island calving into the sea — and now, to top it all off, you-­know-­who, padding
around 1600 Pennsylvania Avenue in his pajamas with his thumb on the “tweet”
button: How tempting it is, in these fraught and fractious times, to view any
dystopian novel as a kind of nonfiction in waiting, a pre-­journalism of the future.

This urge is nothing new, of course. In the modern era, the granddaddy of such
books is Orwell’s “1984,” a novel sentenced forever by its prognosticatory title to
high school reading lists across the English­-speaking world. (I often wish Orwell had
stuck with his original title, “The Last Man in Europe,” so that more readers might
encounter it later in life and appreciate its complexity.) Likewise did the Cold War
produce a bountiful literature of white­-knuckle nuclear prediction. From Nevil
Shute’s “On the Beach” to Pat Frank’s “Alas, Babylon” to the apocalyptic kitsch of
“Planet of the Apes” (confession: I sat through all five in a row at an “Apes” film
festival in 1975), the message was the same: Fellow humans, we blew it. Now we’re
doomed.

Our worries may have evolved since then, but not the impulse to enact them on
the page, and Omar El Akkad’s “American War” is a disturbingly plausible case in
point — a tale of a future America torn asunder by its own political and tribal
affiliations [JB emphasis].

El Akkad’s novel, his first, opens in a distant future when the United States as
we know it is barely a memory, permanently knocked off the world stage by climate
change, plague and intrastate conflict. The novel’s nominal narrator (a conceit that is
quickly pushed into the background) is a historical researcher who has devoted his
life to studying “this country’s bloody war with itself. Part of the Miraculous
Generation “born in the years between the start of the Second American Civil War in
2074 and its end in 2095,” Benjamin Chestnut arrived in New Anchorage, Alaska, as
a young refugee. Now an old man dying of cancer, he tells a story in equal measures
about historical reconstruction and personal atonement. “There are things I know
that nobody else knows,” he says. “I know because she told me. And my knowing
makes me complicit.”

The “she” he speaks of is his aunt, Sara T. Chestnut, known as Sarat, who is the
novel’s true subject. When the story reboots in 2075, Sarat is a young girl living with
her family in a shipping container in a mostly drowned Louisiana. Climate change
has occurred on a massive, unanticipated scale; many coastal cities are gone, as well
as virtually all of peninsular Florida. (The federal government has relocated to
Columbus, Ohio, a nice touch.) When, in the face of environmental catastrophe,
fossil fuel is outlawed, the country goes bonkers. Mississippi, Alabama and Georgia
secede to form the Free Southern State; South Carolina, which led the revolt, is
encased by a massive wall after the federal government unleashes the first of the
novel’s two plagues to tamp down the rebellion.

This seems a bit far-­fetched. Southerners do love their Nascar, but going to war to
defend their rights to gas up a muscle car? All of Florida? And the wall around South
Carolina — where have we heard this before? Were the residents of South Carolina
perhaps made to pay for it? There’s a fair amount of authorial winking and seat­-of-the-­pants
science going on here, but never mind; El Akkad is far less concerned with
the mechanics of his conceit than its psychological underpinnings. When Sarat’s
father is killed in a terrorist blast and rebel militias close in on the family home, the
Chestnuts flee to a filthy tent city for displaced persons on the Tennessee border,
ironically named Camp Patience — the “festering heart of the war-­torn South.” Just
beyond the wires lies the front line separating “Reds” from “Blues.” It is here, under
the gaze of Northern snipers ordered to kill any who attempt to cross, that Sarat
commences her education as a would-­be freedom fighter or terrorist, take your pick.

By this point, if the novel’s true historical and social analogues aren’t apparent
to the reader, they should be. The novel may be set in the future, and the title may be
“American War,” but there’s nothing especially futuristic or, for that matter,
distinctly American about it. This is precisely the author’s point, and the thing that’s
most unsettling about the book. America is not Iraq or Syria, but it’s not Denmark,
either; it’s a large, messy, diverse country glued together by 250­-year­-old paperwork
composed by yeoman farmers, and our citizens seem to understand one another less
by the day. Puncture the illusion of a commonwealth, El Akkad asserts, fire a few
shots into the crowd and put people in camps for a decade, and watch what happens.

Sarat is the novel’s test case. As the war grinds pointlessly on, and she and her
family languish in materially deprived boredom, she is singled out by a smooth-talking
figure named Gaines, who hires her to deliver money to rebel militiamen
operating outside the purview of the Army of the Free Southern State. Twelve years
old, she is soon passing her days in his company, being fed a steady diet of pro-Southern
propaganda and oily praise while Gaines grooms her for something more.
Gaines is an American veteran of various Middle Eastern conflicts (the money is
funneled from the Bouazizi Empire, a unified, post­-Arab Spring Middle East), and he
has learned well the lessons of his former adversaries. “I seek out special people,”
Gaines tells her, “people who, if given the chance and the necessary tools, would
stand up and face the enemy on behalf of those who can’t … who would do this even
if they knew for certain that it would cost them dearly, maybe even cost them their
lives.” Sarat rises to the bait; when Northern militiamen massacre the residents of
Camp Patience, killing Sarat’s mother and gravely wounding her brother, her fate is
sealed. “Sarat turned her attention to the only thing that still mattered: revenge, the
unsettled score.”

All of this unfolds at an unhurried pace; the novel’s thriller premise notwithstanding,
El Akkad applies a literary writer’s care to his depiction of Sarat’s psychological
unpacking and the sensory details of her life, first in Camp Patience, then on the move
as a freelance insurgent. (The story also pauses at regular intervals for the inclusion of
various wartime documents — committee reports, bureaucratic case files, eyewitness
accounts — to flesh out the background.) Even as the story delves deeper into the
political minutiae of the war — in particular, a power struggle between the government
of the Free Southern State and rebel militias over the question of ending the conflict —
it also makes the case that Sarat’s journey is an entirely personal one, as war itself
becomes personal, a collection of private grievances looking for a public solution.
By the time Sarat is finally captured and sent to a Guantanamo-­like prison to be
waterboarded, she’s achieved legendary status, but she hardly cares; she’s a
horoughly apolitical animal. When the war ends and she’s abruptly released,
there can be little doubt that her program of vengeance has not ended. It’s
merely looking for its terrible, final expression.

“For Sarat Chestnut,” her nephew explains, “the calculus was simple: The
enemy had violated her people, and for that she would violate the enemy. There
could be no other way, she knew it. Blood can never be unspilled.” Whether read as a
cautionary tale of partisanship run amok, an allegory of past conflicts or a study of
the psychology of war, “American War” is a deeply unsettling novel. The only
comfort the story offers is that it’s a work of fiction. For the time being, anyway.

Justin Cronin’s most recent novel is “The City of Mirrors,” Book 3 of the Passage trilogy.

‘In God We Trust,’ Even at Our Most Divided - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


wsj.com; for more on "On God We Trust" as the official motto on the United States (so established by Congress in 1956 during the Cold War, defined by many as a struggle between God-believing capitalism and atheistic communism), see.



The story behind the Civil War-era motto that still appears on America’s coins.



PHOTO (from article): GETTY IMAGES
On April 22, 1864, Congress approved a significant revision to the nation’s coinage: the addition of “In God We Trust” on several U.S. coins. This was more than a small change for small change: Governmental officials believed it would help America through a time of crisis. As the country continues to slog through an era of deep division [JB emphasis], it’s worth studying the ideals that informed this refinement of American currency.
April 1864 was not necessarily an auspicious time for the U.S. The Civil War was raging. Bloody battles took place at Sabine Crossroads and Pleasant Hill, and free African-American soldiers were massacred when they were overrun at Fort Pillow in Tennessee. Southern secession left the nation physically and spiritually fractured.
With political life frayed and the war effort faltering, adding a new motto to American coinage might have looked like desperation or propaganda. It was neither. Abraham Lincoln and Treasury Secretary Salmon P. Chase had known about the idea for years. In an 1861 letter, the Rev. M.R. Watkinson of Pennsylvania asked Chase to consider recognizing “the Almighty God in some form on our coins.”
Chase, an abolitionist Ohio Republican, had liked the idea for years. “No nation can be strong except in the strength of God, or safe except in His defense,” he wrote to the director of the U.S. Mint in 1861. “The trust of our people in God should be declared on our national coins.” Some three years later the motto was approved by Congress and stamped on coinage in Philadelphia.
The change fit the mood of the time. Facing the dissolution of the Union, many Americans looked for divine aid to help heal the national divisions. They recognized that faith could sustain liberty and self-government. This echoed the acts of earlier generations of Americans, who during the Revolutionary War had flown battle flags bearing the motto “An Appeal to Heaven.”
Does using the language of faith on currency constitute another example of “civil religion” perverting traditional religion for secular ends? As historian John D. Wilsey argued in “American Exceptionalism and Civil Religion,” such public religious appeals aren’t necessarily destined to become unhealthy derivatives of serious religious ideals. They can create an open ideal that broadens the circle of citizenship and invites participation—which the “In God We Trust” stamp did.
President Lincoln channeled these religious concerns during his Second Inaugural Address in 1865. His reflections were brief but profound, drawing heavily on biblical language. The president rejected the South’s claims, but he did so with humility. “It may seem strange that any men should dare to ask a just God’s assistance in wringing their bread from the sweat of other men’s faces, but let us judge not, that we be not judged,” he asserted, building off Matthew 7:1-2.
Rather than assume a morally superior position, Lincoln used the moment to call for self-reflection. The North had also been entangled in slavery and the violence of the Civil War, and it was in no position to claim perfect conduct. “The Almighty has His own purposes,” Lincoln said. And, no matter what, “so still it must be said ‘the judgments of the Lord are true and righteous altogether,’ ” evoking Psalm 19:9.
If both North and South stood under divine judgment, then a new attitude was demanded, one of humbly working for the common good. In his peroration, Lincoln called his hearers to steady service: “With malice toward none, with charity for all, with firmness in the right as God gives us to see the right, let us strive on to finish the work we are in.”
The most important of these tasks was “to bind up the nation’s wounds, to care for him who shall have borne the battle and for his widow and his orphan.” Lincoln was calling to mind the good Samaritan from the Gospel of Luke, who, finding an injured man, “bound up his wounds, pouring in oil and wine.” Similarly, his injunction to help the widow and the orphan echoed the Book of James, which taught that “pure religion” consisted at least partly of visiting “the fatherless and widows in their affliction.”
Lincoln concluded that this vision could be a global one, as they would “do all which may achieve and cherish a just and lasting peace among ourselves and with all nations.” The 16th president thus demonstrated that the best religious reflection in public life could lead to humility, self-criticism, care for fellow citizens, and renewal of civic ties. And that seems like a beneficial reminder from the random coins jangling in our pockets.  

Of course adults sneer at Millennials: Christian Schneider - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


Christian Schneider, USA Today

uncaptioned image from article

Hating younger generations is an American tradition, but the kids always turn out fine.

In 1749, a cantankerous Benjamin Franklin engaged in a tradition older than America itself: Throwing shade at a younger generation. Franklin, referring to the common complaint that the youth of America were not of "equal ability" to their predecessors, said that, "The best capacities require cultivation, it being truly with them, as with the best ground, which unless well tilled and sowed with profitable seed, produces only ranker weeds."
Imagine what Franklin would have had to say about Millennials.
One does not have to scan the news too vigorously to find endless condemnations of the latest crop of American youth. Children born from the mid-1980s to the early 2000s are now routinely derided as "snowflakes," as if each thinks he or she is a unique gift to the world. Raised in a culture of "trigger warnings" and "microaggressions," Millennials have been accused of fostering a culture of hypersensitivity, unable to connect with the naked realities of the real world.  
Nowhere has this transition been more evident than on college campuses, which have taken racial and sexual balkanization to new extremes.
For instance, who can forget when the University of Minnesota banned the use of female cheerleaders, believing their routines fostered demeaning "sexual stereotypes?" Or when University of Wisconsin-Milwaukee officials distributed a list of 49 "Ways to Experience Diversity," including urging students to "Hold hands publicly with someone of a different race or someone of the same sex as you" and "Go to a toy store and investigate the availability of racially diverse dolls?" Or when the University of Connecticut banned "inappropriately directed laughter?"
You may think you saw these examples fly by on Twitter over the past few months. But they all actually took place in the late 1980s and early 1990s, when campuses were at the pinnacle of the first era of "political correctness." At one point, the University of Arizona instituted a "Diversity Action Plan" that banned discrimination on the grounds of things like age, color, ethnicity, race, religion, sexual orientation and "personal style." A campus "diversity specialist" clarified that "personal style" would include "nerds and people who dress differently." (All these examples can be found in Charlie Sykes' prescient book A Nation of Victims, released way back in 1992.) 
Amid the modern tumult, it's easy to forget that we've been here before. And it's probably now safe to say that college students of my generation (I started school in 1991) made it through this era of progressive inculcation okay.
Generation X was literally known as the "slacker generation" — an ironically detached group of kids raised in the 1970s that couldn't be bothered to muster up enthusiasm for anything other than flannel shirts and Winona Ryder.Washington Post headline from the early 1990s perfectly represented the enmity Baby Boomers felt against Gen Xers: "The Boring Twenties: Grow Up, Crybabies."
And yet Gen Xers — the neglected middle child between the Baby Boomers and the Millennials — have now inhabited the cultural and political positions of power they once disdained. Speaker of the House Paul Ryan is a Gen-Xer, as are United Nations Ambassador Nikki Haley and rising Senators Marco Rubio and Cory Booker. Among their members, Xers also count the founders of tech giants like Twitter and Google — tools which, ironically, can aid interested Millennials in finding out more about Gen X. (Suggested first search: "Who is Pauly Shore?")
Predictably, now it's Gen X's turn to heap disdain upon the younger generation. (As Michael Kinsley once quipped about Gen X, "These kids today. They're soft. They don't know how good they have it. Not only did they never have to fight in a war . . . they never even had to dodge one.'')
It is true that Millennials differ from other generations in some important regards. A report released by the U.S. Census Bureau on Wednesday of this week demonstrated that today’s young people are far more likely to live at home and delay getting married and having children. But Gen X is also heavily influenced by major demographic and cultural changes that took place during the 1970’s — with more women in the workplace and increased access to birth control, Generation X is far smaller than the two generations that sandwich it on either side.
But while their styles are different — Gen X went out of its way to prove it didn't care about anything while Millennials seem to care way too much about everything — the two groups share a lot more in common other than the fact that Jennifer Aniston appears to not have aged during the transition.
For instance, both groups experienced a campus climate with an excruciating emphasis on identity politics. Perhaps most notable for the Gen Xers was the "speech code" enacted by the University of Wisconsin-Madison in the late 1980s. "The university is institutionally racist," declared Chancellor Donna Shalala at the time, adding that the campus simply reflected American society, which is "racist and sexist." Not surprisingly, conservative speakers at the state's other campuses were soon pelted with hard objects and shouted down — a scene that would become common once again in 2017.
In fact, the last few years on college campuses have become a virtual "I Love the '80s" of grievance and victimhood. A group of students at Pomona College recently wrote a letter to their school's administration claiming “the idea that there is a single truth . . . is a myth and white supremacy.” Earlier this month, Rice University stopped using the term "master" to refer to the heads of its residential colleges, as they feared the term was too closely associated with slavery. And on and on.
Thankfully, the cyclical nature of the political correctness movement offers hope for the Millennial generation. For one, the mere process of growing up, becoming an adult, getting a job, having kids and paying taxes typically has the effect of grounding people in reality. It happens with every generation. 
Further, if the PC movement of the late 1980s and early 1990s is any sort of blueprint, there will soon be a backlash to the modern buffoonery happening on campuses. Society typically has a way of finding its water level — the political correctness of decades ago was followed by cultural figures devoted to shattering that oversensitivity.
It was no coincidence that 1990 saw the rise in Andrew Dice Clay and 2 Live Crew — middling artists that reveled in tastelessness and taboo-shattering. Soon, Adam Sandler movies were making hundreds of millions of dollars. Culture eventually corrects itself. (That is not to say Clay or the 2 Live Crew made the world a better place, but there's always good money to be made in a well-timed backlash.)
Most importantly, we should have learned by now that dividing up individual people by birthdate is a wildly inaccurate way of judging a generation's relative quality. [JB emphasis] Every age group is going to have their leftist radicals and their religious conservatives. My generation managed to birth both Janeane Garofalo and Ted Cruz. Yet the internet is always going to devote more pixels to the attention-seekers shouting into bullhorns than the students putting their heads down and gritting their way through their studies.
During recent campus incidents in which conservative speakers were accosted by groups of protesters, the fact has been lost that a good number of students actually showed up at these events to see what the speakers had to say. In fact, there's ample evidence that campus activism might be provoking a silent backlash. While Millennials clearly do support "liberal" positions such as same-sex marriage, young people now are less likely than their parents to support legal abortion and at least as likely as older people in their support for gun rights.
Although new technology may seem like we're in unchartered territory, history tells us that the kids that scare us now will one day be just fine. Swap in Lena Dunham for Kurt Cobain, Lady Gaga for Madonna, and the cycle grinds on. And if we don't do enough to help Millennials to succeed, it will be the older generations that will have failed them, not the other way around.