Sunday, May 3, 2015

Robert Reich: America's economy is a nightmare of our own making. Note for a lecture, "E Pluribus Unum? What Keeps the United States United."


The former secretary of labor examines how our country became the most unequal society in the developed world

Robert Reich, Salon

This piece originally appeared in the spring edition of The American Prospect.

For the past quarter-century I’ve offered in articles, books, and lectures an explanation for why average working people in advanced nations like the United States have failed to gain ground and are under increasing economic stress: Put simply, globalization and technological change have made most of us less competitive. The tasks we used to do can now be done more cheaply by lower-paid workers abroad or by computer-driven machines.

My solution—and I’m hardly alone in suggesting this—has been an activist government that raises taxes on the wealthy, invests the proceeds in excellent schools and other means people need to become more productive, and redistributes to the needy. These recommendations have been vigorously opposed by those who believe the economy will function better for everyone if government is smaller and if taxes and redistributions are curtailed.

While the explanation I offered a quarter-century ago for what has happened is still relevant—indeed, it has become the standard, widely accepted explanation—I’ve come to believe it overlooks a critically important phenomenon: the increasing concentration of political power in a corporate and financial elite that has been able to influence the rules by which the economy runs. And the governmental solutions I have propounded, while I believe them still useful, are in some ways beside the point because they take insufficient account of the government’s more basic role in setting the rules of the economic game.

Worse yet, the ensuing debate over the merits of the “free market” versus an activist government has diverted attention from how the market has come to be organized differently from the way it was a half-century ago, why its current organization is failing to deliver the widely shared prosperity it delivered then, and what the basic rules of the market should be. It has allowed America to cling to the meritocratic tautology that individuals are paid what they’re “worth” in the market, without examining the legal and political institutions that define the market. The tautology is easily confused for a moral claim that people deserve what they are paid. Yet this claim has meaning only if the legal and political institutions defining the market are morally justifiable.

II

Most fundamentally, the standard explanation for what has happened ignores power. As such, it lures the unsuspecting into thinking nothing can or should be done to alter what people are paid because the market has decreed it.

The standard explanation has allowed some to argue, for example, that the median wage of the bottom 90 percent—which for the first 30 years after World War II rose in tandem with productivity—has stagnated for the last 30 years, even as productivity has continued to rise, because middle-income workers are worth less than they were before new software technologies and globalization made many of their old jobs redundant. They therefore have to settle for lower wages and less security. If they want better jobs, they need more education and better skills. So hath the market decreed.

Yet this market view cannot be the whole story because it fails to account for much of what we have experienced. For one thing, it doesn’t clarify why the transformation occurred so suddenly. The divergence between productivity gains and the median wage began in the late 1970s and early 1980s, and then took off. Yet globalization and technological change did not suddenly arrive at America’s doorstep in those years. What else began happening then?

Nor can the standard explanation account for why other advanced economies facing similar forces of globalization and technological change did not succumb to them as readily as the United States. By 2011, the median income in Germany, for example, was rising faster than it was in the United States, and Germany’s richest 1 percent took home about 11 percent of total income, before taxes, while America’s richest 1 percent took home more than 17 percent. Why have globalization and technological change widened inequality in the United States to a much greater degree?

Nor can the standard explanation account for why the compensation packages of the top executives of big companies soared from an average of 20 times that of the typical worker 40 years ago to almost 300 times. Or why the denizens of Wall Street, who in the 1950s and 1960s earned comparatively modest sums, are now paid tens or hundreds of millions annually. Are they really “worth” that much more now than they were worth then?

Finally and perhaps most significantly, the market explanation cannot account for the decline in wages of recent college graduates. If the market explanation were accurate, college graduates would command higher wages in line with their greater productivity. After all, a college education was supposed to boost personal incomes and maintain American prosperity.

To be sure, young people with college degrees have continued to do better than people without them. In 2013, Americans with four-year college degrees earned 98 percent more per hour on average than people without a college degree. That was a bigger advantage than the 89 percent premium that college graduates earned relative to non-graduates five years before, and the 64 percent advantage they held in the early 1980s.

But since 2000, the real average hourly wages of young college graduates have dropped. The entry-level wages of female college graduates have dropped by more than 8 percent, and male graduates by more than 6.5 percent. To state it another way, while a college education has become a prerequisite for joining the middle class, it is no longer a sure means for gaining ground once admitted to it. That’s largely because the middle class’s share of the total economic pie continues to shrink, while the share going to the top continues to grow.

III

A deeper understanding of what has happened to American incomes over the last 25 years requires an examination of changes in the organization of the market. These changes stem from a dramatic increase in the political power of large corporations and Wall Street to change the rules of the market in ways that have enhanced their profitability, while reducing the share of economic gains going to the majority of Americans.

This transformation has amounted to a redistribution upward, but not as “redistribution” is normally defined. The government did not tax the middle class and poor and transfer a portion of their incomes to the rich. The government undertook the upward redistribution by altering the rules of the game.

Intellectual property rights—patents, trademarks, and copyrights—have been enlarged and extended, for example. This has created windfalls for pharmaceuticals, high tech, biotechnology, and many entertainment companies, which now preserve their monopolies longer than ever. It has also meant high prices for average consumers, including the highest pharmaceutical costs of any advanced nation.

At the same time, antitrust laws have been relaxed for corporations with significant market power. This has meant large profits for Monsanto, which sets the prices for most of the nation’s seed corn; for a handful of companies with significant market power over network portals and platforms (Amazon, Facebook, and Google); for cable companies facing little or no broadband competition (Comcast, Time Warner, AT&T, Verizon); and for the largest Wall Street banks, among others. And as with intellectual property rights, this market power has simultaneously raised prices and reduced services available to average Americans. (Americans have the most expensive and slowest broadband of any industrialized nation, for example.)

Financial laws and regulations instituted in the wake of the Great Crash of 1929 and the consequential Great Depression have been abandoned—restrictions on interstate banking, on the intermingling of investment and commercial banking, and on banks becoming publicly held corporations, for example—thereby allowing the largest Wall Street banks to acquire unprecedented influence over the economy. The growth of the financial sector, in turn, spawned junk-bond financing, unfriendly takeovers, private equity and “activist” investing, and the notion that corporations exist solely to maximize shareholder value.

Bankruptcy laws have been loosened for large corporations—notably airlines and automobile manufacturers—allowing them to abrogate labor contracts, threaten closures unless they receive wage concessions, and leave workers and communities stranded. Notably, bankruptcy has not been extended to homeowners who are burdened by mortgage debt and owe more on their homes than the homes are worth, or to graduates laden with student debt. Meanwhile, the largest banks and auto manufacturers were bailed out in the downturn of 2008–2009. The result has been to shift the risks of economic failure onto the backs of average working people and taxpayers.

Contract laws have been altered to require mandatory arbitration before private judges selected by big corporations. Securities laws have been relaxed to allow insider trading of confidential information. CEOs have used stock buybacks to boost share prices when they cash in their own stock options. Tax laws have created loopholes for the partners of hedge funds and private-equity funds, special favors for the oil and gas industry, lower marginal income-tax rates on the highest incomes, and reduced estate taxes on great wealth.

All these instances represent distributions upward—toward big corporations and financial firms, and their executives and shareholders—and away from average working people.

IV

Meanwhile, corporate executives and Wall Street managers and traders have done everything possible to prevent the wages of most workers from rising in tandem with productivity gains, in order that more of the gains go instead toward corporate profits. Higher corporate profits have meant higher returns for shareholders and, directly and indirectly, for the executives and bankers themselves.

Workers worried about keeping their jobs have been compelled to accept this transformation without fully understanding its political roots. For example, some of their economic insecurity has been the direct consequence of trade agreements that have encouraged American companies to outsource jobs abroad. Since all nations’ markets reflect political decisions about how they are organized, so-called “free trade” agreements entail complex negotiations about how different market systems are to be integrated. The most important aspects of such negotiations concern intellectual property, financial assets, and labor. The first two of these interests have gained stronger protection in such agreements, at the insistence of big U.S. corporations and Wall Street. The latter—the interests of average working Americans in protecting the value of their labor—have gained less protection, because the voices of working people have been muted.

Rising job insecurity can also be traced to high levels of unemployment. Here, too, government policies have played a significant role. The Great Recession, whose proximate causes were the bursting of housing and debt bubbles brought on by the deregulation of Wall Street, hurled millions of Americans out of work. Then, starting in 2010, Congress opted for austerity because it was more interested in reducing budget deficits than in stimulating the economy and reducing unemployment. The resulting joblessness undermined the bargaining power of average workers and translated into stagnant or declining wages.

Some insecurity has been the result of shredded safety nets and disappearing labor protections. Public policies that emerged during the New Deal and World War II had placed most economic risks squarely on large corporations through strong employment contracts, along with Social Security, workers’ compensation, 40-hour workweeks with time-and-a-half for overtime, and employer-provided health benefits (wartime price controls encouraged such tax-free benefits as substitutes for wage increases). But in the wake of the junk-bond and takeover mania of the 1980s, economic risks were shifted to workers. Corporate executives did whatever they could to reduce payrolls—outsource abroad, install labor-replacing technologies, and utilize part-time and contract workers. A new set of laws and regulations facilitated this transformation.

As a result, economic insecurity became baked into employment. Full-time workers who had put in decades with a company often found themselves without a job overnight—with no severance pay, no help finding another job, and no health insurance. Even before the crash of 2008, the Panel Study of Income Dynamics at the University of Michigan found that over any given two-year stretch in the two preceding decades, about half of all families experienced some decline in income.

Today, nearly one out of every five working Americans is in a part-time job. Many are consultants, freelancers, and independent contractors. Two-thirds are living paycheck to paycheck. And employment benefits have shriveled. The portion of workers with any pension connected to their job has fallen from just over half in 1979 to under 35 percent today. In MetLife’s 2014 survey of employees, 40 percent anticipated that their employers would reduce benefits even further.

The prevailing insecurity is also a consequence of the demise of labor unions. Fifty years ago, when General Motors was the largest employer in America, the typical GM worker earned $35 an hour in today’s dollars. By 2014, America’s largest employer was Walmart, and the typical entry-level Walmart worker earned about $9 an hour.

This does not mean the typical GM employee a half-century ago was “worth” four times what the typical Walmart employee in 2014 was worth. The GM worker was not better educated or motivated than the Walmart worker. The real difference was that GM workers a half-century ago had a strong union behind them that summoned the collective bargaining power of all autoworkers to get a substantial share of company revenues for its members. And because more than a third of workers across America belonged to a labor union, the bargains those unions struck with employers raised the wages and benefits of non-unionized workers as well. Non-union firms knew they would be unionized if they did not come close to matching the union contracts.

Today’s Walmart workers do not have a union to negotiate a better deal. They are on their own. And because less than 7 percent of today’s private-sector workers are unionized, most employers across America do not have to match union contracts. This puts unionized firms at a competitive disadvantage. Public policies have enabled and encouraged this fundamental change. More states have adopted so-called “right-to-work” laws. The National Labor Relations Board, understaffed and overburdened, has barely enforced collective bargaining. When workers have been harassed or fired for seeking to start a union, the board rewards them back pay—a mere slap on the wrist of corporations that have violated the law. The result has been a race to the bottom.

Given these changes in the organization of the market, it is not surprising that corporate profits have increased as a portion of the total economy, while wages have declined. Those whose income derives directly or indirectly from profits—corporate executives, Wall Street traders, and shareholders—have done exceedingly well. Those dependent primarily on wages have not.

V

The underlying problem, then, is not that most Americans are “worth” less in the market than they had been, or that they have been living beyond their means. Nor is it that they lack enough education to be sufficiently productive. The more basic problem is that the market itself has become tilted ever more in the direction of moneyed interests that have exerted disproportionate influence over it, while average workers have steadily lost bargaining power—both economic and political—to receive as large a portion of the economy’s gains as they commanded in the first three decades after World War II. As a result, their means have not kept up with what the economy could otherwise provide them.

To attribute this to the impersonal workings of the “free market” is to disregard the power of large corporations and the financial sector, which have received a steadily larger share of economic gains as a result of that power. As their gains have continued to accumulate, so has their power to accumulate even more.

Under these circumstances, education is no panacea. Reversing the scourge of widening inequality requires reversing the upward distributions within the rules of the market, and giving workers the bargaining leverage they need to get a larger share of the gains from growth. Yet neither will be possible as long as large corporations and Wall Street have the power to prevent such a restructuring. And as they, and the executives and managers who run them, continue to collect the lion’s share of the income and wealth generated by the economy, their influence over the politicians, administrators, and judges who determine the rules of the game may be expected to grow.

The answer to this conundrum is not found in economics. It is found in politics. The changes in the organization of the economy have been reinforcing and cumulative: As more of the nation’s income flows to large corporations and Wall Street and to those whose earnings and wealth derive directly from them, the greater is their political influence over the rules of the market, which in turn enlarges their share of total income.

The more dependent politicians become on their financial favors, the greater is the willingness of such politicians and their appointees to reorganize the market to the benefit of these moneyed interests. The weaker unions and other traditional sources of countervailing power become economically, the less able they are to exert political influence over the rules of the market, which causes the playing field to tilt even further against average workers and the poor.

Ultimately, the trend toward widening inequality in America, as elsewhere, can be reversed only if the vast majority, whose incomes have stagnated and whose wealth has failed to increase, join together to demand fundamental change. The most important political competition over the next decades will not be between the right and left, or between Republicans and Democrats. It will be between a majority of Americans who have been losing ground, and an economic elite that refuses to recognize or respond to its growing distress.



Robert Reich, one of the nation’s leading experts on work and the economy, is Chancellor’s Professor of Public Policy at the Goldman School of Public Policy at the University of California at Berkeley. He has served in three national administrations, most recently as secretary of labor under President Bill Clinton. Time Magazine has named him one of the ten most effective cabinet secretaries of the last century. He has written 13 books, including his latest best-seller, “Aftershock: The Next Economy and America’s Future;” “The Work of Nations,” which has been translated into 22 languages; and his newest, an e-book, “Beyond Outrage.” His syndicated columns, television appearances, and public radio commentaries reach millions of people each week. He is also a founding editor of the American Prospect magazine, and Chairman of the citizen’s group Common Cause. His new movie "Inequality for All" is in Theaters. His widely-read blog can be found at www.robertreich.org.

Moscow in 2025




Two imaginary cities of the not-so-distant future.
When China’s Ming dynasty first came into contact with a new nation approaching its border from the north — Russia — Chinese officials had to choose the right hieroglyphs to name it. They called it 俄國, which translates as “Country of Unpredictability.”

One can only admire the foresight of those 17th century bureaucrats. They identified the angry nerve of Russia’s anatomy with the precision of a virtuoso chiropractor. Yes, my dear country is unpredictable. I have lived all my life, until very recently, in its capital. But a simple question about what Moscow might be like in ten years, a short span of time, in truth, leaves me at a loss.

Perhaps Russia’s state symbol — the two-headed eagle — could serve as another explanation for my uncertainty. The bewildered mutant doesn’t know which way to look, to the East or to the West, and keeps trying to fly in opposite directions.
I start with this preamble because I feel I must explain why, when I mentally superimpose the name “Moscow” with the number “2025”, I see not one town, but two, resembling each other no more than Mr. Hyde resembles Dr. Jekyll.
I’ll call those two imaginary cities of the not-so-distant future “Moscow-1” and “Moscow-2.”

Moscow-1

This is what Moscow will become if things continue the way they have been going since March 2014, when Russia started moving rapidly towards isolation.
I imagine myself arriving in Moscow after a long absence. For the past ten years I couldn’t visit because I’ve been deemed a traitor and an enemy of the people, but recently a slight détente has set in; I come as a guest of the Committee for Cultural Ties with Compatriots. I decide to accept. I am eager to see my native town again.

It is not difficult for me to visualize this Moscow-1. I’ve been there before. I grew up in the USSR, my memories are vivid.

It feels like a time-warp, a leap into the past.

A severe border guard tries to check my passport through an antiquated computer system, but the autonomous “Rusnet” doesn’t work properly, so I have to wait for several minutes under the benevolent scrutiny of the Great Leader — his huge portrait hangs in the booth. While I’ve gotten older over the past ten years, the Great Leader keeps getting younger. His forehead is smooth, his generalissimo epaulettes glittering.

I am met by my guide, who regards me warily at first, but soon relaxes. Whispering so that our driver can’t hear, he tells me a new one about how our Great Leader sees off the Great Leader of North Korea in the airport, gives him a kiss, and then waves his hand for a long, long time. When asked why, he answers dreamily: “Oh, what a kiss that was!” I smile sourly. I’d heard that one half a century ago. About Brezhnev and Honecker, the then leader of the German Democratic Republic.

The noisy Volga sedan rattles over the holes in the pavement. The shabbiness of the streets is masked by innumerable posters, orange-black and white-blue-red. There are no commercial advertisements, only slogans and portraits of the Great Leader. I see him on a horse, in the cabin of a fighter jet, surrounded by happy children, shaking hands with a grateful pensioner.

In the hall of the Intourist Hotel, while my guide checks me in, a sneaky young man pulls me by my sleeve. He wants to sell me rubles at a rate five times better than the official one.

Then I walk to the street where I lived for so many years.

My old apartment house has not been renovated. Huge vintage air conditioners are still there, covered in rust.

I pass what used to be the Museum of Tolerance, now the Museum of Patriotism. Then I see a long queue in front of a food store. I read the notice on the door: “Chinese canned beef! Two cans max. per buyer. Food coupons not valid.”
Back in my hotel I watch the news. It’s all about the second trial of the infamous “Medvedev Gang.” The former president and his ministers confess to having been CIA agents and to practising outlawed homosexual acts. They beg for clemency. Judging by the tone of the reportage, they won’t be getting any.
I shiver. I open the fridge and drink everything I can find there: the lousy Dagestan brandy and the vodka “Putinka.”

Enough. I’ll stop describing Moscow-1 here, before I have to start drinking.

Moscow-2

If life in my country goes back to normalcy (which I hope it will), I think that by 2025 Moscow will be the most interesting place in the world. Not the most beautiful or the most wealthy or the most comfortable city, oh no, but one that is bustling with energy.

No wishful thinking here, just the plain laws of physics. By “normalcy” I mean freedom. When a big country becomes free after a long period of suppression, it’s like a steel spring let loose. The air vibrates with adrenaline, everything moves, everything changes.

The brain drain of the previous decade has reversed its flow; professionals, entrepreneurs and intellectuals are returning home to Russia. That’s where big money is made now, that’s where things happen.

I won’t be noticing how my town has changed because I have been there for the whole decade. As a writer, I’ve been feasting on all the pent-up energy and freshness. I write differently in this 2025, and I do not like to re-read my old stuff. It’s another Russia, another Moscow, another me.

It’s April 22, 2025 today. I realize it’s Vladimir Ilyich Lenin’s birthday. A sudden idea comes to mind. Why not go visit Lenin’s Mausoleum? I haven’t been there since childhood.

It’s quite a long drive, because the mausoleum was removed from Red Square long ago. Now it is the main attraction of “Sovietland,” a park in a Moscow suburb where all the monuments of the totalitarian era were transplanted, a sort of historical Disneyland of the bygone epoch. I walk along the alleys, among Lenins, Stalins, Dzerzhinskys, Sverdlovs, cheerful miners and busty kolkhoznitsy, keepsakes from the time when I was a young pioneer. There’s also a bronze Putin in a judo outfit, a work by the sculptor Zurab Tsereteli, cast some 20 years ago.

That reminds me. I do not want to miss the press conference, so I hurry back home.

The judoist has just been released from jail, having served only half of his term. I look at his wrinkled face on the TV screen. No, he has no intention of returning to politics, he says. Yes, he wants to spend all his time with his kids, to compensate for the years of absence. No, he cannot disclose the advance he received for his forthcoming memoirs.

That’s not fair, I think enviously. This schmuck’s memoirs are sure to become an international bestseller, while my books do not sell at all. It’s true that I write differently now, I write better than before, but young people think I am a writer from the past. I tell myself that future generations will surely rediscover me again, but I know they won’t.

It’s OK. I am indulgent towards my older self of 2025. As long as that guy feels no nostalgia for the times when life was awful but his books sold well.
I know, I know. In reality things usually end up somewhere between shining white and dismal black, in the gray zone. But not this time. No shades of gray for Moscow of 2025. It will be either this — or that. You’ll see.

Boris Akunin is the pen name of Grigory Shalvovich Chkhartishvili. He is an essayist, literary translator and writer of detective fiction.

America’s Fattest Cities: Note for a Lecture, "E Pluribus Unum? What Keeps the United States United."


Alice G. Walton Forbes


The Most Obese Metro Areas in America

Obesity rates nationwide are rapidly increasing. According to Gallup, the national rate in 2013 was 27.1% — the highest it’s been since the organization started tracking the numbers. But some communities’ obesity rates exceed even that. These metro areas have the highest rates in the country [entry cites the communities].

American Dream? Or Mirage? - Note for a Lecture, "E Pluribus Unum? What Keeps the United States United"


MAY 1, 2015

MICHAEL W. KRAUS, SHAI DAVIDAI and A. DAVID NUSSBAUM
New York Times

ECONOMIC inequality in the United States is at its highest level since the
1930s, yet most Americans remain relatively unconcerned with the issue.
Why?

One theory is that Americans accept such inequality because they
overestimate the reality of the “American dream” — the idea that any
American, with enough resolve and determination, can climb the economic
ladder, regardless of where he starts in life. The American dream implies that
the greatest economic rewards rightly go to society’s most hard­working and
deserving members.

Recently, studies by two independent research teams (each led by an
author of this article) found that Americans across the economic spectrum did
indeed severely misjudge the amount of upward mobility in society. The data
also confirmed the psychological utility of this mistake: Overestimating
upward mobility was self­serving for rich and poor people alike. For those who
saw themselves as rich and successful, it helped justify their wealth. For the
poor, it provided hope for a brighter economic future.

In studies by one author of this article, Shai Davidai, and the Cornell
psychologist Thomas Gilovich, published earlier this year in Perspectives on
Psychological Science, more than 3,000 respondents viewed a graph of the five
income quintiles in American society and were asked to estimate the likelihood
that a randomly selected person born to the bottom quintile would move to
each of the other income quintiles in his lifetime. These estimates were
compared with actual mobility trends documented by the Pew Research
Center. Participants in the survey overshot the likelihood of rising from the
poorest quintile to one of the three top quintiles by nearly 15 percentage
points. (On average, only 30 percent of individuals make that kind of leap.)
Studies by another author of this article, the University of Illinois
psychologist Michael W. Kraus, and his colleague Jacinth J.X. Tan, to be
published in next month’s issue of the Journal of Experimental Social
Psychology, found a similar pattern: When asked to estimate how many
college students came from families in the bottom 20 percent of income,
respondents substantially misjudged, estimating that those from the lowest
income bracket attended college at a rate five times greater than the actual one
documented by the Current Population Survey.

One experiment by Professors Kraus and Tan demonstrated the selfserving
nature of these errant upward mobility estimates. As with the studies
above, participants were asked to estimate the ease of moving up the economic
ladder. This time, however, they were also asked to estimate upward mobility
for people who were similar to them “in terms of goals, abilities, talents and
motivations.” In this case, respondents were even more likely to overestimate
upward mobility. We believe unduly in our own capacity to move up the
economic ladder, and these beliefs increase our mobility overestimates more
generally.

For those lower in income or educational attainment, lower standing was
associated with greater overestimation of upward mobility. Those with the
most room to move up were more likely to think that such movement was
possible.

However, when people were asked to explicitly state how high up the
economic ladder they felt, after accounting for their actual economic standing,
the reverse pattern emerged: The higher up people said they were, the more
they overestimated the likelihood of upward mobility. Being aware of your
position at the top of a low­mobility hierarchy can be uncomfortable, because
without mobility, sitting at the top is the result of luck, rather than merit.

Some Americans were better than others when it came to judging
economic mobility. Across both sets of studies, political liberals were less likely
to overestimate upward mobility relative to conservatives — a finding
consistent with other research suggesting that conservatives see our society as
more merit­based than do liberals.

In addition, studies by Professor Gilovich and Mr. Davidai found that
members of ethnic minority groups tended to overestimate upward mobility
more than did European Americans. This result indicated that those with the
most to gain from believing in an upwardly mobile society tended to believe so
more strongly.

Taken together, these sets of studies suggest that belief in the American
dream is woefully misguided when compared with objective reality.
Addressing the rising economic gap between rich and poor in society, it seems,
will require us to contend not only with economic and political issues, but also
with biases of our psychology.

Michael W. Kraus is an assistant professor of psychology at the
University of Illinois. Shai Davidai is a Ph.D. candidate in
psychology at Cornell University. A. David Nussbaum is an
adjunct assistant professor of behavioral science at the Booth
School of Business at the University of Chicago.

Saturday, May 2, 2015

China Passes Mexico as the Top Source of New U.S. Immigrants - Note for a lecture, "E Pluribus Unum? What Keeps the United States United"


By NEIL SHAH, Wall Street Journal; via JM and LH on Facebook



Move over, Mexico. When it comes to sending immigrants to the U.S., China and India have taken over.
China was the country of origin for 147,000 recent U.S. immigrants in 2013, while Mexico sent just 125,000 [JB question: legal or "illegal"?], according to a Census Bureau study by researcher Eric Jensen and others presented Friday. India, with 129,000 immigrants, also beat Mexico, though the two countries’ results weren’t statistically different from each other.
For the study, presented at the Population Association of America conference in San Diego, researchers analyzed annual immigration data for 2000 to 2013 from the American Community Survey. The annual survey conducted by the Census Bureau asks where respondents lived the year before. Researchers counted as an  ”immigrant” any foreign-born person in the U.S. who said they previously lived abroad, without asking about legal status. (So while the data include undocumented immigrants, it may undercount them.)
A year earlier, in 2012, Mexico and China had been basically tied for top-sending country—with Mexico at 125,000 and China at 124,000.

It’s not just China and India. Several of the top immigrant-sending countries in 2013 were from Asia, including South Korea, the Philippines and Japan.
For a decade, immigration to the U.S. from China and India, which boast the world’s biggest populations, has been rising. Meanwhile, immigration from Mexico has been declining due to improvements in the Mexican economy and lower Mexican birth rates. More recently, the Great Recession also reduced illegal immigration from Mexico.
A shift in America’s immigrant community will take far longer. In 2012, five times as many immigrants in the U.S. were from Mexico than China.
But the shifting nature of the immigrant flows seen in the Census study give us a peek at what’s likely to happen to the overall racial and ethnic makeup of the U.S. population.
The millennial generation—roughly speaking, people born between 1982 and 2000, but definitions vary and there’s no real endpoint—is already the most diverse generation in U.S. history. As Brookings Institution demographer William Frey details in his recent book, “Diversity Explosion,” the social, economic and cultural implications are just starting to come into view. In time—2044, to be exact, according to Census projections—the entire U.S. population will have no racial majority, and, instead, a melting pot of minorities will shape U.S. society and politics.
Hispanics are still America’s biggest racial or ethnic minority group. But roughly two-thirds of them are now native-born, not recent immigrants. Among the U.S. Asian population, two-thirds (65%) are foreign-born.
Census researchers note that the rise of this latest, Asian wave of immigration seems—and is—dramatic, but past waves have been dramatic, too. The U.S.’s earliest immigrant waves came from Northern and Western Europe, then Southern and Eastern Europe, and finally, from Latin America.
Plenty of recent immigrants don’t come from China, India or Mexico. When you combine them, recent immigrants from those nations made up just about a third of the roughly 1.2 million immigrants in 2013, the Census analysis shows.
The question now is just how big and significant this Asian wave is going to be. “Whether these recent trends signal a new and distinct wave of immigration is yet to be seen,” Census researchers say.

This one video shows how racism is real in America. Note for a lecture, "E Pluribus Unum? What Keeps the United States United"

Racism is still very much an issue in this country, one that pervades many aspects of life. But not everyone in America fully appreciates that fact. As many studies show, white Americans are often cut off from the realities of racism, living within homogeneous social networks and communities.
But if you have any doubts about whether racism still exists in America, this 3-minute video from Brave New Films, a California-based company that makes films to spur political activism, might clear them up. The video counts down eight reasons that racism is still very real in America, using research from Yale University, the American Civil Liberties Union and the New England Journal of Medicine, among others.
Those reasons are listed below, with links to the research cited.
1. Black sounding names are 50 percent less likely to be called back by those reviewing job applications. In a 2002 study, Marianne Bertrand and Sendhil Mullainathan of the University of Chicago mailed thousands of job applications to reviewers that were identical except for the names. They found that applications with white-sounding names like Emily and Brendan were much more likely to be answered than identical resumes from black-sounding names like Lakisha and Jamal.
2. Black people are charged roughly $700 more when buying cars. A study by Ian Ayres and Peter Siegelman of Yale Law School found that dealers quoted lower prices to white men than blacks and women, even though all buyers used an identical script for negotiating.
3. Black drivers are twice as likely to get pulled over. Numerous studies -- including this 1999 study by the ACLU and an analysis of FBI records by USA Today last year -- show a significant racial gap in police stops and arrests.
4. Black clients are shown 17.7 percent fewer houses for sale. A 2012 study of housing discrimination by the Federal Department of Housing and Urban Development found that black homebuyers who contacted agents about recently advertised homes for sale learned about 17 percent fewer available homes than equally qualified whites and were shown 17.7 percent fewer homes. Asian homebuyers learned about 15.5 percent fewer available homes and were shown 18.8 percent fewer homes.
5. Black people are much more likely to be arrested for marijuana use. A 2013 study by the ACLU showed that, while marijuana use rates are equal among blacks and whites, black people are 3.7 times more likely to be arrested for it.
6. Black people are incarcerated at nearly six times the rate of white people. A 2007 study by Marc Mauer and Ryan King of the Sentencing Project document the incredible incarceration rates of young black men. "If current trends continue, one in three black males born today can expect to spent time in prison during his lifetime," they wrote.
7. Doctors did not inform black patients as often as white ones about an important heart procedure. In a 1999 study for The New England Journal of Medicine, researchers investigated a long-standing difference in the use of cardiovascular procedures according to the race and sex of the patient. They found that women and blacks were less likely to be referred for cardiac catheterization than men and whites, respectively, and that black women were much less likely to be referred than white men.
8. White legislators did not respond as frequently to constituents with black-sounding names. In a 2011 study, Daniel Butler and David Broockman of Yale University found that state legislators were less likely to respond to requests for help with registering to vote when the sender had a putatively black name than a white one. Legislators of both parties exhibited similar levels of discrimination against constituents with "black names." However, the study also found that minority legislators did the opposite, responding more frequently to those with black names.

America dumbs down. Note for a lecture, "E Pluribus Unum? What Keeps the United States United"


Jonathon Gatehouse macleans.ca
May 15, 2014

South Carolina’s state beverage is milk. Its insect is the praying mantis. There’s a designated dance—the shag—as well a sanctioned tartan, game bird, dog, flower, gem and snack food (boiled peanuts). But what Olivia McConnell noticed was missing from among her home’s 50 official symbols was a fossil. So last year, the eight-year-old science enthusiast wrote to the governor and her representatives to nominate the Columbian mammoth. Teeth from the woolly proboscidean, dug up by slaves on a local plantation in 1725, were among the first remains of an ancient species ever discovered in North America. Forty-three other states had already laid claim to various dinosaurs, trilobites, primitive whales and even petrified wood. It seemed like a no-brainer. “Fossils tell us about our past,” the Grade 2 student wrote.

And, as it turns out, the present, too. The bill that Olivia inspired has become the subject of considerable angst at the legislature in the state capital of Columbia. First, an objecting state senator attached three verses from Genesis to the act, outlining God’s creation of all living creatures. Then, after other lawmakers spiked the amendment as out of order for its introduction of the divinity, he took another crack, specifying that the Columbian mammoth “was created on the sixth day with the other beasts of the field.” That version passed in the senate in early April. But now the bill is back in committee as the lower house squabbles over the new language, and it’s seemingly destined for the same fate as its honouree—extinction.

What has doomed Olivia’s dream is a raging battle in South Carolina over the teaching of evolution in schools. Last week, the state’s education oversight committee approved a new set of science standards that, if adopted, would see students learn both the case for, and against, natural selection.

Charles Darwin’s signature discovery—first published 155 years ago and validated a million different ways since—long ago ceased to be a matter for serious debate in most of the world. But in the United States, reconciling science and religious belief remains oddly difficult. A national poll, conducted in March for the Associated Press, found that 42 per cent of Americans are “not too” or “not at all” confident that all life on Earth is the product of evolution. Similarly, 51 per cent of people expressed skepticism that the universe started with a “big bang” 13.8 billion years ago, and 36 per cent doubted the Earth has been around for 4.5 billion years.

The American public’s bias against established science doesn’t stop where the Bible leaves off, however. The same poll found that just 53 per cent of respondents were “extremely” or “very confident” that childhood vaccines are safe and effective. (Worldwide, the measles killed 120,000 people in 2012. In the United States, where a vaccine has been available since 1963, the last recorded measles death was in 2003.) When it comes to global warming, only 33 per cent expressed a high degree of confidence that it is “man made,” something the UN Intergovernmental Panel on Climate Change has declared is all but certain. (The good news, such as it was in the AP poll, was that 69 per cent actually believe in DNA, and 82 per cent now agree that smoking causes cancer.)

If the rise in uninformed opinion was limited to impenetrable subjects that would be one thing, but the scourge seems to be spreading. Everywhere you look these days, America is in a rush to embrace the stupid. Hell-bent on a path that’s not just irrational, but often self-destructive. Common-sense solutions to pressing problems are eschewed in favour of bumper-sticker simplicities and blind faith.

In a country bedevilled by mass shootings—Aurora, Colo.; Fort Hood, Texas; Virginia Tech—efforts at gun control have given way to ever-laxer standards. Georgia recently passed a law allowing people to pack weapons in state and local buildings, airports, churches and bars. Florida is debating legislation that will waive all firearm restrictions during state emergencies like riots or hurricanes. (One opponent has moved to rename it “an Act Relating to the Zombie Apocalypse.”) And since the December 2012 massacre of 20 children and six staff at Sandy Hook Elementary School, in Newtown, Conn., 12 states have passed laws allowing guns to be carried in schools, and 20 more are considering such measures.

The cost of a simple appendectomy in the United States averages $33,000 and it’s not uncommon for such bills to top six figures. More than 15 per cent of the population has no health insurance whatsoever. Yet efforts to fill that gaping hole via the Affordable Health Care Act—a.k.a. Obamacare—remain distinctly unpopular. Nonsensical myths about the government’s “real” intentions have found so much traction that 30 per cent still believe that there will be official “death panels” to make decisions on end-of-life care.

Since 2001, the U.S. government has been engaged in an ever-widening program of spying on its own—and foreign—citizens, tapping phones, intercepting emails and texts, and monitoring social media to track the movements, activities and connections of millions. Still, many Americans seem less concerned with the massive violations of their privacy in the name of the War on Terror, than imposing Taliban-like standards on the lives of others. Last month, the school board in Meridian, Idaho voted to remove The Absolutely True Diary of a Part-Time Indian by Sherman Alexie from its Grade 10 supplemental reading list following parental complaints about its uncouth language and depictions of sex and drug use. When 17-year-old student Brady Kissel teamed up with staff from a local store to give away copies at a park as a protest, a concerned citizen called police. It was the evening of April 23, which was also World Book Night, an event dedicated to “spreading the love of reading.”

If ignorance is contagious, it’s high time to put the United States in quarantine.

Americans have long worried that their education system is leaving their children behind. With good reason: national exams consistently reveal how little the kids actually know. In the last set, administered in 2010 (more are scheduled for this spring), most fourth graders were unable to explain why Abraham Lincoln was an important figure, and only half were able to order North America, the U.S., California and Los Angeles by size. Results in civics were similarly dismal. While math and reading scores have improved over the years, economics remains the “best” subject, with 42 per cent of high school seniors deemed “proficient.”

They don’t appear to be getting much smarter as they age. A 2013 survey of 166,000 adults across 20 countries that tested math, reading and technological problem-solving found Americans to be below the international average in every category. (Japan, Finland, Canada, South Korea and Slovakia were among the 11 nations that scored significantly higher.)

The trends are not encouraging. In 1978, 42 per cent of Americans reported that they had read 11 or more books in the past year. In 2014, just 28 per cent can say the same, while 23 per cent proudly admit to not having read even one, up from eight per cent in 1978. Newspaper and magazine circulation continues to decline sharply, as does viewership for cable news. The three big network supper-hour shows drew a combined average audience of 22.6 million in 2013, down from 52 million in 1980. While 82 per cent of Americans now say they seek out news digitally, the quality of the information they’re getting is suspect. Among current affairs websites, Buzzfeed logs almost as many monthly hits as the Washington Post.

The advance of ignorance and irrationalism in the U.S. has hardly gone unnoticed. The late Columbia University historian Richard Hofstadter won the Pulitzer prize back in 1964 for his book Anti-Intellectualism in American Life, which cast the nation’s tendency to embrace stupidity as a periodic by-product of its founding urge to democratize everything. By 2008, journalist Susan Jacoby was warning that the denseness—“a virulent mixture of anti-rationalism and low expectations”—was more of a permanent state. In her book, The Age of American Unreason, she posited that it trickled down from the top, fuelled by faux-populist politicians striving to make themselves sound approachable rather than smart. Their creeping tendency to refer to everyone—voters, experts, government officials—as “folks” is “symptomatic of a debasement of public speech inseparable from a more general erosion of American cultural standards,” she wrote. “Casual, colloquial language also conveys an implicit denial of the seriousness of whatever issue is being debated: talking about folks going off to war is the equivalent of describing rape victims as girls.”

That inarticulate legacy didn’t end with George W. Bush and Sarah Palin. Barack Obama, the most cerebral and eloquent American leader in a generation, regularly plays the same card, droppin’ his Gs and dialling down his vocabulary to Hee Haw standards. His ability to convincingly play a hayseed was instrumental in his 2012 campaign against the patrician Mitt Romney; in one of their televised debates the President referenced “folks” 17 times.

An aversion to complexity—at least when communicating with the public—can also be seen in the types of answers politicians now provide the media. The average length of a sound bite by a presidential candidate in 1968 was 42.3 seconds. Two decades later, it was 9.8 seconds. Today, it’s just a touch over seven seconds and well on its way to being supplanted by 140-character Twitter bursts.

Little wonder then that distrust—of leaders, institutions, experts, and those who report on them—is rampant. A YouGov poll conducted last December found that three-quarters of Americans agreed that science is a force for good in the world. Yet when asked if they truly believe what scientists tell them, only 36 per cent of respondents said yes. Just 12 per cent expressed strong confidence in the press to accurately report scientific findings. (Although according to a 2012 paper by Gordon Gauchat, a University of North Carolina sociologist, the erosion of trust in science over the past 40 years has been almost exclusively confined to two groups: conservatives and regular churchgoers. Counterintuitively, it is the most highly educated among them—with post-secondary education—who harbour the strongest doubts.)

The term “elitist” has become one of the most used, and feared, insults in American life. Even in the country’s halls of higher learning, there is now an ingrained bias that favours the accessible over the exacting.

“There’s a pervasive suspicion of rights, privileges, knowledge and specialization,” says Catherine Liu, the author of American Idyll: Academic Antielitism as Cultural Critique and a film and media studies professor at University of California at Irvine. Both ends of the political spectrum have come to reject the conspicuously clever, she says, if for very different reasons; the left because of worries about inclusiveness, the right because they equate objections with obstruction. As a result, the very mission of universities has changed, argues Liu. “We don’t educate people anymore. We train them to get jobs.” (Boomers, she says, deserve most of the blame. “They were so triumphalist in promoting pop culture and demoting the canon.”)

The digital revolution, which has brought boundless access to information and entertainment choices, has somehow only enhanced the lowest common denominators—LOL cat videos and the Kardashians. Instead of educating themselves via the Internet, most people simply use it to validate what they already suspect, wish or believe to be true. It creates an online environment where Jenny McCarthy, a former Playboy model with a high school education, can become a worldwide leader of the anti-vaccination movement, naysaying the advice of medical professionals.

Most perplexing, however, is where the stupid is flowing from. As conservative pundit David Frum recently noted, where it was once the least informed who were most vulnerable to inaccuracies, it now seems to be the exact opposite. “More sophisticated news consumers turn out to use this sophistication to do a better job of filtering out what they don’t want to hear,” he blogged.

But are things actually getting worse? There’s a long and not-so-proud history of American electors lashing out irrationally, or voting against their own interests. Political scientists have been tracking, since the early 1950s, just how poorly those who cast ballots seem to comprehend the policies of the parties and people they are endorsing. A wealth of research now suggests that at the most optimistic, only 70 per cent actually select the party that accurately represents their views—and there are only two choices.

Larry Bartels, the co-director of the Center for the Study of Democratic Institutions at Vanderbilt University, says he doubts that the spreading ignorance is a uniquely American phenomenon. Facing complex choices, uncertain about the consequences of the alternatives, and tasked with balancing the demands of jobs, family and the things that truly interest them with boring policy debates, people either cast their ballots reflexively, or not at all. The larger question might be whether engagement really matters. “If your vision of democracy is one in which elections provide solemn opportunities for voters to set the course of public policy and hold leaders accountable, yes,” Bartels wrote in an email to Maclean’s. “If you take the less ambitious view that elections provide a convenient, non-violent way for a society to agree on who is in charge at any given time, perhaps not.”

A study by two Princeton University researchers, Martin Gilens and Benjamin Page, released last month, tracked 1,800 U.S. policy changes between 1981 and 2002, and compared the outcome with the expressed preferences of median-income Americans, the affluent, business interests and powerful lobbies. They concluded that average citizens “have little or no independent influence” on policy in the U.S., while the rich and their hired mouthpieces routinely get their way. “The majority does not rule,” they wrote.

Smart money versus dumb voters is hardly a fair fight. But it does offer compelling evidence that the survival of the fittest remains an unshakable truth even in American life. A sad sort of proof of evolution.