Wednesday, August 30, 2017

When Machines Run Amok


Wall Street Journal


The author was taken aback when he observed an AI program teach itself to play an arcade game—much better than its human designers. Frank Rose reviews ‘Life 3.0’ by Max Tegmark.


PHOTO: GETTY IMAGES
Cosmologists take on the big questions, and in “Life 3.0” Max Tegmark addresses what may be the biggest of them all: What happens when humans are no longer the smartest species on the planet—when intelligence is available to programmable objects that have no experience of mortal existence in a physical body? Science fiction poses such questions frequently, but Mr. Tegmark, a physicist at MIT, asks us to put our “Terminator” fantasies aside and ponder other, presumably more realistic, scenarios. Among them is the possibility that a computer program will become not just intelligent but wildly so—and that we humans will find ourselves unable to do anything about it.
Mr. Tegmark’s previous book, “Our Mathematical Universe” (2014), put a hugely debatable spin on the already counterintuitive notion that there exists not one universe but a multitude. Not all mathematicians were impressed. “Life 3.0” will be no less controversial among computer scientists. Lucid and engaging, it has much to offer the general reader. Mr. Tegmark’s explanation of how electronic circuitry—or a human brain—could produce something so evanescent and immaterial as thought is both elegant and enlightening. But the idea that a machine-based superintelligence could somehow run amok is fiercely resisted by many computer scientists, to the point that people associated with it have been attacked as Luddites.
PHOTO: WSJ

LIFE 3.0

By Max Tegmark
Knopf, 384 pages, $28
Yet the notion enjoys more credence today than it did a few years ago, partly thanks to Mr. Tegmark. Along with Elon Musk, Stephen Hawking and the Oxford philosopher Nick Bostrom, he has emerged as a leading proponent of “AI safety” research, which focuses on such critical matters as how to switch off intelligent machines before things get out of hand.
In March 2014 he co-founded the Boston-based Future of Life Institute to support work on the subject, and soon after he helped stage a conference at which AI researchers from around the world agreed that they should work not just to advance the field of artificial intelligence but to benefit humankind. This past January, he helped draw up a 23-point statement of principles that has been embraced by some 1,200 people in AI, among them the authors of the leading textbook on the subject and the founders of DeepMind, the Google-owned company whose AlphaGo program defeated one of the world’s top Go players last year in South Korea. 
The issue is certainly timely. After decades in which artificial intelligence promised much and delivered little, recent breakthroughs in such target areas as facial recognition, automatic translation and self-driving cars have brought AI out of the woods. Amazon, Alphabet, Facebook, Tesla and Uber are making huge investments in AI research, as are Baidu and Alibaba in China. Where all this will take us is the broader focus of Mr. Tegmark’s book.
Though he sees widespread benefits in fields ranging from medical diagnosis to power-grid management, Mr. Tegmark devotes the bulk of “Life 3.0” to how things could go wrong. Most immediate is the threat of unemployment, starting perhaps among Uber drivers before eventually spreading to computer scientists whose machines have learned to program themselves. Even more disconcerting is the threat of an arms race involving cheap, mass-produced autonomous weapons. As Mr. Tegmark points out, “there isn’t much difference between a drone that can deliver Amazon packages and one that can deliver bombs.” Actually, bombs are crude compared with what AI could deliver once it has been weaponized: Think drones the size of bumblebees that could be programmed to kill certain people, or certain categories of people, by grabbing their skulls with tiny metal talons and drilling into their heads. [JB emphasis]
As horrific as that possibility may sound, it wouldn’t threaten the existence of the human species. Superintelligence might. No one really knows if a machine will ever develop the general-purpose intelligence that would be required. But in 2014 Mr. Tegmark caught a glimpse of how it might. He was watching a DeepMind program as it learned to play Breakout, a ’70s arcade game. The object of the game is to break through a wall by bouncing a ball off it repeatedly, knocking out a brick with every hit. At first the AI was hopeless. But it quickly got better, and before long it devised a relentlessly effective technique that none of the humans at DeepMind had thought of. It went on to learn 49 different arcade games, including Pong and Space Invaders, beating its human testers on more than half of them. Obviously it’s a very long way from vintage arcade games to general intelligence, let alone consciousness. But if a computer program can teach itself to play games, it might be able to teach itself many other things as well—slowly at first, then faster and faster.
What would that mean for humans? Nobody knows, including—as he freely admits—Mr. Tegmark. Like horses after the invention of the internal-combustion engine, we might be kept on as show animals—although Mr. Tegmark’s observation that the U.S. horse population fell almost 90% between 1915 and 1960 is not exactly heartening. He presents a dozen or so other scenarios as well. Would an omniscient AI act as a “protector god,” maximizing human happiness while allowing us the illusion that we’re still in control? Would it decide we’re a threat and wipe us out?
It’s impossible to know that either. By failing either to refute or champion the bulk of these possible futures, Mr. Tegmark makes the whole exercise seem divorced from reality. But he means it as a challenge: Rather than our being told what is going to happen, he wants us to decide what we want to happen. This sounds quite noble, if a tad naive—until he invites us to debate the issue on a web site that is chockablock with promo material for the book. There’s a place for self-promotion, just as there’s a place for killer-robot movies—but does either really contribute to our understanding of what humanity faces?
Mr. Rose is the author of “The Art of Immersion” and a senior fellow at the Columbia University School of the Arts.

Modern Liberalism’s False Obsession With Civil War Monuments: Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


wsj.com


Black accomplishments in the ’40s and ’50s prove that today’s setbacks are not due to slavery.


A statue of Confederate Gen. Thomas Jonathan "Stonewall" Jackson in Richmond, Va., Aug. 23.
A statue of Confederate Gen. Thomas Jonathan "Stonewall" Jackson in Richmond, Va., Aug. 23. PHOTO: CHIP SOMODEVILLA/GETTY IMAGES
Visit the American Museum of Natural History in New York City, and between exhibits of dinosaur skeletons, Asian elephants and Alaskan moose you might notice a bust of Henry Fairfield Osborn and a plaque honoring Madison Grant. Osborn and Grant were two of the country’s leading conservationists in the early 1900s. They also were dedicated white supremacists.
Osborn, a former president of the museum, founded the Eugenics Education Society—now known as the Galton Institute—which sought the improvement of humanity through selective breeding. Grant, a co-founder of the Bronx Zoo, is known today for his influential 1916 best seller, “The Passing of the Great Race,” a pseudoscientific polemic arguing that nonwhite immigrants—which included Eastern and Southern Europeans by his definition—were tainting America’s superior Nordic stock. Osborn, who was a zoologist by training, wrote the introduction to Grant’s book, which Hitler called “my Bible.” The New Yorker magazine once described Grant as someone who “extended a passion for preserving bison and caribou into a mania for preserving the ‘Nordic race.’ ”
Given their options, why are liberals so focused on monuments to Civil War figures? Politically, it makes some tactical sense. The GOP has spent decades warding off claims of racism, and forcing Republican politicians to defend prominent displays of Confederate statuary keeps them on the defensive. On another level, however, liberals make a fetish of Civil War monuments because it feeds their hallowed slavery narrative, which posits that racial inequality today is mainly a legacy of the country’s slave past.

–– ADVERTISEMENT ––

One problem with these assumptions about slavery’s effects on black outcomes today is that they are undermined by what blacks were able to accomplish in the first hundred years after their emancipation, when white racism was rampant and legal and blacks had bigger concerns than Robert E. Lee’s likeness in a public park. Today, slavery is still being blamed for everything from black broken families to high crime rates in black neighborhoods to racial gaps in education, employment and income. Yet outcomes in all of those areas improved markedly in the immediate aftermath of slavery and continued to improve for decades.
Between 1890 and 1940, for example, black marriage rates in the U.S. where higher than white marriage rates. In the 1940s and ’50s, black labor-participation rates exceeded those of whites; black incomes grew much faster than white incomes; and the black poverty rate fell by 40 percentage points. Between 1940 and 1970—that is, during Jim Crow and prior to the era of affirmative action—the number of blacks in middle-class professions quadrupled. In other words, racial gaps were narrowing. Steady progress was being made. Blacks today hear plenty about what they can’t achieve due to the legacy of slavery and not enough about what they did in fact achieve notwithstanding hundreds of years in bondage followed by decades of legal segregation.
In the post-’60s era, these positive trends would slow, stall, or in some cases even reverse course. The homicide rate for black men fell by 18% in the 1940s and by another 22% in the 1950s. But in the 1960s all of those gains would vanish as the homicide rate for black males rose by nearly 90%. Are today’s black violent-crime rates a legacy of slavery and Jim Crow or of something else? Unfortunately, that’s a question few people on the left will even entertain.
Just ask Amy Wax and Lawrence Alexander, law professors at the University of Pennsylvania and University of San Diego, respectively, who were taken to task for co-authoring an op-ed this month in the Philadelphia Inquirer that lamented the breakdown of “bourgeois” cultural values that prevailed in mid-20th-century America. “That culture laid out the script we all were supposed to follow,” they wrote. “Get married before you have children and strive to stay married for their sake. Get the education you need for gainful employment, work hard, and avoid idleness. . . . Be respectful of authority. Eschew substance abuse and crime.”
The professors noted that disadvantaged groups have been hit hardest by the disintegration of these middle-class mores and that the expansion of the welfare state, which reduced the financial need for two-parent families, hastened social retrogression. “A strong pro-marriage norm might have blunted this effect,” they wrote. “Instead, the number of single parents grew astronomically, producing children more prone to academic failure, addiction, idleness, crime, and poverty.”
For the suggestion that something other than continuing racial bigotry and the legacy of slavery has contributed to racial inequality, a coalition of faculty and students at the University of Pennsylvania promptly accused the professors of advancing a “racist and white supremacist discourse.” The reality is that there was a time when blacks and whites alike shared conventional attitudes toward marriage, parenting, school and work, and those attitudes abetted unprecedented social and economic black advancement.
Appeared in the August 30, 2017, print edition.

Tuesday, August 29, 2017

Trump and the Disunited States of America (ongoing entry)


i

image from

  • James FallowsThe Atlantic (August 23): "[W]ith every day that passes without their doing something about it, the stain and responsibility for Trump’s ungoverned tone stick more lastingly to the Republican establishment that keeps looking the other way as he debases his office and divides his country."
  • Michael D'AntonioCNN (August 23): "Although Reckless Trump often sabotages his own agenda, he causes the greatest damage when he indulges himself on an issue that is vital to both public safety and national unity."
  • Philip Rucker, "Trump’s whiplash: Three personas in three speeches," Washington Post (August 23): "In the span of 48 hours this week, President Trump has boomeranged among three roles: the commander in chief, the divider and the uniter. ... Trump jetted to Phoenix, where the immigration inferno he has helped ignite burns nearly as hot as the broiling sun. ... Speaking from the heart, he served up one 'us' vs. 'them' riff after another. ... By Wednesday, with television news commentators devouring his Phoenix free-for-all, Trump swooped into Reno, Nev., with the kind of unity message that you would expect to hear Pope Francis deliver'It is time to heal the wounds that divide us and to seek a new unity based on the common values that unite us,' Trump said, again reading from a teleprompter, at the national convention of the American Legion."
  • Mark Landler, "Different Day, Different Audience, and a Completely Different Trump, New York Times (August 23): “We are here to hold you up as an example of strength, courage and resolve that our country will need to overcome the many challenges that we face,” the president said, speaking slowly and gravely as he read from a teleprompter. “We are here to draw inspiration from you as we seek to renew the bonds of loyalty that bind us together as one people and one nation.
  • Michael Gerson,"Trump’s rhetorical schizophrenia is easy to see through," Washington Post (August 24) "And so, on one day, we had an unhinged and divisive rant by President Trump in Phoenix. Then, the next day in Reno, Nev., a call for national unity and reconciliation. Multiple political personality disorder. Rhetorical schizophrenia. ... The real voice again widening racial divisions by defending Confederate monuments as 'our history and our heritage.' ... The unified control of House, Senate and presidency means little when the president lives in a reality of his own."
  • Dan Balz, "Arpaio fits a pattern: A divider, not a uniter," Washington Post (August 26): President Trump has set his presidency on an unambiguous course for which there could be no reversal. He has chosen to be a divider, not a uniter, no matter how many words to the contrary he reads off a teleprompter or from a prepared script. That’s one obvious message from Friday’s decision to issue a pardon for controversial former Arizona sheriff Joe Arpaio. Trump has been a divisive force from the very start of his campaign for president, a proud disrupter of the political status quo. ... What was perhaps unexpected was the timing of the pardon. For starters, it came only days after the president had delivered a speech about national unity before the American Legion in Reno, Nev."
  • David Brooks, "How Trump Kills the G.O.P," New York Times, (August 29): "It may someday be possible to reduce the influence of white identity politics, but probably not while Trump is in office. As long as he is in power the G.O.P. is a house viciously divided against itself, and cannot stand."
  • Editorial, New York Times (August 29): "This is Donald Trump’s rule of law — a display of personal dominance disconnected from concerns about law and order, equality or the Constitution. That distorted understanding of justice is cleaving the nation between the majority who support the principles of American democracy and those who support only him."
  • Stephen Dinan and S.A. Miller, "Trump shuns Republican leaders to cut debt deal with Schumer, Pelosi," Washington Times (September 7): "Senate Majority Leader Mitch McConnell, Kentucky Republican, said Mr. Trump can explain the deal for himself. He added that the president appeared to want to find unity at a time of multiple crisis points, including the country’s fiscal situation and natural disasters."

Monday, August 28, 2017

Racial content sweeps away ‘Gone With the Wind’ screening - Note for a discussion, "E Pluribus Unum? What Keeps the United States United."


AP, sfgate.com; thanks for the lead, AF.

image from article

MEMPHIS, Tenn. (AP) — A Tennessee theater has canceled a long-running screening of "Gone With the Wind" because of racially insensitive content in the classic 1939 film.

Officials at Memphis' Orpheum Theatre have announced that the film will not be shown during its summer movie series in 2018. Theater president Brett Batterson says in a statement that "the Orpheum cannot show a film that is insensitive to a large segment of its local population."

The film was shown at the Orpheum on Aug. 11. This is the 34th straight year it has screened at the theater.

"Gone With the Wind" tells the story of the daughter of a Georgia plantation owner during and after the Civil War.

Batterson tells the Memphis Commercial Appeal a "social media storm" played a role in the decision.


***
Seth Abramovitch, "Oscar's First Black Winner Accepted Her Honor in a Segregated 'No Blacks' Hotel in L.A.," hollywoodreporter.com (February 19, 2015)

It's been 75 years since Hattie McDaniel won for 'Gone With the Wind,' accepting her award at the Ambassador's Cocoanut Grove nightclub. Four husbands, a friendship with Clark Gable and 74 maid roles later, she died, her body refused by a segregated cemetery, her statuette now missing, but with her descendants devoted to her memory. 

This story first appeared in the Feb. 27 issue  [sic] of The Hollywood Reporter magazine.
On a February afternoon in 1940, Hattie McDaniel — then one of the biggest African-American movie stars in the world — marched into the Culver City offices of producer David O. Selznick and placed a stack of Gone With the Wind reviews on his desk. The Civil War epic, released two months earlier, had become an instant cultural sensation, and McDaniel's portrayal of Mammy — the head slave at Tara, the film's fictional Southern plantation — was being singled out by both white and African-American critics as extraordinary. The Los Angeles Times even praised her work as "worthy of Academy supporting awards." Selznick took the hint and submitted the 44-year-old for a nomination in the best supporting actress category, along with her co-star, Olivia de Havilland, contributing to the film's record-setting 13 noms.

The 12th Academy Awards were held at the famed Cocoanut Grove nightclub in The Ambassador Hotel. McDaniel arrived in a rhinestone-studded turquoise gown with white gardenias in her hair. (Seventy years later in 2010, a blue-gown– and white-gardenia–clad Mo'Nique, one of 11 black actors to win Academy Awards since, was the only one to pay homage to McDaniel while accepting her best supporting actress Oscar for Lee DanielsPrecious.) McDaniel then was escorted, not to the Gone With the Wind table — where Selznick sat with de Havilland and his two Oscar-nominated leads, Vivien Leigh and Clark Gable — but to a small table set against a far wall, where she took a seat with her escort, F.P. Yober, and her white agent, William Meiklejohn. With the hotel's strict no-blacks policy, Selznick had to call in a special favor just to have McDaniel allowed into the building (it was officially integrated by 1959, when the Unruh Civil Rights Act outlawed racial discrimination in California).
“Every picture and every line, it belonged to Hattie. She knew she was supposed to be subservient, but she never delivered a subservient line,” says MaBel Collins (center), 77, partner of Edgar Goff, McDaniel’s grandnephew. McDaniel’s descendants were photographed Feb. 13 at The Culver Studios in Culver City, a few yards from Gone With the Wind producer David O. Selznick’s former offices and where most of the movie was filmed.
A list of winners had leaked before the show, so McDaniel's win came as no shock. Even so, when she was presented with the embossed plaque given to supporting winners at the time, the room was rife with emotion, wrote syndicated gossip columnist Louella Parsons: "You would have had the choke in your voice that all of us had." The daughter of two former slaves gave a gracious speech about her win: "I shall always hold it as a beacon for anything I may be able to do in the future. I sincerely hope that I shall always be a credit to my race and the motion picture industry."
But Hollywood's highest honor couldn't stave off the indignities that greeted McDaniel at every turn. White Hollywood pigeonholed her as the sassy Mammy archetype, with 74 confirmable domestic roles out of the IMDb list of 94 ("I'd rather play a maid than be a maid," was her go-to response). [JB emphasis]The NAACP disowned her for perpetuating negative stereotypes. Even after death, her Oscar, which she left to Howard University, was deemed valueless by appraisers and later went missing from the school — and has remained so for more than 40 years. Her final wish — to be buried in Hollywood Cemetery — was denied because of the color of her skin.
McDaniel's career was defined by contradictions, from performing in "whiteface" early on to accounts that her refusal to utter the N-word meant it never made it onscreen in Gone With the Wind. "We all grew up with this image of her, the Mammy character, kind of cringing," says Jill Watts, author of Hattie McDaniel: Black Ambition, White Hollywood. "But she saw herself in the old-fashioned sense as a 'race woman' — someone advancing the race." Adds Mo'Nique: "That woman had to endure questions from the white community and the black community. But she said, 'I'm an actress — and when you say, "Cut," I'm no longer that.' If anybody knew who this woman really was, they would say, 'Let me shut my mouth.'"
A staging for a 1939 Oscars newsreel had McDaniel standing by a table laden with awards; her best supporting actress plaque is up front.
•••
Said McDaniel in 1944 about her disappointing prospects following her Oscar win, "It was as if I had done something wrong." Selznick's first move had been to dispatch her on a live, movie-palace tour as Mammy, which played to half-filled houses. But he saw less and less use for his typecast star, and Warner Bros. eventually bought out her contract.
Even after World War II, she continued to play underwritten maid parts in such films as 1946's Song of the South, Walt Disney's adaptation of the Uncle Remus stories, now considered a rare racist blot on the studio's legacy. In her final years, McDaniel found success on the radio, taking over in 1947 from Bob Corley — a white voice actor who mimicked an African-American woman — as the title character in Beulah, a hit comedy series about a live-in maid. It was the first time an African-American woman starred in a radio show, earning McDaniel $1,000 a week. She was cast in the TV version of Beulah in 1951 but shot only six episodes before falling ill. She died Oct. 26, 1952, of breast cancer. She was 57.
McDaniel with Leigh as Scarlett O’Hara in a scene from the 1939 film, which won best picture.
Though she had been married four times — losing her first husband to pneumonia, the others to divorce — McDaniel never had children of her own. The McDaniel bloodline lives on through her sister, Etta. Etta's grandson Edgar Goff, who devoted much of his life to keeping Hattie's memory alive, died in 2012. "He was an urban engineer by profession, but his passion was black Hollywood, and the Hattie McDaniel story in particular," says Edgar's daughter Kimberly Goff-Crews, secretary and vice president for student life at Yale University. Edgar would regale his kids with stories of their great-great-aunt Hattie, who had hoped her descendants might choose a different path. "My father said that Hattie was pretty clear that she didn't want the family to be in Hollywood," says Goff-Crews. "She wanted them to have 'good, normal' jobs, so to speak — doctors and lawyers. She was no stage mom."
In her last days, McDaniel threw a deathbed party, coincidentally attended by her grandnephew's future life partner MaBel Collins, then 15, who recalls "people milling around, drinking, laughing. Guests would go in one or two at a time and visit with her. I had no idea who that dying movie star was until a couple years later, I saw Gone With the Wind — and realized that was Hattie in the bed."
In her last will and testament, McDaniel left detailed instructions for her funeral. "I desire a white casket and a white shroud; white gardenias in my hair and in my hands, together with a white gard­enia blanket and a pillow of red roses," she wrote. "I also wish to be buried in the Hollywood Cemetery," today known as Hollywood Forever Cemetery. But the resting place of numerous showbiz types — including GWTW director Victor Fleming — had a whites-only policy. Hattie was buried at Angelus-Rosedale Cemetery, the first L.A. cemetery open to all races. In 1999, Edgar successfully lobbied to get a marble memorial to McDaniel placed at Hollywood Forever.
McDaniel also specified what was to become of her Oscar, which an appraiser dismissed as having "no value" in an accounting of her estate. Despite working steadily until her death, McDaniel left the world in debt: Her belongings were valued at $10,336.47 (about $95,000 today), $1,000 less than what she was deemed to owe the IRS. The Oscar, she wrote, was to be left to Howard University, but the award went missing from the Washington, D.C., school during the early 1970s.

The Biggest Misconception About Today’s College Students


Gail O. Mellow, New York Times, AUG. 28, 2017
[original article contains links]

image from article













You might think the typical college student lives in a state of bliss, spending each day
moving among classes, parties and extracurricular activities. But the reality is that an
increasingly small population of undergraduates enjoys that kind of life.

Of the country’s nearly 18 million undergraduates, more than 40 percent go to
community college, and of those, only 62 percent can afford to go to college fulltime.
By contrast, a mere 0.4 percent of students in the United States attend one of
the Ivies.

The typical student is not the one burnishing a fancy résumé with numerous
unpaid internships. It’s just the opposite: Over half of all undergraduates live at
home to make their degrees more affordable, and a shocking 40 percent of students
work at least 30 hours a week. About 25 percent work full-time and go to school full-time.

The typical college student is also not fresh out of high school. A quarter of
undergraduates are older than 25, and about the same number are single parents.

These students work extremely hard to make ends meet and simultaneously get
the education they need to be more stable: A two-year degree can earn students
nearly 20 percent more annually than just a high school diploma.

And yet, these students are often the most shortchanged.

As open-access institutions, community colleges educate the majority of our
country’s low-income, first-generation students. But public funding for community
colleges is significantly less than for four-year colleges, sometimes because of explicit
state policies. This means the amount that community colleges can spend on each
student — to pay for faculty, support services, tutoring and facilities — is far less as
well.

Tuition for low-income students can be covered by federal financial aid
programs, but these students often have significant other costs — including housing,
transportation, food and child care — that regularly pose obstacles to their
education.

A recent Urban Institute study found that from 2011 to 2015, one in five
students attending a two-year college lived in a food-insecure household. A study
from the Wisconsin Hope Lab found that in 2016, 14 percent of community college
students had been homeless at some point. At LaGuardia Community College in New
York, where I am president, 77 percent of students live in households making less
than $25,000 per year.

With financial pressures like these, studying full-time is not an option. It is not
uncommon for a student to take between three and six years to graduate from a two-year
associate degree program.

Even that can be a miraculous feat. At LaGuardia, many of our students start
their days by taking their child to day care on the bus. Then they take the subway to
college, then ride a different bus to their job, another bus to pick up their child and a
final bus to go home. Once home, they still need to cook dinner, help their child with
homework, tuck the child in, tidy up and complete their own college coursework.

Many of these students have jobs that are part-time and pay the minimum
wage; their schedules can vary wildly, making the fragile balance of each day
complex.

Being stretched so thin makes each day an ordeal. It’s no wonder that too many
students drop out before graduation.

Community colleges need increased funding, and students need access to more
flexible federal and state financial aid, enhanced paid internships and college work-study
programs. Improved access to public supports, like food stamps and reduced
public transportation fares, would also make a world of difference.

It’s not just that policy must change. Last year, more than $41 billion was given
in charity to higher education, but about a quarter of that went to just 20
institutions. Community colleges, with almost half of all undergraduate students,
received just a small fraction of this philanthropy. It is imperative that individuals,
corporations and foundations spread their wealth and diversify where they donate
their dollars.

Correcting society’s perception of who attends college in the United States is the
first step toward helping these hard-working and ambitious students, eager to make
a better life for themselves and their families.

It will take sustained commitment by our elected officials, business leaders and
philanthropists to increase support for routinely underfunded community colleges.
It’s time to put public and private money where more and more students are
educated, and remove the real, but surmountable, obstacles that stand between them
and a degree.

Gail O. Mellow is the president of LaGuardia Community College.