Monday, September 14, 2015

I have one of the best jobs in academia. Here's why I'm walking away.


Via LOS on Facebook

vox.com [original article contains links]

image from

by Oliver Lee on September 8, 2015

My grandmother worked in a school cafeteria. My mother taught second grade. Nearly two decades ago, I resolved to enter public education, too, but with plans to rise even higher. I would become a college professor, advancing the scholarship of my discipline, free from the petty bureaucratic concerns that hamstrung my mother's career.

From 1998 until 2012, I pursued that objective with extraordinary focus. I graduated from college at 19. I went to law school and passed the bar exam. At 24, I was admitted to the history PhD program at the University of Pittsburgh. There, I made connections with brilliant academics, won prestigious fellowships and grants, and, at the age of 29, just five years after starting graduate school, I landed a tenure-track job.

I can't understate how rare this opportunity is: Tenure-track jobs at large state universities are few and far between. Landing one without serving a postdoctoral appointment or working as a visiting assistant professor is about as likely as landing a spot on an NBA team with a walk-on tryout — minus the seven-figure salary, naturally.

I had read all of the doom-and-gloom think pieces about the status of the American university system, of course, but it felt like none of that applied to me. I had a full-time position, secured early in my career — the possibilities were endless. Although a legal historian by training, I viewed myself as beyond such simple labels: I was a cultural historian, in command of critical theory and immersed in the latest and best work on gender and sexuality. Activism informed my teaching; I exhorted my students to transcend and transform the status quo. I coached my university's legal debate team to a national championship bid and served on nearly a dozen PhD and EdD dissertation committees. I launched several digital humanities initiatives and curated a museum exhibit about professional wrestling, attracting mainstream attention in the process.

I had not just survived the academic Hunger Games — I had emerged triumphant.

Then it all began to fall apart.

First there was sniping, from peers and administrators. Critiques of my teaching and debate team coaching, often made through backchannels and delivered to me secondhand or not at all, centered on my easygoing personal style (He doesn't use the title "doctor!" He teaches in T-shirts!), my effusive student evaluations (If he's pleasing them, he must be doing something wrong!), and my relatively calm demeanor (If a young academic doesn't seem stressed beyond capacity, he's not working hard enough!).

Then there was official pushback and politics. A proposal to create interactive teaching materials from archival materials was derided as bewildering and gimmicky. I learned that the public outreach in which I engaged — that is, publishing in popular magazines — had ruffled certain feathers. I watched administrators and donors who had championed my career be shown the door, or at least swept under the rug, by an incoming presidential administration — proving that the autonomy I had imagined upon entering academia really was an illusion.

Finally, I realized not even students were too invested. When my best friend visited my campus to give a talk, he observed one of my lectures. I've got many shortcomings as an academic, but lecturing isn't one of them. I've been on TV, radio, podcasts — you name it. By professor standards, which admittedly aren't that high, I could rock the mic. But while my friend sat there, semi-engrossed in the lecture, he found himself increasingly distracted by the student in front of him. That student, who like all in-state students was paying $50 per lecture to hear me talk, was watching season one of Breaking Bad. In a class with no attendance grade, where the lectures were at least halfway decent, he was watching Breaking Bad.

Later during that same visit, my friend asked me, in total sincerity, "Why aren't you doing something meaningful with your life?"

"This is important," I insisted. But there was no passion behind my words. I was a priest who had lost his faith, performing the sacraments without any sense of their importance.

Op-eds about the failings of higher education are like certain unmentionable body parts: Everybody's got one. Professors are or aren't afraid of their liberal students, adjuncts are underpaid and exploited, grade inflation is rampant, college graduates can't find jobs, student loan debt will doom us all.

But these are just parts of a larger and even more troubling story. After spending four years working in higher education, trying to effect piecemeal improvements, I'm convinced that the picture is more dire than most people realize: There's no one single problem to fix or villain to defeat, no buzzword-y panacea that will get things back to normal.

And so now, after devoting nearly 20 years to this life, I've decided to walk away. I'm quitting my tenure-track position; by May of next year, I'll be out of this side of academia forever.

Here are some departing thoughts.

1) Too many people go to college

As recently as a year ago, I remained willing to work inside that fractured system of pay-to-play higher education. If students wanted to take out federal loans to buy degrees, who was I to stop them? Let the chips fall where they may; graduate them all and let the invisible hand sort them out.

But that system is unsustainable. Liberal arts programs, and the humanities in particular, have become a place to warehouse students seeking generic bachelor's degrees not out of any particular interest in the field, but in order to receive raises at work or improve their position in a crowded job market.

Once upon a time, in a postwar America starved for middle managers who could file TPS reports, relying on the BA as an assurance of quality, proof of the ability to follow orders and complete tasks, made perfect sense. But in today's world of service workers and coders and freelancers struggling to brand themselves, wasting four years sitting in classes like mine makes no economic sense for the country or for the students — particularly when they're borrowing money to do so.

Every so often, we're treated to an essay about how liberal arts majors can prepare students to make creative contributions to an employer's bottom line. Do you know how else you can prepare to make these vague creative contributions, much more cheaply and efficiently? By sitting around in your parents' basement and reading great works of literature. Yes, lectures and classroom discussions might help open your mind to new possibilities, but so will skillfully produced videos that are freely available on YouTube. Expert oversight is valuable — but how valuable is it really? I imagine most people wouldn't fork over $50 an hour for the privilege, regardless of their respect for the stellar minds whose contributions to society can rather easily be accessed and understood for free.

2) Online education isn't the solution

Despite my department boasting more than 20 full-time faculty with solid research and teaching credentials, a majority of history students don't come anywhere near their classrooms. Instead, they're remote students, enrolled in an online education.

For some, online degree programs are a solution to the cost and time problem. If there's mass demand for BAs, but the time and expense of real college doesn't make sense for most people, why not provide a similar service digitally? Online classes could unite knowledge seekers from around the world, advocates say, allowing them to get a version of the university experience more compatible with the demands of the modern world.

But in practice, online education isn't a solution — it's a Band-Aid on an infected wound.

In place of thought-provoking video chats and genuinely creative software applications the theory promises, most online students get Blackboard — a cumbersome and inefficient program that only a bureaucracy could love. The "lectures" amount to little more than uploaded PowerPoints that may or may not be accompanied by instructor narration. Usually a single module serves as the university-wide template for an entire mandatory subject, such as US history to the Civil War, allowing professors to be replaced by "graders" capable of administering these courses for even less than the pittance paid to adjuncts. At my university, for example, a grader for one of our online courses supervises approximately 30 to 50 students for an entire course. The grader typically makes $700.

Meanwhile, online classes are — in defiance of all reason — generally longer and more involved than in-person classes. To make up for the lack of in-person instruction, they gorge on assignments, sometimes featuring as many as 60 quizzes in a term. The consequence is cheating as often as education; if you've got a willing partner or three, you could theoretically divide up the coursework and hope the underpaid grader doesn't notice.

Completion rates for online courses are dismal as well, especially at places such as the University of Phoenix Online, which has invested heavily in front-end services like financial aid advising but far less in teachers and student support.

All of this makes perfect sense from an economic standpoint: University administrators are rational actors, and what they're incentivized to maximize are paid student enrollments. There's still no real penalty for failing to graduate students, so why not chase that easy federal money and focus all the effort on upfront enrollment? But what's clear is that this system does not offer a viable, sensible alternative for students; it just allows administrations to exploit the crisis in education to make even more money with even less effort or investment.

3) Tenured professors pity adjuncts. But we can't help them.

We all went into this business with the best of intentions. Those of us who sought PhDs in overpopulated and declining fields knew that the market was not only rough but absolutely brutal; dark humor about the impossible odds facing PhD seekers is part and parcel of the whole grad student experience.

Among the handful of academics who do land tenure-track jobs, one finds little sympathy for the less fortunate. Lip service, to be sure, but academia is a bloodless, endless game of Survivor in which every winner is saying to himself or herself, "There but for the grace of God go I" — or, more likely, "Sucks for them, but what can you do?"

As someone who has sat in department meetings, served on hiring committees, and powwowed with other "real" academics at conferences, I can offer the following statement with confidence: No matter how bad things are for the adjuncts, they're effectively non-people to their ostensible colleagues. We won't save you. It's not that we full-timers don't care; it's that we can't. The rules of the game for tenure are simple and terrible — "do twice as much as you think you need to do" — and there's no time to worry about the fallen when your own pay lags well behind the national average.

Life for the liberal arts adjuncts, who surely deserve better, is only getting worse as enrollments climb. University administrators maximize the bottom line, and the bottom line at most non-elite schools is tuition-paying customers. If you can pay someone to teach five history classes for $15,000 or pay someone else $60,000 to teach those same five classes, why bother with the latter? People complain, but there's no real evidence showing that loss of business from students turned off by less-qualified instructors is even close to competing with the savings.

The incentives are especially destructive in the humanities. When administrators do decide to invest in faculty, they tend to favor STEM professors. Those guys rake in the valuable grant money, and thanks to the miracle of co-authored papers, they produce far longer CVs with far better citation counts, a valuable asset when chasing a higher school ranking and the cash that comes with it.

The situation has become dire enough that I often think the only feasible solution would be to eliminate tenure altogether. Morally, such a plan would be repugnant: Academics deserve the freedom to work at their pace and without the fear of too much administrative interference. But economically, it might be the only thing that allows for real labor market flexibility, forcing out elderly and ineffective professors and driving a rise in the standard of living for those many talented adjuncts who are unable to find work under prevailing conditions.

4) "Alt-academia" isn't a solution — it's surrender

So if not to the wretched life of an adjunct, whither our underpaid, overeducated PhDs? The notion of "alternative academic" careers has become a rallying cry for many, particularly those whose alternative academic position involves finding alt-ac jobs for other PhDs.

Briefly put, "alternative academia" is a catchall term for the process wherein individuals, unsuccessful in their quest to become university professors or disillusioned with that sort of work, seek alternative employment at places like libraries, nonprofits, university presses, and private sector think tanks.

These positions are typically filled by people with master's degrees or other terminal credentials; those with doctorates, goes the reasoning, would be able to use their critical thinking skills to excel in such fields, which lack many of the pressures associated with the tenure track but still offer opportunities to undertake meaningful, exciting work.

The concept is good enough in theory, but in practice it's just another way of phrasing the problem: There's not enough room in academia. Go find a job in a different field.

Some blame scholars themselves for the problem — claiming that today's PhD holders aren't as capable or as qualified as generations past. But after sitting on hiring committees and reading hundreds of CVs and writing samples, I refuse to blame the earnest applicants whose sole crime was being told scholarship was a worthwhile pursuit and believing it. If anything, market pressures have resulted in the production of some of the finest scholarship in generations, with even many adjuncts having a handful of great publications under their belts. The problem is that the system is more than happy to take their money and use their services from undergrad all the way to their doctoral graduation, but when it comes time to pay it off with a real job? Sorry — best look somewhere "alternative."

Recently, an article circulated that urged PhD seekers to view their degrees as a six-year, time-limited job, after which they should expect to move on to something else. That's all well and good, but like my $50-a-pop lectures, is that something you'd want to invest in? When presented with such stark questions, I'd imagine most people would say no. Forcing people to master multiple languages, paleography, archival research, coding, yet all the while reminding them they need to be ready to retool as academic advisers or advertising executives, isn't a solution to the academic crisis — its outright surrender to it.

5) The students and professors aren't the problem; the university system is

All of these issues lead to one, difficult-to-escape conclusion. Despite all the finger-pointing directed at students ("They're lazy! They're oversensitive! They're entitled!"), and the blame heaped on professors ("Out of touch and irrelevant to a man"), the real culprit is systemic. Our federally backed approach to subsidizing higher education through low-interest loans has created perverse incentives with disastrous consequences. This system must be reformed.

When I started out, I believed that government regulation could solve every problem with relatively simple intervention. But after four years of wading though this morass, I'm convinced these solutions should be reevaluated constantly. If they're not achieving their objectives, or if they're producing too much waste in the process, they ought to be scrapped. We can start with federal funding for higher education.

The quickest and most painful solution to the crisis would involve greatly reducing the amount of money that students can borrow to attend college. Such reductions could be phased in over a span of years to alleviate their harshness, but the goal would remain the same: to force underperforming private and public universities out of business. For-profit universities — notorious for their lack of anything resembling good academic intention — should be barred altogether from accessing these programs; let them charge only what consumers in a genuinely free market can afford to pay for their questionable services.

Without the carrot of easy access to student loans, enrollments would shrink. Universities would be forced to compete on a cost-per-student basis, and those students still paying to attend college would likely focus their studies on subjects with an immediate return on investment. Lower tuition costs, perhaps dramatically lower at some institutions, would still enable impoverished students eligible for Pell Grant assistance to attend college. Vocational education programs, which would likely expand in the wake of such a massive adjustment, would offer inexpensive skills training for others. The liberal arts wouldn't necessarily die out — they'd remain on the Ivy League prix-fixe menu, to be sure, and curious minds of all sorts would continue to seek them out — but they'd no longer serve as a final destination for unenthusiastic credential seekers.

In the time that's allotted to us to in life, we have to make many choices. Opting to pursue an unmarketable career solely because one loves it is an available option. But that decision has consequences. In a university system like ours, where supply and demand are distorted, many promising young people make rash decisions with an inadequate understanding of their long-term implications. Even for people like me, who succeed despite the odds, it's possible to look back and realize we've worked toward a disappointment, ending up as "winners" of a mess that damages its participants more every day.

Had I known sooner, I would've given up on this shrinking side of academia many years ago, saving myself plenty of grief while conserving the most valuable quantity of all: time. No one should have to wait so long or sacrifice so much of it for a system like this. Time is money, and we must spend it wisely. Until something is done — something that isn't just a quick fix, something that looks long and hard at the structure of the present university system and tears it up from the foundation, if that's what it takes — the academy is no longer an investment of time worth making.

Oliver Lee is an attorney and assistant professor of history. His writing has appeared in the Atlantic, VICE, Salon, Mic, and Al Jazeera America.

No comments: