A few years ago, when I was still
teaching at Yale, I was approached by a student who was interested in
going to graduate school. She had her eye on Columbia; did I know
someone there she could talk with? I did, an old professor of mine. But
when I wrote to arrange the introduction, he refused to even meet with
her. “I won’t talk to students about graduate school anymore,” he
explained. “Going to grad school’s a suicide mission.”
The policy may be extreme, but the feeling is universal. Most
professors I know are willing to talk with students about pursuing a
PhD, but their advice comes down to three words: don’t do it. (William
Pannapacker, writing in the Chronicle of Higher Education as
Thomas Benton, has been making this argument for years. See “The Big Lie
About the ‘Life of the Mind,’” among other essays.) My own advice was
never that categorical. Go if you feel that your happiness depends on
it—it can be a great experience in many ways—but be aware of what you’re
in for. You’re going to be in school for at least seven years, probably
more like nine, and there’s a very good chance that you won’t get a job
at the end of it.
At Yale, we were overjoyed if half our graduating students found
positions. That’s right—half. Imagine running a medical school on that
basis. As Christopher Newfield points out in Unmaking the Public University
(2008), that’s the kind of unemployment rate you’d expect to find among
inner-city high school dropouts. And this was before the financial
collapse. In the past three years, the market has been a bloodbath:
often only a handful of jobs in a given field, sometimes fewer, and as
always, hundreds of people competing for each one.
It wasn’t supposed to be like this. When I started graduate school in
1989, we were told that the disastrous job market of the previous two
decades would be coming to an end because the large cohort of people who
had started their careers in the 1960s, when the postwar boom and the
baby boom combined to more than double college enrollments, was going to
start retiring. Well, it did, but things kept getting worse. Instead of
replacing retirees with new tenure-eligible hires, departments
gradually shifted the teaching load to part-timers: adjuncts, postdocs,
graduate students. From 1991 to 2003, the number of full-time faculty
members increased by 18 percent. The number of part-timers increased by
87 percent—to almost half the entire faculty.
But as Jack Schuster and Martin Finkelstein point out in their comprehensive study The American Faculty
(2006), the move to part-time labor is already an old story. Less
visible but equally important has been the advent and rapid expansion of
full-time positions that are not tenure-eligible. No one talks
about this transformation—the creation of yet another academic
underclass—and yet as far back as 1993, such positions already
constituted the majority of new appointees. As of 2003, more than a
third of full-time faculty were working off the tenure track. By the
same year, tenure-track professors—the “normal” kind of academic
appointment—represented no more than 35 percent of the American faculty.
The reasons for these trends can be expressed in a single word, or
buzzword: efficiency. Contingent academic labor, as non-tenure-track
faculty, part-time and full-time, are formally known, is cheaper to hire
and easier to fire. It saves departments money and gives them greater
flexibility in staffing courses. Over the past twenty years, in other
words—or really, over the past forty—what has happened in academia is
what has happened throughout the American economy. Good, secure,
well-paid positions—tenured appointments in the academy, union jobs on
the factory floor—are being replaced by temporary, low-wage employment.
* * *
You’d think departments would respond to the Somme-like conditions
they’re sending out their newly minted PhDs to face by cutting down the
size of their graduate programs. If demand drops, supply should drop to
meet it. In fact, many departments are doing the opposite, the job
market be damned. More important is maintaining the flow of labor to
their domestic sweatshops, the pipeline of graduate students who staff
discussion sections and teach introductory and service courses like
freshman composition and first-year calculus. (Professors also need
dissertations to direct, or how would they justify their own existence?)
As Louis Menand puts it in The Marketplace of Ideas (2010),
the system is now designed to produce not PhDs so much as ABDs: students
who, having finished their other degree requirements, are “all but
dissertation” (or “already been dicked,” as we used to say)—i.e., people
who have entered the long limbo of low-wage research and teaching that
chews up four, five, six years of a young scholar’s life.
If anything, as Menand notes, the PhD glut works well for departments
at both ends, since it gives them the whip hand when it comes to hiring
new professors. Graduate programs occupy a highly unusual, and
advantageous, market position: they are both the producers and the
consumers of academic labor, but as producers, they have no financial
stake in whether their product “sells”—that is, whether their graduates
get jobs. Yes, a program’s prestige is related, in part, to its
placement rate, but only in relative terms. In a normal industry, if no
firm sells more than half of what it produces, then either everyone goes
out of business or the industry consolidates. But in academia, if no
one does better than 50 percent, then 50 percent is great. Programs have
every incentive to keep prices low by maintaining the oversupply.
Still, there’s a difference between a Roger Smith firing workers at
General Motors and the faculty of an academic department treating its
students like surplus goods. For the CEO of a large corporation, workers
are essentially entries on a balance sheet, separated from the
boardroom by a great gulf of culture and physical distance. If they are
treated without mercy, that is not entirely surprising. But the
relationship between professors and graduate students could hardly be
more intimate. Professors used to be graduate students. They
belong to the same culture and the same community. Your dissertation
director is your mentor, your role model, the person who spends all
those years overseeing your research and often the one you came to
graduate school to study under in the first place. You, in turn, are her
intellectual progeny; if you make good, her professional pride. The
economic violence of the academic system is inflicted at very close
quarters.
How professors square their Jekyll-and-Hyde roles in the
process—devoted teachers of individual students, co-managers of a system
that exploits them as a group—I do not know. Denial, no doubt, along
with the rationale that this is just the way it is, so what can you do?
Teaching is part of the training, you hear a lot, especially when
supposedly liberal academics explain why graduate-student unions are
such a bad idea. They’re students, not workers! But grad students don’t
teach because they have to learn how, even if the experience is indeed
very valuable; they teach because departments need “bodies in the
classroom,” as a professor I know once put it.
I always found it beautifully apt that my old department occupies the
same space where the infamous Milgram obedience experiments were
conducted in the early 1960s. (Yes, really.) Pay no attention to the
screams you hear coming from the next room, the subjects were told as
they administered the electric shocks, it’s for their own good—a perfect
allegory of the relationship between tenured professors and graduate
students (and tenured professors and untenured professors, for that
matter).
Well, but so what? A bunch of spoiled kids are having trouble finding
jobs—so is everybody else. Here’s so what. First of all, they’re not
spoiled. They’re doing exactly what we always complain our brightest
students don’t do: eschewing the easy bucks of Wall Street, consulting
or corporate law to pursue their ideals and be of service to society.
Academia may once have been a cushy gig, but now we’re talking about
highly talented young people who are willing to spend their 20s living
on subsistence wages when they could be getting rich (and their friends are getting
rich), simply because they believe in knowledge, ideas, inquiry; in
teaching, in following their passion. To leave more than half of them
holding the bag at the end of it all, over 30 and having to scrounge for
a new career, is a human tragedy.
Sure, lots of people have it worse. But here’s another reason to
care: it’s also a social tragedy, and not just because it represents a
colossal waste of human capital. If we don’t make things better for the
people entering academia, no one’s going to want to do it anymore. And
then it won’t just be the students who are suffering. Scholarship will
suffer, which means the whole country will. Knowledge, as we’re
constantly told, is a nation’s most important resource, and the great
majority of knowledge is created in the academy—now more than ever, in
fact, since industry is increasingly outsourcing research to
universities where, precisely because graduate students cost less than
someone who gets a real salary, it can be conducted on the cheap. (Bell
Labs, once the flagship of industrial science, is a shell of its former
self, having suffered years of cutbacks before giving up on fundamental
research altogether.)
It isn’t just the sciences that matter; it is also the social
sciences and the humanities. And it isn’t just the latter that are
suffering. Basic physics in this country is all but dead. From 1971 to
2001, the number of bachelor’s degrees awarded in English declined by 20
percent, but the number awarded in math and statistics declined by 55
percent. The only areas of the liberal arts that saw an increase in BAs
awarded were biology and psychology—and this at a time when aggregate
enrollment expanded by something like 75 percent. On the work that is
done in the academy depends the strength of our economy, our public
policy and our culture. We need our best young minds going into
atmospheric research and international affairs and religious studies,
chemistry and ethnography and art history. By pursuing their individual
interests, narrowly understood, departments are betraying both the
values they are pledged to uphold—the pursuit of knowledge, the spirit
of critical inquiry, the extension of the humanistic tradition—and the
nation they exist to serve.
We’ve been here before. Pay was so low in the nineteenth century,
when academia was still a gentleman’s profession, that in 1902 Andrew
Carnegie created the pension plan that would evolve into TIAA-CREF, the
massive retirement fund. After World War II, when higher education was
seen as an urgent national priority, a consensus emerged that salaries
were too small to attract good people. Compensation soared through the
1950s and ’60s, then hit the skids around 1970 and didn’t recover for
almost thirty years. It’s no surprise that the percentage of college
freshmen expressing an interest in academia was more than three times
higher in 1966 than it was in 2004.
But the answer now is not to raise professors’ salaries. Professors
already make enough. The answer is to hire more professors: real ones,
not academic lettuce-pickers.
Yet that’s the last thing schools are apt to do. What we have seen
instead over the past forty years, in addition to the raising of a
reserve army of contingent labor, is a kind of administrative
elephantiasis, an explosion in the number of people working at colleges
and universities who aren’t faculty, full-time or part-time, of any
kind. From 1976 to 2001, the number of nonfaculty professionals
ballooned nearly 240 percent, growing more than three times as fast as
the faculty. Coaching staffs and salaries have grown without limit;
athletic departments are virtually separate colleges within universities
now, competing (successfully) with academics. The size of presidential
salaries—more than $1 million in several dozen cases—has become
notorious. Nor is it only the presidents; the next six most highly paid
administrative officers at Yale averaged over $430,000 in 2007. As Gaye
Tuchman explains in Wannabe U (2009), a case study in the
sorrows of academic corporatization, deans, provosts and presidents are
no longer professors who cycle through administrative duties and then
return to teaching and research. Instead, they have become a separate
stratum of managerial careerists, jumping from job to job and
organization to organization like any other executive: isolated from the
faculty and its values, loyal to an ethos of short-term expansion, and
trading in the business blather of measurability, revenue streams,
mission statements and the like. They do not have the long-term health
of their institutions at heart. They want to pump up the stock price
(i.e., U.S. News and World Report ranking) and move on to the next fat post.
If you’re tenured, of course, life is still quite good (at least
until the new provost decides to shut down your entire department). In
fact, the revolution in the structure of academic work has come about in
large measure to protect the senior professoriate. The faculty have
steadily grayed in recent decades; by 1998 more than half were 50 or
older. Mandatory retirement was abolished in 1986, exacerbating the
problem. Departments became “tenured in,” with a large bolus of highly
compensated senior professors and room, increasingly squeezed in many
cases, for just a few junior members—another reason jobs have been so
hard to find. Contingent labor is desirable above all because it saves
money for senior salaries (as well as relieving the tenure track of the
disagreeable business of teaching low-level courses). By 2004, while pay
for assistant and associate professors still stood more or less where
it had in 1970, that for full professors was about 10 percent higher.
What we have in academia, in other words, is a microcosm of the
American economy as a whole: a self-enriching aristocracy, a swelling
and increasingly immiserated proletariat, and a shrinking middle class.
The same devil’s bargain stabilizes the system: the middle, or at least
the upper middle, the tenured professoriate, is allowed to retain its
prerogatives—its comfortable compensation packages, its workplace
autonomy and its job security—in return for acquiescing to the
exploitation of the bottom by the top, and indirectly, the betrayal of
the future of the entire enterprise.
* * *
But now those prerogatives are also under threat. I am not joining
the call for the abolition of tenure—a chorus that includes two of last
year’s most widely noticed books on the problems of America’s colleges
and universities, Higher Education?, by Andrew Hacker and Claudia Dreifus, and Crisis on Campus,
by Mark Taylor. Tenure certainly has its problems. It crowds out
opportunities for young scholars and allows academic deadwood to
accumulate on the faculty rolls. But getting rid of it would be like
curing arteriosclerosis by shooting the patient. For one thing, it would
remove the last incentive for any sane person to enter the profession.
People still put up with everything they have to endure as graduate
students and junior professors for the sake of a shot at that golden
prize, and now you’re going to take away the prize? No, it is not good
for so many of academia’s rewards to be backloaded into a single moment
of occupational transfiguration, one that sits like a mirage at the end
of twelve or fifteen years of Sinaitic wandering. Yes, the job market
would eventually rebalance itself if the profession moved, say, to a
system of seven-year contracts, as Taylor suggests. But long before it
did, we would lose a generation of talent.
Besides, how would the job market rebalance itself? If the
people who now have tenure continued to serve under some other
contractual system, the same surplus of labor would be chasing the same
scarcity of employment. Things would get better for new PhDs only if
schools started firing senior people. Which, as the way things work in
other industries reminds us, they would probably be glad to do. Why
retain a 55-year-old when you can replace her with a 30-year-old at half
the price? Now that’s a thought to swell a provost’s revenue stream.
Talk about efficiency.
And what exactly are you supposed to do at that point if you’ve spent
your career becoming an expert in, say, Etruscan history? Academia
exists in part to support research the private sector won’t pay
for, knowledge that can’t be converted into a quick buck or even a slow
one, but that adds value to society in other ways. Who’s going to
pursue that kind of inquiry if they know there’s a good chance they’re
going to get thrown out in the snow when they’re 50 (having only started
to earn a salary when they were 30, to boot)? Doctors and lawyers can
set up their own practice, but a professor can’t start his own
university. This kind of thing is appalling enough when it happens to
blue-collar workers. In an industry that requires a dozen years of
postsecondary education just to gain an entry-level position, it is
unthinkable.
Nor should we pooh-pooh the threat the abolition of tenure would pose
to academic freedom, as Hacker and Dreifus do. “We have scoured all the
sources we could find,” they write, “yet we could not find any academic
research whose findings led to terminating the jobs of college faculty
members.” Yes, because of tenure. If deans and trustees and alumni and
politicians rarely even try to have professors fired, that is precisely
because they know they have so little chance of making it happen. Before
tenure existed, arbitrary dismissals were common. Can you imagine what
the current gang of newly elected state legislators would do if they
could get their hands on the people who teach at public universities?
(Just look at what happened to William Cronon, the University of
Wisconsin historian whose e-mails were demanded by the state Republican
Party after he exposed the role of the American Legislative Exchange
Council in Governor Scott Walker’s attack on public employee unions.)
Hacker and Dreifus, who recognize the importance of academic freedom,
call instead of tenure for presidents and trustees with “backbone” (a
species as wonderful as the unicorn, and almost as numerous). Sure, and
as long as the king is a good man, we don’t need democracy. Academics
play a special role in society: they tell us things we don’t want to
hear—about global warming, or the historical Jesus, or the way we raise
our children. That’s why they need to have special protections.
* * *
But the tenure system, which is already being eroded by the growth of
contingent labor, is not the only thing that is under assault in the
top-down, corporatized academy. As Cary Nelson explains in No University Is an Island
(2010), shared governance—the principle that universities should be
controlled by their faculties, which protects academic values against
the encroachments of the spreadsheet brigade—is also threatened by the
changing structure of academic work. Contingent labor undermines it both
directly—no one asks an adjunct what he thinks of how things run—and
indirectly. More people chasing fewer jobs means that everyone is
squeezed for extra productivity, just like at Wal-Mart. As of 1998,
faculty at four-year schools worked an average of about seven hours more
per week than they had in 1972 (for a total of more than forty-nine
hours a week; the stereotype of the lazy academic is, like that of the
welfare queen, a politically useful myth). Not surprisingly, they also
reported a shrinking sense of influence over campus affairs. Who’s got
the time? Academic labor is becoming like every other part of the
American workforce: cowed, harried, docile, disempowered.
In macropolitical terms, the erosion of tenure and shared governance
undermines the power of a large body of liberal professionals. In this
it resembles the campaign against teachers unions. Tenure, in fact, is a
lot like unionization: imperfect, open to corruption and abuse, but
incomparably better than the alternative. Indeed, tenure is what
professors have instead of unions (at least at private universities,
where they’re banned by law from organizing). As for shared governance,
it is nothing other than one of the longest-standing goals of the left:
employee control of the workplace. Yes, professors have it better than a
lot of other workers, including a lot of others in the academy. But the
answer, for the less advantaged, is to organize against the employers
who’ve created the situation, not drag down the relatively privileged
workers who aren’t yet suffering as badly: to level up, in other words,
not down.
Of course, some sectors of the academy—the ones that educate the
children of the wealthy and the upper middle class—continue to maintain
their privilege. The class gradient is getting steeper, not only between
contingent labor and the tenure track, and junior and senior faculty
within the latter, but between institutions as well. Professors at
doctoral-granting universities not only get paid a lot more than their
colleagues at other four-year schools; the difference is growing, from
17 percent in 1984 to 28 percent in 2003. (Their advantage over
professors at community colleges increased during the same period from
33 percent to 49 percent.) The rich are getting richer. In 1970 (it
seems like an alternative universe now) faculty at public colleges and
universities actually made about 10 percent more than those at
private schools. By 1999 the lines had crossed, and public salaries
stood about 5 percent lower. The aggregate student-faculty ratio at
private colleges and universities is 10.8 to 1; at public schools, it is
15.9 to 1—almost 50 percent higher.
Here we come to the most important issue facing American higher
education. Public institutions enroll about three-quarters of the
nation’s college students, and public institutions are everywhere under
financial attack. As Nancy Folbre explains in Saving State U
(2010), a short, sharp, lucid account, spending on higher education has
been falling as a percentage of state budgets for more than twenty
years, to about two-thirds of what it was in 1980. The average six-year
graduation rate at state schools is now a dismal 60 percent, a function
of class size and availability, faculty accessibility, the use of
contingent instructors and other budget-related issues. Private
universities actually lobby against public funding for state schools,
which they see as competitors. In any case, a large portion of state
scholarship aid goes to students at private colleges (in some cases,
more than half)—a kind of voucher system for higher education.
Meanwhile, public universities have been shifting their financial aid
criteria from need to merit to attract applicants with higher scores
(good old U.S. News again), who tend to come from wealthier
families. Per-family costs at state schools have soared in recent years,
from 18 percent of income for those in the middle of the income
distribution in 1999 to 25 percent in 2007. Estimates are that over the
past decade, between 1.4 million and 2.4 million students have been
prevented from going to college for financial reasons—about 50 percent
more than during the 1990s. And of course, in the present climate of
universal fiscal crisis, it is all about to get a lot worse.
* * *
Our system of public higher education is one of the great
achievements of American civilization. In its breadth and excellence, it
has no peer. It embodies some of our nation’s highest ideals:
democracy, equality, opportunity, self-improvement, useful knowledge and
collective public purpose. The same president who emancipated the
slaves and funded the transcontinental railroad signed the Morrill Land
Grant Act of 1862, which set the system on its feet. Public higher
education is a bulwark against hereditary privilege and an engine of
social mobility. It is altogether to the point that the strongest state
systems are not to be found in the Northeast, the domain of the old WASP
aristocracy and its elite private colleges and universities, but in
places like Michigan, Wisconsin, Illinois, Virginia, North Carolina and,
above all, California.
Now the system is in danger of falling into ruin. Public higher
education was essential to creating the mass middle class of the postwar
decades—and with it, a new birth of political empowerment and human
flourishing. The defunding of public higher education has been essential
to its slow destruction. In Unmaking the Public University,
Newfield argues that the process has been deliberate, a campaign by the
economic elite against the class that threatened to supplant it as the
leading power in society. Social mobility is now lower in the United
States than it is in Northern Europe, Australia, Canada and even France
and Spain, a fact that ought to be tattooed on the foreheads of every
member of Congress, so directly does it strike at America’s identity as
the land of opportunity.
But it was not only the postwar middle class that public higher
education helped create; it was the postwar prosperity altogether.
Knowledge, again, is our most important resource. States that balance
their budgets on the backs of their public universities are not eating
their seed corn; they’re trampling it into the mud. My state of Oregon, a
chronic economic underperformer, has difficulty attracting investment,
not because its corporate taxes are high—they’re among the lowest—but
because its workforce is poorly educated. So it will be for the nation
as a whole. Our college-completion rate has fallen from second to
eighth. And we are not just defunding instruction; we are defunding
research, the creation of knowledge itself. Stipends are so low at the
University of California, Berkeley, the third-ranked research
institution on the planet, that the school is having trouble attracting
graduate students. In fact, the whole California system, the crown jewel
of American public higher education, is being torn apart by budget
cuts. This is not a problem; it is a calamity.
Private institutions are in comparable trouble, for reasons that will
sound familiar: too much spending during the boom years—much of it on
construction, much of it driven by the desire to improve “market
position” relative to competitors by offering amenities like new dorms
and student centers that have nothing to do with teaching or
research—supported by too much borrowing, has led to a debt crisis.
Among the class of academic managers responsible for the trouble in the
first place, an industry of reform has sprung up, along with a
literature of reform to go with it. Books like Taylor’s Crisis on Campus, James Garland’s Saving Alma Mater (2009) and the most measured and well-informed of the ones I’ve come across, Robert Zemsky’s Making Reform Work (2009), propose their variously visionary schemes.
Nearly all involve technology to drive efficiency. Online courses,
distance learning, do-it-yourself instruction: this is the future we’re
being offered. Why teach a required art history course to twenty
students at a time when you can march them through a self-guided online
textbook followed by a multiple-choice exam? Why have professors or even
graduate students grade papers when you can outsource them to BAs
around the country, even the world? Why waste time with office hours
when students can interact with their professors via e-mail?
The other great hope—I know you’ll never see this coming—is the
market. After all, it works so well in healthcare, and we’re already
trying it in primary and secondary education. Garland, a former
president of Miami University of Ohio (a public institution), argues for
a voucher system. Instead of giving money to schools, the state would
give it to students, and the credit would be good at any nonprofit
institution in the state—in other words, at private ones as well. The
student would run the show (as the customer should, of course), scouring
the market like a savvy consumer. Universities, in turn, “would compete
with each other…by tailoring their course offerings, degree programs,
student services, and extracurricular activities” to the needs of our
newly empowered 18-year-olds, and the invisible hand would rain down its
blessings.
But do we really want our higher education system redesigned by the
self-identified needs of high school seniors? This is what the British
are about to try, and in a country with one of Europe’s most
distinguished intellectual traditions, they seem poised to destroy the
liberal arts altogether. How much do 18-year-olds even know about what
they want out of college? About not only what it can get them, but what
it can give them? These are young people who don’t know what college is,
who they are, who they might want to be—things you need a college
education, and specifically a liberal arts education, to help you figure
out.
* * *
Yet the liberal arts, as we know, are dying. All the political and
parental pressure is pushing in the other direction, toward the
“practical,” narrowly conceived: the instrumental, the utilitarian, the
immediately negotiable. Colleges and universities are moving away from
the liberal arts toward professional, technical and vocational training.
Last year, the State University of New York at Albany announced plans
to close its departments of French, Italian, Russian, classics and
theater—a wholesale slaughter of the humanities. When Garland enumerates
the fields a state legislature might want to encourage its young people
to enter, he lists “engineering, agriculture, nursing, math and science
education, or any other area of state importance.” Apparently political
science, philosophy, history and anthropology, among others, are not
areas of state importance. Zemsky wants to consider reducing college to
three years—meaning less time for young people to figure out what to
study, to take courses in a wide range of disciplines, to explore, to
mature, to think.
When politicians, from Barack Obama all the way down, talk about
higher education, they talk almost exclusively about math and science.
Indeed, technology creates the future. But it is not enough to create
the future. We also need to organize it, as the social sciences enable
us to do. We need to make sense of it, as the humanities enable us to
do. A system of higher education that ignores the liberal arts, as
Jonathan Cole points out in The Great American University
(2009), is what they have in China, where they don’t want people to
think about other ways to arrange society or other meanings than the
authorized ones. A scientific education creates technologists. A liberal
arts education creates citizens: people who can think broadly and
critically about themselves and the world.
Yet of course it is precisely China—and Singapore, another great
democracy—that the Obama administration holds up as the model to emulate
in our new Sputnik moment. It’s funny; after the original Sputnik, we
didn’t decide to become more like the Soviet Union. But we don’t possess
that kind of confidence anymore.
There is a large, public debate right now about primary and secondary
education. There is a smaller, less public debate about higher
education. What I fail to understand is why they aren’t the same debate.
We all know that students in elementary and high school learn best in
small classrooms with the individualized attention of motivated
teachers. It is the same in college. Education, it is said, is lighting a
fire, not filling a bucket. The word comes from the Latin for “educe,”
lead forth. Learning isn’t about downloading a certain quantity of
information into your brain, as the proponents of online instruction
seem to think. It is about the kind of interchange and incitement—the
leading forth of new ideas and powers—that can happen only in a seminar.
(“Seminar” being a fancy name for what every class already is from
K–12.) It is labor-intensive; it is face-to-face; it is one-at-a-time.
The key finding of Richard Arum and Josipa Roksa’s Academically Adrift
(2011), that a lot of kids aren’t learning much in college, comes as no
surprise to me. The system is no longer set up to challenge them. If
we’re going to make college an intellectually rigorous experience for
the students who already go—still more, for all the ones we want to go
if we’re going to reach the oft-repeated goal of universal postsecondary
education, an objective that would double enrollments—we’re going to
need a lot more teachers: well paid, institutionally supported, socially
valued. As of 2003 there were about 400,000 tenure-track professors in
the United States (as compared with about 6 million primary- and
secondary-school teachers). Between reducing class sizes, reversing the
shift to contingent labor and beefing up our college-completion rates,
we’re going to need at least five times as many.
So where’s the money supposed to come from? It’s the same question we
ask about the federal budget, and the answer is the same. We’re still a
very wealthy country. There’s plenty of money, if we spend it on the
right things. Just as we need to wrestle with the $700 billion gorilla
of defense, so do universities need to take on administrative edema and
extracurricular spending. We can start with presidential salaries.
Universities, like corporations, claim they need to pay the going rate
for top talent. The argument is not only dubious—whom exactly are they
competing with for the services of these managerial titans, aside from
one another?—it is beside the point. Academia is not supposed to be a
place to get rich. If your ego can’t survive on less than $200,000 a
year (on top of the prestige of a university presidency), you need to
find another line of work. Once, there were academic leaders who put
themselves forward as champions of social progress: people like Woodrow
Wilson at Princeton in the 1900s; James Conant at Harvard in the 1940s;
and Kingman Brewster at Yale, Clark Kerr at the University of California
and Theodore Hesburgh at Notre Dame in the 1960s. What a statement it
would make if the Ivy League presidents got together and announced that
they were going to take an immediate 75 percent pay cut. What a way to
restore academia’s moral prestige and demonstrate some leadership again.
But leadership will have to come from somewhere else, as well. Just
as in society as a whole, the academic upper middle class needs to
rethink its alliances. Its dignity will not survive forever if it
doesn’t fight for that of everyone below it in the academic hierarchy.
(“First they came for the graduate students, and I didn’t speak out
because I wasn’t a graduate student…”) For all its pretensions to public
importance (every professor secretly thinks he’s a public
intellectual), the professoriate is awfully quiet, essentially
nonexistent as a collective voice. If academia is going to once again
become a decent place to work, if our best young minds are going to be
attracted back to the profession, if higher education is going to be
reclaimed as part of the American promise, if teaching and research are
going to make the country strong again, then professors need to get off
their backsides and organize: department by department, institution to
institution, state by state and across the nation as a whole. Tenured
professors enjoy the strongest speech protections in society. It’s time
they started using them.