“Yeah, proof is the bottom line for everyone…”

In 1994, Norman Cantor was gearing up for his fourth year of besiegement after the release of Inventing the Middle Ages, a mass-market book in which he sought to show how the formative experiences of certain twentieth-century medievalists explained the ways they interpreted history. Fellow historians didn’t like his blunt biographical approach—and so in “Medievalism and the Middle Ages,” a lively but little-read article in The Year’s Work in Medievalism, Cantor hammered back at “establishment dust-grinders” by holding up the movie Robin Hood: Prince of Thieves as “a highly significant core defeat” the academy hadn’t even known it had suffered:

It shows how little the academic medievalists have made an impact on popular culture and its view of the medieval world. Costner’s Robin Hood signifies social failure for the Ivy League, Oxbridge, and the Medieval Academy of America. But I expect the august personalities in those exalted precincts never gave a moment’s thought to this connection.

I recalled Cantor’s smart, spirited (and, in retrospect, debatable) rant when I read last week’s Chronicle of Higher Education piece by Paul Dicken, a philosopher of science who’s keen to write for popular audiences despite the sneering of colleagues and peers:

Yet as I struggle on with my apparently misguided endeavors, I sometimes think that maybe the search committee had a point. It is difficult pitching academic material in a way that is suitable for a popular audience. I don’t pretend to be an unparalleled communicator of ideas, nor do I kid myself about my ability to produce pithy and engaging prose. After many years of writing for peer review, I have developed a nasty habit of overusing the passive voice — not to mention the usual reliance upon jargon, excessive footnotes, and the death by a thousand qualifications that undermines any attempt to state a clear, precise thesis. It is definitely a learning process. But no matter how dull the final product, I was at least confident that I could express my ideas clearly. That’s what we’re trained for, right?

I’ve known plenty of scholars who write lucid books and blogs; I doubt the academy nurtured the requisite skills.

When I decided to start writing in earnest, I drove wildly around England and Wales collecting material for travel stories. The Washington Post published two of them, but only after an editor nudged me with notes like this one from 1999:

I don’t think this lede works; it’s too slow and diffuse for our reader—imagine a bagel-eating Sunday morning householder, an occasional traveler seeking a weekly fix of travel infotainment—but surrounded by a pile of other sections tugging at his time, and household things about to start tugging too…this is different from someone who settles in for a long night with a New Yorker and a hot toddy.

A good editor knows how to improve and refine our writing without shearing off all of the frills and frippery we vainly adore. Thanks to that guy and a couple others like him, I sloughed off three-and-a-half years of bad grad-school style and (eventually, arguably) learned how to write. Paul Dicken, stick to your plan: keeping readers engrossed in weighty matters without overusing the passive voice or condemning them to “death by a thousand qualifications” doesn’t require “an unparalleled communicator of ideas.” Just know your audience, then decide what you’re doing is, among other things, art.

* * *

We’re overdue for great shifts in our obsolete cultural coalitions; the creaking we hear as they seize up and fail is also the venting of truths. In another Chronicle of Higher Education piece last week, philosopher and science historian Lee McIntyre decries the recent “attack on truth” that he believes has us ambling into “an age of willful ignorance”:

It is sad that the modern attack on truth started in the academy — in the humanities, where the stakes may have initially seemed low in holding that there are multiple ways to read a text or that one cannot understand a book without taking account of the political beliefs of its author.

That disrespect, however, has metastasized into outrageous claims about the natural sciences.

Anyone who has been paying attention to the fault lines of academic debate for the past 20 years already knows that the “science wars” were fought by natural scientists (and their defenders in the philosophy of science) on the one side and literary critics and cultural-studies folks on the other. The latter argued that even in the natural realm, truth is relative, and there is no such thing as objectivity. The skirmishes blew up in the well-known “Sokal affair” in 1996, in which a prominent physicist created a scientifically absurd postmodernist paper and was able to get it published in a leading cultural-studies journal. The ridicule that followed may have seemed to settle the matter once and for all.

But then a funny thing happened: While many natural scientists declared the battle won and headed back to their labs, some left-wing postmodernist criticisms of truth began to be picked up by right-wing ideologues who were looking for respectable cover for their denial of climate change, evolution, and other scientifically accepted conclusions. Alan Sokal said he had hoped to shake up academic progressives, but suddenly one found hard-right conservatives sounding like Continental intellectuals. And that caused discombobulation on the left.

“Was I wrong to participate in the invention of this field known as science studies?,” Bruno Latour, one of the founders of the field that contextualizes science, famously asked. “Is it enough to say that we did not really mean what we said? Why does it burn my tongue to say that global warming is a fact whether you like it or not? Why can’t I simply say that the argument is closed for good?”

“But now the climate-change deniers and the young-Earth creationists are coming after the natural scientists,” the literary critic Michael Bérubé noted, “… and they’re using some of the very arguments developed by an academic left that thought it was speaking only to people of like mind.”

Having noticed, as Norman Cantor did, how rare it is for new discoveries about the Middle Ages to prosper off-campus unless they’re being exploited for linkbait, I was startled by this whole line of thought. I’ll have to read McIntyre’s book to see if it’s true that postmodernist humanities scholars influenced “hard-right conservatives” or “climate-change deniers and the young-Earth creationists.” I doubt it, although I suspect that the latter have at least heckled the former to live up to the credos implied by their critical approaches, but what a remarkable admission: that a fair amount of recent work in the humanities is baloney that was never meant to be consumed, sold, or even sniffed by outsiders.

Humanities theorists have insisted for years that when we set our work loose, it’s no longer our own. They’ll find in the end that intentions still matter: there’s more pleasure and solace in writing and art when you believe what you’re doing is true.

“Unsheathe the blade within the voice…”

Is polysemy now unseemly? Two weeks ago, when historian Steve Muhlberger traveled to that great North American ent-moot, the International Congress on Medieval Studies, he found himself in the midst of “a lot of griping and grouching about the misuse and ambiguity of the word medieval.” In a lucid and laudably concise blog post, he calls out the problem behind the problem:

You would think that a bunch of scholars who by their very nature of their discipline are experts in the evolution of the meaning of words would by now have gotten over the fact that though it doesn’t make a lot of sense to call “the Middle Ages” by that term, and that coming up with a really good, chronological definition of those ages is impossible, we are stuck with the words medieval and Middle Ages anyway. But no . . .

Steve is a scholar of chivalric tournaments and an experienced combat reenactor, so he knows how to land a disarming blow:

This can be intensely irritating for people who know that certain phrases and analyses lost their cogency back in 1927 and want to talk about what their friends are doing in the field now. Nevertheless people whose business is words should really accept the fact that words like “medieval” have a number of popular meanings, and when one of them shows up in current discussion (when, for instance, a Game of Thrones shows up and is widely labelled as medieval, even though the world of Game of Thrones is not our earth at all), the fact can be dealt with a good-humored way. It certainly would reflect credit on any field where a good-humored approach was the norm.

It would indeed. Off campus, the world blissfully resists more than a century of scholarship—pop culture still depicts Vikings in huge horned helmets, for heaven’s sake—and I respectfully suggest that more scholars contemplate why this is so.

As the rare soul who’s read every volume of Studies in Medievalism, I’ve marveled at the field’s mania for nomenclature. Since at least 2009, contributors to the journal—and its sister publication The Year’s Work in Medievalism, and its annual conference, and a pricey new handbook of critical terms—have kicked around the meaning of “medievalism” and “neo-medievalism” until every syllable simpers for mercy. Because I write about medievalism not as a professional scholar but as a footloose amateur, I miss the many years of meaty articles explaining, say, how boys’ chivalric clubs helped inspire the American scouting movement or why we’re perpetually tempted to make Dante a mouthpiece for generational angst. Forged from an accidental alloy of romanticism, nostalgia, politics, religion, and wishful thinking, medievalism can’t help but have jagged edges. It’s tiring to hone terms of art so finely that they cease to exist in three dimensions; we may as well flaunt the imperfection.

When it comes to the matter of the merely medieval, here’s Steve Muhlberger again:

David Parry made the most sensible remark of the entire week when he pointed out that an imprecise word like medieval has a lot of cultural value for people who make their living interpreting that era. Indeed there is a financial payoff being associated with it.

What’s the worth of a timeworn coinage? Steve’s full blog post answers that question, with the suggestion that settling on terms can pay other, less measurable dividends too.

“The story is old, I know, but it goes on…”

With its mix of sunshine and harmless bluster, September brings back-to-school nostalgia—ivy-covered professors, that first fall riot, scoldings for being insufficiently euphoric over sports—and perhaps that’s why the past two weeks have swirled with stories about the woes of humanities types in academia. I’ve watched would-be scholars expire en route to the ferne hawle of full professorhood for 20 years, so I’m guessing that many grad students and adjuncts have newly discerned, with the sort of creeping, pitiless dread otherwise confined to Robert E. Howard stories, that they won’t find long-term employment.

First, at the Atlantic, Jordan Weissmann asked why the number of grad students in the humanities is growing. Then, Slate ran a piece about the awkwardness that still hangs about people with doctorates in the humanities who land “alt-ac” careers—that is, jobs where they don’t teach college. Apparently, though, there aren’t enough such lucky people, because a few days later, Salon covered adjunct professors on food stamps.

With all the attention this subject now gets in the press, I can only hope that fewer souls will fling themselves into the hellmouth—but maybe academia shouldn’t have undone quite so many in the first place. While reading about medievalism in recent days, I found two historians who sensed where things were headed long ago.

The first was Karl F. Morrison, who wrote “Fragmentation and Unity in ‘American Medievalism,'” a chapter in The Past Before Us, a 1980 report commissioned by the American Historical Association to explain the work of American historians to their colleagues in other countries. Morrison writes candidly about his field, but he also makes an especially prescient extrapolation, which I’ve bolded:

There was also an expectation in the “guild” that investment in professional training would, in due course, fetch a return in professional opportunity.

By 1970, these benefits could no longer be taken for granted. By 1974, even the president of Harvard University was constrained to deliver a budget of marked austerity, reducing “the number of Assistant Professors substantially while cutting the size of the graduate student body below the minimum desirable levels.” The aggregate result of many such budgets across the country was a sharp reduction in the number of professional openings for medievalists, and an impairment of library acquisitions and other facilities in aid of research. Awareness of this changed climate impelled a large number of advanced students to complete their doctoral dissertations quickly, producing a bulge that is noticeable around 1972-1974 in our tables. For many reasons, including the deliberate reduction or suspension of programs in some universities, it also resulted in a decline in the number of graduate students proceeding to the doctorate.

In effect, the historians who became qualified during this period without being able to secure professional employment constitute a generation of scholars that may be in the process of being lost, casualties of abrupt transition. There is no reason to expect that the demographic and economic trends that so sharply reversed their professional expectations will alter before the end of the century, and this projection raises certain quite obvious possibilities regarding the diversity and renewal of the profession.

Fast forward to 1994. Norman Cantor was gearing up for his fourth year of professional besiegement after the release of Inventing the Middle Ages, a book for non-academic readers in which he sought to show how the formative experiences of certain 20th-century medievalists explained the ways they interpreted history. Fellow historians didn’t like his blunt biographical approach—and so in “Medievalism and the Middle Ages,” a little-read article in The Year’s Work in Medievalism, Cantor hammered back at “establishment dust-grinders” and noted, in passing, the crummy academic job market and the prevalence of certain “alt-ac” career paths even then:

Within academia a fearful conservative conformity prevails. The marginal employment situation has a twofold negative impact. First, it discourages innovative minds and rebellious personalities from entering doctoral programs in the humanities. People in their late twenties and thirties today with the highest potential to be great medievalists and bridge academic medieval studies and popular medievalism are a phantom army, a lost generation. Instead, for the most part, of climbing the ladder at leading universities they are pursuing careers (often regretfully and unhappily if well-paid) in major law firms.

Second, even if imaginative people take Ph.D.’s in medieval disciplines, they face the job market and particularly once they get a prized tenure track post they encounter a chilling intellectual conservatism that frustrates expressions of their best thoughts and deepest feelings.

I like Cantor’s claim that academia is literally conservative. After all, people are still fretting over problems that he and Morrison noticed decades ago. It’s September 2014, yet Rebecca Schuman at Slate can still write: “The academic job market works on a fixed cycle, and according to a set of conventions so rigid that you’d think these people were applying for top-secret security clearances, not to teach Physics 101 to some pimply bros in Sheboygan.”

The early blogosphere was rife with humanities grad students and adjuncts wavering between disgruntlement and despair; the much-praised Invisible Adjunct rose up to unite them in discussions so civil that I can scarcely believe I saw them on the Internet.

As someone who writes about people who use the imagined past to carve out identities, argue from authority, resist mainstream culture, or seek respite from the real world, I think I understand why the number of new students in arts and humanities doctoral programs grew by 7.7 percent in 2012, but I can’t claim a moment’s nostalgia for the geeky excitement they surely must feel. Morrison and Cantor both imagined a lost generation, but their jobless contemporaries were merely wandering. For this next generation, that luxury is long gone—as is the prospect of claiming that nobody warned them.

“Empty-handed on the cold wind to Valhalla…”

For all the violence the Vikings unleashed, their enemies and victims might find cold comfort in the torments Americans now inflict on them. We’ve twisted them into beloved ancestors, corny mascots, symbolic immigrants, religious touchstones, comic relief—and, this week, proponents of gender equity on the battlefield. The medieval past is grotesque, uninviting, and indifferent to our hopes. We wish so badly that it weren’t.

“Shieldmaidens are not a myth!” trumpeted a Tor.com blog post on Tuesday, sharing tidings of endless Éowyns in the EZ-Pass lane to the Bifröst:

“By studying osteological signs of gender within the bones themselves, researchers discovered that approximately half of the remains were actually female warriors, given a proper burial with their weapons . . . It’s been so difficult for people to envision women’s historical contributions as solely getting married and dying in childbirth, but you can’t argue with numbers—and fifty/fifty is pretty damn good.

Great Odin’s ophthalmologist! Holy hopping Hávamál! Half of all Viking warriors were women?

Alas, no. “Researchers discovered” nothing of the sort—but that didn’t stop wishful linkers from sharing the “news” hundreds of times via Twitter and countless times on Facebook.

So what’s going on here? Besides conflating “Viking” with “Norse,” the pseudonymous author of the Tor.com blog post misread a two-year-old USA Today summary of a 2011 article by scholar Shane McLeod, who most definitely has not delivered forsaken warrior maidens from their long-neglected graves. No, McLeod simply did the un-newsworthy work of reassessing burial evidence for the settlement of Norse women in eastern England in the late 800s, with nary a Brunhilde or Éowyn in sight.

You can find “Warriors and women: Norse migrants to eastern England up to 900 AD” in the August 2011 issue of the journal Early Medieval Europe. If you don’t have institutional access to scholarly databases, the article is imprisoned behind a $35 paywall, which is a shame, because although McLeod’s piece requires a slow, patient read, you don’t need expertise in ninth-century English history or modern osteology to understand it—just the ability to follow an argument about a couple dozen skeletons in a tiny corner of England at a very specific time in history, plus an openness to the possibility that McLeod hasn’t brought your “Game of Thrones” fantasies to life.

Here’s the gist of McLeod’s article, as concisely as I can retell it:

Focusing only on the area of eastern England occupied by the Norse in the 800s, he looks at one sample of six or seven burials from five locations dating from 865 to 878 A.D. where scholars had made assumptions about the sex of the dead based on the stuff buried with them. He compares them to a second sample: 14 burials from five sites (dating from 873 to the early 10th century) where osteologists determined the sex of the dead by examining their bones.

In the first group, only one person was tagged as female. In the second group, between four and six of the dead, perhaps half of the sample, were found to be female, even though based on grave goods, at least one of them might previously have been assumed to be male, because one of those women was buried with a sword. (Ah, but that woman was also interred with a child of indeterminate sex. What if the sword belonged to her young son? And look: someone in the first group who might have been a woman was buried with a sword, too…)

McLeod’s assessment is this: If we scientifically determine the sex of the dead based on their bones rather that assume their sex based on grave goods, we find more evidence (to pile atop existing evidence from jewelry finds) that Norse women came to England with Norse armies, earlier and in greater numbers than previously thought, rather than in a later wave of migration and settlement. Perhaps the men weren’t “a demobbed Norse army seeking Anglo-Saxon wives,” but intermarried with local women in smaller numbers than historians previously believed.

For the lay reader, that’s a disheartening hoard of unsexy conclusions—and a far cry from the Tor.com blogger’s claim, mindlessly brayed across social media, that “Half of the Warriors were Female.” It’s fantasy, not scholarship, and certainly not science, to interpret one woman buried with a sword, maybe two, as evidence for Norse women in combat.

Shane McLeod deserves better. Working with limited data pried out of ninth-century crevices, he recognizes that his sample size is tiny, that it’s tough to identify burials as “Norse” for sure, and that his findings are only “highly suggestive.” He’s precise, tentative, and conscious of counter-arguments, and he seems willing to go wherever the evidence takes him. His biggest accomplishment, however, is highlighting a major scholarly error. Experts who made assumptions about male versus female grave goods failed to reassess the biases they project backwards onto the Middle Ages—even though doing so is one of the traits even the most pop-minded academic medievalists will often claim distinguishes them from the duct-tape-sword-wielding masses.

Likewise, science-fiction fans are forever congratulating themselves for holding the right opinions on such subjects as evolution, but this time they lazily succumbed to fannish fantasies, failing to question a claim that deserved to be pummeled by doubt. I’ve done tons of social-media copywriting, so I get why that blogger just wanted to throw something out there after a holiday to beguile weekend-weary eyeballs—but come on.

Science doesn’t always tell us what we want to hear. Truth demands nuanced consideration of evidence, and reason demands skepticism, neither of which flourish on social media—so if you shared or re-tweeted the Tor article, congratulations! This week, in the name of medievalism, you made the world stupider.

[2019 update: Research into this subject has developed quite a bit since 2014, but I’m keeping this post online because it’s still a good example of how careful academic research gets turned into misleading clickbait. Feel free to leave links to updated scholarly research in the comments for future readers who find this post via Google.]