“Unsheathe the blade within the voice…”

Is polysemy now unseemly? Two weeks ago, when historian Steve Muhlberger traveled to that great North American ent-moot, the International Congress on Medieval Studies, he found himself in the midst of “a lot of griping and grouching about the misuse and ambiguity of the word medieval.” In a lucid and laudably concise blog post, he calls out the problem behind the problem:

You would think that a bunch of scholars who by their very nature of their discipline are experts in the evolution of the meaning of words would by now have gotten over the fact that though it doesn’t make a lot of sense to call “the Middle Ages” by that term, and that coming up with a really good, chronological definition of those ages is impossible, we are stuck with the words medieval and Middle Ages anyway. But no . . .

Steve is a scholar of chivalric tournaments and an experienced combat reenactor, so he knows how to land a disarming blow:

This can be intensely irritating for people who know that certain phrases and analyses lost their cogency back in 1927 and want to talk about what their friends are doing in the field now. Nevertheless people whose business is words should really accept the fact that words like “medieval” have a number of popular meanings, and when one of them shows up in current discussion (when, for instance, a Game of Thrones shows up and is widely labelled as medieval, even though the world of Game of Thrones is not our earth at all), the fact can be dealt with a good-humored way. It certainly would reflect credit on any field where a good-humored approach was the norm.

It would indeed. Off campus, the world blissfully resists more than a century of scholarship—pop culture still depicts Vikings in huge horned helmets, for heaven’s sake—and I respectfully suggest that more scholars contemplate why this is so.

As the rare soul who’s read every volume of Studies in Medievalism, I’ve marveled at the field’s mania for nomenclature. Since at least 2009, contributors to the journal—and its sister publication The Year’s Work in Medievalism, and its annual conference, and a pricey new handbook of critical terms—have kicked around the meaning of “medievalism” and “neo-medievalism” until every syllable simpers for mercy. Because I write about medievalism not as a professional scholar but as a footloose amateur, I miss the many years of meaty articles explaining, say, how boys’ chivalric clubs helped inspire the American scouting movement or why we’re perpetually tempted to make Dante a mouthpiece for generational angst. Forged from an accidental alloy of romanticism, nostalgia, politics, religion, and wishful thinking, medievalism can’t help but have jagged edges. It’s tiring to hone terms of art so finely that they cease to exist in three dimensions; we may as well flaunt the imperfection.

When it comes to the matter of the merely medieval, here’s Steve Muhlberger again:

David Parry made the most sensible remark of the entire week when he pointed out that an imprecise word like medieval has a lot of cultural value for people who make their living interpreting that era. Indeed there is a financial payoff being associated with it.

What’s the worth of a timeworn coinage? Steve’s full blog post answers that question, with the suggestion that settling on terms can pay other, less measurable dividends too.

“The story is old, I know, but it goes on…”

With its mix of sunshine and harmless bluster, September brings back-to-school nostalgia—ivy-covered professors, that first fall riot, scoldings for being insufficiently euphoric over sports—and perhaps that’s why the past two weeks have swirled with stories about the woes of humanities types in academia. I’ve watched would-be scholars expire en route to the ferne hawle of full professorhood for 20 years, so I’m guessing that many grad students and adjuncts have newly discerned, with the sort of creeping, pitiless dread otherwise confined to Robert E. Howard stories, that they won’t find long-term employment.

First, at the Atlantic, Jordan Weissmann asked why the number of grad students in the humanities is growing. Then, Slate ran a piece about the awkwardness that still hangs about people with doctorates in the humanities who land “alt-ac” careers—that is, jobs where they don’t teach college. Apparently, though, there aren’t enough such lucky people, because a few days later, Salon covered adjunct professors on food stamps.

With all the attention this subject now gets in the press, I can only hope that fewer souls will fling themselves into the hellmouth—but maybe academia shouldn’t have undone quite so many in the first place. While reading about medievalism in recent days, I found two historians who sensed where things were headed long ago.

The first was Karl F. Morrison, who wrote “Fragmentation and Unity in ‘American Medievalism,'” a chapter in The Past Before Us, a 1980 report commissioned by the American Historical Association to explain the work of American historians to their colleagues in other countries. Morrison writes candidly about his field, but he also makes an especially prescient extrapolation, which I’ve bolded:

There was also an expectation in the “guild” that investment in professional training would, in due course, fetch a return in professional opportunity.

By 1970, these benefits could no longer be taken for granted. By 1974, even the president of Harvard University was constrained to deliver a budget of marked austerity, reducing “the number of Assistant Professors substantially while cutting the size of the graduate student body below the minimum desirable levels.” The aggregate result of many such budgets across the country was a sharp reduction in the number of professional openings for medievalists, and an impairment of library acquisitions and other facilities in aid of research. Awareness of this changed climate impelled a large number of advanced students to complete their doctoral dissertations quickly, producing a bulge that is noticeable around 1972-1974 in our tables. For many reasons, including the deliberate reduction or suspension of programs in some universities, it also resulted in a decline in the number of graduate students proceeding to the doctorate.

In effect, the historians who became qualified during this period without being able to secure professional employment constitute a generation of scholars that may be in the process of being lost, casualties of abrupt transition. There is no reason to expect that the demographic and economic trends that so sharply reversed their professional expectations will alter before the end of the century, and this projection raises certain quite obvious possibilities regarding the diversity and renewal of the profession.

Fast forward to 1994. Norman Cantor was gearing up for his fourth year of professional besiegement after the release of Inventing the Middle Ages, a book for non-academic readers in which he sought to show how the formative experiences of certain 20th-century medievalists explained the ways they interpreted history. Fellow historians didn’t like his blunt biographical approach—and so in “Medievalism and the Middle Ages,” a little-read article in The Year’s Work in Medievalism, Cantor hammered back at “establishment dust-grinders” and noted, in passing, the crummy academic job market and the prevalence of certain “alt-ac” career paths even then:

Within academia a fearful conservative conformity prevails. The marginal employment situation has a twofold negative impact. First, it discourages innovative minds and rebellious personalities from entering doctoral programs in the humanities. People in their late twenties and thirties today with the highest potential to be great medievalists and bridge academic medieval studies and popular medievalism are a phantom army, a lost generation. Instead, for the most part, of climbing the ladder at leading universities they are pursuing careers (often regretfully and unhappily if well-paid) in major law firms.

Second, even if imaginative people take Ph.D.’s in medieval disciplines, they face the job market and particularly once they get a prized tenure track post they encounter a chilling intellectual conservatism that frustrates expressions of their best thoughts and deepest feelings.

I like Cantor’s claim that academia is literally conservative. After all, people are still fretting over problems that he and Morrison noticed decades ago. It’s September 2014, yet Rebecca Schuman at Slate can still write: “The academic job market works on a fixed cycle, and according to a set of conventions so rigid that you’d think these people were applying for top-secret security clearances, not to teach Physics 101 to some pimply bros in Sheboygan.”

The early blogosphere was rife with humanities grad students and adjuncts wavering between disgruntlement and despair; the much-praised Invisible Adjunct rose up to unite them in discussions so civil that I can scarcely believe I saw them on the Internet.

As someone who writes about people who use the imagined past to carve out identities, argue from authority, resist mainstream culture, or seek respite from the real world, I think I understand why the number of new students in arts and humanities doctoral programs grew by 7.7 percent in 2012, but I can’t claim a moment’s nostalgia for the geeky excitement they surely must feel. Morrison and Cantor both imagined a lost generation, but their jobless contemporaries were merely wandering. For this next generation, that luxury is long gone—as is the prospect of claiming that nobody warned them.

“Empty-handed on the cold wind to Valhalla…”

For all the violence the Vikings unleashed, their enemies and victims might find cold comfort in the torments Americans now inflict on them. We’ve twisted them into beloved ancestors, corny mascots, symbolic immigrants, religious touchstones, comic relief—and, this week, proponents of gender equity on the battlefield. The medieval past is grotesque, uninviting, and indifferent to our hopes. We wish so badly that it weren’t.

“Shieldmaidens are not a myth!” trumpeted a Tor.com blog post on Tuesday, sharing tidings of endless Éowyns in the EZ-Pass lane to the Bifröst:

“By studying osteological signs of gender within the bones themselves, researchers discovered that approximately half of the remains were actually female warriors, given a proper burial with their weapons . . . It’s been so difficult for people to envision women’s historical contributions as solely getting married and dying in childbirth, but you can’t argue with numbers—and fifty/fifty is pretty damn good.

Great Odin’s ophthalmologist! Holy hopping Hávamál! Half of all Viking warriors were women?

Alas, no. “Researchers discovered” nothing of the sort—but that didn’t stop wishful linkers from sharing the “news” hundreds of times via Twitter and countless times on Facebook.

So what’s going on here? Besides conflating “Viking” with “Norse,” the pseudonymous author of the Tor.com blog post misread a two-year-old USA Today summary of a 2011 article by scholar Shane McLeod, who most definitely has not delivered forsaken warrior maidens from their long-neglected graves. No, McLeod simply did the un-newsworthy work of reassessing burial evidence for the settlement of Norse women in eastern England in the late 800s, with nary a Brunhilde or Éowyn in sight.

You can find “Warriors and women: Norse migrants to eastern England up to 900 AD” in the August 2011 issue of the journal Early Medieval Europe. If you don’t have institutional access to scholarly databases, the article is imprisoned behind a $35 paywall, which is a shame, because although McLeod’s piece requires a slow, patient read, you don’t need expertise in ninth-century English history or modern osteology to understand it—just the ability to follow an argument about a couple dozen skeletons in a tiny corner of England at a very specific time in history, plus an openness to the possibility that McLeod hasn’t brought your “Game of Thrones” fantasies to life.

Here’s the gist of McLeod’s article, as concisely as I can retell it:

Focusing only on the area of eastern England occupied by the Norse in the 800s, he looks at one sample of six or seven burials from five locations dating from 865 to 878 A.D. where scholars had made assumptions about the sex of the dead based on the stuff buried with them. He compares them to a second sample: 14 burials from five sites (dating from 873 to the early 10th century) where osteologists determined the sex of the dead by examining their bones.

In the first group, only one person was tagged as female. In the second group, between four and six of the dead, perhaps half of the sample, were found to be female, even though based on grave goods, at least one of them might previously have been assumed to be male, because one of those women was buried with a sword. (Ah, but that woman was also interred with a child of indeterminate sex. What if the sword belonged to her young son? And look: someone in the first group who might have been a woman was buried with a sword, too…)

McLeod’s assessment is this: If we scientifically determine the sex of the dead based on their bones rather that assume their sex based on grave goods, we find more evidence (to pile atop existing evidence from jewelry finds) that Norse women came to England with Norse armies, earlier and in greater numbers than previously thought, rather than in a later wave of migration and settlement. Perhaps the men weren’t “a demobbed Norse army seeking Anglo-Saxon wives,” but intermarried with local women in smaller numbers than historians previously believed.

For the lay reader, that’s a disheartening hoard of unsexy conclusions—and a far cry from the Tor.com blogger’s claim, mindlessly brayed across social media, that “Half of the Warriors were Female.” It’s fantasy, not scholarship, and certainly not science, to interpret one woman buried with a sword, maybe two, as evidence for Norse women in combat.

Shane McLeod deserves better. Working with limited data pried out of ninth-century crevices, he recognizes that his sample size is tiny, that it’s tough to identify burials as “Norse” for sure, and that his findings are only “highly suggestive.” He’s precise, tentative, and conscious of counter-arguments, and he seems willing to go wherever the evidence takes him. His biggest accomplishment, however, is highlighting a major scholarly error. Experts who made assumptions about male versus female grave goods failed to reassess the biases they project backwards onto the Middle Ages—even though doing so is one of the traits even the most pop-minded academic medievalists will often claim distinguishes them from the duct-tape-sword-wielding masses.

Likewise, science-fiction fans are forever congratulating themselves for holding the right opinions on such subjects as evolution, but this time they lazily succumbed to fannish fantasies, failing to question a claim that deserved to be pummeled by doubt. I’ve done tons of social-media copywriting, so I get why that blogger just wanted to throw something out there after a holiday to beguile weekend-weary eyeballs—but come on.

Science doesn’t always tell us what we want to hear. Truth demands nuanced consideration of evidence, and reason demands skepticism, neither of which flourish on social media—so if you shared or re-tweeted the Tor article, congratulations! This week, in the name of medievalism, you made the world stupider.

[2019 update: Research into this subject has developed quite a bit since 2014, but I’m keeping this post online because it’s still a good example of how careful academic research gets turned into misleading clickbait. Feel free to leave links to updated scholarly research in the comments for future readers who find this post via Google.]