“The story is old, I know, but it goes on…”

With its mix of sunshine and harmless bluster, September brings back-to-school nostalgia—ivy-covered professors, that first fall riot, scoldings for being insufficiently euphoric over sports—and perhaps that’s why the past two weeks have swirled with stories about the woes of humanities types in academia. I’ve watched would-be scholars expire en route to the ferne hawle of full professorhood for 20 years, so I’m guessing that many grad students and adjuncts have newly discerned, with the sort of creeping, pitiless dread otherwise confined to Robert E. Howard stories, that they won’t find long-term employment.

First, at the Atlantic, Jordan Weissmann asked why the number of grad students in the humanities is growing. Then, Slate ran a piece about the awkwardness that still hangs about people with doctorates in the humanities who land “alt-ac” careers—that is, jobs where they don’t teach college. Apparently, though, there aren’t enough such lucky people, because a few days later, Salon covered adjunct professors on food stamps.

With all the attention this subject now gets in the press, I can only hope that fewer souls will fling themselves into the hellmouth—but maybe academia shouldn’t have undone quite so many in the first place. While reading about medievalism in recent days, I found two historians who sensed where things were headed long ago.

The first was Karl F. Morrison, who wrote “Fragmentation and Unity in ‘American Medievalism,'” a chapter in The Past Before Us, a 1980 report commissioned by the American Historical Association to explain the work of American historians to their colleagues in other countries. Morrison writes candidly about his field, but he also makes an especially prescient extrapolation, which I’ve bolded:

There was also an expectation in the “guild” that investment in professional training would, in due course, fetch a return in professional opportunity.

By 1970, these benefits could no longer be taken for granted. By 1974, even the president of Harvard University was constrained to deliver a budget of marked austerity, reducing “the number of Assistant Professors substantially while cutting the size of the graduate student body below the minimum desirable levels.” The aggregate result of many such budgets across the country was a sharp reduction in the number of professional openings for medievalists, and an impairment of library acquisitions and other facilities in aid of research. Awareness of this changed climate impelled a large number of advanced students to complete their doctoral dissertations quickly, producing a bulge that is noticeable around 1972-1974 in our tables. For many reasons, including the deliberate reduction or suspension of programs in some universities, it also resulted in a decline in the number of graduate students proceeding to the doctorate.

In effect, the historians who became qualified during this period without being able to secure professional employment constitute a generation of scholars that may be in the process of being lost, casualties of abrupt transition. There is no reason to expect that the demographic and economic trends that so sharply reversed their professional expectations will alter before the end of the century, and this projection raises certain quite obvious possibilities regarding the diversity and renewal of the profession.

Fast forward to 1994. Norman Cantor was gearing up for his fourth year of professional besiegement after the release of Inventing the Middle Ages, a book for non-academic readers in which he sought to show how the formative experiences of certain 20th-century medievalists explained the ways they interpreted history. Fellow historians didn’t like his blunt biographical approach—and so in “Medievalism and the Middle Ages,” a little-read article in The Year’s Work in Medievalism, Cantor hammered back at “establishment dust-grinders” and noted, in passing, the crummy academic job market and the prevalence of certain “alt-ac” career paths even then:

Within academia a fearful conservative conformity prevails. The marginal employment situation has a twofold negative impact. First, it discourages innovative minds and rebellious personalities from entering doctoral programs in the humanities. People in their late twenties and thirties today with the highest potential to be great medievalists and bridge academic medieval studies and popular medievalism are a phantom army, a lost generation. Instead, for the most part, of climbing the ladder at leading universities they are pursuing careers (often regretfully and unhappily if well-paid) in major law firms.

Second, even if imaginative people take Ph.D.’s in medieval disciplines, they face the job market and particularly once they get a prized tenure track post they encounter a chilling intellectual conservatism that frustrates expressions of their best thoughts and deepest feelings.

I like Cantor’s claim that academia is literally conservative. After all, people are still fretting over problems that he and Morrison noticed decades ago. It’s September 2014, yet Rebecca Schuman at Slate can still write: “The academic job market works on a fixed cycle, and according to a set of conventions so rigid that you’d think these people were applying for top-secret security clearances, not to teach Physics 101 to some pimply bros in Sheboygan.”

The early blogosphere was rife with humanities grad students and adjuncts wavering between disgruntlement and despair; the much-praised Invisible Adjunct rose up to unite them in discussions so civil that I can scarcely believe I saw them on the Internet.

As someone who writes about people who use the imagined past to carve out identities, argue from authority, resist mainstream culture, or seek respite from the real world, I think I understand why the number of new students in arts and humanities doctoral programs grew by 7.7 percent in 2012, but I can’t claim a moment’s nostalgia for the geeky excitement they surely must feel. Morrison and Cantor both imagined a lost generation, but their jobless contemporaries were merely wandering. For this next generation, that luxury is long gone—as is the prospect of claiming that nobody warned them.

“A legacy of romance from a twilight world…”

Last night, when the U.S. began walloping ISIS militants in Syria, our jets also hit the Khorasan group, hardcore Al-Qaeda veterans who are reportedly expert bomb-makers. When I first heard the news on the radio in my car, I wondered why Al-Qaeda had a group called “Corazón”—some Spanish-speaking faction, perhaps?—but then I realized I’d already written about the original Khurasanis. They were the muscle behind the Abbasids: the third Islamic caliphate, the dynasty associated with Baghdad’s founding and golden age, and the contemporaries of the Carolingians.

The fourth chapter of Becoming Charlemagne takes readers on a tour of Baghdad around the year 798:

In the ritzy Harbiya suburb of northwest Baghdad, the families of soldiers started each day with expectant prayers. In summer, they awoke in their cool basement apartments, or on their rooftops within sight of the Round City, where they greeted the dome at the hub of their city.

As a boy, the current caliph, Harun, had led their fathers and husbands to the frontiers against the Rum, the so-called Romans of Constantinople, the ones whom poets called al-asfar, “the yellow ones.” More recently, they had been paid to quash local rebellions, commanding armies in the service of the caliph. In a caliphate that stretched from northern Africa to India, there was a constant market for well-armed men. Praised by their contemporaries in story and song, these generals rarely lacked for work.

The comfortable estates of Harbiya had been built on that same military might. Only a few generations earlier, these soldiers had stormed out of the eastern province of Khurasan, bringing to power the descendants of al-Abbas, the uncle of the Prophet Muhammad, in a show of force and a flurry of black banners. The Abbasid caliphs had rewarded the Khurasanis with desirable land and jobs for their children, who now commanded the palace guard and ran the police force.

As this blog has long shown, Europeans and Americans love to dress up in medieval costumes, follow pseudo-medieval soap operas on television, construct medieval-ish buildings, and otherwise evoke or re-create the Middle Ages, sometimes to spurn the modern world, more often to carve out a place in it, whether individually or in groups. With their choice of name, the Khorasan nutjobs are heeding that same inexhaustible impulse. I can respond only by marching out one of my favorite observations from scholar Tom Shippey: “There are . . . many medievalisms in the world, and some of them are as safe as William Morris wallpaper: but not all of them.”

“Holding their own, last orders commanding attention…”

Some of us are so busy spotting medievalism in the modern world that sometimes we need to stop and notice the moments when the lack of it is literally remarkable.

Three days before Thursday’s referendum on Scottish independence, the Wall Street Journal ran a curious piece by foreign affairs writer Bret Stephens, who harks back to the 1919 Paris Peace Conference. Stephens suggests that the Wilsonian emphasis on national self-determination backfired, leading people around the world to the perilous realization that “nations are almost endlessly divisible into smaller entities.” Wilson and his advisers (some of whom were medieval historians) did get it wrong when they cobbled together a doomed Yugoslavia, but Stephens believes that when smaller countries go it alone, they may become dangerous, poor, corrupt, or insignificant.

The point is interesting and debatable—but Stephens’ conclusion is inarguably weird:

Some Scots may imagine that by voting “Yes” they are redeeming the memory of William Wallace. Maybe. The other way of looking at it is as a vote for medievalism over modernity.

Memo to wannabe Bravehearts: The 13th century wasn’t all that fun.

“Medievalism over modernity”! That might seem like a fair way to talk about a referendum that was almost slated for June 24, the 700th anniversary of the battle of Bannockburn, a key moment in the medieval fight for Scottish independence.

The thing is, I followed the news surrounding the referendum, which was actually held on an otherwise unimportant date in Scottish history. I browsed the “Yes” websites and sat through the videos. Knowing that European nationalists love to dig up and reanimate their shambling medieval ancestors—benignly in countries like Finland, malevolently in places like Germany and Serbia—I kept an eye out for William Wallace, Robert the Bruce, and other heroes hauled from the pages of Sir Walter Scott.

I saw economic arguments, anti-nuke and anti-English rhetoric, sentimental appeals to independence, and other pleas—but outside of news articles reporting on Scotland’s history of pre-1707 independence, I saw nary a trace of sword-wielding medieval warriors. I don’t doubt that in recent weeks, somebody decked out in costume and kit spoke glowingly of Scotland’s medieval glory, and I hope readers will send me examples—but overwhelmingly, the “Yes” side rooted its arguments not in some politicized dreamland of castles and kings, but in the here and now.

The press has been keen to emphasize that separatist movements in Catalonia, the Basque region, Flanders, the Crimea, and even Venice were watching to see what the Scots would decide. I assume there was interest in Wales and Cornwall as well. I suppose it’s possible that these movements will conclude that the “Yes” campaigners failed because the Scots didn’t sufficiently use their medieval heritage to inflame nationalistic pride—but if so, that won’t be Scotland’s fault.

As a distant, disinterested observer, I had no opinion on the outcome of the referendum except to note that the Scots set a worthy and decent precedent: asserting their identity and affirming their independence while keeping their medieval forefathers silent and snug in their graves.

“Mais nous pouvons faire ce que nous voulons…”

Because I’m monstrously busy, I figure it’s time to bring back some of the more literal monsters featured on this blog from 2009 to 2012. Every few weeks, I challenged myself to wander up to the National Cathedral, choose from among its myriad gargoyles and grotesques, and write a poem inspired by what I’d seen.

With the kind permission of the cathedral, I collected the resulting poems, 53 in all, in a 138-page paperback that you can order online, buy at the cathedral gift shop, or purchase from me via email. (You can browse the first drafts of 51 of the poems here.)

Written in March 2011, “Apologia” was certainly one of the weirder poems, inspired as it was by the indifference of a snake to the shock and hopelessness of his prey. I almost put a poem in the mouth of the rabbit, but then I attended an exhibition of medieval reliquaries in Baltimore and jotted down this note: “snake an antiquarian with a fascination for the Anglo-Saxons, attempting to explain to the rabbit the weird, mythologized larger purpose for eating him.”

The resulting poem is full of New Old English, but my hope is that even people who don’t get a word of it will read it aloud and find it fall familiar on the tongue.

APOLOGIA

Heo cwaeð: “Seo naedre bepaehte me ond ic aett.”
—Gen. 3:13 (British Library MS Cotton Claudius B.iv)

We rede the Saxons sympathised with snakes:
On broach and bract they turve and intertwine
But buckle when modernity awakes;
All laud the wyrm who weaves a wulfish vine.

In retsel-books and wrixled words we find
The Saxons, ever lacertine, bestirred
To grammar-craft, whose duple pronouns bind;
So sundered lives were woven with a word.

(A scene: Some god-forsook Northumbrish monk,
Emboldened by an asp to double think,
Professes wit and unk and unker-unk,
But shrinks from git and ink and inker-ink.)

Now I, who raveled precedent relate,
Propose that we be litchwise intertraced;
The wulf and adder gleam on plink and plait,
Yet no immortal lepus ever graced

The lapidated latch of art divine,
So spurn your sallow scrafe, forget the sun.
For you the relic, I the blessid shrine;
In wit and work alike, we two are one.

“Empty-handed on the cold wind to Valhalla…”

For all the violence the Vikings unleashed, their enemies and victims might find cold comfort in the torments Americans now inflict on them. We’ve twisted them into beloved ancestors, corny mascots, symbolic immigrants, religious touchstones, comic relief—and, this week, proponents of gender equity on the battlefield. The medieval past is grotesque, uninviting, and indifferent to our hopes. We wish so badly that it weren’t.

“Shieldmaidens are not a myth!” trumpeted a Tor.com blog post on Tuesday, sharing tidings of endless Éowyns in the EZ-Pass lane to the Bifröst:

“By studying osteological signs of gender within the bones themselves, researchers discovered that approximately half of the remains were actually female warriors, given a proper burial with their weapons . . . It’s been so difficult for people to envision women’s historical contributions as solely getting married and dying in childbirth, but you can’t argue with numbers—and fifty/fifty is pretty damn good.

Great Odin’s ophthalmologist! Holy hopping Hávamál! Half of all Viking warriors were women?

Alas, no. “Researchers discovered” nothing of the sort—but that didn’t stop wishful linkers from sharing the “news” hundreds of times via Twitter and countless times on Facebook.

So what’s going on here? Besides conflating “Viking” with “Norse,” the pseudonymous author of the Tor.com blog post misread a two-year-old USA Today summary of a 2011 article by scholar Shane McLeod, who most definitely has not delivered forsaken warrior maidens from their long-neglected graves. No, McLeod simply did the un-newsworthy work of reassessing burial evidence for the settlement of Norse women in eastern England in the late 800s, with nary a Brunhilde or Éowyn in sight.

You can find “Warriors and women: Norse migrants to eastern England up to 900 AD” in the August 2011 issue of the journal Early Medieval Europe. If you don’t have institutional access to scholarly databases, the article is imprisoned behind a $35 paywall, which is a shame, because although McLeod’s piece requires a slow, patient read, you don’t need expertise in ninth-century English history or modern osteology to understand it—just the ability to follow an argument about a couple dozen skeletons in a tiny corner of England at a very specific time in history, plus an openness to the possibility that McLeod hasn’t brought your “Game of Thrones” fantasies to life.

Here’s the gist of McLeod’s article, as concisely as I can retell it:

Focusing only on the area of eastern England occupied by the Norse in the 800s, he looks at one sample of six or seven burials from five locations dating from 865 to 878 A.D. where scholars had made assumptions about the sex of the dead based on the stuff buried with them. He compares them to a second sample: 14 burials from five sites (dating from 873 to the early 10th century) where osteologists determined the sex of the dead by examining their bones.

In the first group, only one person was tagged as female. In the second group, between four and six of the dead, perhaps half of the sample, were found to be female, even though based on grave goods, at least one of them might previously have been assumed to be male, because one of those women was buried with a sword. (Ah, but that woman was also interred with a child of indeterminate sex. What if the sword belonged to her young son? And look: someone in the first group who might have been a woman was buried with a sword, too…)

McLeod’s assessment is this: If we scientifically determine the sex of the dead based on their bones rather that assume their sex based on grave goods, we find more evidence (to pile atop existing evidence from jewelry finds) that Norse women came to England with Norse armies, earlier and in greater numbers than previously thought, rather than in a later wave of migration and settlement. Perhaps the men weren’t “a demobbed Norse army seeking Anglo-Saxon wives,” but intermarried with local women in smaller numbers than historians previously believed.

For the lay reader, that’s a disheartening hoard of unsexy conclusions—and a far cry from the Tor.com blogger’s claim, mindlessly brayed across social media, that “Half of the Warriors were Female.” It’s fantasy, not scholarship, and certainly not science, to interpret one woman buried with a sword, maybe two, as evidence for Norse women in combat.

Shane McLeod deserves better. Working with limited data pried out of ninth-century crevices, he recognizes that his sample size is tiny, that it’s tough to identify burials as “Norse” for sure, and that his findings are only “highly suggestive.” He’s precise, tentative, and conscious of counter-arguments, and he seems willing to go wherever the evidence takes him. His biggest accomplishment, however, is highlighting a major scholarly error. Experts who made assumptions about male versus female grave goods failed to reassess the biases they project backwards onto the Middle Ages—even though doing so is one of the traits even the most pop-minded academic medievalists will often claim distinguishes them from the duct-tape-sword-wielding masses.

Likewise, science-fiction fans are forever congratulating themselves for holding the right opinions on such subjects as evolution, but this time they lazily succumbed to fannish fantasies, failing to question a claim that deserved to be pummeled by doubt. I’ve done tons of social-media copywriting, so I get why that blogger just wanted to throw something out there after a holiday to beguile weekend-weary eyeballs—but come on.

Science doesn’t always tell us what we want to hear. Truth demands nuanced consideration of evidence, and reason demands skepticism, neither of which flourish on social media—so if you shared or re-tweeted the Tor article, congratulations! This week, in the name of medievalism, you made the world stupider.

[2019 update: Research into this subject has developed quite a bit since 2014, but I’m keeping this post online because it’s still a good example of how careful academic research gets turned into misleading clickbait. Feel free to leave links to updated scholarly research in the comments for future readers who find this post via Google.]