I didn't set out to write a western. That's one of the first things most people learn about my novel, Black River, though: it's a western. It says so in my publisher's promotional materials, in the first lines of most reviews, and in many reader comments left on sites like Goodreads. But when I was writing the novel, I didn't think of it as a western; I thought of it as a book about the kinds of places and people I've spent most of my life around. It just turns out that some of those places are in Montana and some of those people wear cowboy boots.
The designation of some American fiction as "regional" has always struck me as a bit misleading. It's a term that seems disproportionately applied to works written by authors living in either the South or the West, though surely fiction set in the Midwest or Northeast or Mid-Atlantic could be just as easily considered regional. These places have their own histories, their own traditions, their own landscapes that shape those who live in them. That said, it's true that the West holds a special place in the American imagination. The West is the land of wide-open spaces, frontier spirit, opportunity. Of course, in 2015, Wal-Marts dot the landscape, cell towers watch over grazing cattle, and while there's still plenty of open space, little of it feels as isolated as it once did. So if that's the case, why write western fiction? Why read it?
In recent years, quite a few literary westerns have been published and become widely read, both historical and contemporary. At first glance, historical westerns can seem particularly puzzling -- isn't the western genre glutted? After hundreds of cowboys-and-Indians movies and novels, is there anything left to say? -- but the very popularity of "old-fashioned" westerns in the past helps explain the hunger for new takes on the historic West. If you're of a certain age and grew up in the United States, you've almost certainly been inundated with images and stories about the "Wild West," many of which were inaccurate at best and damaging at worst. There's a strong desire among many writers and readers today to revisit the settling and evolution of the American West from the perspectives of those who were denied a voice during the first tellings. The opportunity to break free of stereotypes and explore the history and mythology of the American West in a broader and more nuanced fashion appeals to many people.
So what about more contemporary stories? Is it helpful to designate fiction set in the American West as "western"? Even though I find myself a bit resistant to the "western author" label, I have to concede that life in the West is different than in other parts of the country. When I visited New York's American Museum of Natural History last year, I realized I'd encountered many of the animals in the Hall of North American Mammals live in the wild, including bears, cougars, elk, and moose. While I am not and do not expect to ever be a gun owner, I attended a college at which the dorms had gun lockers, and it wasn't unusual to see students carrying rifles through campus parking lots on their way to a weekend hunting trip. Even in the city of over 200,000 in which I live now, one of the most contentious issues you could bring up is the management of wolves. (The city is plastered with billboards from both anti- and pro-wolf groups. The former's billboards strongly imply that wolves are actively plotting to eat your sweet Golden Retriever and adorable young child, while the latter's ignore the concerns of livestock owners in favor of assuring you that lightning, all-terrain vehicles, and elevators will kill you long before wolves will.)
Of course, the realities of the contemporary West are inextricably linked to the mythologies of the historical West. Those wolves? To some, they're emblematic of the wild American West that people eradicated by settling here and changing the landscape; to others, they're a threat to the traditional western lifestyle involving raising livestock and earning a living from the land. There's definitely still a frontier mindset that influences many Westerners. Sometimes it's a positive thing, reflected in a strong work ethic and a healthy self-sufficiency, but it also has a darker side and can, in a few cases, lead to separatism and extremism. (The latter is something I'm exploring in the novel I'm writing now.)
In any case, it's in these clashes between the expectations of the West -- those held by Westerners and outsiders alike -- and the realities of the West that stories are found. As long as there are people living in the West, shaped by its landscape, steeped in its history, there will be stories to tell about them. Why write fiction about the West? Because it's part of the American experience. Because it's part of the human experience. Because fiction is at its best when it tells the stories of as many people and places as possible.