His spurs jangle with every step, his holster slung low as the red desert sun. A whistling wind sends tumbleweeds capering past listless horses. The saloon doors complain as they swing open, and a half-tuned piano falls suddenly silent. A glass of frontier whiskey coasts down the bar at breakneck speed. An instant before it careens off the edge, the gunslinger’s hand flashes out to catch it – but the other never leaves the hilt of his revolver. That’s the essence of the Wild West, or at least what we imagine it to have been. It was about freedom, masculinity, the idea of might making right, and the never-ending struggle for territory and possessions. John Wayne, Bud Spencer, Terence Hill, and Lucky Luke are a few of the dusty luminaries who come to mind. But when and where would one place the heyday of the real Wild West?
It’s tough to nail it down precisely, but most folks talk about the time when 19th-century pioneers were pushing out west of the Mississippi before there were so many United States. It was an era defined by a sense of unbridled adventure. European emigrants and former slaves, along with plenty of people soured by their side’s defeat in the Civil War, followed the sunset in search of new beginnings. Some of them dreamt of striking it rich quick and soon caught the gold fever. Besides that and farming the land, raising cattle was an important part of the day’s economy. That was how the cowboy’s trade came to be so synonymous with the Wild West. Since the law hadn’t yet found its way out to these uncharted territories, some parts proved to be fertile ground for all kinds of desperados. It’s been largely romanticised in books and moving pictures, of course, but the West was also won through the brutal expulsion (and in some cases, outright extirpation) of the land’s native peoples.