The dawn of the twentieth century saw denim moving away from work factories and into the personal culture of Americans. This was the beginning of the Wild West era, the western United States was no longer a rugged tough guy frontier, but a developed, civilized region. The industrialization of the region made wagons and horseback travel obsolete, but that did not prevent Hollywood from glorifying the old west.

The Great Train Robbery, a ten minute short, was the very first cowboy movie released in 1904. It wasn’t until the 1930s that Western movies really ignited Americans’ interest, mostly  personified by John Wayne with a pistol in hand, a Stetson hat, and a pair of dirty denim jeans. In response to these popular motion pictures, denim became a symbol of individualism and a life of independence.

Denim brands caught onto this and started advertising to focus on the mass personification of cowboys. Denim was only being sold west of the Missisippi river, and tourists headed in large numbers to dude ranches of the west. Not only did they experience the “cowboy” lifestyle there, but they also purchased jeans that they brought home to show off to their friends.