StoryNote logo

It seems that from the 1930s to 1960s Western movies dominated the American film industry, but by the 1980s they were almost non-existent. What happened to cause the decline of the Western film?

by /u/debaser11 in /r/AskHistorians

Upvotes: 58

Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list

StoryNote©

Reddit is a registered trademark of Reddit, Inc. Use of this trademark on our website does not imply any affiliation with or endorsement by Reddit, Inc.