It seems that from the 1930s to 1960s Western movies dominated the American film industry, but by the 1980s they were almost non-existent. What happened to cause the decline of the Western film?
by /u/debaser11 in /r/AskHistorians
Upvotes: 58
Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list