Texas has a prominent Western prominent aesthetic to it, and that Western culture has been a part of Texas since the beginning. However there is also a claim that the Western aesthetics were added to Texas to shove out the Confederate/Dixie/Slave aspects of the state during the 1930s. Is this true?
Upvotes: 7
Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list