"Over the past 200 years, women's clothes in the West have trended toward more revealing, while men's have remained more or less the same in terms of exposed skin." How true or false is this statement? And if true, what has been the impetus behind this trend?
Upvotes: 287
Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list