How did teaching, nursing, and being a secretary become the only “proper” professions a woman could work at, at least in the US? Didn’t teaching used to be a man’s job? Was it a part of “women raise children” thing which then extended to teaching them?
Upvotes: 2
Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list 