Think of your favorite sitcom or television show. They usually present the audience with a "typical" family where the mom is skinny and beautiful and the husband is not so intelligent or attractive. (The Simpsons, Grounded for Life, The King of Queens, Fresh Prince, Everybody Loves Raymond, The George Lopez Show, etc.) What about "reality" television like Jersey Shore or Tough Love? Women are dramatic, fake, and only interested in how they look and finding a guy to be happy. In real life, women are intelligent and no different than their male counterparts. They seriously aren't as superficial as they seem. In my opinion, gender is a social construct. The only difference between a man and a women is their reproductive systems but, other than that, they are both capable of anything they set their mind to. Women have come along way to gain equal rights and respect from the other gender. However, they are still being oppressed and the media/arts are not helping the matter.
My question to you is: Women have such a negative image of themselves because of what they see in the media. Even though they have come along way to gain equality, they are still not completely there. What must be done to change how the media and the arts presents women in order for true equality between the sexes to take place and will this ever happen?
Does this not scream gang rape?

No comments:
Post a Comment