Walking Dead is a fairly new television show, that came out back in 2010 and is now on its second season. The show is about police officer Rick Grimes leading a group of survivors in a zombie apocalypse world. This show is extremely popular it has been nominated for a Golden Globe, won 6 awards, and has had 25 other nominations. The women in this series have little to no power, and that little power that they do possess is by the men giving them permission to carry guns. Besides being able to carry guns for their own protection they are shown tending to their, doing laundry, watching children, and making dinner. The series gives off the impression that women can not survive without the protection of the male survivors in the group. The women are sometimes asked their opinions on certain topics but at the end of the day it's the men who make the final decisions. Is this show trying to say that in a apocalyptic world that women lose all of their rights? While I am entertained by zombies I can not agree with the message the show is sending to its viewers. I'd like to think that the feminist movement of the 70's has gotten stronger and in a apocalyptic world that women would rise to the same power level as men and be just as sufficient as men when it comes to surviving the zombie attack. .
Mad Men has been on television a little longer having come on the air back in 2007, and now being in its fifth season. This show is set back in the 50's/60's, it is about a group of caucasian males who work in advertisement and are at the top of their game. This shows just oozes with sexism. Women are shown as sex objects not as human beings, and men have all the glory and power and answer to no one. Wives are stuck watching the children and taking care of the house while the men are at work flirting/ having affairs with their secretaries and drinking with their friends. The wives lives are very sad in this show. Women seem to have no opinion or voice whatsoever while men walk all over them and their is nothing they can do about it. While during the era that this show is set in women's rights were very limited, I still believe this show is extremely offensive to women. Wives are shown having nightmares about their husbands divorcing them for other women and work hard to keep them ignoring their affairs.
These two shows are set during different time periods one in the 50's and the other set in modern times however both are sexist, with men having all the power and women having little to none. Also both of these television series are shown on the AMC channel, could it be that this channel is trying to say that times haven't changed, that the feminist movement changed nothing, and power will always remain in the hands of men? I greatly disagree with the message these two shows are giving out to their constantly growing audiences and believe that something needs to change.
I agree that the Walking Dead is very sexist, but I wonder if it isn't intentional. The show may be trying to say that in the event of a zombie attack we may regress to primal times where men were the protectors and women were to ones to look out for the children and cook. (By the way, love this show! Even if it is sexist :)) I have not seen Mad Men, but just the fact that it is based in the 50s/ 60s makes me think that it would be hard pressed not to be sexist. Since this was the time before women had rights I'm not suprised that they are objectified in this show. I think there will always be some sexism on TV, but it's getting better. There are shows that are full of un-tainted "Grrl Power" and I think this change will progress into the future.
ReplyDeleteI wonder what would happen during an actual zombie attack? It's interesting to think that we might regress to ancient gender dynamics. I'm not saying that women are not capable of learning how to wield, and use a gun. I'm just saying that before we evolved(some of us) warriors, and hunters were traditionally men.
ReplyDelete