I've always had a bit of a hint that Disney was kind of bad. I recently read fast food nation in which they discuss the origins of the company and also Walt Disney himself being a bit questionable. But the last documentary we watched in class really brought a lot to light for me. What stood out most was questions of racism and of sexism. It's true that there are no minority characters in a lot of the stories. However, it seems that a lot of times the bad guys are usually racial minorities as far as the animal characters.
If this is far-fetched, I think the overt notions of sexism are not. The female characters are always incredibly vulnerable and do rely quite heavily on male characters to get them out of trouble. Many times the female characters don't have mothers. I think this lends to the vulnerability. As a child, I always found myself disappointed at reenactments that I would see of the Disney Princesses on TV. The reason probably was that real women are not proportioned the way that the Princesses are! All of the female protagonists in Disney cartoons have a large bust and teeny, tiny waists. It's not that these things are particularly outrageous. Like Lou said, is it possible to have anything entirely politically correct? Probably not. However, it does not seem like a preferable thing to promote. Not only that, but who wants to expose their children to such nonsense?
I guess my question would be will it ever stop? Even while I was watching this documentary, I was thinking, but I still really like Disney! As long as Disney still appeals to children will there ever be an end? Will the reign of Disney ever cease, especially considering how wealthy the corporation is and how much power they have considering all they own?
A. Gorno
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment