Utopia, the perfect world, where the sun shines daily, people are equal and no one lacks for anything. While we realize that this is probably never going to happen as each side of every issue have difficulty finding a happy medium, we trudge through our daily lives, going to and from work, schlepping our children from activity to activity, and paying those nasty bills instead of finding Utopia. Okay so our lives aren't as awful as I just made it seem, but I am getting to my point. We don't live in Utopia but we also don't live in a Dystopia either.
For those unfamiliar with that term, think The Hunger Games, think The Walking Dead. It's the opposite of a Utopia, a world that we wouldn't want to live in, something frightening and unfamiliar, as a result of an event, biological or otherwise which leads to the cataclysmic decline in society. Stories are filled with the dehumanization of the person, totalitarian governments, lawlessness. So then, why are these television shows, movies and books so popular?
We are huge The Hunger Games and The Walking Dead fans in our household, classic examples of the dystopian society. In one story line we see the catastrophic spread of a disease which turns the human race into flesh-eating zombies. A society whose only purpose is to run from the zombie monsters and survive.
In The Hunger Games, we're entrenched in a world that is recovering from a revolt attempted by the less fortunate. Citizens living in the outer districts, being controlled because of The Hunger Games. A world where the government controls the masses by sending children 12 – 18 years of age to the spectacle, where they must fight to death. Who in what society would allow this to happen? How does it get to this? Again, an impossible world that we can only imagine, one that is terrifying and decidedly not where we would want to live.
So again, the question is, why? Why are we so interested in these horrifying societies, unreal and yet manage to hold our fascination? I've said it before but I always think it's easier to solve our real world problems in a world devoid of the rules that we know and understand. Where we can feel that justice is served because we can make our own rules, as needed based on impossible situations because shooting an arrow through the brain of a zombie solves the problem neatly and cleanly.
I think that for us readers and viewers it's a glimpse into something far more fantastic than our own lives and in a way that's a little scary. We need something so unbelievable, so frightening, so awful to grab our attention and thrill us, or maybe the questions that these stories pose, allow us to think about the consequences of our actions. Topics and situations that gives us a reason to discuss and conclude something about our own lives. Or maybe it's a simple as hope. the belief that things can get better if we work hard, think it through and fight for what we believe in. Trust in the people closest to us and care for them through the impossible.
In The Hunger Games, Katniss's simple gesture to honor a fallen child is turned to hope. And she will use that to protect those she loves as she's propelled into the face of the rebellion. We cling to the hope as we watch the rebellion move forward and we cheer when she makes the right choice and earns her freedom and the freedom of those she loves.
Hope is different in The Walking Dead, because there isn't a cure for the zombie disease. It's about finding a stable environment in which to make a life. a place to be safe. Ironically, it's the prison. Because they stumble into the now unused prison where they are able to defend their position and remain out-of-the-way of the zombie hordes, it becomes a place worth protecting and fighting for, escaping destruction and loss of more of their new-found family group. It gives them hope.
Or maybe I'm over thinking it and it's simply a bit of everything wrapped in a warped and wonderful visual experience.