The Nature of Science Fiction
Science fiction has long been about what Man can accomplish when he sets himself, mentally and physically, to overcoming the challenges that Nature has in store for him. Whether it's conquering disease, discovering new worlds, or building devices to make life easier, science fiction should celebrate the triumph of the human will, and thus inspire real people to strive to make real changes in the real world.
In recent decades, much of science fiction has become a dour cautionary tale, dwelling on dystopian visions of the future that neither uplift nor inspire. And I think this has had repercussions in the real world, as fear and superstition have replaced hope and aspiration.
And this is the fault of the writers and directors of science fiction. By focusing on nonsense themes like the zombie apocalypse and a future ruled by evil robot overlords, they are instilling in their audience the idea that the future isn't worth striving towards. Such stories crush hope, and turn people away from wanting to take part in a world where such ideas could become possibilities.
So if there has been a drop-off in the popularity of science fiction, the genre has nowhere to place the blame but at its own feet. Flash Gordon urged kids to want to go to the moon and do the other things, and they did; now their eyes have been turned inward, and they don't even want to leave their own rooms, choosing instead to exist in a virtual world that makes The Matrix seem eerily prescient.
Who will come up with the medical breakthroughs and the stunning new technologies, if nobody feels it's worthwhile anymore? The stories of heroes who save people and advance society have been replaced by anti-heroes who struggle in futility against overwhelming forces unleashed by technology run amok.
It may be interesting to note that liberals, who *say* they're forward-looking and up with people, are the ones who produce the majority of the dystopian apocalyptic views of the future, while the stodgy old curmudgeonly conservatives give us films that we can actually feel good about watching. Which, of course, then get dismissed by the overwhelming majority of liberal-minded people in Hollywood.
Why is this? Why is it that those who support Big Government in their private lives, turn around and tell us stories about how Big Government will rule the world with an iron fist? I realize that there is an implication that the totalitarian regimes often found in science fiction are assumed to be Republican, or at least right-wing. But who are the proponents of the ideologies which, if taken to their natural conclusion, will result in the breakdown of society?
Can it be that the genre was so desperate to be Taken Seriously that it mutated from something we like to watch into something that goes down like a dose of castor oil? Science fiction stories don't win Oscars. They may make buckets of ducats at the box office, but they usually don't wein the sort of ephemeral critical acclaim that the inflated egos of Hollywood types crave. Serious films win Oscars, films about death and disease and dysfunction, films that make you feel like you need a bath after you watch them just to get all the ickiness off. A film about a mixed-race lesbian paraplegic who is raped by her father and then kills her baby because she hears voices coming from her dog would send the film critcs into transports of bliss. If the woman is then condemned to death by an all-male jury, it would be a shoo-in for Best Picture.
So is it any wonder, then, why the genre changed over from upbeat films about defeating the bad guys and saving the world, to gloomfests that ramble incoherently for an hour or two before everybody dies? And is it any coincidence that this change-over happened about the same time as the hippies started gaining control of society?
I think they watched way too many Ingmar Bergman films in college.