Is there a TV show or movie out there that ruined the fantasy or horror concept for you?
While enjoying watching an episode of HBO’s “True Blood”, a series I follow since I love vampire stories, I started getting the feeling that somehow, the strong element of horror brought by the vampire concept seem to have diminished within me.
In the True Blood universe, vampires live among ordinary humans, a metaphor for a minority population attempting to be accepted and given equal rights. Is there any movie or tv show out there that you thought ruined the horror or fantasy concept for you? : )