Reality TV is ruining America!
I hate reality television. I dont think that anything good comes out of it. The main reason that I do not like reality television is that I do not have any desire to watch people get bitched at by some quote unquote judge that has no real credentials to my knowlegde.
Moving on. My main beef is with shows like "The Hills" and "Paris Hilton's New BFF."
These are completely obsurd. I was actually exposed to these shows and after 47 seconds I wanted to gauge my eyes out. No really, I had a fork in my hand. "Flavor of Love" and " NewYork... Whatever" Also some serious Bullshit. I dont even want to get started on the "Real World." More like not so real world.
The producers of these shows intentially pick the most insane and irrational people they can find for these shows which inturn leads to inevitable drama between these people. The American public is powerless to this LETHAL COMBINATION.
Down to the main beef. The American Public watches these shows and begin to assume and beleive that this is the way people actaully act. That this, so called reality is the correct way to behave in our society. Out of all the bullshit that I have seen in my life I have never had an experience like the ones Ive seen on reality TV shows. At one time I was a homeless drug addict so I think that it is safe to say that I have seen and experienced some serious shit. I have never had to compete is some dumbass competition to keep my job. I have never had the pleasure of voting people I hate out of my work enviroment. Reality my ass. Perhaps reality TV is ruining America.
Personally I watch cartoons when I can.
Any thoughts?
13 Comments