Do American TV shows show the real America?

watching tv - girl watching tv
Singapore
May 20, 2007 10:13am CST
I've always watched American TV shows and I loved most of them. I learned a whole lot of English from the TV shows and it showed me some of America's culture. I was just wondering if most of the tv shows show the real America or is it just a bit of exaggeration? What TV show do you think doesn't really happen in real life and which do you think actually shows what America is all about?
1 response
@tigerdragon (4297)
• Philippines
20 May 07
that bit of exaggeration mostly portrays the real america.always fighting for justice ,but overly done.showing the machiavelli principle in every motion picture that the end justify the means.the environment that is being shown is somewhat reflective of the nations youths.freedom of expression became too much.too much independence and pride and shows no family value system.
1 person likes this