Should we have diverstiy in America?

By Amy
Abernathy, Texas
August 28, 2007 9:20pm CST
Is diversity is important in America? Diversity is the idea of accepting different ethnicities, backgrounds and/or cultures into the daily American lifestyle. Diversity also consists of being equally fair to women and minorities.
3 responses
@Zorrogirl (1503)
• South Africa
29 Aug 07
i thought usa already did that. i idolise america. if only my country got some sense and operate like them. if i had the oportunity to move there, i would.
@cynddvs (2950)
• United States
29 Aug 07
I think diversity is really important to Americans. I mean after all there is no true American race besides the native Americans. We all come from different cultures, backgrounds, races, and religions. Our ancestors all came from overseas somewhere. I think America or anywhere for that matter would be a very dull place if everyone were the same.
• United States
29 Aug 07
We already do. By the fact that we were born from a hodgepodge of different people from different places having different backgrounds, we are inherently diverse. We could, however, be more open to more kinds of diversity. I think all groups being equally respected if not equally represented is key to an enriching american culture. American culture is mostly a sum of the different cultural differences each person contributes to a society.