Improving health care in America...

United States
January 25, 2007 9:38am CST
Based on whatever experience your family may have had dealing with insurance and so on, what do you feel would benefit the nation as a whole involving health care? Some countries don't even do the individual insurance thing. They pay taxes, and medical coverage is through the government. What are your thoughts?
No responses