With all the talk on TV it seems like most people are against health insurance. Where I live most people do not have it and would love to. Is it just that most people aren't thinking that far ahead ? I know a lot of bad stories, doctors won't see you around here unless you have insurance, most won't even take cash anymore. Plus with what I've seen out of business anymore it is a sad situation. I've known people who have been fired because they got cancer. Most just argue that that can't happen but it does. All they need to do is say that job has been eleminated and hire someone new to do the same job under a different title. How would you feel having cancer, no job and no way to get insurance ? Why can't our government look at healthcare around the world and form a plan based on what works best and not some bad idea like giving all the money to insurance companies. Insurance is just a plan in case something happens. So wouldn't it be cheaper to pay for the people actually sick instead of everyone that might become sick ? I think it is sad the state of healthcare in the US and it needs to be changed. I don't want anyone dying from cancer to be sitting around worried that they can't get care. It is just wrong in my book. I don't see why we can't help people when they really need it.