Maltese Mom
New member
Companies that provide health insurance deduct what they pay as an expense and their employees don't pay taxes on a huge benefit. That makes health care for those with employer paid health insurance almost free for them. Why shouldn't your employer pay for your auto insurance, too? Why is this fair to those that have to pay their own medical bills? What type of socialism is this?
Cogent arguments only, please.
Cogent arguments only, please.