Southern Dad
VIP Member
- Feb 15, 2015
- 248
- 91
- 80
Insurance is not free and insurance companies make record profits
In the United States we can shop for our healthcare insurance and choose what carrier we wish to use. Healthcare insurance is subsidized by the government for many. Yes, insurance companies make profits.