People are increasingly recognizing the fact that insurance companies are no longer honest, legitimate companies, that they've slowly ruined whatever little bit of good reputation they had when they began resorting to all kinds of nefarious practices and excuses in order to prevent paying claims so as to increase their profits.
As a consequence, people are opting out of insurance (only the most gullible, socially brainwashed people, those who work for companies that provide it cheaply, or the rich still have it), figuring that the downside is not as bad as getting ripped off would be if and when the insurance company failed to pay off after a legitimate claim was filed.
Therefore, the government has to get involved (spurred on by lobbying, of course) and start mandating that insurance policies be purchased. First, a long time ago, it was house insurance (which, to be fair, banks, not the government, mandated; for good reason); then it was car insurance; and next it's health insurance.
Why do you think that companies increasingly fail to provide insurance to their employees? They recognize it has become a scam business and they don't really want to continue to participate; or, if they feel that their employees will not like it, they require that their employees pay a part of the premium, which has been increasing over time.
Employees, in order to maintain their insurance, must now buy into the scam; and when they do, human psychology dictates that they adopt a belief consistent with their behavior: They're paying for it, therefore they must believe that it's worthwhile. It's classic con game psychology. Once they've got you hooked, you maintain the hold.
Support Corporate Dismantlement
.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment