no interest in getting into any kind of arguement but i just want to say insurance industry just doesn't work with health care. Its their job to deny you, its their job to make sure they make the most money possible from you. of course doctors and health care workers need to earn a good wage but a for profit health care system goes against the very essence of what health care should be.
__________________
you don't know you're wearing a leash if you sit by the peg all day..
|