How did this happen?
How did all the good intentioned doctors out there get suckered in to believing that pharmaceutical meds are the "magic bullet" for curing people's ailments?
How did this happen, when all the evidence out there proves that all of these drugs cause some sort of side effect which often can be deadly?
What happened to getting back to basics... proper nutrition, exercise and rest?
The drugs that are being fed to people are killing them. How can doctors, who take the Hippocratic Oath, betray the people that they vowed to help?
You know, this makes me wonder... how much are the pharmaceutical companies paying the doctors to push their newest drugs?
Here's an article that I found that sums up what I believe - Death by Medicine.