I do not mean my general practitioner or any other real doctor in my life. I mean the show "The Drs".I do take the word of my GP with a grain of salt; he is known to give free drug samples more than write prescriptions and at the beginning of my bronchitis, I was told to take Mucinex since I couldn't stay for my appointment to see the doctor (after having waited two hours). I advise every person to ask questions and to research things on their own time. Doctors are not infallible and there are often ulterior motives involved in the information they give you.
This show, The Doctors, is a panel of four good looking (in the opinion of some) doctors that spend an hour on TV promoting products and playing devil's advocate on topics that deserve serious attention.
I am not comfortable looking at "doctors" that are heavily made up (one of them looks like a less attractive
George Hamilton) and are loud extroverts that say fake sounding things like "I love this show- I learn something new every day!". I hate you, Doctors.
Another issue that I have with the show is how they use the reality show dramatic cliffhanger by saying things like "a disease that can blind you AND YOU MAY ALREADY HAVE IT" before going to a commercial.
People are stupid. People are led into these traps very easily and are more than willing to be led. This show takes advantage of that in the worst way. They devote no more than about two minutes to each topic and a handful of them are important things to know- however, this segment about the a, u, and g spot was about two minutes long and the conclusion to that segment was "but it's most important to get to know your partner, not anatomy". Then what was the point in talking about it if you go on to explain that this will not work for every woman?
The organization of the show is set up to draw ignorant women in and then leave them knowing little more than when they began. That is not helpful, especially since these women are not likely to seek out further information. That is not the nature of our people, as has been proven through people's willingness to attach to conspiracy theories (Obama's not American, Billy Cosby is dead, etc). In a nation where people want to be spoon fed everything, people will take what they're given at face value. People trust doctors, a little more than they should, and these people are exploiting that trust. The only way for a person to not be exploited is to have knowledge.
Can we please get this show off the air? Next, Dr. Oz. Stop scaring people with shows like "The Rising Plague" and do something to actually improve the health of Americans.
ETA: This is the guy that America is taking health advice from:
Why is a doctor promoting diet instead of nutrition adjustments?