Wednesday, June 25, 2008

Can we measure the value (or harm) of CME?

In response to one of my posts on the Medscape CME controversy a commenter, Supremacy Claus, said this:

This point is missing from the discussion. CME itself is garbage. It does no harm. It has no demonstrable benefit. Demanding an example of a benefit is as valid as demanding an example of harm. I could not name a specific fact of benefit, just as Carroll cannot name an instance of specific harm. CME is an unfunded mandate imposed by the clinician hater lawyer oppressor running the medical licensing boards. There is no evidence of any benefit to any patient from this massive waste of time and paper shuffling. There is no evidence the academic windbags presenting these programs know anything of value to patients. There is no evidence anyone remembers their trite, narrow, useless technical points 5 minutes after walking out. There is certainly no evidence anyone changes any practice after these programs.

I do believe a physician’s life long learning produces benefits for patients. These benefits, however, are intangible and cannot be measured in any meaningful way. Moreover, it makes little or no difference whether this learning is “logged in” as accredited CME hours. Learning needs and styles vary from one physician to another. That’s why the responsibility for life long learning should lie with the individual physician, not with government bureaucracy, and that’s what the academic windbags, who think one learning formula fits all, don’t get.

I’ve used Up to Date as a point of care look up reference for several years. When Up to Date became a CME provider I was able to log in accredited hours with no extra effort. Was the learning experience suddenly enhanced? No. Did my use of Up to Date change? No. Although I more than satisfy my state’s CME requirement by using Up to Date, for me the most meaningful accredited learning experiences come from meetings such as Bob Wachter’s and Mayo Clinic’s hospital medicine courses, activities which would not exist without industry sponsorship.

The academics are clamoring for metrics to gauge CME’s effects on doctors’ “behavior.” There being no meaningful way to measure such an intangible, the best they’re likely to come up with are perfunctory “core quality measures” used today for pay for performance and public reporting, Unfortunately these measures are crude, sometimes non-evidence based and often have produced unintended consequences that far outweigh their benefits. If applied as measuring sticks for CME they are sure to have a dumbing down effect.

No comments: