STATS ARTICLES 2006

2009 | 2008 | 2007 | 2006 | 2005 | 2004 | 2003


Staying Skeptical of Addiction Treatment
October 12, 2006
Maia Szalavitz.
How the Los Angeles Times should have covered the extravagant claims made for “Prometa”

Whenever a condition is difficult to treat, heavily stigmatized and poorly understood, people selling unproven remedies tend to find an excellent market. Addiction treatment is the classic example; and unfortunately, for years, the press has abetted the sales of what ultimately turned out to be ineffective and harmful treatments by hyping them with a few moving anecdotes of success.

The Los Angeles Times dipped a toe in these waters with its recent article on the “Prometa” regime, currently being sold by a California-based company called Hythiam, for $15,000 for a two-to-five-visit outpatient regime involving the use of three medications.

As it noted, the drugs used in Prometa have not been FDA-approved for addiction treatment. Although it is legal to utilize medications for indications other than those for which they were initially approved, there have been no controlled trials finding this combination safe or effective in addiction.

While the Times was careful to repeatedly emphasize the lack of peer-reviewed data and to quote numerous skeptical experts, it did cite a non-controlled trial presented at a scientific meeting and included the usual glowing anecdotal reports of stunning success from its promoters.

More problematic, it also mentioned a report “compiled by several doctors” from a Washington state drug court program. A company spokesperson claimed that this report, presented at the June meeting of the National Association of Drug Court Professionals found that “98% of methamphetamine and cocaine addicts achieved clean urine screening tests for three months after the Prometa treatment provided by Hythiam.”

If this is true –and the data presented is not from a selected sample or otherwise distorted – this would be an overwhelming result in the addictions field and the story should have been on front pages around the world.

Typically, 40-60 percent of addiction treatment attendees relapse within a year [pdf], and since most relapses occur within days or weeks of treatment, a 98% abstinence rate for three months seems implausible.

Far more likely is that the sample was somehow selected to exclude the failures. For example, many addiction programs used to claim an “80% success rate” among graduates five years after completing treatment. What they didn’t tell reporters was that 80 percent of the people who started the program failed to graduate, so that this really means that just 16 percent of those who started the program were clean five years later.

Given the high likelihood that the “98%” claim is, at best, misleading, it probably should not have been included in the story.

Covering these issues is tricky: the fact that a company is selling an addiction cure for $15,000 a pop and is heavily advertising is certainly newsworthy. And due to space constraints, reporters have to assume that readers know to give far greater value to randomized, controlled data, rather than explaining why this matters in every article.

Fairness means that journalists must quote the compelling claims of the company even as they emphasize that anecdote is not evidence; journalists may also not have the space to explain in each article why anecdotes need to be treated with a strong dose of skepticism. Quoting non-peer-reviewed “data” as though it were scientific, however, should probably be avoided.

What can also be done better is giving a sense of the history of these expensive, public experiments in addiction treatment. For example, some groups still sell a $15,000 procedure called “rapid opioid detox” in which heroin or prescription opioid addicts are placed under anesthesia for the first hours of withdrawal. For more than a decade, media from Barbara Walters on 20/20 to Elle and Wired Magazine touted this as a way to avoid the pain of withdrawal, even though there was no evidence that it actually reduced the discomfort of withdrawal compared to other detoxes.

Science eventually caught up with rapid detox: but not until some dozen patients died from problems as a result of the procedure. A controlled trial published in the Journal of the American Medical Association found that rapid detox does not reduce withdrawal severity, or improve abstinence rates, or treatment retention; it also poses a significant risk of life-threatening adverse effects, which other detox methods do not pose.

As Herb Kleber, one of the authors of the JAMA study, noted back in 1982, when it comes to addiction treatment:

“The history… is a long and dishonorable one. The trail is strewn with cures enthusiastically received and then quietly discarded when they turned out to be relatively ineffective or even worse, productive of greater morbidity and mortality ... Any claim for a new method should be put forward modestly and viewed with skepticism until amply documented by careful experimental procedures.”