As a sexual health author, educator, and psychtherapist who works in the field of addiction treatment, Douglas Braun-Harvey, MFT, CGP, CST, is no stranger to a controversy. But even he was surprised recently when a comment that he posted on made to SAMHSA’s online “Definition of Recovery” forum -- a website created to collect public feedback on the agency’s 2005 draft definition of recovery and guiding recovery principles--was apparently censored.
Braun-Harvey originally visited the website to review the recovery definition and offer comment. On finding that the draft made no mention of “sexual health,” a factor that he and others cite as critical to the successful recovery of some addicted individuals, he typed in and submitted his comment.
He was surprised to find that what he typed:
Appeared as follows:
“I was really astonished,” he recalls. “Instead of publishing a legitimate comment, the censorship that occurred made my posting look like a collection of dirty words.” When Braun-Harvey blogged at BH about what appeared to him to be censorship of a well-researched and legitimate opinion, I got wind of the story and asked SAMHSA to explain.
Brad Stone, the SAMHSA public information officer serving the behavioral health field, heard my version of the tale and promised a quick reply. He delivered a short while later, explaining that the electronic website “forms” on which public comments were to be composed and submitted to the Recovery site “are operated with a software tool that has a profanity filter. The filter replaces words that might be thought of as explicit with asterisks,” he said.
The purpose of the filter is, of course, to ensure that potentially inappropriate or offensive remarks do not taint open, public debate and commentary about important issues. And, it is perhaps the only alternative to a much more effective, but also much more expensive alternative: full-time moderation of posts by a human being prior to posting on SAMHSA’s websites.
Stone acknowledged that the technology is not perfect, saying that “the default setting for this filter casts a very broad net.” Any software that bleeps out words like "sex" or "sexual" would pose obvious difficulties for policy and treatment discussions involving subjects like HIV/AIDS. “But,
Stone explains, "as we learn about its limitations with various words over time, we keep pointing them out and our IT people keep getting them fixed.”