Education: Touch and Stop

There's little evidence that classroom programs help prevent sexual abuse.


Almost two-thirds of the nation's elementary-school districts require classroom training in sexual-abuse prevention. Despite the popularity of prevention programs, there is a serious debate among professionals about whether the lessons work and should be expanded or whether the vast amount of resources they consume might be used in other ways to protect children more effectively. Recent findings from a major study claim to settle the issue with the discovery that comprehensive prevention programs really do protect children.

These findings have gained the kind of authority that is attributed to facts endorsed by The New York Times. On October 6, under the headline ABUSE-PREVENTION EFFORTS AID CHILDREN, the Times devoted almost a quarter of a page to an article extolling the benefits of sexual-abuse prevention training programs. A close look at the study, however, reveals a different story than the one told to the public. It is a story of how journalists are bamboozled by advocacy research advanced under the mantle of social science, how public funding is promoted for programs of dubious value, and how children may be jeopardized in the process.

The opening sentence of the Times article declares, "The first national survey to assess the effectiveness of school-based programs to prevent sexual abuse confirms that they help children in real-life encounters with potential child molesters." Based on results from a representative national sample of 2,000 children who were interviewed on the telephone along with their parents, the study's senior author, University of New Hampshire sociologist David Finkelhor, finds "that the kids who have gotten the better sexual abuse prevention programs do act better in the crunch." By "better programs" he means those that were longest and most comprehensive in coverage of prevention concepts.

In the words of Times reporter Daniel Goleman, "The more comprehensive the training the more likely it was that the children would use strategies like yelling, saying no, or running away when threatened with sexual abuse. They also knew more than other children about the topic, felt more confident about their abilities to protect themselves, and were more likely to report any actual or attempted abuse to adults. On the other hand, children who had received the training were slightly more likely to suffer injury once a sexual assault began because they were more prone to fight back."

This version of the study's findings magnifies small differences that are insignificant and underplays large differences that may be seriously dangerous. Let's look at the facts. Did children who participated in the comprehensive program have more knowledge about sexual-abuse prevention? Yes. On a 16-point scale, they scored an average of half a point higher than children who had never had any training. At the same time, the children who never had any training actually scored a whit higher than those who had participated in the prevention programs that were judged less comprehensive. If these findings are correct, about one-half of the schools that mandate this training are spending millions of dollars on brief, less-comprehensive prevention programs from which students emerge with no more knowledge than those who never participated. And even more funds are being spent on longer programs that increase knowledge by just a fraction of a point.

But what about the other outcomes? Does this slight increase in knowledge make a difference? Are the children who receive comprehensive sexual-abuse prevention training really acting better in the crunch? Finkelhor's rigorous analysis of the data simply does not coincide with his optimistic interpretation of the results. Regarding the positive outcomes cited by Goleman, for all of the children abused and threatened by sexual abuse the data show no statistically significant differences between the behavior of those who had received no or minimal prevention training and those who had participated in the most comprehensive programs, when relevant factors such as age and gender are controlled.

Among the sexually victimized respondents, children who received comprehensive training were not significantly more likely than the others to use the strategies, such as saying no or running away, that Finkelhor considers more effective. Nor did they differ significantly with regard to their sense of confidence in coping with their abusers, their disclosures of actual or attempted abuse, or their success in thwarting attempted abuse. There were small dif-ferences among the respondent groups on these outcome variables.

In some cases—for example, success in thwarting attempted abuse—children who received no training were a little more effective than those who participated in minimal or comprehensive prevention programs. This finding was not mentioned in the Times article. But in every case, judging by conventional scientific standards, we must conclude that the small differences found were likely to occur by chance.

On several other outcome variables, however, the differences were more pronounced and statistically significant. While Goleman reports that the children who received comprehensive training were only "slightly more likely to suffer injury once a sexual assault began" (emphasis added), the data show that in fact 15 percent of these children were injured, a rate more than three times that of those who received no training. And when threatened with sexual victimization, children who participated in comprehensive prevention training programs were more likely to employ strategies, such as crying and fighting back, that Finkelhor considers less effective (this may account for the higher injury rate).

Although Finkelhor calls fighting back one of the "less preferred" responses, it is a protective strategy taught to children as young as 5 by one of the most widely used programs in the country, the Child Assault Prevention Program, developed by Women Against Rape in Columbus, Ohio. (This program also teaches children that a father's pat on the behind can be a "bad touch" that should be reported to other adults.) Rather than "acting better in the crunch," the firmest conclusion that can be drawn from these findings is that, when faced with the danger of sexual abuse, children who receive comprehensive training use less preferred strategies and get injured more than three times as often as children who received no training.

Are sexual prevention programs really that dangerous? These findings should not be taken too seriously. Professionals know that telephone surveys are rather unreliable as a method to evaluate anything as complex as the effectiveness of prevention programs or the way children respond to actual threats. Journalists should learn something about the limits of social research. The real message from this research for parents, school officials, and policy makers is that although sexual-abuse prevention training programs are well-intentioned efforts to protect children against a loathsome crime, there is no evidence that they work, or even on the margin that these efforts produce more good than harm—newspaper headlines on the subject notwithstanding.

Neil Gilbert is Chernin Professor of Social Welfare and co-chairman of the Berkeley Child Welfare Research Center. His most recent book is With the Best of Intentions: The Child Sexual Abuse Prevention Movement (written with Jill Duerr Berrick).