Provider-centric, face-to-face health intervention programs that help people quit smoking, lose weight and increase activity levels have been shown to work, but they are expensive, don’t scale, and inconvenient. By contrast, Internet-based programs with similar goals can be disseminated widely and inexpensively, and can be accessed by consumers at a convenient time and place.
Although many of the latter programs have been shown in clinical trial settings to be efficacious, attempts to commercialize them have been plagued by attrition. People stop using the programs because they lose motivation, can’t find the time, or become frustrated by clunky interfaces and data entry requirements.
In one study for example, only 26% of participants in a randomized trial of a free physical activity website dropped out of the study before it was completed, whereas 67% of registered open access users dropped out during the same course of time. The open access users also spent less time on the site.
The lower attrition rate in the trial was likely driven by the emotional, cognitive and logistic support provided by trial personnel. It follows that the commercial success of online health intervention programs hinges on their ability to support users in the same way as trained personnel do in clinical trial settings.
Online communities have been proposed for this purpose. These tools permit users to communicate via the posting and reading of messages on a group message board. Social learning theory suggests they can reduce attrition by favorably impacting motivation to change, helping users learn vicariously and gain inspiration, and providing content that encourages users to return to the site.
Recently, a study by Caroline Richardson and colleagues at the University of Michigan showed in fact that an online community associated with an Internet-mediated walking program did reduce attrition.
Richardson’s group randomized 324 sedentary adults into 2 groups. Both groups were granted access to a Web-based walking program that required them to wear pedometers for 16 weeks and upload step-count data to a server. All participants could also view graphs of their progress and receive individually-tailored motivational messages. Participants who were randomized to the “online community” group had, in addition, access to online community features embedded in their intervention webpage, enabling them to post and view messages left by other participants. Those in the “no online community” group were not granted access to these features.
The scientists found that participants randomized to the online community arm were quite active: 65% of them used it, either as posters or lurkers.
As for attrition, it turned out that 79% of participants in the online community arm completed the program, whereas significantly fewer, 66% of those in the no online community arm completed it. In short, the online community reduced attrition by nearly 20%. Participants with access to the online community also engaged the program longer and uploaded valid pedometer data more frequently than those without access to the online community. Interestingly however, both groups increased their average daily step counts to the same extent during the program.
What Can We Make of This?
These results support the premise that online communities can reduce attrition from Internet-based wellness programs, particularly among people who have low perceived social support for behavior change. Posts that describe how people overcame obstacles, provide empathic support people who are grappling with these obstacles, or that celebrate success really do keep folks engaged.
There is a small problem though. Although online communities are less expensive and more easily scalable than traditional face-to-face support, it remains a difficult, and yes, an expensive challenge to implement and maintain them.
Previous studies of online communities had shown them to be plagued by low utilization, a phenomenon that limited their effectiveness as tools to positively impact health outcomes. To combat this problem, research staff affiliated with Richardson’s group posted their own self-introductions and open-ended questions designed to boost member participation. They directed participants to other aspects of the intervention and organized small contests to maintain interest. They endeavored to post responses to all participants’ posts within 24 hours. The scientists stated in their write-up that such interventions ‘were necessary to test the effectiveness of an active online community.’
Size does matter when it comes to online communities. If they’re big, dynamic and vibrant, they are an asset. But there is a pricetag associated with the implementation and maintenance of a community with these attributes. Attempts to commercialize internet-based behavior change programs must plan for this. And let’s not forget, this trial lasted only 4 months.