Design, setting and participants
Thirteen elementary schools from a large school district in a northeast Iowa completed SWITCH® programming as part of annual district programming and 10 agreed to participate in the optional SWITCH® evaluation. The school district had been running the SWITCH® program (with funding and logistical support from the local Young Men’s Christian Association [YMCA]) continuously since the original efficacy-based study back in 2006. The program has been valued in the community but the high cost has presented a barrier to sustainability and dissemination so the schools agreed to participate in a controlled evaluation of the newly developed online version of SWITCH®. Schools were matched by socio-economic status (assessed as the percent of youth qualifying for free/reduced lunch) and then randomly assigned into either the print (n = 5) or online (n = 5) versions of the SWITCH® program. The online version replicated all aspects of the print-based manual and the programming was delivered in a consistent way for both sets of schools. The matched samples and standardization of methods make it possible to directly compare the print vs online formats.
Parents voluntarily enrolled in the SWITCH® program by either returning a signed enrollment form (print) or by completing a similar online registration form (online). A total of 210 children/parent dyads enrolled in the project (print: n = 100; online: n = 110). Schools had 2–3, 3rd grade classrooms involved, with total school enrollments ranging from 14 to 38. Parents from the participating schools were asked to complete a post-test survey to evaluate their engagement with and utilization perceptions of the SWITCH® program. Because the data we collected in this study were de-identified, the Iowa State University institutional review board (IRB) deemed the study as “exempt”, therefore, written consent from participants or their parents/guardians were also waived.
Intervention procedures
The SWITCH® program operates over a 16 week period with new materials released each week. Programming is divided into 4 modules each following the same sequence of content: week 1 “Switch what you DO”, week 2 “Switch what you VIEW”, week 3 “Switch what you CHEW”, and week 4 “You Rule! Try all 3 goals!”. The programming in each week is enriched by activities carefully designed to promote parent/child interactions about the behaviors. For example, in a specific “Do” week parents are provided with a set of “SWITCH Activity Cards”. Each card is detailed with activity name, necessary equipment and space, and description of activities that the child can do on their own or with their parent. In this round of formative evaluation, the SWITCH® program was implemented in the same manner for the print and the online conditions. The parents in the print group received booklets for each month of the program and the same resources were released on the website for parents to browse, print, and use.
The key behavior change strategy in SWITCH® is self-monitoring as children are tasked to work with their parents to complete the weekly SWITCH® Tracker sheets. The Trackers prompt children to choose a daily behavior goal (e.g., 60 minutes of physical activity in a “Do” day/week), record the actual behavior result (e.g., actual activity time per day), then tally and add up the points earned if goal is attained (e.g., 3 points/day and 21points/week maximal). Children turn in the Trackers each week to accumulate SWITCH® points which are redeemable for various incentives throughout the program. District program managers hired by the local YMCA made weekly site visits to gather the Trackers and to provide the weekly and monthly incentives for participation.
The link between the school and the home is an important component in the SWITCH® program since it keeps parents more involved in the program. The previous efficacy-based studies were able to systematically implement the SWITCH® program with optimal fidelity since research assistants could handle the coordination and communication. However, anecdotal observations of current SWITCH® schools indicate that there is considerable variability in the degree of school engagement. To enable broader dissemination of the program it is important to better understand the impact of school engagement on the program outcomes. While not a planned part of the study, school engagement was included as a moderating variable in the analyses.
Evaluation framework and measures
The evaluation of the SWITCH® program was guided by the established PRECEDE PROCEED Model [18]. There are a number of advantages of this model for the present study and for the subsequent dissemination efforts. One key advantage is that it is consistent with the social-ecological approaches that underlie the SWITCH® intervention. The “Epidemiology Assessment” in the PRECEDE Phase challenges the planner to separate out the behavioral and environmental factors that are targeted in the intervention. The subsequent “Educational and Ecological Assessment” then splits these influences into Enabling, Reinforcing and Predisposing Factors. In SWITCH®, parents are viewed as Enabling Factors since they enable, facilitate and promote behavior change in their children. Schools are viewed as Reinforcing factors since the teachers are positioned to remind and reinforce the systematic efforts with the program. The goal of the programming is to facilitate self-monitoring and behavioral skills in children and these are considered the key Predisposing Factors. The final stage of the PRECEDE phase (“Administrative and Policy Assessment”) captures variables thought to be potentially important for the implementation and sustainability of the program. Examples in the present study include school characteristics and school engagement. A second advantage of the model is that it incorporates an evaluation of the process, the impact (i.e. the intervention itself), and the final outcome in the Proceed phase. The structure of this model has been endorsed as an appropriate model for dissemination and implementation research [19]. Details of the process, impact and outcome variables examined in the evaluation are summarized below.
Process measure (Child Involvement)
The primary goal of the program was to promote self-monitoring and goal setting for changing diet, activity and screen time behaviors. Children (with parental help) were tasked with filling out SWITCH® Trackers each week and bringing them to school for incentives (points redemption). Therefore, the percent of children completing and returning SWITCH® Trackers was viewed as the key process measure. Another key process measure was the overall parent satisfaction with the SWITCH® materials. Parent satisfaction was captured with a single multiple-choice item anchored on a 4-point Likert scale ranging from 1 = “very satisfied” to 4 = “very dissatisfied”.
Impact measures (Parent/Child Interactions)
The focus of the SWITCH® programming is to facilitate parent/child interactions about healthy lifestyles. The impact of the programming was assessed with items capturing parent’s report of the quality of interactions related to the three target behaviors (“Do, Chew, and View”). For example, the item for “Do” behavior was stated as: “Did SWITCH help you talk to your child about being physically active?” The answers included 1 = yes, helped a lot”, 2 = “yes, helped somewhat”, 3 = “yes, helped a little”, and 4 = “no, did not help”. The mean of the three items was used to reflect the overall impact of the SWITCH® program on parent/child interactions.
Outcome measures (Child Behaviors)
Children’s weight management behaviors were measured by 8 items, with 2 items each for the Do and View behaviors, and 4 items for the Chew behavior. Parents were asked to compare their child’s current (upon intervention) behaviors to before participating in SWITCH®. An item for the chew behavior is stated as “Compared to before your family participated in SWITCH®, does your child consume fruits?” The answers ranged from 1 = “a lot less often” to 5 = “a lot more often”. Other Chew behavior items asked about children’s consumption of vegetables, 100% fruit juice, and soft drinks. The means for each of the individual behaviors was used for the analyses.
Moderating variables (Teacher/School Engagement)
The degree of teacher/school engagement was assessed as a moderating variable using the numerical ratings conducted by two experienced program managers who had worked as SWITCH® staff for over six years. The program managers rated each teacher’s level of engagement from 1 (low) to 3 (high) based on their in-person field observations and interactions with each school. There were a total of 22 teachers (ranging from 1 to 3 teachers per school; median =3) being rated by the program managers. The researchers then averaged the scores for the teachers of the same school and categorized two levels of teachers’ engagement for data analysis: 1 = “low” (≤mean), 2 = “high” (>mean).
Data analysis
The focus of the evaluation was on direct comparisons of process, impact and outcome measures between the print and online versions of SWITCH®. For the process evaluation, we compared the rate of completion of the Trackers between print and online using graphic techniques, and between the highly engaged schools and the lowly engaged schools using descriptive analysis (Mean and Standard Deviation). It was not possible to examine these data using statistical methods since the sample sizes for the school-related outcomes were too small. We also descriptively compared the level of parental satisfaction (i.e., average % for “very satisfied” and “satisfied” parents) with the SWITCH® program between the print and online groups. Standard analytic techniques described below were used to examine the Impact and Outcome measures. Frequencies of key measures were first reported to provide an overall sense of parent reactions to the programming. Descriptive statistics (mean and standard deviation) were provided for the primary measures and these were reported for both print and online groups as well as combined. Two-way multivariate analyses of variance (MANOVAs) were used to statistically examine differences in Impact and Outcome measures between the print and online groups and between lowly and highly engaged schools. Homogeneity of variances test was performed prior to the series of inferential statistical analyses.