How effective can we make a summer math program?

From an outsider’s perspective, the summer months (equivalent to ~1/3 of the academic year) seem like a huge opportunity to close educational gaps for at-risk students. The research tends to agree. National assessment data shows that many of the disparities in student learning arise during the summer months, with about 20% of test score changes happening over summer.

We wanted to see if we could flip the script on summer learning loss, and so we ran a summer version of our math intervention program. We made a few changes: we upgraded our tutor training, simplified the curriculum, and added a few new components that we’ll discuss below. A final bit of context: the summer program had ~20 students from grades 2–5. All of the students had been identified as needing additional math support and all were significantly behind grade level.

The takeaway: students in our summer program gained over 0.4 grade levels worth of math skills, while similar students who didn’t participate saw their skill levels decrease. This suggests that well-executed summer math programs can be highly effective at boosting student’s academic growth rates and reducing educational gaps.

With that in mind, let’s take deeper look at some of the key pieces of the program.

Student Engagement

Can summer math be engaging, rewarding, and fun for students? The short answer appears to be yes. Our summer math program was completely voluntary (and remote!), but nevertheless, we had attendance numbers that were similar or higher than what we saw during the school year. After accounting for schedule conflicts (for example, some students had competing summer programs for a portion of the summer), we had an attendance rate of 79%.

In terms of excitement and fun, we learned a lot about how competition can increase student motivation. We incorporated weeklong “sprints” into our program: during a sprint, students competed to see who could practice the most; prizes were awarded as part of the fun. The results were clear: for many students, competition is a big lever to influence their motivation to practice. On average, students practiced ~65% more during competition weeks. The effects can be clearly seen in the chart below.

Average practice time per student over the course of our summer program. It is worth noting that students practiced a lot: over 60 minutes per week on average! We’ve normalized the data from Week 8 to account for the fact that it was a short week (o…

Average practice time per student over the course of our summer program. It is worth noting that students practiced a lot: over 60 minutes per week on average! We’ve normalized the data from Week 8 to account for the fact that it was a short week (only 4 days).

This brings us to the next item in our toolbox for more effective learning…

Family Engagement

This summer we tested out a new piece of our program: we sent parents weekly, personalized updates about their students. These updates combined hand-written notes from our tutors with automated data on very specific math skills that each student was working on.

The results were clear: over the course of the program, over 80% of our emails were read by parents. For anyone familiar with typical open rates for email campaigns, this is an impressive number: our parents are very interested in getting information about their student’s progress!

Email open rate (emails were sent to 1–2 family members per student) over the course of the summer math program.

Email open rate (emails were sent to 1–2 family members per student) over the course of the summer math program.

In terms of connecting this part of our program to improved outcomes for our students, we aren’t yet in a place to make strong quantitative conclusions, but we have some pretty telling individual stories. For example, in addition to a large number of grateful replies from parents, one week we received a note from an upset parent. She was dismayed to see her son was currently working on subtraction when she thought he should be working on division. She didn’t want him doing “remedial” work and was considering removing him from the program. However, we were able to show her exactly which subtraction problems her son had gotten wrong that week, and how that influenced his personalized lesson plan. This convinced her that we really understand her son’s skill levels. After the following week’s email update she sent us a video of herself practicing subtraction with him! Even better, several weeks later he had moved on from subtraction and was making clear progress on multiplication/division. This sort of response gives us confidence that strong family engagement offers a path to improving student learning.

Measuring the Impact

To quantify the impact of our summer program program, we measured academic growth for both participating student and for a cohort of similar students at the same school who didn’t participate in the program. The results are shown below.

Average academic growth for the 110 days between June 11th and September 29th for both participating and non-participating students.

Average academic growth for the 110 days between June 11th and September 29th for both participating and non-participating students.

Students in the summer program gained an average of 0.41 grade levels, while students who didn’t participate lost an average of 0.04 grade levels.

Given all we’ve learned, what are the takeaways?

This summer our tutors were high school students; definitely on the “untrained” side of the spectrum! Our summer results show that with the right training and structure it is possible to be very successful with a wide range of tutors.

The initial results with family engagement are also very encouraging. It suggests that families will respond to relevant, personalized information about their students. This opens up an entirely new pathway to improve student motivation and growth.

Finally, and most importantly, we’ve now shown that it is possible to both a) dramatically boost students growth rates during the school year and b) keep a high growth rate through the summer, a time when students usually regress. This is all while operating fully remote and onlineThe data makes it clear that the tools already exist to fully close the math gaps for students at even the most disadvantaged schools, if we are willing to invest in using them.