Doubling Academic Growth During the School Year

We ran our first experiments on improving student learning during the 2020-21 school year. We worked with Lowell Elementary (a high-poverty school near downtown Seattle) to see if we could measure the impact of a technology-enabled, intensive tutoring program in a fully-remote setting.

The results? We showed that with the right tools and process coaches could double the academic growth rates of struggling students: students who had historically grown at 0.6 grade levels per year averaged 1.2 grade levels of growth per year in our program.

Let’s take a look at the details.

The Students

Over the course of the school year, we worked with ~60 different students in a few different formats. The students ranged from 2nd-5th grade, and started our program over 2 years behind grade level, on average. Many were English-language learners and some were homeless. However, almost all of them were eager to learn. It isn’t clear at what point most struggling students stop trying, but in our experience, it isn’t before 6th grade.

The Program

Our program combined key elements from several types of interventions. In rough order of importance, they are:

Small-group tutoring. This is one of the best researched math interventions, and many studies have demonstrated that it is possible to have a strong impact on students with this format.

Personalized learning software. While the research here has shown decidedly mixed results in terms of student outcomes, there are a couple of reasons why it was important for us to leverage digital tools. The first is the ability for students to practice anytime, anywhere. Independent practice was a big part of our program. The second reason is that it was important for us to have regular monitoring of student progress, which is strongly facilitated by software-based tools. As we’ll show below, the analytics we built for our system allowed us to really understand how each part of our program components affected outcomes.

Family and teacher engagement. Over the course of the academic year, we began to experiment with tools (texts, emails, and regular reports) to engage families and teachers. The current research here is pretty thin, but the idea seemed “common-sense” enough to try.

The Results

Over the course of the year, we tested out 4 different tutors and groups sizes: small (1-2 students per tutor), and medium (6 students per tutor). Broadly speaking, our data highlighted two clear results:

  • There was a pretty large variability in tutor quality (where we loosely define “quality “ as the ability to motivate student practice and growth).

  • Even with similar quality tutors, the coaching ratio makes a big difference. 6:1 coaching wasn’t as effective on student practice/growth as 1:1 coaching.

None of these results was particularly surprising, but what makes our program special is how we can quickly quantify these factors. For example, we can easily see how student growth varied by tutor (all running the same small-group format), as in the graph below. While all of the tutors had average growth rates above historical averages, Tutor B clearly had the biggest impact: their student’s skills grew twice as fast as they had in the past.

Average math growth rates for three different tutors running the same small-group program, along with the historical growth rates for their students.

One of the key aspects of the program was active practice: both during school hours and outside of school. When we compare the results of two high-quality, experienced tutors working with different groups sizes, the difference is clear. In a smaller group setting it was much easier to both a) encourage attendance and b) motivate independent practice. The students in the 1:1 groups practiced over 2.5x more than their counterparts in the 6:1 groups.

Average student practice time for two experienced tutors with different group sizes.

The Takeaways

Our first pilot program demonstrated a few important things about increasing student learning in math.

The first important lesson was about tutor quality. Having a high-quality, well-trained tutor can be the difference between a huge program impact or none at all. In our subsequent summer program, we demonstrated some of the possibilities of tutor training.

Next, format matters. There are clear tradeoffs between student/tutor ratios, program cost, and effectiveness. This means that schools will need to make strategic decisions around how they want to allocate resources: is it better to help a bunch of students a little bit or a few students a lot?

Finally, and most importantly, this program demonstrated that with the right tutors and format, it is possible to dramatically increase student’s academic growth rates in math, even in the toughest of circumstances: fully remote learning in the middle of a pandemic.