Teachers have theories about when their students learn well and when they don’t: 8 a.m. classes are too groggy to retain anything. Beware the post-lunch slump. On Friday, they are already thinking about the weekend.
Having observed billions of interactions involving millions of students, Knewton can test how timing affects student performance. In order to design a reliable study, the data science team focused on a single product to guarantee a uniform population: similar ages, coursework, and context. So we chose a supplemental online tool for learning elementary school math in the United States. In the 2014–15 school year, 12,695 students used this tool for learning math.
When you look at how many math problems they did each day, you can see the rhythms of the week and the school year:
We can also see the rhythms of the school day. The next chart below shows by hour how many questions students answered on an average weekday. The dark line represents the mean number of questions answered, and the lighter areas represent the standard error of the mean — i.e., how much that number tends to vary.
Most of the extra math work happened during the school day, between 8 a.m. and 3 p.m., with a substantial tail of activity until 9 p.m.
That rhythm isn’t the same each day: On Fridays, for instance, almost no one used the math program after 3 p.m.:
Now that you know a little bit about our dataset, let’s come back to our original question. When students did these online math problems, how often were their answers correct, and did their percentage of correct answers vary from day to day? The chart below shows a simple average of students’ scores on each weekday.
These students scored an average of about five percentage points lower on Fridays. (Notice that the y-axis begins at 64 percentage points rather than at zero.)
This chart, however, doesn’t say whether the same students scored better on Mondays and worse on Fridays. Or is one set of students working on Mondays and scoring well, while a different set of students works on Fridays and scores poorly? In the following animation, each colored line represents the usage patterns of a randomly selected student:
Our initial assumption was that students would use computers at the same time each week. In fact the data show that students tended to go online once a week, but on a different day each week. Their varied schedules offer a convenient way to explore whether the observed “Friday effect” can be explained by differences in the students, or by the day of the week.
So we looked at the students who, over the course of the year, did these math problems at least once on each weekday:
The results of this analysis are similar to what we saw already: Students didn’t perform as well on Fridays.
One final way to verify the measurement is to compute the average score for each student who had activity on all five weekdays and compare that to her average score for a particular weekday. For instance, a student might get an overall average score of 70 percent, but on Wednesdays, her average score might be 72 percent. That means that, on Wednesdays, she scores 2 percentage points higher than her average across all weekdays. Let’s call it her normalized score difference.
Here is the average of normalized score differences for all the students:
Something is happening on Fridays. Students scored somewhere between 4 and 5 percentage points lower than on other weekdays. We don’t know exactly why, however.
We were startled by the strength of the Friday effect and the clarity of the data. While these results hold true for these particular students, preliminary analysis of several other groups shows that the Friday effect may be more widespread So doing work on a Thursday instead of a Friday could be the difference between an A- and a B+.