The Knewton Blog

Subscribe to Newsletter

Our monthly newsletter features edtech and product updates, with a healthy dose of fun Knerd news.

Data Stories: Active Time and Intervention

Posted in Adaptive Learning on May 19, 2014 by

At Knewton, our analytics transform anonymized strings of data into actionable information that make teachers’ jobs just a bit easier. We generate metrics and predictions that teachers using Knewton-powered learning products can use as they see fit. Rather than overwhelm with endless stats and charts and graphs, we create specialized inferences that serve a clear purpose.

In creating and refining the models that produce these metrics, our data scientists attempt to answer questions like:

  • What is a student’s understanding of a particular concept?

  • How much productive time have students spent working?

  • What score can a student expect to get on an upcoming quiz?

The answers to these types of questions help teachers adjust lesson plans and target extra help for individual students. For example, imagine a product that lets a teacher group students and email them at once. Using Knewton analytics, the application could automatically generate groups of students based on current or anticipated needs. It could go even further and let a teacher assign students an “adaptive assignment,” which would dynamically guide every student through a unique set of supplementary material.

Active Time

One of the metrics Knewton calculates is Active Time, which aims to help instructors better understand students’ productivity and engagement with learning materials. Rather than just measure total time spent working on homework or in-class assignments, Active Time shows teachers at-a-glance how much productive time students are spending. To calculate this, Knewton analyzes various facets of each student’s activity patterns, along with total time spent. Knewton takes into account any work done that is related to a learning goal determined by an instructor; for example, if a student does an extra practice test that wasn’t assigned by the teacher, this work would “count” too.

Most every teacher out there has probably tried (consciously or not) to make similar estimates of their students’ productive time. But thanks to a lack of real data, their estimations will always be rough. And trying to improve the estimations, even marginally, would take a whole lot of time that teachers don’t have.

But what if teachers did have access to this raw data? What kind of inferences could they make based on this information? As a thought exercise, let’s take a look at some real, anonymized data from a college-level math course that uses Knewton-powered digital materials.

First, here’s a visualization of a single student’s work in the math course. Each dot represents a time when a student did some work. The color of the dot indicates the topic covered. (The muddied color of the first dot shows that there were multiple topics covered in that introductory assignment.)


Together, a class of students’ work patterns look something like the graph below. Each row represents a single student’s work. The rows are ordered by performance, with the highest-performing student on top.


A teacher looking at this information would immediately see the variances in work patterns among students. Of course, nearly all the students are doing work during class time (this concentration of simultaneous activity is what creates the neat color blocks in the visualization). But from there, the patterns vary. A subset of high-performing students do a lot of additional work in between class sessions, unlike most of the low-performing students and a group of very high performers, for whom success seems to come easily.

This may seem pretty obvious: the students who do more work, seem to perform better. But while this may be true at a high level, it’s not always quite as simple as it seems. Let’s zoom into five individual students’ work patterns to see how a teacher might put this information into context and use it in a productive way.

algebra_individuals_highlighted-01Sample Student: Phoebe

Let’s look first at Phoebe, who is at the top of her class. At first glance, she doesn’t seem to be doing that much work — and yet she is still performing exceptionally well. But dialing further into her work patterns, we see this:


Phoebe is actually working a lot — she’s just doing it in concentrated bursts. In addition to working during class time (which falls around 11 am — we can see these consistent dots in nearly every student’s graphs), she is working directly before and after class. Looking at this, a teacher would probably be satisfied that Phoebe is engaged and has established productive study habits. However, her work begins to slow about ⅔ of the way into the course. Part of this might be explained by spring break (most of the students’ work patterns fall off for about a week), but it’s possible that she’s also becoming disengaged. Perhaps the work isn’t challenging enough; seeing this, the teacher might assign her some supplemental assignments at a higher level. Or perhaps the teacher can see that Phoebe, a high achiever, has overloaded herself with other classes — this might be an opportunity for the teacher to step in and help her figure out a way to balance her workload better.

Sample Student: Joey

Any teacher could infer that a student like Joey, who is performing at a very low level, isn’t putting enough time into his work. Looking at his work sessions backs up this conclusion:


Joey is clearly doing just the bare minimum of work required for the course. While the teacher probably assumed as much already, having concrete evidence discounts any arguments on the part of the student (“but I really am working!”). From here, the teacher could talk to Joey to try to ascertain why he’s not working — is he just not motivated, or is he frustrated because the work is above his head? Maybe he’d rather slack off than feel inadequate. If the latter, the teacher might direct him to work on foundational material to help get him up to speed. We can see in this example both the opportunities and limitations of data in education. Data can show patterns and make predictions — but it can’t on its own effect change.

Sample Student: Monica

Now let’s look at Monica, who is performing around the 60th percentile in the class.


Unlike Joey, it’s clear that Monica is working a lot (mostly after class, but also a bit before). But a lot of her work sessions last just a few minutes, as indicated by the lightly shaded dots. This information is interesting on its own. But in the classroom, combined with a teacher’s observations and insight, such information can be immediately impactful. Perhaps the teacher knows that Monica is working at a part- or full-time job in addition to taking classes. It might become clear that she is sneaking in quick work sessions on the job — doing a problem or two while waiting for customers, then getting distracted and coming back to it a few hours later when business is slow again. The teacher might use this information to suggest that Monica do her best to rearrange her study schedule. Perhaps she even experiments with spending less time on her schoolwork, but making sure that any time she does spend is uninterrupted. Rather than study everyday in between job requirements, she might try completing all her work in two weekly library sessions.

Sample Students: Ross and Rachel

Finally, let’s compare two students. Ross’ performance is among the best in the class, while Rachel’s is somewhere in the middle. Their work patterns are fairly similar up until the middle of the course — they work consistently both during and after class. But around this time, Ross begins working a lot more. Perhaps this sudden of flurry of work was self-motivated. Or perhaps the teacher sat down with both students to discuss work habits, and Ross took the talk seriously while Rachel didn’t. Here, again, we see how data can help guide intervention — but can’t in and of itself affect change or flawlessly predict performance.



Connecting the Dots

Teachers have been performing the types of nuanced analyses described above since the beginning of time. Is student A struggling because he doesn’t understand the material? Or is there something going on at home that’s distracting him? Or is he overwhelmed by his English course and unable to devote enough time to math? But an instructor with hundreds, or even dozens, of students will never have the bandwidth to perform this kind of in-depth analysis on every individual’s work patterns.

But if she did — what kind of information would she want to see? What questions would she use that information to help answer? This is what Knewton’s data scientists think about as they build the data models that produce inferred metrics like Active Time. Data can’t improve education on its own. It’s our hope that these metrics serve as one more tool in teachers’ toolboxes — helping them make more informed diagnoses and judgments, leading to interventions that address each student’s individual needs.