Back to Blog

Earlier this year, Knewton presented an efficacy analysis of alta, Knewton’s adaptive learning courseware for higher education, developed by our data science team. From our perspective, the results of our internal analysis strongly suggested a causal link between alta and improved student performance.

Because even the best-intentioned researchers can introduce unconscious biases when making analytical choices, we knew that our in-house analysis could only be part of the story of alta’s effectiveness. As a data science team comprised largely of academics, we have a deep appreciation for the value of independent reproduction of scientific results.

To get a fresh, unbiased perspective on alta’s impact, we shared our fully anonymized Fall 2017 data with the Center for Research and Reform in Education at Johns Hopkins University (JHU). More specifically, we asked JHU to assess the impact of demonstrating concept proficiency by completing alta assignments — as well as alta usage in general — on student outcomes like quiz and test scores, future assignment completion and retention.

Key findings

To gain an understanding of the study’s key findings and conclusions, we invite you to read JHU’s complete analysis of alta’s impact on learning outcomes.

Knewton’s commitment to impact and transparency

These analyses — performed by different teams, in different ways, and across different time periods — represent both a fundamental piece of scientific research and are key to our efforts toward transparency and continuous, data-driven improvement.

Now, the conversation around alta’s impact must expand to include things like direct feedback from instructors, student surveys, user research, and case studies from a variety of classroom settings. There’s a lot of work to be done!

While conducting a conversation that, by design, never ends is in some ways daunting, it keeps us connected to the experiences and results of our users. And from that perspective alone, this endeavor has been a valuable one.