Knerd Story – David Simon, Baton Rouge Community College

David Simon
Baton Rouge Community College
Mathematics

Background

For a long time, I’ve been thinking about making videos of me teaching, but didn’t get around to it. The move to remote learning in spring 2020 was the motivation I needed to get going on making these teaching videos. By the end of that first week of remote learning, I had a rig set up with my tripod, a selfie stick, and my phone, and I just started making videos and created a YouTube channel. The channel now has about 200 videos.

Why did you decide to use Knewton Alta?

As a college, we had decided to pilot different products in different courses. In the fall of 2019 I was piloting Knewton Alta, along with another competitor product. In the beginning, I really got into Knewton Alta. I just felt that it flowed better, was easier for students to latch on to, and linked up with Canvas much more easily. It was working great. Cut to March 2020 and I was starting my video project in the middle of the semester and working with multiple software platforms in different courses.
I quickly realized I didn’t want my videos to be connected to any particular textbook, so I decided not to use textbook problems in my videos. Instead, I used my own problems, and didn’t have to follow along with textbook chapters, and could let things unfold in my own way. I started incorporating Knewton, which I felt fit in with my plan in a more seamless way. I had links to OpenStax textbooks, and Knewton worked better with my videos. So I stuck with it.

What features of Knewton Alta do you like?

The fact that the homework is interactive and can pinpoint for students where they are is so important. Knewton Alta can tell them, “You need to practice this topic some more…” Instead of just being the typical “Do questions one through five and hopefully I can give you feedback.”
Logo Description automatically generatedKnewton Alta can give students that individualized feedback. That’s what I need. And I know some of the other software does that too, but Knewton was simpler and it just worked out better.
This interactive stuff, this is the future. Especially with all the video conferencing technology and all the remote learning that’s happening. I don’t think education is ever going back to the way it was. Nobody’s going to let that happen.
I like that Knewton Alta is a software where you don’t ever see a negative, or the downside because it doesn’t ever take points away. That’s one thing that helps motivate students too. They can lose confidence if they keep getting hammered for getting things wrong.
Knewton Alta’s adaptability is important. I tried a couple of softwares that were adaptive. A competitor product was also good, but it’s just the interface of Knewton seemed to flow better. The problem choices were a bit more varied. I felt like I could do more in it. I could make some of the problems in it. And there were just too many blockades for some of the other softwares. Like they didn’t link with Canvas or students had to log in somewhere else. Knewton Alta and myself jive, like puzzle pieces, my tap dancing, and its adaptability. And I just kept getting such good feedback from the students about it too.

What has been the impact of using Knewton Alta?

Diagram Description automatically generatedWell, things have been developing since Spring 2020. I kept piloting Knewton Alta at my school until I was able to show everyone how well it was working. I used to have a 40 or 50% pass rate. Now I’m up to a 70% pass rate.
This semester (Fall 2021) we actually went ahead and adopted it for the college algebra and trigonometry classes too. And it seems to be going really well. There are always pitfalls when lots of people are using things because some people are good at the technology and some people aren’t. I’ve tried to help as much as I can. And then my Customer Success Manager for Knewton Alta fills in most of the gaps. And I’m hoping that it can keep going. I know sometimes people have a tendency to regress and they want to go to something they know, and they want to go back to solutions and partners we’ve used in the past. And I say no, we can’t do that.

Implementation Strategy

I teach a topic and assign a Knewton Alta homework on the topic (or sub-topic). I try to cut the homework up into manageable chunks of about half an hour each. I do this if the Knewton topic is too big. It also allows students to get quick feedback and encouragement. Then even if one weekly Canvas module has seven or eight assignments, students know each one will take about 30 minutes.
Many educators, myself included, would love to be able help individual students. Back in the “little red school house days” you had individual attention. It would be great to be able to stand next to a student and let them know I’m helping them today, until they understand the material like the back of their hand. We just can’t do that. Every semester I’ve got upwards of 200+ students in a small college.
We’re getting technology like Knewton Alta, right when we need it—or maybe a little bit later than we need.

And I don’t know if a lot of us realized that until something like Knewton Alta came along. But now we can figure out where the problems are, where students’ difficulties are, and try to smooth them out. And that’s what we’ve been needing.

What’s new in alta for Fall 2018

Ahh, the dawn of a new academic year. When the slate is wiped clean and we are given a new opportunity to help students achieve their goals.

There are new courses to be taught. Fresh faces in your classroom. Great new alta features just waiting to be used. With that in mind, we thought we’d provide a handy round-up of everything that’s new in alta for the Fall 2018 term.

If you have any questions, please reach out to us — we’ll have a knerd get in touch with you ASAP to walk you through what’s new and answer any questions you may have.

Now, on to the enhancements!

In-app support via chat

Ever wanted support to feel a little less like sending a message in a bottle and a little more like a two-way conversation with a friend? Check out alta’s new support chat feature (that we think represents the most advanced support infrastructure in the industry):

What does all of this add up to? More immediate responses, more proactive support, and an elegant support experience that’s on par with what you’ve come to expect from alta.

Courses and sections

Have you ever wished that you could manage multiple sections of your alta course — while keeping all the core elements consistent?

With our Courses and Sections update, you can create sections based on your original alta course. Alta will carry over the original course learning objectives, content, and settings to the new sections. (Instructors teaching the new sections may add coursework or modify due dates for homework, but don’t worry: the important stuff will stay the same.)

Courses and Sections is kind of a big deal, so we’ve dedicated an entire blog post to it. If you have questions or would like to learn more, we recommend checking it out.

Improved adaptivity in non-quantitative courses

After gathering feedback from instructors and analyzing how students performed using alta in their Economics courses, we recognized an opportunity to improve the student learning experience.

Over the summer, we released updates to how our adaptive engine measures student progress toward mastery in non-quantitative courses. We’ve also made the learning objectives in these courses more granular to allow us to help students gain proficiency with greater precision.

Desmos graphing questions in for Math, Econ and Stats products

We’re always seeking new ways to present content and assessment in order to help students achieve mastery. That’s why we’re proud to announce that alta products in Math, Economics and Statistics will feature graphing questions from Desmos.

Desmos provides a powerful platform for presenting assessment questions in the form of a graph. The flexibility of Desmos allows us to deliver higher-order comprehension questions and provide better support for graphing questions.

Expect between 25-50% of content within each alta product in these subject areas to feature Desmos.

…and content updates across the board

While we’re continually refining alta’s content to make sure that it’s effective in helping to improve learning outcomes, over the summer, we completed a front-to-back sweep of our content to make sure that everything is in tip-top shape.

Think of it as a little “summer cleaning,” if you will.

There are lots of things to love about alta, and chief among them is the fact alta is always getting better. You can expect to hear more from us soon about the next set of exciting enhancements to alta.

In the meantime, we wish everyone the best of luck with your alta journey this semester!

The New Chalk: How Machine Learning Can Benefit Higher Education

Machine learning, AI and other algorithmic technologies have long promised to enhance the learning experience for college instructors and students.

But what can these technologies actually deliver? What’s required to implement them effectively? And how can they operate in a way that’s both equitable and transparent?

Andrew Jones, a data scientist here at Knewton, joined a panel discussion hosted by EdSurge this week in NYC that sought to answer some of these questions.

The panel, which included a group of educators and education technologists, covered a range of issues, including how machine learning technologies are perceived by students, specific areas where machine learning can make an impact on learning, and the barriers that must be overcome for this technology to be implemented successfully.

When asked to suggest the tough questions that instructors should ask before implementing machine learning technologies in their classroom, Andrew urged instructors to push for greater transparency into how a company’s algorithms work. “Asking what is being optimized for, and why, can give you a sense of [whether a tool] is focused on student outcomes, or whether it is about getting a prediction that’s right more often,” he said.

Toward the end of the session, the focus shifted to instructors themselves — and the role they will play in courses that increasingly feature machine learning technologies, such as virtual assistants

Andrew underscored the central role of the instructor, saying: “I’d rather see machine learning reach the level of chalk.”

You can find a full recap of the event over on EdSurge.

Connecting our Knerds: Day of Knerdvocate Collaboration

Earlier this summer, Knewton hosted its first national Knerd Camp, bringing together adopters from across the country to discuss all things alta with our staff knerds.

Knewton’s “knerdvocates” played a key role in making the meeting a success. Knerdvocates are educators who use alta to drive student success in their own courses and help others do the same in theirs through peer coaching, best practices, and thought leadership.

During the 2-day Knerd Camp, instructors were able to experience the day in the life of a Knewton Knerd in New York City. Sharing Knewton’s vision of “putting achievement in reach for everyone,” attendees discussed their teaching challenges, offered advice, and collaborated with each other and staff Knerds on even better ways to help students using alta. They also got the behind-the-scenes view into the data science behind alta, and enjoyed a preview of the newly enhanced interface with the Knewton team of technology product developers, data scientists, and support team.

Instructor, Shawn Shields commented, “I really learned a lot from this event in terms of how it works and hearing others’ experiences and best practices. I ended up with quite a few good ideas from others that I can modify and add to my course,confirming how important it is to give educators opportunities for peer-to-peer coaching in a comfortable, positive setting.

Passion was also a recurring theme, with Knerdvocates becoming inspired by the passion of the Knerd learning community–”I loved the opportunity to talk with the Knerds who were so passionate about what they do…{….}. the Knerds genuinely care about what they do. You can’t fake that kind of passion and dedication,” indicated instructor, Melanie Yosko

Knerd Camp was rounded out with a cruise on the Hudson to visit New York City’s famous trademarks and dinner at the suitably named tapas restaurant, alta.

Building on the success of our first national Knerd Camp, Knewton is planning to expand the program with a series of regional Knerd Camps for instructors who are interested in learning more about alta. Keep your eyes open for one in your area.

 

Knewton’s UX Research Practice Gears Up for Back-to-School

As back-to-school season approaches, Knewton is diligently working on powerful new feature releases and product updates. And Knewton’s User Experience Research (UXR) practice supports this work by incorporating instructor and student feedback through a host of research methods.

We vet draft experiences with users, identify issues, iterate, and validate potential solutions. And this is all often before a single line of code is written for a new feature release or product update.

Knewton UXR recently conducted efforts to inform an upcoming alta feature that allows course coordinators to create and manage multi-section courses. We wanted to first understand educators’ current practice, and then swiftly iterate and validate draft designs in light of user feedback. In doing so, by the end of our process, we could come to a useful and usable solution.

We approached research through:

  1. Focus Group
  2. 1:1 Interviews
  3. Persona Development
  4. Rapid Iterative Testing and Evaluation

Focus Groups & 1:1 Interviews

Prior to initiating design work, we took a step back and conducted remote focus groups and 1:1 interviews to understand how coordinators across the country currently create multi-section courses. What does this process look like for them? Where do issues arise? How do Learning Management Systems come into play? This early research provided our cross-functional team with a deeper knowledge of users’ needs and goals.

Persona Development

We used information gleaned from early research sessions to create a course coordinator persona. User goals were defined here, giving teams common language to talk about relevant problems — and how to best solve them.

Rapid Iterative Testing and Evaluation

As the design team started building out a draft experience, UXR hosted 1:1 remote usability testing sessions with course coordinators (users and potential users) across the country. We screen-shared semi-functional draft designs, turned over control of the keyboard and mouse, and asked participants task-oriented and open-ended questions. Because stakeholders (design, product, engineering) were observing each session, we quickly identified design issues, iterated in-between sessions, and validated potential solutions with subsequent users.

What We Learned

What are some things we learned in our multi-section course research? Well…A LOT! But, sometimes the most memorable findings are the ones that are those ‘aha’ moments — the ones where we watch users go through a potential workflow and an imaginary lightbulb goes off for us.

We immediately consider an easier way for users to accomplish a task. Designs are revised and further validated.

One example of an ‘aha’ moment within our research involved ‘auto-save’ during educators’ process of building out coursework. Auto-save seems harmless enough, right? But employing auto-save within the complex task of building out coursework for a multi-section course didn’t seem to give users enough confidence that their work was indeed being saved. Designs were revised and the issue was alleviated.

Another compelling finding involved course Initialization links — what instructors would need to click within a workflow to make the course section ‘start.’ Early draft designs did not seem to make enough distinction between this link and additional content on the same screen. Again, designs were revised to more overtly highlight where users should navigate to initialize the course.

Effectively Leveraging UXR for Educators

Using a multi-method research approach provided our cross-functional team with a solid understanding of user needs prior to any design work, and the flexibility to improve designs in-between research sessions.

Upon concluding our design research, we came away away with an experience we’re confident is useful and usable, and can be put into development for our customers.

Thanks to Michael Mancuso.

Celebrating National Higher Education Day at Knewton

Today, we’re excited to celebrate National Higher Education Day.

What is #NationalHigherEducationDay, you ask? Let us tell you!

Celebrated annually on June 6, it’s a day to champion the value of higher education — and to acknowledge all of the hard work that must be done to make sure everyone can share in it.

The Value of Higher Education – National Higher Education Day

The value of a college education

College not only provides students with a life-changing experience that will broaden their perspective and deepen their understanding of the world around us, it’s also the surest path to a better life. Over the course of their careers, a college degree is worth $2.8 million.1

This is more than just some extra spending money in your bank account. It’s the type of financial security that allows you to pursue a career in a field you’re passionate about or move forward with a Major Life Decision.

Despite all of this, only 61% of college students think that a college offers a good value, according to a recent Student Monitor report.2

To understand why, we’d like to use National Higher Education Day as cause to look at three of higher ed’s biggest challenges — and think about how we can work together to solve them.

Improving student outcomes

A college education provides students with a number of benefits. But sometimes it’s helpful to remember that students will only be able to realize these benefits if they complete their courses and go on to graduate.

Unfortunately, fewer than 6 in 10 students who enter college go on to graduate.3 Fewer still will graduate on-time.

Poorer students experience the lowest graduation rates. The graduation rate of college students born in the 1980s and who are from lower-income families is 11.8%. Among students from middle-income families, the graduation rate is 32.5%. For students from higher-income families, the graduation rate jumps to 60.1%.4

There are many reasons for higher education’s low graduation rates, but none is bigger than the fact that too many students arrive on campus unprepared for the academic rigor of college. 52% of 2-year college students require remediation in math5; however, only around half the students enrolled in remedial math courses go on to complete them.6

As we see it, the biggest opportunity to improve on-time graduate rates is to help students who aren’t prepared for college — particularly in math — get up to speed quickly.

Making learning accessible to all

It’s often said that education is our society’s great equalizer. But what if not everyone has the same access to higher education?

11% of undergraduate students — nearly 2 million in total — have diagnosed disabilities.7 These students are faced with a number of challenges, not the least of which is course materials that aren’t fully ADA compliant. This doesn’t include students who have undiagnosed disabilities in college, when students often have to self report that they are learning disabled.

These challenges add up to make a big impact. About one-third of students with disabilities who are enrolled at a 4-year institution graduate within 8 years. At 2-year institutions, that number is slightly higher, at 41%.8

Improving the learning experience for students with disabilities is a complex issue. But if there’s one thing we can all agree on, it’s that course materials that are fully ADA compliant should become the norm.

Affordability

College provides an incredible value to students. But it’s still expensive.

$1.5 trillion in total U.S. student debt.9 An average debt of more than $39K for the Class of 2017.10 These numbers can be so big that they almost seem to lose meaning.

But for students, their impact is very real.

According to Student Monitor, financial strain accounts for 2 of students’ top 5 concerns.11 And, according to a survey released by researchers at Temple University and the Wisconsin HOPE Lab, 36% of students do not get enough to eat, and a similar number lack a secure place to live.12

We have a shared obligation to make college more affordable, without compromising outcomes or accessibility.

Putting achievement within reach with alta

We built alta, Knewton’s fully integrated adaptive learning courseware, with higher education’s biggest challenges in mind.

Alta combines Knewton’s adaptive learning technology with high-quality openly available content to help students achieve mastery. Alta is accessible to all learners: its technology, content and user experience are all WCAG 2.0 AA-level ADA compliant. At $44 for 2-year access, it’s also affordable.

Solving higher education’s biggest challenges won’t happen overnight, but if we are to reaffirm the value of college for all learners, we must never lose sight of them.

What else needs to be done to improve higher education? What more can we be doing to help? Hit us up on social and tag your post with #NationalHigherEd day.

References

  1. Georgetown University: The College Payoff
  2. Student Monitor: Lifestyle & Media – Spring 2018
  3. National Student Clearinghouse Research Center: Signature 14 Completing College: A National View of Student Completion Rates – Fall 2011 Cohort
  4. University of Michigan: Growing Wealth Gaps in Education
  5. Complete College America: Data dashboard
  6. National Center for Education Statistics: Remedial Coursetaking at U.S. Public 2- and 4-Year Institutions
  7. National Center for Education Statistics: Fast facts
  8. National Center for Special Education Research: The Post-High School Outcomes of Young Adults With Disabilities up to 6 Years After High School
  9. Federal Reserve Bank of New York: Quarterly Report on Household Debt and Credit
  10. Data provided by Mark Kantrowitz to studentloanhero.com
  11. Student Monitor: Lifestyle & Media – Spring 2018
  12. Wisconsin HOPE Lab: Still Hungry and Homeless in College

How does Knewton’s Proficiency Model estimate student knowledge in alta?

Accurately estimating a student’s knowledge is one of the core challenges of adaptive learning.

By understanding what a student knows and doesn’t know, adaptive learning technology is able to deliver a learning experience that will help the student achieve mastery. Understanding the student knowledge state is also essential for delivering accurate, useful analytics to students and instructors.

We refer to our data-driven mathematical model for estimating a student’s knowledge state as Knewton’s Proficiency Model. This model lies at the core of our ability to deliver lasting learning experiences to students using alta.

How does our Proficiency Model estimate student knowledge? Answering that question begins by looking at its inputs, which include:

The model’s outputs represent the student’s proficiencies in all of the learning objectives in the Knowledge Graph at a given point in time. So what’s in between the inputs and the outputs?

Knewton’s Proficiency Model

A basis in Item Response Theory

The foundation for our Proficiency Model is a well-known educational testing theory known as Item Response Theory (IRT).

One important aspect of IRT is that it benefits from network effects — that is, we learn more about the content and the students interacting with it as more people use the system. When a student answers a difficult question correctly, the model’s estimated proficiency for that student should be higher than it would be if the student had correctly answered an easy question. But how can we determine each question’s difficulty level? Only by observing how large numbers of diverse students performed when responding to those questions.

With this data in-hand, we are able to better and more efficiently infer student proficiency — or weakness — and deliver content that is targeted and effective.

Moving beyond the limits of IRT

Because IRT was designed for adaptive testing — a learning environment in which a student’s knowledge remains fixed — it does not meet all of the requirements of adaptive learning, an environment in which the student’s knowledge is continually changing. In a model based on IRT, a student’s older responses make the same impact on the student’s proficiency level as their more recent responses. While this is fine in a testing environment, in which students aren’t typically provided feedback or instruction, it becomes a problem in an adaptive learning environment.

In an adaptive learning environment, we inherently expect that students’ knowledge will change. As a result, we want to give more weight to recent responses than older ones — allowing for the possibility of an “Aha!” moment along the way.

To correct for the limitations of IRT, Knewton has built temporal models that weight a student’s recent responses more heavily than their older ones when determining proficiency, providing a more accurate and dynamic picture of the student’s knowledge state.

Accounting for relationships between learning objectives

Adaptive learning requires constant, granular assessment on multiple learning objectives embedded in the learning experience. However, traditional IRT also does not account for the relationships between learning objectives. As discussed above, these relationships are an important part of the Knewton Knowledge Graph.

To remedy this shortcoming of IRT, Knewton has developed a novel way to incorporate these relationships in a Bayesian modeling framework, allowing us to incorporate prior beliefs about proficiency on related topics, with evidence provided by the student’s responses. This leads to so-called proficiency propagation, or the flow of proficiency throughout the Knowledge Graph.

What does this look like in practice? If, in the Knowledge Graph below, a student is making progress toward the learning objective of “Solve word problems by subtracting two-digit numbers,” our Proficiency Model infers a high proficiency on that learning objective. The model also infers a high proficiency on the related learning objectives (“Subtract two-digit numbers” and “Subtract one-digit numbers”), even without direct evidence. The basic idea: If two learning objectives are related and a student masters one of them, there’s a good chance the student has also mastered the others.

A Knewton Knowledge Graph.

The effectiveness of Knewton’s Proficiency model

The many facets of the Proficiency Model – IRT-based network effects, temporal effects, and the Knowledge Graph structure – combine to produce a highly accurate picture of a student’s knowledge state. We use this picture to provide content that will increase that student’s level of proficiency. It’s also the basis of the actionable analytics we provide to students and instructors.

How effective is the Proficiency Model in helping students master learning objectives? In his post “Interpreting Knewton’s 2017 Student Mastery Results,” fellow Knerd Andrew D. Jones presents data that shows that Knewton’s Proficiency Model helps students achieve mastery — and that mastery, as determined by the Proficiency Model, makes a positive impact on student’s academic performance.

Interpreting Knewton’s 2017 Student Mastery Results

This post was developed with Illya Bomash, Knewton’s Managing Data Scientist.

Results. Efficacy. Outcomes.

Student success is the ultimate goal of learning technology. Despite this, there exists a startling lack of credible data available to instructors and administrators that speaks to the impact of ed-tech on learning and academic performance.

To provide instructors and administrators with greater transparency into the effectiveness of alta and the Knewton adaptive technology that powers it, we analyzed the platform results of students using alta. These results represent our effort to validate our measure of mastery (more on that to come) and provide instructors and administrators with much-needed transparency regarding the impact of alta on student achievement.

Here, we hope to provide context and explanation that we hope will leave educators and those in the ed-tech community with a clearer picture of how we arrived at the these results — and why they matter.

Our data set

The findings in this report are drawn from the results of 11,586 students who cumulatively completed more than 130,000 assignments and 17,000 quizzes in alta in 2017.

This data set includes all of alta’s 2017 spring and summer student interactions. Only cases in which the relevant calculations are impossible have been excluded — such as quiz scores for a course in which the instructor chose not to administer quizzes. So while these results aren’t from randomized, controlled trials, they do paint an accurate portrait of student performance across alta users, making use of as much of our student data as possible.

Why mastery?

Our adaptive technology is based on the premise that if a student masters the concepts tied to the learning objectives of their course, that student will succeed in the course and be prepared to succeed in future courses. It’s also based on the premise that Knewton’s mathematical model of student knowledge states — which we frequently refer to as Knewton’s proficiency model — can determine when a student has reached mastery.

This basis in mastery manifests itself in how students experience alta: Every assignment that a student encounters in alta is tied to learning objectives that have been selected by the instructor for their course. A student “completes” an alta assignment when our proficiency model calculates that a student has mastered all of the learning objectives covered in that assignment.

Our 2017 Mastery Results seek to clarify two things: the frequency with which students achieve mastery in alta, and the later performance of students who have (and have not) achieved mastery, as determined by our proficiency model.

Controlling for students’ initial ability level

In this analysis, we wanted to assess the impact of mastery across the full spectrum of student ability levels. To capture a sense of each student’s initial proficiency, we aggregated the first two questions each student answered across all of the concepts he or she encountered in the course. The percentage of those questions the student answered correctly provides a naive but reasonable estimate of how well the student knew the material entering the course.

We looked at the distribution of this score across all of our students, tagging each student’s history with a label corresponding to where they fell among all users.

Note: Knewton’s proficiency model neither uses this measure nor tags students with any kind of “ability label.” Our adaptive technology calculates a detailed, individualized portrait of each student’s proficiency levels across a wide range of concepts after each student interaction. But for the sake of this comparative impact analysis, we’ve chosen to use these distinctions as a tool to compare students of similar initial abilities.

Our findings

Students of all ability levels achieved mastery with alta at high rates

Analyzing students’ assignment completion revealed that with alta, students achieve mastery at high rates. As seen in Figure 1, across all students, 87% of the time, students working on an assignment in alta achieved mastery. Even among students who struggled to complete a particular assignment, 82% eventually reached mastery.

Achieving mastery with alta makes a positive impact on students’ academic performance

We know that with alta, students are highly likely to achieve mastery. But what is the impact of that mastery? When our model indicates that a student has mastered the material, how well does the student perform on future assignments, quizzes, and tests?

For any given level of initial ability, Knewton’s adaptive learning technology is designed to facilitate reaching mastery effectively for any student willing to put in the time and effort. To validate Knewton’s measure of mastery, we compared the performance of students who mastered prerequisite learning objectives (for adaptive assignments) and target learning objectives (for quizzes) through altawith students of similar initial ability who did not master these concepts.

Mastery improves the quiz scores for students of all ability levels

Figure 2 shows average Knewton quiz scores for students who did/did not reach mastery of the quiz learning objectives on prior adaptive assignments. Quiz takers who mastered at least ¾ of the quiz learning objectives through previous adaptive work went on to achieve substantially higher quiz scores than similarly-skilled peers mastering ¼ or fewer of the learning objectives.

Mastery levels the playing field for struggling students

Putting in the work to reach mastery on the relevant adaptive assignments increased initially struggling students’ average quiz scores by 38 percentage points, boosting scores for these students above the scores of otherwise advanced students who skipped the adaptive work.

Mastery improves future platform performance

Students who master the learning objectives on earlier assignments also tend to perform better on later, more advanced assignments.

Assignment completion

As Figure 3 shows, controlling for overall student skill levels, students who mastered ¾ of the learning objectives prerequisite to any given assignment tended to complete the assignment at much higher rates than students who did not. This is the virtuous cycle of mastery: the more students master, the better prepared they are for future learning.

Work to completion

Mastery of an assignment’s learning objectives also saves students time. When students began an assignment after having mastered most of its prerequisites, they tended to require significantly fewer questions to complete it. For students who mastered at least ¾ of the prerequisites to any given adaptive assignment, completing the assignment took 30-45% fewer questions than for students who did not (see Figure 3). Mastery helps students of all abilities learn faster, and struggling students see the biggest gains: for these students, prerequisite mastery leads to an average postrequisite assignment shortening by more than 40%.

The road ahead

Any self-reported efficacy results will be met with a certain amount of scrutiny. While we’ve attempted to be as transparent as we can be about our data, we understand that some will question the validity of our data or our approach to presenting it.

It’s our hope that, if nothing else, the reporting of our results will inspire others in the ed-tech community to present their own with the same spirit of transparency. In many ways, these results are intended not as a definitive end-point but as the start of a more productive conversation about the impact of technology on learning outcomes.

Lastly, while our 2017 Student Mastery Results are encouraging, we know that they exist in a world that is constantly changing. The challenges in higher education are becoming greater and more complex. The student population is growing increasingly diverse. Our technology and our approach to learning is evolving.

This year, we plan to update these numbers periodically and provide the results of other analyses with the goal of providing greater transparency into the effectiveness of alta and deeper insights into how students learn.

View full mastery results

What are Knewton’s Knowledge Graphs?

Imagine you are tutoring a new student who is working on a homework assignment about solving word problems that require two-digit subtraction. The first question on the assignment is:

“Jessica wants to buy a book that costs $15 and she has $32 in her wallet. After she buys the book, how much money will she have left?”

Your student looks at the problem and looks back at you, terrified. He has no idea what to do. Clearly he hasn’t yet mastered the ability to solve word problems with two-digit subtraction. But what exactly is the problem? Is he struggling with the word problem itself, or is he missing crucial prerequisite knowledge as well?

Answering this question correctly requires a student to be able to perform the following steps:

  1. Translate a word problem about subtraction into a mathematical expression: Question text → 32–15 = ?
  2. Solve the two-digit subtraction problem: 32–15
  3. Solve one-digit subtraction problems (while performing the two-digit subtraction problem): 12–5, 2–1

From the above breakdown, we can see that this word problem requires a student to know three separate skills, or learning objectives. These learning objectives also happen to build on each other: a student can’t solve a two-digit subtraction problem without first knowing how to solve one-digit subtraction problems, and a student can’t solve word problems that require two-digit subtraction without knowing how to perform two-digit subtraction. We can represent the relationship between these learning objectives in the following way:

Figure 1: A tiny Knewton Knowledge Graph

In the picture above, we are representing each learning objective as an oval. An arrow pointing from oval A to oval B indicates that a student must know A in order to be successful in B — we describe this relationship by saying that “A is a prerequisite to B”. This simple framework forms the foundation of Knewton’s Knowledge Graphs, a key tool that helps Knewton provide adaptive assignments and analytics in alta.

Extending the Example

The tiny Knowledge Graph above is a very simple example of how relationships between learning objectives can be represented. In many cases, a given learning objective can have multiple direct prerequisites. A simple example of a learning objective with two prerequisites can be found by looking at the learning objective requiring students to divide fractions. To divide fraction X by fraction Y, you multiply X times the reciprocal of Y. This means that in order to be able to divide fractions, you must already be able to (1) multiply fractionsand (2) find reciprocals of fractions.

Figure 2: An example learning objective with more than one prerequisite

By connecting each learning objective to its prerequisites, we can create a large Knowledge Graph full of connections between our content. Learning objectives can be connected to each other even if they appear in different assignments, chapters, or courses. Below, we show an example section of a large Knowledge Graph, where different colored learning objectives can be thought of as different assignments, chapters, or course levels.

Figure 3: An example section of a large knowledge graph

The notion of representing content relationships in a graph structure is not new, and Knewton’s Knowledge Graphs build on previous work (Novak 1990; McAleese 1999; Doignon and Falmagne 1999; Hwang 2003). Knewton’s Knowledge Graphs are created by trained subject matter experts, who identify the learning objectives in a course, the relationships between these learning objectives, and the pieces of content that teach and assess each learning objective. This direct alignment of each piece of content to learning objectives allows Knewton to precisely diagnose which learning objectives a student has mastered, in addition to providing targeted instructional content when a student is struggling.

Powering Just-In-Time Remediation

Knewton’s Knowledge Graphs allow us to generate adaptive recommendations based on pedagogical criteria, such as those reviewed by Graesser et al. (2012), including frontier learning, building on prerequisites, and providing remediation. For example, let’s go back to our struggling student. It is possible that our student may not have the prerequisite knowledge necessary to succeed in solving word problems with two-digit subtraction. If he struggles when solving these word problems in an alta adaptive assignment, Knewton’s recommendation engine can diagnose his prerequisite knowledge by using the information contained in the Knowledge Graph and provide just-in-time remediation in the form of targeted instructional content and formative assessment on the prerequisite learning objective(s) that he is struggling with. As the student masters the prerequisites, Knewton can move him forward towards his ultimate goal of learning how to solve word problems with two-digit subtraction.

Since Knewton’s Knowledge Graphs are an abstract representation of relationships between learning objectives, prerequisite relationships can also be defined between learning objectives existing in different books, subject areas, or school years, enabling cross-domain adaptivity. For example, Knewton can recommend remediation on a math prerequisite that’s required for mastering a chemistry learning objective. As we’ll see in a future blog post, Knewton’s Knowledge Graphs are also a key input in the Knewton predictive analytics engine, enabling Knewton to estimate student mastery on each learning objective in a graph.

Sources

Doignon, J. P. and Falmagne, J. C. (1999). Knowledge Spaces. Springer.

Graesser, A. C., Conley, M. W., and Olney, A. (2012). Intelligent Tutoring Systems. APA Handbook of Educational Psychology. Washington, DC: American Psychological Association.

Hwang, G.-J. (2003). A Conceptual Map Model for Developing Intelligent Tutoring Systems. Computers & Education, 40(3):217–235.

McAleese, R. (1999). Concept Mapping: A Critical Review. Innovations in Education and Training International, 36(4):351–360

Novak, J. D. (1990). Concept Mapping: A Useful Tool for Science Education. Journal of Research in Science Teaching, 27(10):937–949

Flipped, tipped or traditional: Adaptive technology can support any blended learning model

Like many people, I funded my graduate school (and early teaching career) by bartending. At the end of any really long or otherwise challenging shift, I looked forward to drowning my sorrows with Waffle House coffee as I contemplated the complexities of their hash brown menu…smothered, covered, and diced went without saying, but then what? Capped? Peppered? Chunked?

If you’re not from the South (or you’re just not a fan of Waffle House, or hash browns), you’re probably feeling a little lost right now. Don’t worry—a quick Google search will clear things right up for you! (Trust me, by the time you get to a Waffle House, you’ll likely know exactly what you want.)

If, on the other hand, it’s the “flipped, tipped, or traditional” that has you wondering, we can dig deeper here for you. What are the differences between these three models of blended learning, and what role can adaptive tools play in each?

According to EDUCAUSE the flipped classroom is a pedagogical model in which the typical lecture and homework elements of a course are reversed.

In other words, concepts or skills are introduced to students ahead of class time through a digital medium, and in-class time is spent working with, practicing, or applying their newly acquired knowledge or skills.

In this model, homework typically functions to:

  • prepare students for productive in-class time by giving them materials that introduce or develop key concepts and skills you’ll be working on in class;
  • provide visibility into students’ current knowledge or skillset ahead of introducing challenging material in class;
  • or both of the above.

Adaptive learning tools like Knewton can have a positive impact in this context because they guide students through material that’s coming up in class, offering lots of practice as well as an opportunity to demonstrate mastery so students feel more comfortable participating in class discussion or group activities.

Instructor dashboards and reports enable you to know ahead of time which concepts or skills your students struggle with as a group, so your instructional plan can be targeted to these learning objectives. Some instructors use the Knewton dashboard to build groups or facilitate other peer-to-peer learning opportunities between partners with complementary strengths and weaknesses, or to inform one-on-one instruction or meetings.

The traditional model, in the context of blended learning, refers to a pedagogy that utilizes homework in pretty much the opposite way. The concepts or skills students work on after class are those that were introduced or developed in the class immediately prior. In this model, homework can:

  • provide additional practice opportunity;
  • enable students to make use of and take greater ownership of new knowledge and skills;
  • serve to demonstrate mastery or understanding;
  • or, all of the above.

Adaptive learning tools can play a role here similar to their role in the flipped class, offering advantages for both teaching (instructor analytics let you know where your class stands as a whole and also see how each student is faring individually) and learning (lots of additional practice, instruction if needed, and opportunity to demonstrate understanding and build confidence). Inclusion of non-adaptive assignments like quizzes or tests are commonly used to give closure and provide evidence of student learning that can be easily measured and assessed.

This brings us to the tipped model—the newest of the bunch, and not one you’ll have much luck Googling (at least this was true at the time of this draft!) but it’s a term that’s begun surfacing in conference paper titles and abstracts. And it’s floating around with some uncertainty in conversation.

Personally, I love it! In part because of the imagery but mostly because it captures the interstitial nature of this model; its capacity for tilting between the flipped and the traditional.

In this model, you might assign homework that meets the same purposes for which you would assign homework in a flipped context (prepare students for the material, gain insight into students’ prior/current knowledge, etc.), and then use the instructor dashboard to help you decide on the best topics for a “mini-lecture” at the start of class and provide the focus for the day’s activities. Then, after class, you’re back to the traditional model—students return to the platform to take a quiz; you see the results in real time and can adapt accordingly.

Any way you slice it (or dice, smother, cover, cap, or pepper it), adaptive learning tools can add a lot to the experiences of both teaching and learning. I have no doubt that these tools would have made my early teaching life much easier—leaving the tough decision of the day to hash browns.

Editor’s Note: We’ve got some new tools on the market that fit any of these models. Take a spin for yourself!

Aimee Berger, Ph.D, is a solutions architect for Knewton. She travels around the country helping college instructors implement adaptive learning tools.