8 ways to set up your students for a successful year with Knewton Alta

Make sure you’re ready for whatever this semester throws your way. We’ve outlined some of the best ways for our Knerds to use Knewton Alta to overcome common challenges and empower you to do what you do best – teach!

Read our Implementation Guide

Whether in-person, online, or anything in between, Knewton Alta brings a personalized learning experience to any type of course design. Alta can be flexibly integrated in a variety of settings to meet the precise needs of instructors and students.

Download our implementation guide!

Keep our Quick Start Guide and Knerd Manual handy

Download our Quick Start Guide or our complete Knerd Manual for the latest comprehensive training on how to use Knewton Alta in your classroom.

Visit our Support Center

Do you have a quick question or are trying to troubleshoot an issue you’re having? Our Support Center provides dozens of articles on course set-up, registration, how to effectively use our software in your lessons, and other important topics.

Visit our Support Center!

Watch our video tutorials

Does it help if you can see how to do something yourself? Watch our video tutorials for in-depth explanations of Alta’s features.

Watch our videos!

Join our Knerd Studio

Our Knerd Studio lets you…

Plus, signing up enters you for a chance to win one of Wiley’s Stay the Course Grants, which provides $500 to help a student of your choice persist in their educational journey!

Read more about these offers… or join the Knerd Studio now!

Connect with our Peer Advisors

Our Peer Advisors are here to provide one-on-one advice on anything from on instructional design, course development, Alta best practices, online or hybrid teaching strategy, and more.

You can connect with someone directly in your field to get discipline-specific ideas to make your course more relevant and intuitive, while establishing a better understanding of tactics and techniques to improve student outcomes.

Meet our advisors!

Refer a colleague, earn rewards

With our referral program in the Knerd Studio you can earn up to 10,000 reward points – redeemable for gift cards and Knerd swag—just for introducing us to start the conversation!

Refer someone today!

Find new fashion through the Knerd Store

Looking to expand your wardrobe? This is your one stop shop for Knerd gear! From socks and hats, to a variety of our famous T-Shirts.

Check out our collection!

Knewton launches altapass, an all-access pricing option, making alta even more affordable for students

With altapass, students can access multiple alta courses within a single subject area for $79.95; Knewton lowers price of a single-course alta subscription to $39.95

New pricing options available to students for Fall 2019

NEW YORK — Jan. 15, 2019 — Knewton, the world’s leader in AI-driven teaching and learning, today launched altapass, an all-access pricing offer for alta, the company’s adaptive learning courseware for U.S. higher education. With altapass, students can access multiple alta products across a single subject area for up to two years, and for unlimited use, for $79.95. Additionally, Knewton has reduced the price of a single-course alta subscription from $44 to $39.95.

By introducing the new pricing options, Knewton is making alta even more affordable and accessible, helping to put achievement within reach for the students who need it most.

Students wishing to purchase altapass for Fall 2019 may do so beginning Aug. 1 at Knewton.com. Students may still purchase access to a single alta course via monthly subscription for $9.95 per month.

At launch, altapass will be available across all 36 alta products in the following subject areas:

Knewton brings alta to scale with altapass

Knewton’s effort to make alta more accessible and affordable comes one year after the product’s successful introduction in the U.S. higher education market. Launched in January 2018, alta was used by instructors at more than 250 colleges and universities during the Fall 2018 term.

“It’s clear that we have something special with alta. Now, we’re making it even more affordable and accessible, so that the students who need alta the most can benefit from its impact on learning outcomes,” said Brian Kibby, CEO of Knewton. “We’ve turned the cost structure of our company into a competitive advantage — not just for Knewton, but for students looking for better results at an affordable price.”

“By keeping alta’s pricing simple and consistent across subject areas, we’re taking a lot of the mystery out of the cost of course materials for students. We’re also making alta more affordable for the high number of students who are using alta in more than one course in a single subject area,” said Heather Shelstad, Knewton’s VP of Marketing. “We’re giving students the power to decide which purchasing plan is right for them, and helping them save money no matter which option they choose.”

Knewton releases new insights into student usage, engagement and performance with alta

To provide fresh insight into how alta makes an impact on learning outcomes, Knewton’s data science team released a series of findings regarding student usage, engagement and performance during the Fall 2018 term. They include:

Knewton recently published the results of an independent study of alta’s effectiveness led by the Center for Research and Reform in Education at Johns Hopkins University. The study’s findings drew a link between alta and improved student performance across student ability levels, classrooms and institutions.

“Knewton is an outcomes company,” added Kibby. “While access and affordability represent a key part of alta’s value proposition, there’s nothing more important than its ability to deliver results for students and instructors. We’re going to keep challenging ourselves to set a new standard for transparency regarding those results.”

Johns Hopkins University analysis draws link between Knewton’s alta and improved student outcomes

Earlier this year, Knewton presented an efficacy analysis of alta, Knewton’s adaptive learning courseware for higher education, developed by our data science team. From our perspective, the results of our internal analysis strongly suggested a causal link between alta and improved student performance.

Because even the best-intentioned researchers can introduce unconscious biases when making analytical choices, we knew that our in-house analysis could only be part of the story of alta’s effectiveness. As a data science team comprised largely of academics, we have a deep appreciation for the value of independent reproduction of scientific results.

To get a fresh, unbiased perspective on alta’s impact, we shared our fully anonymized Fall 2017 data with the Center for Research and Reform in Education at Johns Hopkins University (JHU). More specifically, we asked JHU to assess the impact of demonstrating concept proficiency by completing alta assignments — as well as alta usage in general — on student outcomes like quiz and test scores, future assignment completion and retention.

Key findings

To gain an understanding of the study’s key findings and conclusions, we invite you to read JHU’s complete analysis of alta’s impact on learning outcomes.

Knewton’s commitment to impact and transparency

These analyses — performed by different teams, in different ways, and across different time periods — represent both a fundamental piece of scientific research and are key to our efforts toward transparency and continuous, data-driven improvement.

Now, the conversation around alta’s impact must expand to include things like direct feedback from instructors, student surveys, user research, and case studies from a variety of classroom settings. There’s a lot of work to be done!

While conducting a conversation that, by design, never ends is in some ways daunting, it keeps us connected to the experiences and results of our users. And from that perspective alone, this endeavor has been a valuable one.

The New Chalk: How Machine Learning Can Benefit Higher Education

Machine learning, AI and other algorithmic technologies have long promised to enhance the learning experience for college instructors and students.

But what can these technologies actually deliver? What’s required to implement them effectively? And how can they operate in a way that’s both equitable and transparent?

Andrew Jones, a data scientist here at Knewton, joined a panel discussion hosted by EdSurge this week in NYC that sought to answer some of these questions.

The panel, which included a group of educators and education technologists, covered a range of issues, including how machine learning technologies are perceived by students, specific areas where machine learning can make an impact on learning, and the barriers that must be overcome for this technology to be implemented successfully.

When asked to suggest the tough questions that instructors should ask before implementing machine learning technologies in their classroom, Andrew urged instructors to push for greater transparency into how a company’s algorithms work. “Asking what is being optimized for, and why, can give you a sense of [whether a tool] is focused on student outcomes, or whether it is about getting a prediction that’s right more often,” he said.

Toward the end of the session, the focus shifted to instructors themselves — and the role they will play in courses that increasingly feature machine learning technologies, such as virtual assistants

Andrew underscored the central role of the instructor, saying: “I’d rather see machine learning reach the level of chalk.”

You can find a full recap of the event over on EdSurge.

Knewton’s UX Research Practice Gears Up for Back-to-School

As back-to-school season approaches, Knewton is diligently working on powerful new feature releases and product updates. And Knewton’s User Experience Research (UXR) practice supports this work by incorporating instructor and student feedback through a host of research methods.

We vet draft experiences with users, identify issues, iterate, and validate potential solutions. And this is all often before a single line of code is written for a new feature release or product update.

Knewton UXR recently conducted efforts to inform an upcoming alta feature that allows course coordinators to create and manage multi-section courses. We wanted to first understand educators’ current practice, and then swiftly iterate and validate draft designs in light of user feedback. In doing so, by the end of our process, we could come to a useful and usable solution.

We approached research through:

  1. Focus Group
  2. 1:1 Interviews
  3. Persona Development
  4. Rapid Iterative Testing and Evaluation

Focus Groups & 1:1 Interviews

Prior to initiating design work, we took a step back and conducted remote focus groups and 1:1 interviews to understand how coordinators across the country currently create multi-section courses. What does this process look like for them? Where do issues arise? How do Learning Management Systems come into play? This early research provided our cross-functional team with a deeper knowledge of users’ needs and goals.

Persona Development

We used information gleaned from early research sessions to create a course coordinator persona. User goals were defined here, giving teams common language to talk about relevant problems — and how to best solve them.

Rapid Iterative Testing and Evaluation

As the design team started building out a draft experience, UXR hosted 1:1 remote usability testing sessions with course coordinators (users and potential users) across the country. We screen-shared semi-functional draft designs, turned over control of the keyboard and mouse, and asked participants task-oriented and open-ended questions. Because stakeholders (design, product, engineering) were observing each session, we quickly identified design issues, iterated in-between sessions, and validated potential solutions with subsequent users.

What We Learned

What are some things we learned in our multi-section course research? Well…A LOT! But, sometimes the most memorable findings are the ones that are those ‘aha’ moments — the ones where we watch users go through a potential workflow and an imaginary lightbulb goes off for us.

We immediately consider an easier way for users to accomplish a task. Designs are revised and further validated.

One example of an ‘aha’ moment within our research involved ‘auto-save’ during educators’ process of building out coursework. Auto-save seems harmless enough, right? But employing auto-save within the complex task of building out coursework for a multi-section course didn’t seem to give users enough confidence that their work was indeed being saved. Designs were revised and the issue was alleviated.

Another compelling finding involved course Initialization links — what instructors would need to click within a workflow to make the course section ‘start.’ Early draft designs did not seem to make enough distinction between this link and additional content on the same screen. Again, designs were revised to more overtly highlight where users should navigate to initialize the course.

Effectively Leveraging UXR for Educators

Using a multi-method research approach provided our cross-functional team with a solid understanding of user needs prior to any design work, and the flexibility to improve designs in-between research sessions.

Upon concluding our design research, we came away away with an experience we’re confident is useful and usable, and can be put into development for our customers.

Thanks to Michael Mancuso.

Celebrating National Higher Education Day at Knewton

Today, we’re excited to celebrate National Higher Education Day.

What is #NationalHigherEducationDay, you ask? Let us tell you!

Celebrated annually on June 6, it’s a day to champion the value of higher education — and to acknowledge all of the hard work that must be done to make sure everyone can share in it.

The Value of Higher Education – National Higher Education Day

The value of a college education

College not only provides students with a life-changing experience that will broaden their perspective and deepen their understanding of the world around us, it’s also the surest path to a better life. Over the course of their careers, a college degree is worth $2.8 million.1

This is more than just some extra spending money in your bank account. It’s the type of financial security that allows you to pursue a career in a field you’re passionate about or move forward with a Major Life Decision.

Despite all of this, only 61% of college students think that a college offers a good value, according to a recent Student Monitor report.2

To understand why, we’d like to use National Higher Education Day as cause to look at three of higher ed’s biggest challenges — and think about how we can work together to solve them.

Improving student outcomes

A college education provides students with a number of benefits. But sometimes it’s helpful to remember that students will only be able to realize these benefits if they complete their courses and go on to graduate.

Unfortunately, fewer than 6 in 10 students who enter college go on to graduate.3 Fewer still will graduate on-time.

Poorer students experience the lowest graduation rates. The graduation rate of college students born in the 1980s and who are from lower-income families is 11.8%. Among students from middle-income families, the graduation rate is 32.5%. For students from higher-income families, the graduation rate jumps to 60.1%.4

There are many reasons for higher education’s low graduation rates, but none is bigger than the fact that too many students arrive on campus unprepared for the academic rigor of college. 52% of 2-year college students require remediation in math5; however, only around half the students enrolled in remedial math courses go on to complete them.6

As we see it, the biggest opportunity to improve on-time graduate rates is to help students who aren’t prepared for college — particularly in math — get up to speed quickly.

Making learning accessible to all

It’s often said that education is our society’s great equalizer. But what if not everyone has the same access to higher education?

11% of undergraduate students — nearly 2 million in total — have diagnosed disabilities.7 These students are faced with a number of challenges, not the least of which is course materials that aren’t fully ADA compliant. This doesn’t include students who have undiagnosed disabilities in college, when students often have to self report that they are learning disabled.

These challenges add up to make a big impact. About one-third of students with disabilities who are enrolled at a 4-year institution graduate within 8 years. At 2-year institutions, that number is slightly higher, at 41%.8

Improving the learning experience for students with disabilities is a complex issue. But if there’s one thing we can all agree on, it’s that course materials that are fully ADA compliant should become the norm.

Affordability

College provides an incredible value to students. But it’s still expensive.

$1.5 trillion in total U.S. student debt.9 An average debt of more than $39K for the Class of 2017.10 These numbers can be so big that they almost seem to lose meaning.

But for students, their impact is very real.

According to Student Monitor, financial strain accounts for 2 of students’ top 5 concerns.11 And, according to a survey released by researchers at Temple University and the Wisconsin HOPE Lab, 36% of students do not get enough to eat, and a similar number lack a secure place to live.12

We have a shared obligation to make college more affordable, without compromising outcomes or accessibility.

Putting achievement within reach with alta

We built alta, Knewton’s fully integrated adaptive learning courseware, with higher education’s biggest challenges in mind.

Alta combines Knewton’s adaptive learning technology with high-quality openly available content to help students achieve mastery. Alta is accessible to all learners: its technology, content and user experience are all WCAG 2.0 AA-level ADA compliant. At $44 for 2-year access, it’s also affordable.

Solving higher education’s biggest challenges won’t happen overnight, but if we are to reaffirm the value of college for all learners, we must never lose sight of them.

What else needs to be done to improve higher education? What more can we be doing to help? Hit us up on social and tag your post with #NationalHigherEd day.

References

  1. Georgetown University: The College Payoff
  2. Student Monitor: Lifestyle & Media – Spring 2018
  3. National Student Clearinghouse Research Center: Signature 14 Completing College: A National View of Student Completion Rates – Fall 2011 Cohort
  4. University of Michigan: Growing Wealth Gaps in Education
  5. Complete College America: Data dashboard
  6. National Center for Education Statistics: Remedial Coursetaking at U.S. Public 2- and 4-Year Institutions
  7. National Center for Education Statistics: Fast facts
  8. National Center for Special Education Research: The Post-High School Outcomes of Young Adults With Disabilities up to 6 Years After High School
  9. Federal Reserve Bank of New York: Quarterly Report on Household Debt and Credit
  10. Data provided by Mark Kantrowitz to studentloanhero.com
  11. Student Monitor: Lifestyle & Media – Spring 2018
  12. Wisconsin HOPE Lab: Still Hungry and Homeless in College

How does Knewton’s Proficiency Model estimate student knowledge in alta?

Accurately estimating a student’s knowledge is one of the core challenges of adaptive learning.

By understanding what a student knows and doesn’t know, adaptive learning technology is able to deliver a learning experience that will help the student achieve mastery. Understanding the student knowledge state is also essential for delivering accurate, useful analytics to students and instructors.

We refer to our data-driven mathematical model for estimating a student’s knowledge state as Knewton’s Proficiency Model. This model lies at the core of our ability to deliver lasting learning experiences to students using alta.

How does our Proficiency Model estimate student knowledge? Answering that question begins by looking at its inputs, which include:

The model’s outputs represent the student’s proficiencies in all of the learning objectives in the Knowledge Graph at a given point in time. So what’s in between the inputs and the outputs?

Knewton’s Proficiency Model

A basis in Item Response Theory

The foundation for our Proficiency Model is a well-known educational testing theory known as Item Response Theory (IRT).

One important aspect of IRT is that it benefits from network effects — that is, we learn more about the content and the students interacting with it as more people use the system. When a student answers a difficult question correctly, the model’s estimated proficiency for that student should be higher than it would be if the student had correctly answered an easy question. But how can we determine each question’s difficulty level? Only by observing how large numbers of diverse students performed when responding to those questions.

With this data in-hand, we are able to better and more efficiently infer student proficiency — or weakness — and deliver content that is targeted and effective.

Moving beyond the limits of IRT

Because IRT was designed for adaptive testing — a learning environment in which a student’s knowledge remains fixed — it does not meet all of the requirements of adaptive learning, an environment in which the student’s knowledge is continually changing. In a model based on IRT, a student’s older responses make the same impact on the student’s proficiency level as their more recent responses. While this is fine in a testing environment, in which students aren’t typically provided feedback or instruction, it becomes a problem in an adaptive learning environment.

In an adaptive learning environment, we inherently expect that students’ knowledge will change. As a result, we want to give more weight to recent responses than older ones — allowing for the possibility of an “Aha!” moment along the way.

To correct for the limitations of IRT, Knewton has built temporal models that weight a student’s recent responses more heavily than their older ones when determining proficiency, providing a more accurate and dynamic picture of the student’s knowledge state.

Accounting for relationships between learning objectives

Adaptive learning requires constant, granular assessment on multiple learning objectives embedded in the learning experience. However, traditional IRT also does not account for the relationships between learning objectives. As discussed above, these relationships are an important part of the Knewton Knowledge Graph.

To remedy this shortcoming of IRT, Knewton has developed a novel way to incorporate these relationships in a Bayesian modeling framework, allowing us to incorporate prior beliefs about proficiency on related topics, with evidence provided by the student’s responses. This leads to so-called proficiency propagation, or the flow of proficiency throughout the Knowledge Graph.

What does this look like in practice? If, in the Knowledge Graph below, a student is making progress toward the learning objective of “Solve word problems by subtracting two-digit numbers,” our Proficiency Model infers a high proficiency on that learning objective. The model also infers a high proficiency on the related learning objectives (“Subtract two-digit numbers” and “Subtract one-digit numbers”), even without direct evidence. The basic idea: If two learning objectives are related and a student masters one of them, there’s a good chance the student has also mastered the others.

A Knewton Knowledge Graph.

The effectiveness of Knewton’s Proficiency model

The many facets of the Proficiency Model – IRT-based network effects, temporal effects, and the Knowledge Graph structure – combine to produce a highly accurate picture of a student’s knowledge state. We use this picture to provide content that will increase that student’s level of proficiency. It’s also the basis of the actionable analytics we provide to students and instructors.

How effective is the Proficiency Model in helping students master learning objectives? In his post “Interpreting Knewton’s 2017 Student Mastery Results,” fellow Knerd Andrew D. Jones presents data that shows that Knewton’s Proficiency Model helps students achieve mastery — and that mastery, as determined by the Proficiency Model, makes a positive impact on student’s academic performance.

Interpreting Knewton’s 2017 Student Mastery Results

This post was developed with Illya Bomash, Knewton’s Managing Data Scientist.

Results. Efficacy. Outcomes.

Student success is the ultimate goal of learning technology. Despite this, there exists a startling lack of credible data available to instructors and administrators that speaks to the impact of ed-tech on learning and academic performance.

To provide instructors and administrators with greater transparency into the effectiveness of alta and the Knewton adaptive technology that powers it, we analyzed the platform results of students using alta. These results represent our effort to validate our measure of mastery (more on that to come) and provide instructors and administrators with much-needed transparency regarding the impact of alta on student achievement.

Here, we hope to provide context and explanation that we hope will leave educators and those in the ed-tech community with a clearer picture of how we arrived at the these results — and why they matter.

Our data set

The findings in this report are drawn from the results of 11,586 students who cumulatively completed more than 130,000 assignments and 17,000 quizzes in alta in 2017.

This data set includes all of alta’s 2017 spring and summer student interactions. Only cases in which the relevant calculations are impossible have been excluded — such as quiz scores for a course in which the instructor chose not to administer quizzes. So while these results aren’t from randomized, controlled trials, they do paint an accurate portrait of student performance across alta users, making use of as much of our student data as possible.

Why mastery?

Our adaptive technology is based on the premise that if a student masters the concepts tied to the learning objectives of their course, that student will succeed in the course and be prepared to succeed in future courses. It’s also based on the premise that Knewton’s mathematical model of student knowledge states — which we frequently refer to as Knewton’s proficiency model — can determine when a student has reached mastery.

This basis in mastery manifests itself in how students experience alta: Every assignment that a student encounters in alta is tied to learning objectives that have been selected by the instructor for their course. A student “completes” an alta assignment when our proficiency model calculates that a student has mastered all of the learning objectives covered in that assignment.

Our 2017 Mastery Results seek to clarify two things: the frequency with which students achieve mastery in alta, and the later performance of students who have (and have not) achieved mastery, as determined by our proficiency model.

Controlling for students’ initial ability level

In this analysis, we wanted to assess the impact of mastery across the full spectrum of student ability levels. To capture a sense of each student’s initial proficiency, we aggregated the first two questions each student answered across all of the concepts he or she encountered in the course. The percentage of those questions the student answered correctly provides a naive but reasonable estimate of how well the student knew the material entering the course.

We looked at the distribution of this score across all of our students, tagging each student’s history with a label corresponding to where they fell among all users.

Note: Knewton’s proficiency model neither uses this measure nor tags students with any kind of “ability label.” Our adaptive technology calculates a detailed, individualized portrait of each student’s proficiency levels across a wide range of concepts after each student interaction. But for the sake of this comparative impact analysis, we’ve chosen to use these distinctions as a tool to compare students of similar initial abilities.

Our findings

Students of all ability levels achieved mastery with alta at high rates

Analyzing students’ assignment completion revealed that with alta, students achieve mastery at high rates. As seen in Figure 1, across all students, 87% of the time, students working on an assignment in alta achieved mastery. Even among students who struggled to complete a particular assignment, 82% eventually reached mastery.

Achieving mastery with alta makes a positive impact on students’ academic performance

We know that with alta, students are highly likely to achieve mastery. But what is the impact of that mastery? When our model indicates that a student has mastered the material, how well does the student perform on future assignments, quizzes, and tests?

For any given level of initial ability, Knewton’s adaptive learning technology is designed to facilitate reaching mastery effectively for any student willing to put in the time and effort. To validate Knewton’s measure of mastery, we compared the performance of students who mastered prerequisite learning objectives (for adaptive assignments) and target learning objectives (for quizzes) through altawith students of similar initial ability who did not master these concepts.

Mastery improves the quiz scores for students of all ability levels

Figure 2 shows average Knewton quiz scores for students who did/did not reach mastery of the quiz learning objectives on prior adaptive assignments. Quiz takers who mastered at least ¾ of the quiz learning objectives through previous adaptive work went on to achieve substantially higher quiz scores than similarly-skilled peers mastering ¼ or fewer of the learning objectives.

Mastery levels the playing field for struggling students

Putting in the work to reach mastery on the relevant adaptive assignments increased initially struggling students’ average quiz scores by 38 percentage points, boosting scores for these students above the scores of otherwise advanced students who skipped the adaptive work.

Mastery improves future platform performance

Students who master the learning objectives on earlier assignments also tend to perform better on later, more advanced assignments.

Assignment completion

As Figure 3 shows, controlling for overall student skill levels, students who mastered ¾ of the learning objectives prerequisite to any given assignment tended to complete the assignment at much higher rates than students who did not. This is the virtuous cycle of mastery: the more students master, the better prepared they are for future learning.

Work to completion

Mastery of an assignment’s learning objectives also saves students time. When students began an assignment after having mastered most of its prerequisites, they tended to require significantly fewer questions to complete it. For students who mastered at least ¾ of the prerequisites to any given adaptive assignment, completing the assignment took 30-45% fewer questions than for students who did not (see Figure 3). Mastery helps students of all abilities learn faster, and struggling students see the biggest gains: for these students, prerequisite mastery leads to an average postrequisite assignment shortening by more than 40%.

The road ahead

Any self-reported efficacy results will be met with a certain amount of scrutiny. While we’ve attempted to be as transparent as we can be about our data, we understand that some will question the validity of our data or our approach to presenting it.

It’s our hope that, if nothing else, the reporting of our results will inspire others in the ed-tech community to present their own with the same spirit of transparency. In many ways, these results are intended not as a definitive end-point but as the start of a more productive conversation about the impact of technology on learning outcomes.

Lastly, while our 2017 Student Mastery Results are encouraging, we know that they exist in a world that is constantly changing. The challenges in higher education are becoming greater and more complex. The student population is growing increasingly diverse. Our technology and our approach to learning is evolving.

This year, we plan to update these numbers periodically and provide the results of other analyses with the goal of providing greater transparency into the effectiveness of alta and deeper insights into how students learn.

View full mastery results

What are Knewton’s Knowledge Graphs?

Imagine you are tutoring a new student who is working on a homework assignment about solving word problems that require two-digit subtraction. The first question on the assignment is:

“Jessica wants to buy a book that costs $15 and she has $32 in her wallet. After she buys the book, how much money will she have left?”

Your student looks at the problem and looks back at you, terrified. He has no idea what to do. Clearly he hasn’t yet mastered the ability to solve word problems with two-digit subtraction. But what exactly is the problem? Is he struggling with the word problem itself, or is he missing crucial prerequisite knowledge as well?

Answering this question correctly requires a student to be able to perform the following steps:

  1. Translate a word problem about subtraction into a mathematical expression: Question text → 32–15 = ?
  2. Solve the two-digit subtraction problem: 32–15
  3. Solve one-digit subtraction problems (while performing the two-digit subtraction problem): 12–5, 2–1

From the above breakdown, we can see that this word problem requires a student to know three separate skills, or learning objectives. These learning objectives also happen to build on each other: a student can’t solve a two-digit subtraction problem without first knowing how to solve one-digit subtraction problems, and a student can’t solve word problems that require two-digit subtraction without knowing how to perform two-digit subtraction. We can represent the relationship between these learning objectives in the following way:

Figure 1: A tiny Knewton Knowledge Graph

In the picture above, we are representing each learning objective as an oval. An arrow pointing from oval A to oval B indicates that a student must know A in order to be successful in B — we describe this relationship by saying that “A is a prerequisite to B”. This simple framework forms the foundation of Knewton’s Knowledge Graphs, a key tool that helps Knewton provide adaptive assignments and analytics in alta.

Extending the Example

The tiny Knowledge Graph above is a very simple example of how relationships between learning objectives can be represented. In many cases, a given learning objective can have multiple direct prerequisites. A simple example of a learning objective with two prerequisites can be found by looking at the learning objective requiring students to divide fractions. To divide fraction X by fraction Y, you multiply X times the reciprocal of Y. This means that in order to be able to divide fractions, you must already be able to (1) multiply fractionsand (2) find reciprocals of fractions.

Figure 2: An example learning objective with more than one prerequisite

By connecting each learning objective to its prerequisites, we can create a large Knowledge Graph full of connections between our content. Learning objectives can be connected to each other even if they appear in different assignments, chapters, or courses. Below, we show an example section of a large Knowledge Graph, where different colored learning objectives can be thought of as different assignments, chapters, or course levels.

Figure 3: An example section of a large knowledge graph

The notion of representing content relationships in a graph structure is not new, and Knewton’s Knowledge Graphs build on previous work (Novak 1990; McAleese 1999; Doignon and Falmagne 1999; Hwang 2003). Knewton’s Knowledge Graphs are created by trained subject matter experts, who identify the learning objectives in a course, the relationships between these learning objectives, and the pieces of content that teach and assess each learning objective. This direct alignment of each piece of content to learning objectives allows Knewton to precisely diagnose which learning objectives a student has mastered, in addition to providing targeted instructional content when a student is struggling.

Powering Just-In-Time Remediation

Knewton’s Knowledge Graphs allow us to generate adaptive recommendations based on pedagogical criteria, such as those reviewed by Graesser et al. (2012), including frontier learning, building on prerequisites, and providing remediation. For example, let’s go back to our struggling student. It is possible that our student may not have the prerequisite knowledge necessary to succeed in solving word problems with two-digit subtraction. If he struggles when solving these word problems in an alta adaptive assignment, Knewton’s recommendation engine can diagnose his prerequisite knowledge by using the information contained in the Knowledge Graph and provide just-in-time remediation in the form of targeted instructional content and formative assessment on the prerequisite learning objective(s) that he is struggling with. As the student masters the prerequisites, Knewton can move him forward towards his ultimate goal of learning how to solve word problems with two-digit subtraction.

Since Knewton’s Knowledge Graphs are an abstract representation of relationships between learning objectives, prerequisite relationships can also be defined between learning objectives existing in different books, subject areas, or school years, enabling cross-domain adaptivity. For example, Knewton can recommend remediation on a math prerequisite that’s required for mastering a chemistry learning objective. As we’ll see in a future blog post, Knewton’s Knowledge Graphs are also a key input in the Knewton predictive analytics engine, enabling Knewton to estimate student mastery on each learning objective in a graph.

Sources

Doignon, J. P. and Falmagne, J. C. (1999). Knowledge Spaces. Springer.

Graesser, A. C., Conley, M. W., and Olney, A. (2012). Intelligent Tutoring Systems. APA Handbook of Educational Psychology. Washington, DC: American Psychological Association.

Hwang, G.-J. (2003). A Conceptual Map Model for Developing Intelligent Tutoring Systems. Computers & Education, 40(3):217–235.

McAleese, R. (1999). Concept Mapping: A Critical Review. Innovations in Education and Training International, 36(4):351–360

Novak, J. D. (1990). Concept Mapping: A Useful Tool for Science Education. Journal of Research in Science Teaching, 27(10):937–949

Putting achievement within reach through the corequisite model for redesign

For decades, educators and policymakers have been looking for ways to to remedy the epidemic of incoming freshman  who require extra preparation for college level coursework yet end up languishing in courses that don’t earn them college credit.

The number of students placed into non-credit bearing “prerequisite” courses who fail to ever enter — let alone pass — a credit-bearing course is staggering. Ninety-six percent of colleges enrolled students who required remediation during the 2014-2015 academic year, and more than 200 schools placed more than half of their incoming students into at least one remedial course.

But less than one in four students in remediation at 2-year colleges ever make it to a credit-bearing course. This comes at a cost to taxpayers of $7 billion annually.

To address this challenge, colleges and universities have increasingly turned to “redesign,” a shorthand term for the process by which they reconceive instruction for an entire course area to improve student outcomes and cut down on costs for students.

There are many configurations of redesign. While all have led to some level of student gains, the corequisite model has produced results that merit closer attention.

Corequisites: An overview

The corequisite model dispenses with prerequisite courses and replaces them with college-level courses that include “just-in-time” support for students who require it. By providing extra support within the framework of a college-level course, the corequisite model promises to accelerate students’ progress and increases their chances of success.

In Georgia, a state that had used a traditional, prerequisite model, only 21% of developmental-level students went on to complete the related, college-level math course. After transitioning to corequisites, that number leapt to 64%. The results were even more dramatic in Tennessee, where the number of students requiring remediation in math who went on to complete a credit-bearing course exploded, going from 12% to 63%.

More states are dipping their toes into corequisite waters. This past June, Texas Governor Greg Abbott mandated that the state’s public colleges and universities must enroll 75% of their developmental-level students in a corequisite course by 2020. In the parlance of the tech community, that’s a big lift.

Personalizing corequisite instruction with Knewton’s adaptive technology

For states and institutions seeking to give students who require extra support the skills they need while keeping them on pace to earn their degree on time, the corequisite model of redesign shows promise. But still, corequisites present a challenge: providing personalized instruction to students whose skill levels may vary greatly at the start of the course.

Looking to power their corequisite redesign efforts with adaptive technology, colleges are making Knewton a key part of their corequisite courses.

Knewton, which provides all students with an adaptive, personalized path to mastery and offers “just in time” support when needed, is a perfect fit for corequisite courses, which must achieve dual goals: providing developmental-level students with prerequisite skills while helping all students achieve the learning objectives of a college-level course.

And at $44 for two years of access, Knewton’s alta is in line with one of the goals of corequisites: making college more affordable.

While Knewton fits perfectly within any college-level course that includes both well-prepared and underprepared students, we’ve created corequisite versions of some courses to reflect their unique structure, which varies based on approach. Our corequisite courses support a blended approach with “just in time” support, a targeted approach with developmental math review available to be assigned at the beginning of every chapter, and a compressed approach that includes four weeks of developmental math and 8 weeks of college-level material.

Looking ahead

New approaches to redesign and corequisites are constantly emerging. Because of how our content and technology is built, we’re ready to help institutions and instructors seize opportunities to help students succeed by quickly designing solutions that meet their needs.

And because corequisites are rapidly expanding into new course areas, we’re constantly adding to our roster of courses that include developmental support. By 2018, we will offer corequisite versions of the following courses:

We’re excited by Knewton’s ability to support the corequisite model of redesign and to bring you the results of Knewton implementations in corequisite courses. In the meantime, we’ll be working hard to put achievement in reach for all learners by making more corequisite courses available.