In most classrooms, every student gets the same assignments, which a teacher has usually planned out months or weeks in advance. Students in classrooms that use Knewton-powered adaptive products have a totally different experience: we are able to figure out what each student in a class knows, and what she’s struggling with. Given this information, and the goals she’s working towards in a given class, Knewton recommends the best activities for that individual student to work on next, in real time. So in a class of 30 students, every student might be concentrating on different types of questions at various levels of difficulty while working toward the same goal.

It’s understandably difficult for most teachers, students, and parents to picture what that might look like in practice.

To help picture what Knewton is actually doing, let’s look at a few interactive animations that demonstrate the power of Knewton adaptive learning. These visualizations represent real students using a Knewton-powered course.

For the first example, let’s take a look at three real elementary-school students in the same class. Student privacy is important to us, so we don’t know these students’ names; however, to make it easier to talk about them, let’s call the student on the left Amy, the student in the middle Bill, and the student on the right, Chad.

All three students are all working on the same goals: interpreting multiplication equations (green), multiplying by 1-digit numbers (yellow), multiplying by two-digit numbers (blue), and solving multiplications word problems (pink).

In order to understand these four concepts, it’s also relevant to know some prerequisite topics (shown in grey).

For instance, students need to understand “multiplying by 1-digit numbers” in order to learn how to “multiply by 2-digit numbers” .

The video below shows how each student progresses through the course material.

When you press the play button, boxes will appear under each student’s graph. Each box represents a different question that the student answered. The color of the box tells you what concept that question was from. Green check marks represent questions that were answered correctly on the first try, while red x’s represent questions that were answered incorrectly on the first try. Use the back and forward buttons to skip through the students’ histories.

Notice how Knewton recommends the same first three questions to all three students? That’s because this is their first time using a Knewton-powered product and we’re trying to learn a little bit about their current level of knowledge.

Let’s dig a bit into why Knewton makes the recommendations it does to each individual student. On the third question, for example, we see that Bill is struggling with multiplying 2-digit numbers. While Amy and Chad move on to a new topic (interpreting multiplication equations), the middle student keeps working on multiplying 2-digit numbers. That’s differentiation in action!

On the fourth question, Knewton notices that Chad seems to be struggling with interpreting multiplication equations, so he’ll continue to work on questions related to that concept, while Amy continues through the assignment.

Notice how Knewton guides Bill and Chad to keep working on the concept that they’re struggling with until they start to understand and get questions right, eventually mastering that concept.

Knewton’s differentiated instruction lets students focus on the concepts that they are struggling with, rather than bogging them down with busywork on topics on which they have already demonstrated understanding. In addition to supporting students who are falling behind with targeted remediation, Knewton also gives advanced students the opportunity to move forward at their own pace — Amy is a great example of this.

In the second example, we illustrate how Knewton takes a student’s past work into performance when providing recommendations. Again, for privacy reasons we don’t know these students’ names or where they’re from, but for this example, we’ll call the student on the left Mary and the student on the right Joe.

Like the students in the first example, Mary and Joe are also working to master multiplication word problems. However, whereas all three of the students from Example 1 were in the same class, Mary and Joe are in different classes. On a previous day, Mary was assigned lessons that covered the three prerequisite concepts. Her responses to those questions are shown as grey boxes. This is Joe’s first interaction with Knewton, so he has no prerequisite work.

When both students start working on this goal, Mary immediately starts working on the goal concepts. Joe, meanwhile, begins with prerequisite concepts because Knewton is trying to verify that he is proficient in the prerequisite concepts before trying to teach him the goal concepts. Because of the information the Knewton already knows about Mary and Joe, it recommends prerequisites to Joe, but not to Mary — the system knows that she’s already mastered those earlier concepts!

Even though Mary misses more questions than Joe on this assignment, she is able to finish it faster because of her earlier prerequisite work. Joe gets extra help because he hasn’t learned the prerequisites yet.

In this way, Knewton again helps students who have demonstrated proficiency to finish their work quickly — so they have more time to learn new things. We also make sure that students who are new to the material feel comfortable with prerequisite concepts, ensuring that they are primed to excel at their goal.

These two examples are just a snapshot of the way Knewton continuously targets and differentiates instruction to meet every student’s needs. With the help of Knewton-powered products, teachers can ensure that every student works on material that supports in-class instruction and helps them move forward toward learning goals.

]]>Further analysis shows that Knewton-powered adaptive assignments for struggling students narrow the gap between them and high-performing students on subsequent assignments. Closing this gap is one of the biggest challenges instructors can face in the classroom.

In “Reducing the Gap: How Adaptive Follow-Ups Help Struggling Students,” Hillary Green-Lerman and Kevin Wilson of Knewton looked at 48,202 students who used an online homework tool for college-level science textbooks in the spring of 2014. Beyond ordinary homework assignments, students who didn’t show mastery of the concepts they were learning received adaptive follow-up assignments powered by Knewton. These adaptive assignments present a personalized sequence of questions designed to address each student’s individual strengths and weaknesses.

Our research team found that students who were assigned an adaptive follow-up after struggling on a first assignment showed improvements of between four and 12 percentage points on subsequent assignments relative to their classmates who did not have adaptive follow-up assignments.

Students with a lower score will have more room to improve than high-performing ones. So Green-Lerman and Wilson corrected for initial differences in grade distributions between the higher- and lower-performing students. When taking this correction into account, they still see an average improvement of three points, and as much as eight points:

The online homework tool discussed in this study makes use of only a small portion of what Knewton can do to improve learning outcomes. Knewton’s research team continues to validate the efficacy of adaptive learning, and plans to continue to share its findings.

To read the full study, sign up below to download “Reducing the Gap.”

Santillana has named the product A2O, a play on the concept of *aprendizaje líquido*, or liquid learning: the idea that students can flow through their lessons in their own way. A2O adapts in real-time to address each student’s needs and helps teachers see exactly where students need support. Analytics give teachers unprecedented visibility into the learning process.

Teachers participating in the pilot over the next six months will be incorporating A2O into their algebra curriculum. Students will use A2O in the classroom and at home. The pilot covers four to six weeks of secondary-level algebra, and Santillana hopes to expand to other topics in math and other subjects.

Throughout the product development process, the Santillana team has received invaluable insights and feedback from teachers in A2O’s target market. We’re incredibly proud of the work Santillana has done and eager to see the product in the hands of as many students and teachers as possible. Interested teachers can still sign up for the pilot.

With over 28 million students in 22 countries, Santillana has an unparalleled understanding of Spanish and Latin American education. Still, teacher and student feedback will be invaluable when Santillana expands its adaptive learning offering.

En el mes de octubre de 2014, Knewton anunció una alianza con Santillana, la principal editorial de contenidos educativos de España y América Latina, con el objetivo de desarrollar productos de “aprendizaje adaptativo” para el mundo de habla hispana. Actualmente, Santillana está lanzando una prueba piloto de un producto de álgebra para nivel secundario dirigido a estudiantes de 12 y 13 años y sus profesores.

Santillana ha llamado al producto A2O, un juego sobre el concepto de *aprendizaje líquido*: la idea es que los estudiantes puedan fluir a través de sus lecciones de manera propia. A2O se adapta en tiempo real para abordar las necesidades de cada uno de los estudiantes y ayuda a que los profesores vean el momento exacto en qué concepto los estudiantes necesitan su apoyo. El análisis le brinda a los profesores un conocimiento inédito sobre el proceso de aprendizaje.

Durante los próximos seis meses, los profesores que participen en el piloto incorporarán A2O en su plan de estudios de álgebra. Los alumnos utilizarán A2O en clase y en su hogar. El material en el piloto incluye de cuatro a seis semanas de álgebra de nivel secundario. Santillana espera expandirse a otros temas de matemáticas y a otras materias.

Durante todo el proceso de desarrollo del producto, el equipo de Santillana recibió consejos y aportes invaluables de los profesores en los países al que apunta A2O. Estamos sumamente orgullosos del trabajo que ha hecho Santillana y ansiosos por ver el producto en manos de la mayor cantidad posible de estudiantes y profesores. Los profesores interesados aún pueden inscribirse al piloto.

Con más de 28 millones de estudiantes en 22 países, Santillana posee un entendimiento sin precedentes sobre la educación española y latinoamericana. Aún así, los aportes de profesores y estudiantes serán invaluables cuando Santillana amplíe su oferta de aprendizaje adaptativo.

]]>The Knewton data science team went back to look at those nearly 13,000 elementary school students to examine how the runup to a holiday affects student performance.

Because of the Friday effect, the best comparison for Thanksgiving week scores is to how the same students did on all other Mondays, Tuesdays, and Wednesdays that year. For non-holiday weeks, their average score during the beginning of the week was 74 percent. Before Thanksgiving, it was 67 percent.

That’s an even larger drop in performance than the Friday effect.

Now let’s see if there’s a similarly large drop on the Fridays before a long weekend. Might students show a similar slump before a shorter holiday?

The chart below, shows average student scores on Fridays before a two-day weekend as well as on the Fridays before Martin Luther King Jr. Day and Memorial Day.

There’s almost no difference between a normal Friday and the one before Martin Luther King, Jr., Day. However, there is a significant drop in average score before Memorial Day.

The final hypothesis we tested was whether students scores changed before Winter Break. For each weekday, the average score for the rest of the school year is in blue and the the average score for the week before Winter Break is in green.

The fall-off gets worse each day. By the Friday before Winter Break, students are scoring more than 15 percentage points lower than they would on an ordinary Friday — or about 20 points lower than on the average weekday.

]]>Having observed billions of interactions involving millions of students, Knewton can test how timing affects student performance. In order to design a reliable study, the data science team focused on a single product to guarantee a uniform population: similar ages, coursework, and context. So we chose a supplemental online tool for learning elementary school math in the United States. In the 2014–15 school year, 12,695 students used this tool for learning math.

When you look at how many math problems they did each day, you can see the rhythms of the week and the school year:

We can also see the rhythms of the school day. The next chart below shows by hour how many questions students answered on an average weekday. The dark line represents the mean number of questions answered, and the lighter areas represent the standard error of the mean — i.e., how much that number tends to vary.

Most of the extra math work happened during the school day, between 8 a.m. and 3 p.m., with a substantial tail of activity until 9 p.m.

That rhythm isn’t the same each day: On Fridays, for instance, almost no one used the math program after 3 p.m.:

Now that you know a little bit about our dataset, let’s come back to our original question. When students did these online math problems, how often were their answers correct, and did their percentage of correct answers vary from day to day? The chart below shows a simple average of students’ scores on each weekday.

These students scored an average of about five percentage points lower on Fridays. (Notice that the y-axis begins at 64 percentage points rather than at zero.)

This chart, however, doesn’t say whether the *same* students scored better on Mondays and worse on Fridays. Or is one set of students working on Mondays and scoring well, while a different set of students works on Fridays and scores poorly? In the following animation, each colored line represents the usage patterns of a randomly selected student:

Our initial assumption was that students would use computers at the same time each week. In fact the data show that students tended to go online once a week, but on a different day each week. Their varied schedules offer a convenient way to explore whether the observed “Friday effect” can be explained by differences in the students, or by the day of the week.

So we looked at the students who, over the course of the year, did these math problems at least once on each weekday:

The results of this analysis are similar to what we saw already: Students didn’t perform as well on Fridays.

One final way to verify the measurement is to compute the average score for each student who had activity on all five weekdays and compare that to her average score for a particular weekday. For instance, a student might get an overall average score of 70 percent, but on Wednesdays, her average score might be 72 percent. That means that, on Wednesdays, she scores 2 percentage points higher than her average across all weekdays. Let’s call it her *normalized score difference*.

Here is the average of normalized score differences for all the students:

Something is happening on Fridays. Students scored somewhere between 4 and 5 percentage points lower than on other weekdays. We don’t know exactly why, however.

We were startled by the strength of the Friday effect and the clarity of the data. While these results hold true for these particular students, preliminary analysis of several other groups shows that the Friday effect may be more widespread So doing work on a Thursday instead of a Friday could be the difference between an A- and a B+.

]]>It’s never easy to show the efficacy of a given set of instructional materials, since many other factors contribute to how students learn, from how the teacher runs the classroom to what they had for breakfast. Beyond that, the impact of Knewton can be difficult to isolate, since Knewton adaptive learning technology is only one component of any of the digital learning products we power.

As Knewton powers more products and serves more personalized recommendations, however, our data scientists can identify more examples that show how Knewton improves student outcomes.

In “The Improvement Index: Evaluating Academic Gains in College Students Using Adaptive Lessons,” Illya Bomash and Chris Kish of Knewton look at interactions involving approximately 288,000 students. They found that students performed better on average in college-level science courses with Knewton-powered adaptive assignments than in those without.

During the summer and fall semesters of 2013, these students studied college-level physics, biology, chemistry, and anatomy and physiology. Their textbooks came with a Knewton-powered online homework tool.

Knewton didn’t power the entire tool. Instructors assigned homework as usual for the students to complete online. On top of that, however, instructors had the option of enabling Knewton-powered adaptive follow-up assignments for students whose homework didn’t show mastery of the scientific concepts.

During these semesters, about 6,400 courses used the online homework tool. About a quarter of those courses offered Knewton-powered adaptive assignments. The rest didn’t.

Bomash and Kish compared these two groups. Their analysis found that students performed better in courses with Knewton-powered adaptive assignments:

The improvement increases with more use of adaptive assignments. We saw a peak average score difference of four percentage points.

They explored several other possible explanations for the difference — perhaps the better performing group were better prepared, or more motivated because they didn’t want extra homework — and ruled those out in their analysis. What’s more, results from courses in 2014 show consistent improvement in scores:

In courses that relied most on Knewton-powered assignments, students spent less than 25 minutes per week on average doing this additional coursework.

To read the full study, sign up below to download “The Improvement Index.”

A child’s education is composed of many short, singular experiences, which can sometimes be hard to connect. This is a reality with which educators (and learners) are very familiar — there’s so much to cover, and only so much student attention to go around.

A middle school science syllabus may call for only two to three lessons per week. Add in some homework assignments, and you’ve got just a few bursts of student focus over seven days. Here’s a visualization of a typical student’s work patterns — this student has science lessons on Tuesdays and Thursdays, and does a bit of homework each weekend:

And that fortnight’s activity pattern is typical throughout the semester:

Teachers know that a certain portion of each lesson must be focused on refreshing last week’s material, to bridge the gaps between work sessions.

In addition to bridging temporal gaps, educators also need to bridge conceptual gaps. For a few weeks, concepts may generally build on each other; in other lessons, brand new topics will be introduced; and in still other lessons, new ideas will draw on the material from two months ago. That’s a lot to keep track of.

Bridging all those gaps is what good teachers do, at the micro scale in the classroom in the moment, and at the macro scale through lesson plans and curriculum design.

Ideally, the tools students and teachers use would also help bridge those gaps. Educators face a dual challenge: they must 1) make the most of those brief spikes in student focus, and 2) help students make sense of all those disparate experiences.

Knewton has partnered with Muzzy Lane, a leading educational game producer, to explore how the combination of adaptivity and games can help address this challenge. By creating games that engage and inspire students, Muzzy Lane helps teachers make the most of those bursts of student focus. Meanwhile, Knewton provides tools that enable students and teachers to make sense of all those disparate learning experiences over time and across concepts. Knewton brings each student’s learning tendencies, as well as past and current performance, together to provide a unified picture of that student’s current mental state.

In the course of playing educational games, students reveal a lot of information about their learning tendencies and knowledge. But typically, this information is trapped within the game — instructors may have access to summaries of student performance, but that’s it. By combining rich game content with Knewton technology, Muzzy Lane and Knewton give instructors access to more detailed information about student performance. A digital product might include a few educational games, as well as “traditional” instructional and assessment material (think multiple-choice quizzes, essays graded by instructors, video lessons, text instructional content). Knewton can integrate information from a student’s experience in the game along with all the results from other course assessments and activities — meaning teachers get a much fuller and more accurate picture of student performance.

In Feed the Fox, a game prototype developed by Muzzy Lane and Knewton, students construct food webs in various biomes to learn about species classification, the environmental pressures different biomes present, and how organisms are related in an ecosystem. Each student action in the game gives Knewton a hint about one, or perhaps several, of her conceptual understandings (or misunderstandings).

Feed the Fox, and games like it, are intended to be part of a larger digital curriculum, also drawing on many different kinds of content, ranging from traditional instructional text and assessment questions to video and interactive activities. Knewton has the same access to the data generated inside the game as it does to students’ responses to assessment items and interactions with instructional content in other parts of the course.

As a student plays the game, her activity is communicated to Knewton. When she moves a species card onto the board, for example, she demonstrates her current understanding of species classification (consumer vs. producer) and the boreal forest biome (i.e., what types of species thrive in the conditions of this particular biome).

That’s really cool — with a single action, Knewton gets multiple pieces of assessment information.

And similarly, as the student draws connections between species, she demonstrates her understanding of the overall food web and the relationships between species:

Feed the Fox provides Knewton with dense, concentrated information about student proficiency. Students tend to be more motivated and focused in a game environment than within traditional assignments, and therefore maintain their focus for a longer period of time. The rich assessment information from the game provides direct evidence of the student’s proficiency on the relevant concepts in the game, as well as indirect evidence of her proficiency on related concepts not in the game.

Games also provide a unique opportunity to evaluate student proficiency unclouded by test anxiety or the incentive to cheat. Since traditional assessment can cause anxiety for many students, the “stealth assessment” in games may allow for a more comprehensive understanding of student ability. When games are commingled with traditional assessments, students produce more, and more varied, data related to their proficiency. For example, a student who excels in the game but misses the mark with other formats might not be struggling with the underlying concepts but instead might have issues with motivation, confidence, or boredom. Information about a student’s performance across modalities can position a teacher to more thoughtfully intervene while also facilitating richer recommendations by the Knewton engine.

Different types of content play different roles in a course. A whole year’s course will probably never be composed solely of games, nor should it be. But well designed games can provide uniquely rich, dense assessment material that serves to engage students more effectively than traditional content. Incorporating these games into an adaptive course can benefit instructors and students alike — helping teachers support students more effectively, providing students increased motivation to tackle course material, and improving both student and teacher insight into learning.

]]>One thing we’ve heard repeatedly in our conversations with publishers is a desire for a more quantitative means of evaluating how well their particular textbook content is performing for students. Publishers can and do solicit qualitative feedback from students and teachers, but this takes a lot of time and can be difficult to gather at scale. While they can also track observed metrics like student logins or usage times, this surface-level information only goes so far.

We’ve found that publishers really want to understand how well content actually contributes to student learning and helps improve achievement. They want to use this information to build better products, and improve existing ones. Knewton has built the world’s first and only web-based engine to automatically extract statistics comparing the relative quality of content items — enabling us to infer more information about student proficiency and content performance than ever before possible.

With Knewton Content Insights, this information is now available at a glance for publishers and other content creators.

Content Insights help publishers improve their content’s quality, quantity, and organization; make informed investments in their development cycle; and better support students using learning products. Here are a few examples of the information a Knewton Content Insights dashboard provides.

How well do individual questions assess a student’s understanding of a topic?

Questions with high assessment quality do a good job predicting students’ answers on other related questions. For these questions, knowing whether a student got the question right reveals a lot about how well she understands the given concept overall.

This information helps publishers figure out which questions to keep, and which to revise or replace — and ultimately improve learning experiences for students.

Where do students run out of content?

Exhaustion looks at how much content students are using in a particular area. For example, if students tend to use all the material in a chapter on cell division, publishers can focus on creating additional questions or instructional content for this topic. This helps publishers use their content development resources more efficiently and make sure students have the materials they need to succeed.

What is the relative difficulty of a given question?

Content creators can see how hard each question is for students, relative to other questions intended to test similar material. Difficulty is calculated by analyzing how often a question is answered incorrectly and how many tries it takes each student to answer correctly, along with other factors. The difficulty metric helps content creators better target their content’s level to students’ needs.

Publishers can also look at difficulty as it relates to exhaustion. If they see that students are using all the content in a certain area, *and* the topic is particularly hard, they will probably want to prioritize adding more material in that area.

With digital learning comes a huge improvement in product transparency. Digital learning empowers students, teachers, and schools to see exactly how much a given product impacts learning outcomes. Content Insights provide publishers with actionable data to build more effective products, and enable them to get ahead in the market by basing their sales efforts on proven outcomes.

Knewton Content Insights are currently live with several beta partners. Soon, they will be available to all publishers in the course of a normal Knewton integration. Feedback from current and prospective customers has been overwhelmingly positive.

Partners are using different metrics depending on their needs. One beta partner building K-12 products is particularly focused on content exhaustion — they’re using the metric to drive their content creation efforts, prioritizing topic areas where students tend to run out of content most quickly.

A partner in the ELT space is particularly excited about seeing data about the relative difficulty of their assessment items and groups of items — e.g., do students generally find the present simple tense easier to understand than the present perfect tense? They plan to use Content Insights to rewrite items that prove much harder or easier than other items that assess the same concepts, to provide a more consistent experience for all students.

We’re excited to see how future partners apply Content Insights to improve their editorial processes. Knewton will continue to expand the education insights we offer to partners. Ultimately, our goal is to help the world build better products that drive bigger learning gains for students.

*Interested in using Content Insights for your digital product? If you’re a current Knewton partner, talk to your partnership manager. Otherwise, **get in touch here**.*

]]>

Knewton’s mission is to bring personalized learning materials to all students wherever they are. But until now, it’s been impossible for students who lack consistent access to computers at school or home to benefit from adaptive learning.

It’s for this reason that we’re so excited about our new collaboration with HP. Together, we’re developing Personalized Print Learning Solutions to make Knewton-powered adaptive learning materials seamlessly available across print and digital platforms. Publishers and teachers can make print materials like textbooks and worksheets personalized for each student.

**How It Works**

Leveraging Knewton’s existing adaptive learning platform, publishers can use Personalized Print Learning Solutions to make digital and/or print content adaptive. Teachers can assign uniquely tailored lessons for students with only a smartphone and a printer. Imagine a teacher with a few dozen students, some of whom have computers at home and some of whom don’t. With Personalized Print, the teacher can assign a print worksheet, tagged by HP Link Creation Studio technology, that corresponds to the material she covered in class that day.

Once a student completes his work, the student or teacher can scan it with a smartphone. HP technology allows Knewton to receive the student’s responses with the simple press of a button. Just as in a Knewton-powered digital solution, Knewton adaptive learning technology analyzes the student’s answers, plus his past work, and calculates his proficiencies across all related learning concepts. Then, Knewton recommends what content the student should work on next, and suggests a new individualized content packet or textbook chapter for that student to complete for his next assignment.

Knewton’s collaboration with HP technology and their 5,000-strong global network of local print shops will enable publishers to create and deliver personalized chapters. The new “mini-textbook” can be delivered each week or as frequently as the school would like. Teachers can also print these personalized packets locally in the classroom. Students with computers can work on them online. Students who are struggling received targeted help to catch up with the rest of the class, while those who are ahead stay engaged with more challenging material.

Personalized Print opens up a whole new world of possibilities for publishers, teachers, and students. Publishers can offer personalized, made-to-order materials that are more effective and more economical than a standard one-size-fits-all textbook. Schools with tight budgets can provide the power of Knewton-powered adaptive learning to their students. Teachers can better support the diverse needs of every student in their classes. And students in all schools can access the exact learning materials they need to succeed, exactly when they need them.

For more on Personalized Print Learning Solutions, read the press release.

]]>

We’re always eager to learn more about how instructors are using Knewton-powered products in their classrooms. In this study, we looked at how college professors used a suite of science products that allow them to create their own homework assignments. Some assignments are identical for all students (“standard assignments”). Other assignments are personalized to address each student’s demonstrated needs (“adaptive assignments”).

In many cases, professors chose to make most or all of their assignments adaptive. In other cases, they used a combination of adaptive and standard assignments to create a unique workflow that suited their classes.

One pattern we observed was professors assigning frequent standard assignments, combined with less-frequent adaptive assignments.

Why might a professor do this?

When we looked at the content of each assignment, we found that the standard assignments often covered a small set of concepts, assessing the material learned in class that week. The adaptive assignments, on the other hand, reviewed the concepts covered in the previous month — possibly preparing students for an upcoming test.

By taking this approach, professors provided each student with a personalized exam review focusing on the concepts with which she was struggling.

Let’s examine what this strategy looks like in a real college chemistry class from spring 2014. The charts below show student activity on each day of the course. The top plot shows student work on adaptive assignments. The bottom plot shows those same students working on standard assignments. Each peak corresponds to a different assignment. (Note the lull in assignments towards the end of March. That presumably corresponds to students not doing homework over spring break!)

Below is an illustration of how one of these adaptive review assignments drew from material covered in three standard assignments. Each standard assignment is shown in a different color. The concepts from the standard assignments that appear on this particular adaptive review assignment are outlined in orange. Because the review assignment is adaptive, students are given a personalized set of questions that focus only on the areas on which they are struggling.

Below is a visualization showing all the concepts covered in the standard assignment.

Let’s look at one student’s adaptive review assignment. This student interacted with all of the concepts below during the standard assignments, but only the bolded concepts during the adaptive review. Note how this student’s review focused on material from Assignment 2 and Assignment 3.

In contrast, here is a second student’s adaptive review from the same class. This student received review material from all three assignments, with extra focus on Assignment 1 and Assignment 3.

By using an adaptive review assignment, this professor created differentiated learning experiences, allowing each student to focus on the review material that best suited his needs.

We also saw professors use standard assignments to help review concepts that were previously addressed in adaptive assignments. For example, in a college biology course, a professor chose to use short adaptive assignments for most of the homework. She also assigned three long standard assignments, which took about a month to complete and each had over 100 questions. Perhaps the professor assigned these as a take-home exam for students to complete as they learned each concept.

These are just two examples of how professors are using Knewton-powered products to help meet the needs of different students and classes. We’re excited to continue to see how instructors working with different populations of students (across grades, subject areas, and levels of knowledge) use Knewton to support instruction and improve learning outcomes.

]]>