Knewton technology will guide students to the right piece of Electric Company content at each point in time, customizing content to maximize both engagement and mastery. For example, Knewton can show each student more content from the characters they respond to best, or present material that teaches using the most statistically effective pedagogical method for that particular child. Knewton analytics will empower both parents and teachers with critical information about each student’s progress and engagement, highlighting key areas where the student needs additional remediation.
As we talked with Sesame Workshop about their goals, we realized how well aligned our companies are. We both have a strong commitment to improving student outcomes, an interest in leveling gaps between advantaged and disadvantaged students, and a data-driven, research-focused approach to assessing the effectiveness of what we do.
The partnership has special meaning to me. Almost 40 years ago, as a recent college graduate, my mother joined the research division at Children’s Television Workshop (now Sesame Workshop), spending 3 years helping understand the impact of their educational programming. It is exciting to once again become part of the Sesame family.
Joan Ganz Cooney founded Sesame Workshop to answer a simple question: “Can television be used to teach young children?” 45 years, millions of students, and dozens of beloved characters later, the answer is a resounding “Yes!” Today, thanks to differentiated instruction, digital devices provide an even greater opportunity to personalize instruction, engage children, and improve mastery of fundamental skills.
Early childhood is when differences in student ability first take root, growing into much larger problems later. The “Swiss cheese effect” leaves holes in student learning that only grow over time as students are pushed, lock-step, into more and more advanced material that assumes mastery of foundational skills in reading and math.
As Sesame Workshop’s commissioned research shows, students arrive in kindergarten and other early grades with wildly different ability levels. These differences correlate strongly with risk factors like being below the federal poverty line, coming from a single-parent household, having a mother without a high school diploma, or not speaking English in the household. Among students identified as high-risk, “the difference [in reading scores] at kindergarten entry is nearly as large as the gains an average child might make over his or her kindergarten year.” Similar issues develop for high-risk students in math and other topics. Sesame Workshop’s digital products aim to help reduce these gaps early on before they widen.
We’re also excited about the research potential of our partnership. Together, Knewton and Sesame Workshop will pioneer new methods for measuring student knowledge gains through digital products, games, and other engaging activities. We can quantitatively measure student outcomes, working together to continually refine and improve Sesame Workshop digital products over time.
For more details about the partnership, check out the full press release.
]]>The HMH Player enables more effective planning, instruction, and assessment — helping educators seamlessly and productively introduce technology into the classroom. The app features smart design, an enhanced user experience, and convenient implementation. As a student progresses through the HMH curriculum, Knewton analyzes data to figure out what a student knows and recommend what to study next — helping more students master material and get ahead. The HMH Player, which makes lessons accessible to students both online and offline, is the first time Knewton has worked with students in an offline mode.
Instructors can share lessons with colleagues, provide progress reports for parents, and predict grades. In today’s classroom — filled with students with different needs, backgrounds, and interests — it can be tough for one lesson to meet every student’s unique needs. Knewton-powered adaptive learning functionality helps teachers tailor instruction to each student’s specific areas of need.
The release of the HMH Player marks an expansion of Knewton’s partnership with Houghton Mifflin Harcourt. Knewton technology also currently enhances HMH’s Personal Math Trainer Powered by Knewton.
]]>One of the most exciting parts of my job as an implementation architect at Knewton is seeing Knewton-powered products come to life. After lots of research and hard work, Triumph Learning — one of Knewton’s first API partners — is launching Waggle, an educational product designed to help students practice skills and prepare for assessment in a supportive, engaging environment.
Powered by Knewton adaptive learning and aligned with National Common Core Standards, Waggle is built for grades 3-8 and available in both Math and English Language Arts (ELA). Teachers can use Waggle in the classroom or assign it as homework. Waggle uses games, review sections, progress indicators, and other fun features to engage students in the learning process.
Today’s classrooms are filled with students who have different needs, come from different backgrounds, and have different interests. Within Waggle, Knewton technology figures out what each student knows and recommends which concepts to work on next — helping all students meet course goals, master material, and get ahead. Teachers can assign specific assignments to their students, and Knewton will recommend the right practice items to help students complete the assignment successfully. Productive struggle is encouraged within Waggle: students can explore lessons and accelerate learning when appropriate.
For instructors, Waggle features a robust, easy-to-use interface that provides actionable information about their students, along with suggestions of ways to use that data to inform instruction and intervene when necessary. Educators can identify students who are struggling or excelling in certain content areas, and take immediate steps to remediate or accelerate as necessary.
Triumph built Waggle from the ground-up, integrating top-notch content, cutting-edge instructional design, and powerful adaptive learning. We’ve learned a lot from working with the team at Triumph Learning, and Waggle is truly a next-generation product of which Knewton is proud to be a part.
In creating and refining the models that produce these metrics, our data scientists attempt to answer questions like:
What is a student’s understanding of a particular concept?
How much productive time have students spent working?
What score can a student expect to get on an upcoming quiz?
The answers to these types of questions help teachers adjust lesson plans and target extra help for individual students. For example, imagine a product that lets a teacher group students and email them at once. Using Knewton analytics, the application could automatically generate groups of students based on current or anticipated needs. It could go even further and let a teacher assign students an “adaptive assignment,” which would dynamically guide every student through a unique set of supplementary material.
One of the metrics Knewton calculates is Active Time, which aims to help instructors better understand students’ productivity and engagement with learning materials. Rather than just measure total time spent working on homework or in-class assignments, Active Time shows teachers at-a-glance how much productive time students are spending. To calculate this, Knewton analyzes various facets of each student’s activity patterns, along with total time spent. Knewton takes into account any work done that is related to a learning goal determined by an instructor; for example, if a student does an extra practice test that wasn’t assigned by the teacher, this work would “count” too.
Most every teacher out there has probably tried (consciously or not) to make similar estimates of their students’ productive time. But thanks to a lack of real data, their estimations will always be rough. And trying to improve the estimations, even marginally, would take a whole lot of time that teachers don’t have.
But what if teachers did have access to this raw data? What kind of inferences could they make based on this information? As a thought exercise, let’s take a look at some real, anonymized data from a college-level math course that uses Knewton-powered digital materials.
First, here’s a visualization of a single student’s work in the math course. Each dot represents a time when a student did some work. The color of the dot indicates the topic covered. (The muddied color of the first dot shows that there were multiple topics covered in that introductory assignment.)
Together, a class of students’ work patterns look something like the graph below. Each row represents a single student’s work. The rows are ordered by performance, with the highest-performing student on top.
A teacher looking at this information would immediately see the variances in work patterns among students. Of course, nearly all the students are doing work during class time (this concentration of simultaneous activity is what creates the neat color blocks in the visualization). But from there, the patterns vary. A subset of high-performing students do a lot of additional work in between class sessions, unlike most of the low-performing students and a group of very high performers, for whom success seems to come easily.
This may seem pretty obvious: the students who do more work, seem to perform better. But while this may be true at a high level, it’s not always quite as simple as it seems. Let’s zoom into five individual students’ work patterns to see how a teacher might put this information into context and use it in a productive way.
Let’s look first at Phoebe, who is at the top of her class. At first glance, she doesn’t seem to be doing that much work — and yet she is still performing exceptionally well. But dialing further into her work patterns, we see this:
Phoebe is actually working a lot — she’s just doing it in concentrated bursts. In addition to working during class time (which falls around 11 am — we can see these consistent dots in nearly every student’s graphs), she is working directly before and after class. Looking at this, a teacher would probably be satisfied that Phoebe is engaged and has established productive study habits. However, her work begins to slow about ⅔ of the way into the course. Part of this might be explained by spring break (most of the students’ work patterns fall off for about a week), but it’s possible that she’s also becoming disengaged. Perhaps the work isn’t challenging enough; seeing this, the teacher might assign her some supplemental assignments at a higher level. Or perhaps the teacher can see that Phoebe, a high achiever, has overloaded herself with other classes — this might be an opportunity for the teacher to step in and help her figure out a way to balance her workload better.
Any teacher could infer that a student like Joey, who is performing at a very low level, isn’t putting enough time into his work. Looking at his work sessions backs up this conclusion:
Joey is clearly doing just the bare minimum of work required for the course. While the teacher probably assumed as much already, having concrete evidence discounts any arguments on the part of the student (“but I really am working!”). From here, the teacher could talk to Joey to try to ascertain why he’s not working — is he just not motivated, or is he frustrated because the work is above his head? Maybe he’d rather slack off than feel inadequate. If the latter, the teacher might direct him to work on foundational material to help get him up to speed. We can see in this example both the opportunities and limitations of data in education. Data can show patterns and make predictions — but it can’t on its own effect change.
Now let’s look at Monica, who is performing around the 60th percentile in the class.
Unlike Joey, it’s clear that Monica is working a lot (mostly after class, but also a bit before). But a lot of her work sessions last just a few minutes, as indicated by the lightly shaded dots. This information is interesting on its own. But in the classroom, combined with a teacher’s observations and insight, such information can be immediately impactful. Perhaps the teacher knows that Monica is working at a part- or full-time job in addition to taking classes. It might become clear that she is sneaking in quick work sessions on the job — doing a problem or two while waiting for customers, then getting distracted and coming back to it a few hours later when business is slow again. The teacher might use this information to suggest that Monica do her best to rearrange her study schedule. Perhaps she even experiments with spending less time on her schoolwork, but making sure that any time she does spend is uninterrupted. Rather than study everyday in between job requirements, she might try completing all her work in two weekly library sessions.
Finally, let’s compare two students. Ross’ performance is among the best in the class, while Rachel’s is somewhere in the middle. Their work patterns are fairly similar up until the middle of the course — they work consistently both during and after class. But around this time, Ross begins working a lot more. Perhaps this sudden of flurry of work was self-motivated. Or perhaps the teacher sat down with both students to discuss work habits, and Ross took the talk seriously while Rachel didn’t. Here, again, we see how data can help guide intervention — but can’t in and of itself affect change or flawlessly predict performance.
Teachers have been performing the types of nuanced analyses described above since the beginning of time. Is student A struggling because he doesn’t understand the material? Or is there something going on at home that’s distracting him? Or is he overwhelmed by his English course and unable to devote enough time to math? But an instructor with hundreds, or even dozens, of students will never have the bandwidth to perform this kind of in-depth analysis on every individual’s work patterns.
But if she did — what kind of information would she want to see? What questions would she use that information to help answer? This is what Knewton’s data scientists think about as they build the data models that produce inferred metrics like Active Time. Data can’t improve education on its own. It’s our hope that these metrics serve as one more tool in teachers’ toolboxes — helping them make more informed diagnoses and judgments, leading to interventions that address each student’s individual needs.
]]>Testing today can be anything from a mere bureaucratic hurdle to a nerve-wracking, future-determining experience for students. As computerized adaptive systems enable us to deliver truly continuous assessment, how will testing change? How will we use it to improve educational outcomes?
In Disrupting Class, Harvard Business School professor and expert on innovation Clayton Christensen provides an answer: “When students learn through student-centric online technology, testing doesn’t have to be postponed until the end of an instructional module and then administered in a batch mode. Rather, we can verify mastery continually to create tight, closed feedback loops. Misunderstandings do not have to persist for weeks until the exam has been administered and the instructor has had time to grade every student’s test.”
1. Clarified purpose of assessment for students
What exactly is the point of testing in the first place? To compare students? To facilitate their passing to another level? To assess their proficiency and award them a corresponding grade? Too often, the social context and negative emotions involved in assessment can get muddled with its purpose — generally to assess student proficiency and document it in some way. How often, for instance, are students allowed to pass, regardless of whether they truly demonstrate proficiency in a subject? And how often are they bored waiting for the next opportunity to level up and tackle new challenges? What percentage of school time is spent around academic anxieties and insecurities — and what percentage is spent actually doing the cognitive work of mastering new concepts, demonstrating mastery of that material, and developing a love of learning?
When assessment becomes continuous, students are given a constant stream of opportunities to prove their mastery. Assessment becomes embedded into the “flow” of learning, and students can demonstrate mastery as quickly as they choose to. The path to actual progress becomes clear and students’ lives become increasingly oriented toward real learning.
2. Mastery-based learning in action
Continuous assessment allows mastery-based learning — a teaching methodology premised on the idea that progress should be based on mastery instead of seat time — to be implemented.
Students who are struggling will not automatically advance before they have a chance to master the material at hand. On the other hand, advanced students can progress through material at their own pace and remain engaged by pursuing more challenging work as they pass out of the basics. In this sense, students cannot be satisfied with their achievements relative to others; they are encouraged to seek their own course and take responsibility for their learning.
3. Increased self-awareness for students
Time and again, we encounter evidence that self-awareness — understanding of how one feels, thinks, and learns (also known as metacognition) — is one of the most significant factors in professional and personal success. The renowned psychologist, Howard Gardner has argued that self-knowledge — “intrapersonal skill” — is one of the eight defining types of intelligence (the others being “linguistic,” “logical-mathematical,” “naturalist,” “bodily-kinesthetic,” “spatial,” “musical,” and “interpersonal”). The more continuously we assess students and provide feedback, the more knowledge they can gain about themselves — what it takes for them to master something, how they can approach problems differently, what their blind spots are, and how to eliminate them.
4. Greater insight into student needs for teachers
One of the biggest challenges facing schools and administrators today is the growing diversity of the students within their population–and a correspondingly increased diversity of needs to consider. Some struggle because English is not their first language; others have difficulty with focus or organization. Others may be particularly weak in some area but possess unusual strengths in another (which the existing curriculum may not take into account).
As every teacher knows, classroom management is a consummate juggling act. To remain attentive to the needs of all students, teachers must engage the more advanced students while helping the struggling ones catch up. At any given point in a lesson, a teacher must decide whether to move through the material aggressively and add more challenges and twists to the problems presented, or build in more of cushion for those who are confused. Any one of these strategies, including “teaching to the middle,” is bound to leave some students bored or confused.
Blended learning solutions that offer an analytics dashboard supported by continuous assessment give both students and teachers more freedom in this respect: students move through coursework at their own pace and teachers retain control over the classroom while gaining insight into the learning process. A teacher might discover through analytics that a student who is weak with math word problems is struggling because he has difficulty with reading comprehension; that teacher can then coach him through material that improves his grasp of syntax and vocabulary. Another student who understands mathematical concepts but has trouble with carelessness in arithmetic can receive feedback about how to develop stronger estimation abilities or check work once completed.
5. Discovery of the interrelatedness of concepts and subject domains for content creators
Continuous assessment generates a good deal of data around the efficacy of learning content and methodology. When we are able to analyze learning patterns around various concepts in a granular way, teachers, publishers, and administrators can uncover interdisciplinary relationships between subject domains and concepts. We might discover that effective remediation in a subject requires attention to another subject or that the root of common misunderstandings within a subject is something altogether unexpected.
For instance, we might uncover a relationship between quantitative/logical skill and English composition. We might discover that a specific order of teaching subjects (or even concepts) is remarkably effective — that logic and foreign language or fractions and musical harmony should be taught side by side, for example.
]]>Partners can implement recommendations and predictive analytics in a way that best serves their students and instructors. It’s exciting to watch their visions come to life as they develop products using the Knewton API.
Some partners are taking advantage of Knewton’s infrastructure to integrate adaptive learning into brand-new digital learning products.
Triumph Learning, for example, is building Get Waggle — a competency-based, CCSS-aligned product for grades 3-8 with some fun engagement elements, like a “lift meter” which shows students their progress around skills, standards, and goals, while also rewarding grit and performance.
Foreign language publisher Lelivrescolaire is working with Knewton to build an adaptive mobile French grammar app for middle school students, featuring a personalized study center. Lelivrescolaire is leveraging Knewton’s partnership with Gutenberg Technology, an end-to-end digital publishing platform, to seamlessly integrate Knewton recommendations.
Publishers like Cambridge Learning and Macmillan Education are also using Knewton technology to build new digital products; stay tuned for updates!
Other partners are using Knewton to enhance proven solutions. In Pearson MyLab with Knewton Adaptive Learning, students in reading, economics, math, and writing courses use an “Adaptive Study Plan” to suggest what to study next, in addition to an existing syllabus. See how it works.
In Pearson Mastering with Knewton Adaptive Learning, students follow a manually assigned assessment with an “adaptive follow-up” assignment — a dynamic set of activities to help in specific areas. This is an innovative new strategy for science education; Mastering includes courses in chemistry, biology, physics, anatomy & physiology, and other subjects. Watch how it works.
Another existing integration is Houghton Mifflin Harcourt’s Personal Math Trainer Powered by Knewton, a Pre-K-8 automatic intervention and acceleration tool that uses Knewton to create a progressive, personalized learning experience. Personal Math Trainer helps teachers emphasize depth of understanding and problem-solving skills in the classroom.
Students and teachers are using some of these products today. Others are still being built. Many others haven’t been publicly announced yet. For those that have been released, we’re paying close attention to feedback from students and teachers. Pedagogy, user experience, content — these are our partners’ areas of expertise, and the decisions made in these arenas are theirs and theirs alone. But we’re eager to learn more about how personalized recommendations and predictive analytics work best for different populations of students and teachers, and to apply these insights in ways that benefit the whole education ecosystem.
]]>I often get asked about Knewton’s choice to be a technology infrastructure provider rather than an app developer. Why build a platform to help others make adaptive apps? Why not just make them ourselves?
Focus
In Knewton’s early days, we actually did build apps (for test prep and college math readiness). This was never our ultimate goal, but we needed proof-of-concepts for what was — our adaptive infrastructure platform.
Knewton’s infrastructures enable proficiency-based adaptive learning. We can measure not only what students did, but also what they know, how prepared they are, and how their abilities evolve. We never thought of ourselves as experts in how to design apps. We want to support those experts.
Supporting Innovation
There’s a remarkable amount of innovation in the edtech space. Other organizations will experiment with new types of digital materials and experiences. Knewton’s infrastructure helps support this creativity — allowing others to focus on their core competencies (content development, user experience, pedagogy, distribution) while still providing proficiency-data-driven personalization to students.
“Warm Starts”
As a platform, we can provide students using Knewton-powered apps with features that wouldn’t be possible in the closed-app environment that has heretofore dominated the education industry.
Today, students walk into classrooms each September as if they were just born. Teachers must learn everything about them from scratch. Knewton-powered apps change this, allowing each student to start courses “warm” by connecting his or her learning history to every app. Instructors see students’ proficiency in individual concepts, how they study, and how well different strategies work for them. Students get learning products that provide the exact material they need, when they need it. This learning history stretches across grades and subjects, for every Knewton-powered app — helping all students maximize their potential.
Network Effects
The vast number of students across Knewton-powered products produce potent network effects. Knewton uses the combined power of every student’s anonymized data to help every other student learn more deeply and effectively. Our API partners’ products share the benefits of these network effects.
The Knewton team comprises many former teachers, content developers, and instructional designers, in addition to software engineers and data scientists. We recognize the fundamental importance of pedagogy, content, teachers, and user interfaces. Our platform is just one part of this larger ecosystem.
Early on, Knewton decided to focus on the science of digital education — to devote the time and technical expertise to building an adaptive learning infrastructure — to make it easier for our users to perfect their art of producing world-class education apps.
]]>In today’s age of big data, words and phrases like “adaptive learning,” “personalization” and “differentiation” are getting tossed around with increasing frequency. What exactly do these terms mean and to what extent do they overlap?
To learn more about Knewton Adaptive Learning, download the whitepaper here.
]]>Here’s an example of how this works. Suppose that a student at the beginning of her first Knewton-enhanced course is struggling with a word problem which involves calculating the area of a triangle. Assume we know nothing about the student aside from this fact. (This is an uncommon scenario — students who have made any progress in a Knewton-enhanced course or who have taken previous Knewton-enhanced courses will have already generated proficiency data that can help inform recommendations.) Knewton must determine why the student is getting this exercise wrong, so that we can recommend content that helps her learn the skills and concepts required to solve this problem.
Specifically, we must ask the following question: what is preventing the student from solving the triangle problem? There are several possibilities. It may be the case that she doesn’t know how to calculate the area of a triangle. Perhaps she struggles to read and interpret word problems. Maybe the base and the height of the triangle are given as decimals and she doesn’t know how to multiply decimals. There is even the possibility that she doesn’t know how to multiply integers!
It might also be the case that she can find the area of a hundred triangles with her eyes closed while she taps her head, rubs her belly and hops on one foot, and that she’s simply distracted by a computer game that she’s toggling to and from as her teacher wanders in and out of eyeshot of her computer screen. This last possibility is an important one. However, for the purposes of this example, let’s assume that the student is engaged and that her difficulty stems from the fact that she doesn’t understand one or more of the skills or concepts I described above. As you’ll recall if you’ve read one of our posts on knowledge graphs or checked out our white paper, we refer to these concepts as prerequisites.
Let’s list the prerequisites for the triangle word problem again:
Using circles to represent the concepts and arrows to represent the prerequisite relationships, we can draw a diagram:
After we’ve identified the prerequisites, we must then assess the student’s proficiency in these areas so that we can recommend content that helps her learn the necessary concepts to solve the triangle word problem. For example, if she does poorly in assessments on calculating the area of a triangle, we can recommend additional content that helps her master this concept. But where do we start? Should we start by giving her an assessment on prerequisite 3 (Calculate the area of a triangle), or should we start with something more basic, like prerequisite 1 (Multiply integers)?
As you ponder this question, you might notice that the prerequisites in this example are not necessarily independent of one another. For example, it is unlikely that the student can multiply decimals if she cannot multiply integers. In fact, the content in this course that is associated with multiplying decimals expects and assumes that the student is able to multiply integers. In other words, 1 is a prerequisite for 2! Furthermore, prerequisite 1 (Multiply integers) is only important to the triangle problem as it relates to prerequisite 2 (Multiply decimals). In this case, we say that prerequisite 2 subsumes prerequisite 1 because it transmits the knowledge from 1 that is required to solve the triangle problem.
We can adjust our diagram to reflect this as follows:
How does this observation about the relationship between prerequisites 1 and 2 help us determine what the student does and does not know? Let’s imagine that we give the student an assessment on prerequisite 1 (Multiply integers) and she aces it. All we can say is that she’s proficient in prerequisite 1. However, what if we give her an assessment on prerequisite 2 (Multiply decimals) and she aces that? Since we know that the assessments for prerequisite 2 expect and assume that the student is proficient in prerequisite 1, then based on her performance in prerequisite 2, we can be fairly confident that she is proficient in both 1 and 2. Conversely, if she fails prerequisite 1 (Multiply integers), it’s probably safe to say that she is not proficient in prerequisite 2 (Multiply decimals) either. In other words, we can estimate her proficiency in certain concepts without having to directly assess her on them.
There is also the case that the student knows how to multiply integers but does not know how to multiply decimals. We can only determine this by assessing her on both concepts, and therefore, the subsumption relationship does not help us in this scenario. We can, however, use information about how similar students performed in the past to help us identify this scenario. (In a future blog post, we’ll expand on network effects, or how we utilize data about other students’ activity to inform the recommendations we generate for each individual student.)
The example above involves just a few concepts, but for a typical Knewton-enhanced course, we map out the relationships between hundreds of concepts. Knowing what concepts subsume other concepts allows us to eliminate concepts that we think students are already proficient in and more quickly hone in on what they should study to meet their goals. It’s like when the blocks disappear in Tetris!
To summarize: rather than assessing a student on every single prerequisite concept, we can use our understanding of the content — specifically, the relationships that exist between the concepts — to make intelligent inferences about what the student does and does not know. As a result, we can generate recommendations that lead to a more efficient use of the student’s time and energy.
In this post, I’ve described one way that we use our understanding of content to make learning more efficient and effective for our students. In a future blog post, I’ll talk about how we use goals that are defined by students and instructors to help us generate better recommendations.
For more from the Knewton Adaptive Instruction team, check out Jesse Sternberg’s post on the cross-disciplinary approach of the Knewton knowledge graph and Matt Busick’s post on the power of a knowledge graph.
]]>Mrs. T is also aware that the problems on this quiz call upon and use previous skills in her course, skills she refers to as the section’s prerequisites. For example, a section from Chapter 9 introduced word problems involving perimeter, a Chapter 11 section covered how to interpret histograms and charts, a Chapter 12 topic provided a basic understanding of functions, and an earlier section in Chapter 13 covered combining like terms.
Mrs. T wants to help Stu review these prerequisite skills, so she examines her gradebook to see what scores he received in each topic. He seemed to do fairly well (over 80%) on the quiz from Chapter 9 on perimeter word problems and the quiz from Chapter 12 on functions, but he struggled (lower than 60%) with the section from Chapter 11 on interpreting histograms and charts and the earlier section in Chapter 13 on combining like terms.
Mrs. T creates a packet of review materials and puts together a remedial quiz containing questions from those sections in Chapters 11 and 13. Her hope is that after Stu has gained proficiency in the prerequisite skills, he will better understand the Chapter 13 material and be ready to re-tackle the quiz on adding and subtracting polynomials.
This sort of personalized remediation is indeed carried out by the best teachers in our schools, but it can be especially difficult if the student continues to fail in upcoming topics, and it becomes a Herculean, if not an impossible, task if the teacher has to carry it out for all the students in all her classes.
In a Knewton-powered adaptive course, the intricate scenario just outlined happens at the click of a button — thanks in part to the Knewton knowledge graph. The knowledge graph is a cross-disciplinary graph of academic concepts; within the graph, concepts have prerequisite relationships that help define a student’s path through the course.
When a student fails a topic in a Knewton-powered course, he or she is instantly remediated with prerequisite skills, prioritized on the strength of their relationships to the topic at hand and on the student’s demonstrated strengths and weaknesses. This frees Mrs. T from the administrative work of locating all prerequisite skills and correlating them with each student’s past performance. She now has more time to orchestrate classroom activities, introduce creative group work, or sit down with each student to address their misconceptions and encourage them through their frustrations.
Knewton’s knowledge graphs, carefully constructed by subject matter experts, incorporate the connections identified by experienced teachers into the course itself. Learning is by nature an extremely interrelated activity, and with a knowledge graph an adaptive platform can take full advantage of those connections when scaffolding students and guiding them toward mastery.
As more and more students progress through a Knewton course, the strength of these connections are refined over time. We may find that some prerequisite skills are rarely helpful, or only helpful to certain types of students with identified weaknesses, while others are extremely effective as skills to review before students return to a failed topic. The goal of such data-driven analytics is to mimic in real time, on a large scale, the sort of intuition a great teacher develops over his or her career. (For more on how Knewton uses student performance data to improve its recommendations over time, download the Knewton adaptive learning white paper).
Creating unique study plans for each student in a class would be incredibly time-consuming for teachers. By mapping each course to a continuously refined knowledge graph, Knewton does this automatically, ensuring that no student is left behind.