2015-04-13



What is adaptive learning? How about predictive learning? Where do learning analytics fit into the big picture? Like other fields, the education technology community is not immune to jargon and catch phrases. Understanding some of the basic terminology used today can help all of us:

Describe current challenges and future goals

Ask questions about how specific technologies and approaches affect learning outcomes

Connect with the tools and strategies that have the potential to improve the online learning environment

Navigate the ever-growing number of technology options on the market

Below are just a few of the more commonly-used terms you’ll see referenced here on the Acrobatiq blog, and around the Web.  To help advance the conversation, I’ve included a broad overview the term, and then some specific application to how we think about the term here at Acrobatiq. While there’s no single source for definitions, these descriptions offer a good starting point as we think about innovations in online teaching and learning.

Adaptive Learning: The application of technological tools that provide students with a customized experience based on their progress and previous accomplishments with the materials, practice activities, and assessments. Students are presented with more, or less, challenging items as they interact with the materials and respond to questions.

Our approach to adapting the learning experience is conservative by most standards, in part because of our roots in evidence-based learning. Rather than simply depending on a statistical algorithm to force students down different pathways based on what we think they know or don’t know, or down sampling students to easier questions from those they answered wrong, we take a much more nuanced approach.

First, its contextually useful to know that our curriculum development process begins with identifying a Skill Graph (also sometimes referred to as a Knowledge Graph)  that defines a set of learning objectives and skills for what a student should know and be able to do by the end of a learning module. As students work through interactive learning activities, we collect activity data that’s then combined with other statistical parameters to generate a robust learning estimate for each student against each learning outcome. This point is important because we often get questions about how and what we adapt. Having an underlying Skill Graph forms the “what” by providing a clear view into  the relationships between learning objectives and sub skills.

For students with low learning estimates, we adapt the learning experience by enabling more practice opportunities on one or more specific skills. It’s really that simple. And it is just that simplicity that makes the approach effective. For students with high learning estimates, we enable less practice so they can confidently move faster through the material.

The benefit of this approach is that for faculty in blended or online learning environments, there is no “guessing” about how students progressed through the curriculum. Because of the underlying Skill Graph, it’s easy to see which skills students are mastering, and where additional help is needed. By adapting the learning experience with more practice at the end of each module for only those students who really need it, we can give students exactly the kind of personalized learning experience that helps get them “unstuck.”

Competency-Based Learning: Focused on mastery of specific knowledge and skills, competency-based models allow students to demonstrate their understanding and abilities via multiple strategies, e.g., exams, portfolios, and projects. This alternative to traditional academic classes is usually self-paced and provides each learner with the opportunity to earn credit or advance within a course or program according to his or her previous experiences and prior knowledge.

One critical component to effectively delivering CBL is the ability to understand and assesses students’ mastery of key competencies.  Like adaptive learning, understanding the component skills of a competency makes assessment that much easier.  Increasingly, because our curriculums have an underlying Skill Graph, CBL programs can benefit from the resulting data derived from measuring learning at this deep level.  Our ability to include human-graded rubrics in our statistical modeling of student learning estimates enables a wide range of project and portfolio-based learning possibilities.

Courseware: Learner materials, activities, and assessments are organized within a system that not only provides instructional content and assessments, but also tracks details about student progress and allows instructors to review their progress, evaluate challenges, and make decisions about possible interventions.

For many faculty, the notion of using “courseware” is not all that attractive, in large part because most courseware that’s available today can’t really be customized to fit specific course or program goals. At present, our curriculum (or courseware) can be customized only in that modules can be hidden or reorganized. Later this summer, however, we will be releasing a new authoring tool to support more granular levels of customization, including the ability to add locally developed content. For more on the faculty benefits of courseware, see Courseware: The Next Big Thing for Faculty.

Learning Analytics: This term is used to describe a wide range of data collection, analysis, and reporting techniques that inform decisions about instructional strategies and interventions. In the context of online education, the software that runs learning analytics can include everything from tracking student performance to identifying complex learning trends and problems, and is often integrated into courseware and learning management systems.

In our context, the analytics that we generate focus on measuring deep learning – or learning happening at the skill level. Formative and summative assessments are embedded in
the courseware and analyzed using a statistical model developed by learning scientists at Carnegie Mellon University.

The theory of learning that forms the framework of this computational model takes into account a number of factors, including both observations of learning (number of right/wrong answers, number of hints revealed, number of attempts per question, etc.) in addition to cognitive processes like cognitive load, rate of learning, learning decay, etc. The result is an accurate, real-time assessment of each student’s knowledge state against a defined outcome (like, for example, a learning objective).

Learning Optimization: The process of data collection and analysis informs a wide range of decisions about overall course design, including instructional strategy selection and implementation. Continued study of the benefits and challenges of various course features, functions, materials, and interactions allows for continued review and revision with a goal of offering the most effective learning environment for each student.

Again, in the context of Acrobatiq, we optimize learning by using learning analytics we collect from within our courseware as the basis for powerful feedback loops. Data makes possible effective instructional interventions, course corrections, and detailed student feedback based on individual student learning performance.  Student activity data also helps inform course designers about how students are performing on learning activities so that the curriculum can be continuously refined to produce the best learning outcomes.

Personalized Learning: Similar to the term adaptive learning, personalized learning has been broadly used to describe a flexible approach to educational activities that can be tailored to meet the needs of individual students. This term is also used to describe learning environments that allow students to create their own paths to achieving learning outcomes. This can include choosing from among multiple types of interactions, activities, modes of delivery (e.g., online, blended, in person; video, audio, text), and even assignments.

We think about personalizing learning first by understanding the desired outcomes of the learning experience, and secondly, by accurately being able to assess a student’s specific learning estimate against each defined outcome. Only then can we begin to ascertain what to personalize, and for whom and how.

Predictive Learning: Through the application of systems that apply mathematical modeling techniques, educators can more readily identify potential challenges a student might face in a course or program, based on his or her characteristics and past performance. Being able to anticipate the challenges allows time for instructors to intervene with guidance and support, before the issue becomes a problem that prevents learning success.

The Acrobatiq predictive learning model can be a powerful tool to help educators and others develop deep insight into student learning performance. By first understanding desired outcomes, and then developing opportunities for students to both learn new skills and demonstrate evidence of learning as they are learning, we can begin to use statistical predicative techniques. The benefit of this approach is that we can get out of front of students that might be at risk – and, conversely, accelerate students that might otherwise be slowed down in a one-size-fits-all learning environment.

As the evolution of online learning spaces continues, keeping up with the latest information can seem daunting. Connect with your institution’s instructional design professionals, and bookmark sites like The Glossary of Education Reform and the Association for Talent Development’s Learning Circuits Glossary for future reference.  Follow the Acrobatiq blog for more information about innovations in data driven curriculum design and personalized, adaptive learning.

Show more