Learning Analytics are the Ticket to Good to Remote Instruction

Norman Bier, Executive Director, The Simon Initiative of Carnegie Mellon University

Norman Bier, Executive Director, The Simon Initiative of Carnegie Mellon University

Like many schools across the country, Carnegie Mellon University’s first response to the COVID-19 pandemic was a flurry of action as we worked to move our 4,900 course-sections to remote instruction. In time, we realized the opportunity the crisis presented. Several months in, among other lessons, I learned that thoughtfully using learning analytics can help learn in remote instruction.

As Executive Director of CMU’s Simon Initiative, I have been working on expanding access to and improve the quality of education for many years. The pandemic highlighted the importance of something we already knew from research-active, asynchronous learning activities can be a crucial tool for instructors to make their teaching more effective and equitable. Providing asynchronous opportunities offers mechanisms for students to personalize and review their learning while also bridging some of the equity-related digital divide issues that synchronous online instruction can exacerbate.

Yet these types of activities can often be a challenge for educators to integrate into their instructional practice, in part because their asynchronous nature means that the students’ learning and misconceptions aren’t readily apparent to educators—how can we make this work more visible, so that educators can better target their synchronous efforts to students demonstrated needs?

 At CMU, we know that data can tremendously benefit both students and teachers, so we capture learner’s interactions to drive powerful feedback loops. Learning analytics take a wide variety of forms. The learning dashboard in our Open Learning Initiative courseware provides educators with an estimate of how well their class is meeting learning objectives and allows them to drill down to better see specific skills and misconceptions for driving class discussion or examples; it can also provide insights on individual learners for targeted support and guidance. Other tools can improve the design of learning activities, from pedagogical audits that can help make activities more robust to learning curve analysis, which helps designers understand how well their learning model is aligning with actual student performance and adjust accordingly. Still, other approaches can help both students and educators gain a new, shared perspective on activities; Docuscope, for example, provides a visualization of the rhetorical moves made in student writing.

Automated approaches should not be the goal of learning analytics. A good system should augment and support the people engaged in teaching and learning; these human-inthe-loop systems are a hallmark of the Simon approach. Too often, automation is paired with opaque systems of only limited usefulness, and frequently are focused on (or at least used) in solving problems that are at arm’s length to the core work of our educational institutions—identifying potential plagiarism, proctoring exams, rather than providing feedback or enacting learning. Such approaches are frequently bereft of a real evidence base (substituting marketing literature for real science). And beyond the concern that these tools may be literally in opposition to, rather than enacting learning, educators are increasingly worried about the ways that these applications gather and share data and the ways that they can surveil educators and learners alike.

Weekly Brief

ON THE DECK

Read Also

Adapting Rapid Response to Industry Realities

Adapting Rapid Response to Industry Realities

Dr. Katrina Johnson Leon, Dean of Instruction, Applied Science, Business and Technology, San Joaquin Delta College
Innovation in Education Starts with Changing Conditions

Innovation in Education Starts with Changing Conditions

Matthew Callison, Director of Innovation and Strategic Partnerships, South Fayette Township School District
Riding the AI Wave: Opportunities and Responsibilities for Educational Institutions

Riding the AI Wave: Opportunities and Responsibilities for Educational Institutions

Richard Walker, Associate Director (Digital Education), University of York
Herding Faculty: How Course Coordinators Drive Assessment of Learning

Herding Faculty: How Course Coordinators Drive Assessment of Learning

Kent Seaver, Director, Academic Operations, the University of Texas, Dallas
Designing with AI: Why Instructional Designers Still Need Human Mentors

Designing with AI: Why Instructional Designers Still Need Human Mentors

Melody Buckner, Associate Vice Provost, Digital Learning and Online Initiatives, University of Arizona
Leading Learning Technology: Reflections on Leadership, Innovation and the Future

Leading Learning Technology: Reflections on Leadership, Innovation and the Future

Rob Howe, Head of Learning Technology, the University of Northampton