Matthew L. Bernacki, Ph.D. – a leading scholar in learning sciences and learning analytics, and associate professor and the Kinnard White Faculty Scholar of Education at the UNC School of Education – served as guest editor of a January 2025 special issue of the Journal of Educational Psychology titled “Leveraging Learning Theory and Analytics to Produce Grounded, Innovative, Data-Driven, Equitable Improvements to Teaching and Learning.”
The issue’s introduction, written by Bernacki, frames the importance of learning analytics and how they can be used to test and improve education theories and to represent hard to observe processes. Bernacki also discusses how learning analytics researchers can benefit from sturdy guardrails that learning theory can place on learning analytics tools, especially as they increasingly involve artificial intelligence.
“For some time now, Dr. Bernacki has driven a growing field of learning analytics research,” said Jill V. Hamm, interim dean of the School. “He and his collaborators, which include UNC faculty members, doctoral students, and more, are leveraging data generated by technologies K-12 and university students use to learn and additional data provided by students to bring new insights to learning that will change how we teach and learn around the world. Matt’s leadership of this special issue is a nod to the importance of his scholarship and the discipline.”
Bernacki’s article goes on to highlight “the potential advantages of integrating learning theories common to educational psychology scholarship with learning analytics as a methodological approach that can improve the representation of individuals and their actions in the environments where they learn.”
Bernacki notes the methodological approaches “involved in learning analytics include the study of event-based data produced by individuals in learning environments where they use technology.” In his own research, Bernacki examines event-based data produced by college students using learning management systems in introductory science and mathematics courses. Data produced by this work have yielded timely information for teachers, specifically helping them to identify students who might be at risk of course failure and then provide student-specific interventions.
Bernacki calls the studies published within the issue “exemplars” that show “how the tremendous potential of learning analytics can be more fully realized when practices are informed by insights and guidance from psychological theories of learning” and “that when analytics are aligned to reflect theorized phenomena, learning analytics can be plied to test novel research questions that include variables that heretofore were challenging to observe and that have stymied the study of learning.”
One title featured in the special issue – “Using theory-informed learning analytics to understand how homework behavior predicts achievement.” – features research by Bernacki; Jeffrey A. Greene, Ph.D., McMichael Professor and interim associate dean for research and faculty development at the school; and Robert D. Plumley, a doctoral student and the School’s research associate in adaptive learning. The article was led by collaborators at the University of Tübingen in Germany.
“This special issue is meant to invite those thinking very hard about the design of technologies and tools for learning to partner with those who think hard about how people learn, so theorists, designers, and learners can benefit from better research and development on tools for learning,” Bernacki said. “Learning technologies will continue to shape teaching and learning, and as more and more diverse students use them in their day-to-day learning activities, the data they produce when they learn have immense value.
“If we can study learning at the scale and precision that learning analytics affords, we can better represent learners and use better, more representative data to test and refine the theories we use to describe the learning process.
“I hope this special issue convinces the learning, information, and computer science communities on campuses to gather and consider how working together can improve the theories they study and the products they provide to people who wish to learn about their world.”
In addition to editing the recent special issue, Bernacki also served as lead author of an article – “Using Multimodal Learning Analytics to Validate Digital Traces of Self-Regulated Learning in a Laboratory Study and Predict Performance in Undergraduate Courses” – that appears in the February issue of the Journal on Educational Psychology. Co-authors include doctoral student Linyu Yu; recent post-doctoral researcher Shelbi L. Kuhlmann, Ph.D., who is now an assistant professor at the University of Memphis; Plumley; Greene; doctoral student Rebekah F. Duke; Rebekah Freed (’22 Ph.D.), who is now an academic skills counselor and learning specialist at the University of Washington School of Pharmacy ; doctoral student Christina Hollander-Blackmon; and Kelly A. Hogan, a former UNC biology professor now at Duke University.
The article details methodological advancements in learning analytics that help researchers to better understand student learning in digital learning management systems. The article reveals that what students said aloud aligned with actions taken in those systems – actions that provide “digital traces” – and was reliable across all students. By using this student-provided data, the team found that students who completed tasks in the system in future semesters were more likely to perform better on their exams.
Following is a Q&A with Bernacki about the special issue and his team’s recent scholarship.
What is the need for a special issue focused on learning analytics right now?
More and more, learning technologies are being incorporated into instruction, and the data that students produce when they use them are being made more easily accessible to teachers and researchers. The special issue is meant to seed the imagination on ways that these data can answer important questions about learning so we can update theory and improve practice. Learning analytics is a technique that can help with that.
Of the articles featured in the issue and the considerations they present, is there one that stands out to you as one that’s particularly important now?
I really enjoyed the editorial process and am thrilled with the studies we were able to include. There were also dozens of exciting articles that were proposed but the timeline for the special issue was tight, and they couldn’t be developed in time to make the issue. I hope they’ll appear in the Journal of Educational Psychology when they’re ready.
One article that did make the special issue is a piece by Jason Reitmann and colleagues, which shows how eye tracking software and learning analytics can be used to make sense of a really relevant challenge in education: mind wandering. We’ve all found ourselves staring at a page we intended to read, and either wandered off in though unknowingly, or made the conscious choice to let our thinking wander.
This group tracked dimensions of eye movements and submitted them to models that were able to not only detect mind wandering, but also the type of mind-wandering. That’s an exciting finding because it addresses key questions about attentional processes that are essential to learning and are relevant to typical learners and those with learning differences like attention deficit hyperactivity disorder, among others.
The articles presented in this special issue are primarily written for researchers, but are there takeaways for educators? Professionals working in EdTech?
The authors who submitted articles to the special issue embraced a couple of challenges: They were responsible for writing to two research communities – educational psychology and learning analytics. They were also asked to provide professional development to both the educational psychology researchers who might wish to adopt the learning analytics approaches they describe, and to provide a clear summary of the practical implications as they could impact instructional designers and educators (in the discussion section), and to highlight these in an educational impact statement that accompanies the abstract.
For example, the Reitmann paper on detecting when students’ minds have wandered from a reading task have direct implications for practice. A detector could be built into an educational technology and applied in real time as students read. When mindwandering is detected, different kinds of instructional decisions could be made to support the learner.
What should education leaders and policymakers be considering when it comes to learning analytics?
The biggest takeaways are that data are everywhere and can be accessed, but that it takes someone with an appropriate lens to be able to see what the data reflect about learning and instruction. The papers show how lots of raw data were accessed, but it wasn’t until someone took a theoretical lens to the data and used it to carve out, or even create, key metrics that described key processes or predicted important outcomes. It’s worth investing in data systems, and in analytic teams with complementary expertise to derive value from them.
What should we expect of learning analytics research in the next five years?
Like most sectors, the growth in artificial intelligence is capturing the attention of the learning analytics community. These researchers have made quick strides investigating capacity and affordances of generative artificial intelligence for learning. I expect they will continue to explore the newest technologies to probe the possibilities they offer. I expect a second wave of slower work will follow right behind each new wave of AI, where teams of researchers with theoretical and analytical expertise will work more systematically to experiment with these technologies and collect richer data about how the moves humans and AIs make contribute to learning and achievement.
Your most recent publication validates “digital traces” in self-regulated learning by enabling students to describe their learning aloud. This sounds time intensive. Why is it so important to understanding digital learning?
Learning analytics begins with raw data, and what these raw data represent about learning is anyone’s guess. We can make informed guesses when we co-design with instructors who have reasons why they give digital learning resources to students. Validating digital traces is the essential step to test those hypotheses, and confirm that we’re right that a click on an object, a navigation move, or some other trace of a behavior actually reflects a learning behavior.
Without that, we’re making some informed guesses and then scaling them to hundreds or thousands of learners. It seems important that we validate those assumptions, so that we can know what students are doing and then leverage what we know about that learning process so we can help them execute it well, and benefit from doing so.