THANK YOU FOR SUBSCRIBING
Be first to read the latest tech news, Industry Leader's Insights, and CIO interviews of medium and large enterprises exclusively from Education Technology Insights
THANK YOU FOR SUBSCRIBING
Dave Blanchard, EdD, Director, Online and Distance Learning, St. Cloud State UniversityDavidson Blanchard, EdD, is the Director of Online and Distance Learning at St. Cloud State University, where he leads institutional efforts related to instructional design, digital learning and academic technology strategy. His work spans higher education, P-12 systems and cross-sector consulting, supporting leaders and educators as they navigate instructional change, accessibility and the evolving role of artificial intelligence in learning environments. Through practitioner-focused collaboration, he helps institutions design learning experiences that are sustainable, equitable and aligned with how students learn today.
Artificial intelligence is often described as a disruptive force in education. New tools appear daily, faculty express concern about academic integrity and institutions rush to draft policies that promise control in an increasingly uncontrollable environment.
From my work across instructional design, academic leadership and cross-sector conversations, a consistent pattern has emerged that the real disruption is not AI itself, but the growing realization that many of our longstanding instructional assumptions no longer hold.
For decades, higher education has been built on a quiet set of agreements. We assume that learning is demonstrated primarily through individual production, that instructors can reliably distinguish original work from assisted work and that assessment design is stable enough to persist across generations of technology. AI has not shattered these assumptions; it has simply made them visible.
In conversations across P-12, higher education, workforce development and public-sector organizations, leaders are increasingly focused less on whether AI should be used and more on what learning expectations must change as a result. These discussions suggest a common realization that our systems were already under strain.
When students use AI to summarize readings, generate practice questions or reorganize their notes, they are not behaving radically differently from students who have always sought support through tutors, study groups or online resources. What has changed is the scale and immediacy of that support. The friction is gone. And when friction disappears, systems reveal where they were relying on it to function.
Across sectors, a shared tension continues to surface that AI has made visible how dependent many learning models were on opacity rather than intentional design.
Much of the anxiety around AI in education stems from a misplaced focus on detection and prohibition. Institutions ask whether AI use can be identified, regulated or banned, rather than asking whether current learning designs still make sense in a world where generative tools are ubiquitous. The uncomfortable truth is that many assessments were already fragile before AI arrived. AI simply exposed how heavily we depended on compliance rather than engagement.
"Institutions that treat AI as a facet of design much like accessibility, modality or scale, can integrate it thoughtfully into pedagogy, policy and support structures."
This moment invites a more productive question—what would the learning process look like if we assumed AI was always present?
In practice, this shifts the conversation away from policing tools and toward designing learning experiences that emphasize process, context and application. When students are asked to reflect on decision-making, connect theory to lived experience or iteratively revise their work with feedback, AI becomes less of a shortcut and more of a collaborator, one whose influence must be interpreted, not hidden.
This reframing also has implications for faculty workload and trust. Asking instructors to constantly chase new forms of misconduct is unsustainable and corrosive. Designing courses that make learning visible, scaffolded and meaningful is more durable. It also aligns with how professionals actually work beyond the classroom, where AI is increasingly part of everyday problem-solving.
Leadership matters here. Institutions that treat AI as a temporary crisis to be managed will exhaust their faculty and confuse their students. Institutions that treat AI as a facet of design much like accessibility, modality or scale, can integrate it thoughtfully into pedagogy, policy and support structures.
Importantly, this is not a call to abandon academic integrity. It is a call to redefine it. Integrity in an AI-enabled environment is less about the absence of tools and more about transparency, attribution and purposeful use. Students benefit when expectations are clear, consistent and aligned with authentic learning goals rather than reactive rules.
AI is not going away. The real choice facing education is whether we continue defending instructional assumptions that no longer serve us, or whether we redesign learning environments that reflect how knowledge is created, evaluated and applied today. The disruption, it turns out, was overdue.
Read Also
I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info

However, if you would like to share the information in this article, you may use the link below:
www.educationtechnologyinsightsapac.com/cxoinsights/dave-blanchard-nid-3702.html