Embracing Gen Ai Responsibly: The Challenges And Opportunities For He Institutions

Richard Walker, Associate Director (Digital Education), University Of York

Richard Walker, Associate Director (Digital Education), University Of York

Dr Richard Walker is the Associate Director (Digital Education) at the University of York, where he leads the Digital Education Team. His research focuses on blended learning course design, staff development for online instruction, and tracking technology-enhanced learning developments across the UK higher education sector. Dr. Walker has contributed to national reports on technology-enhanced learning and is strongly interested in sustainable e-learning practices.

Through this article, Walker highlights the transformative impact of Generative AI on higher education, emphasising the need for institutions to shift from prohibition to responsible integration, ensuring fair access, data security, and the development of digital literacy skills to prepare students for the evolving workforce.

Educational technology service leaders who were surveyed by UCISA in 2024 highlighted Generative AI as the leading technology challenge facing UK higher education institutions. Unlike other technology developments, Gen AI is seen as a mould breaker and disruptor to established learning, teaching and assessment practices. This view is backed up by more recent research. In a 2025 poll of UK undergraduates, the UK Higher Education Policy Institute reported that the proportion of students using Gen AI tools in some form had increased markedly from 2024, from 66 per cent to 92 per cent. The proportion using tools such as ChatGPT for assessments has jumped from 53 per cent last year to 88 per cent this year. These findings suggest that AI tools are now a staple part of the student experience and potential game-changers for the design of teaching and assessment activities.

While some HE institutions have attempted to prohibit the use of AI, this is not a sustainable long-term strategy. As recent UK Jisc guidance confirms, the pace of AI development outstrips the capabilities of AI detection tools - no AI detection software can conclusively prove that AI has written a text. Indeed, it is easy to defeat AI detection software with some clever adjustments. We must also be wary of the attendant risks that detection tools bring of false positives and the distress that this may bring to students in cases of academic misconduct.

“Ultimately, We Should Be Placing A Premium On Higher-Order Thinking, With The Onus On Data Synthesis And Analysis Over Information Recall And More Creative Tasks, Such As Inviting Students To Produce Artefacts Rather Than Write Essays”

But to think along these lines is to miss an opportunity. Arguably, a key mission of HE institutions is to educate students on the responsible usage of AI, providing graduates with the digital literacy skills and knowledge they will need for the world of work. This is consistent with the UK government’s ambitions, as highlighted in its AI Opportunities Action Plan (2025), which references the need to increase the numbers of AI graduates from HE institutions - and for those institutions to develop new courses codesigned with industry to train up the ‘tens of thousands of AI professionals needed by 2030’.

But this is easier said than done - realising the vision comes with a number of challenges. First and foremost, any implementation strategy should place inclusion and equity considerations at the heart of institutional planning for AI services. This starts with the principle of fair access. There is a huge disparity between paid-for and free-to-use AI tools regarding their features and output. The ability to pay should not affect educational outcomes, so we must ensure a level playing field. Practically, all students should have equal access to the same technology through centrally managed services.

Another key institutional responsibility focuses on data security. The proliferation of AI tools and the incorporation of AI functionality within existing tools and digital services makes data management harder to monitor. However, protecting personal data, academic IP, and copyright remain key responsibilities. This means establishing institutional AI services which are protected through educational licensing agreements. Educational licences ensure that staff and student personal data are not shared with the AI training model or with third parties, and the services are available to all in a secure way.

The level of training institutions offer can also play an important role in helping students and educators make the most effective use of AI tools. The training should highlight the efficiencies of research and study tasks whilst acknowledging the in-built biases and limitations in the training models that AI tools draw on. A critical appreciation of the output of Gen AI tools is needed, with attention to the risks of misinformation, hallucinations, deepfakes and the environmental costs associated with their use. But in a positive sense, we should also be encouraging educators to recognise the strengths of the technology so that they can design in AI usage responsibly - demonstrating how AI tools may be used to address learning outcomes in a complementary way, where there is a clear rationale to justify their integration within the curriculum. This could involve the use of tools such as Chat GPT to serve as personalised study aids to help with the explanation of complex concepts or as a stimulus for brainstorming and ideas generation.

The mainstreaming of Gen AI should be seen as an opportunity to pivot to more open and authentic assessment practices. Students are encouraged to use these tools but do so in an approved way, acknowledging, referencing and critiquing the outputs they have produced. This can be achieved through a staged assessment process by inviting students to generate text using AI applications and then critically review the quality of the output, with commentary on relevance, authenticity, bias, cohesion and consistency.

Ultimately, we should be placing a premium on higherorder thinking, with the onus on data synthesis and analysis over information recall and more creative tasks, such as inviting students to produce artefacts rather than write essays. This is also an opportunity for HE institutions to embrace more flexible and inclusive assessment formats, including multimodal assessment, involving elements such as infographics and video presentations, allowing students to demonstrate their understanding of key concepts in other ways. By seizing this opportunity, we can move a step closer towards addressing student expectations on ‘learner flexibility’ – namely, choice over learning and assessment methods and scope for personalised learning pathways within the curriculum.

Weekly Brief

Read Also

The Indispensable Role of Emotional Intelligence in K-12 Technology Leadership

The Indispensable Role of Emotional Intelligence in K-12 Technology Leadership

Steve Richardson, Director of Information Technology, Homewood-Flossmoor High School
Reimagining Learning in a Digital World

Reimagining Learning in a Digital World

Dr. Darren Draper, Administrator of Technology and Digital Innovation, Alpine School District
Simplifying Online Program Tuition: Residency-Based Pricing in a Digital Age

Simplifying Online Program Tuition: Residency-Based Pricing in a Digital Age

Jonathan Paver, Director of Online Programs, Minnesota State University, Mankato
Empowering the Future of Online Learning: A Holistic Vision for Transformational Education

Empowering the Future of Online Learning: A Holistic Vision for Transformational Education

Mark Campbell, Director of Online Learning, Holy Family University
Transforming Education Through Technology Leadership

Transforming Education Through Technology Leadership

Hector Hernandez, Director of Technology Operations, Aspire Public Schools
Preparing for Generation Alpha in the Age of AI

Preparing for Generation Alpha in the Age of AI

Kevin Corcoran, Assistant Vice Provost of the Center for Distributed Learning and Rebecca McNulty, Instructional Designer, Center for Distributed Learning, University of Central Florida