Skip to content

Challenges of generative AI in Higher Education

The integration of artificial intelligence (AI) into our daily lives is rapidly becoming more commonplace. With Generative AI tools such as Midjourney, Chat GPT, Dall-E and Grok providing an impressive array of functions (including research gathering, code creation, and custom digital artwork), it is clear to see the potential for its use and misuse in academia. 

Through the skilful use of well-considered prompts, AI systems can produce convincingly written essays, articles, and research that most individuals would be hard-pressed to identify as AI-authored. In this era of technological advancement, universities specifically are faced with the challenge of recognising the growing presence of artificial intelligence (AI) in coursework submissions and research projects.

In this article, we will discuss some of the challenges higher education (HE) institutions need to consider, as well as some of the positive contributions AI could make in a research environment.

Positive uses of AI in academic settings

While AI poses significant challenges in preserving academic integrity, it simultaneously empowers universities and students in the domain of research. AI has the potential to revolutionise research by streamlining data analysis, aiding in pattern recognition, and enhancing hypothesis generation.

For subject areas where researchers grapple with large volumes of data, such as physics, sociology, bioinformatics, genetics, climate science, and social sciences (to name but a few), AI tools can quickly process and analyse these datasets, offering insights that might take researchers years to uncover. AI algorithms also excel at recognising patterns within data. This capability is a boon for fields like medical diagnostics, where identifying subtle correlations in medical records can lead to earlier disease detection and improved patient outcomes.

AI can also be used as a tool to help generate hypotheses based on existing data, suggesting avenues of research that human researchers might overlook. This collaboration between human intellect and AI’s computational power can drive innovation and has the potential to break down silos between academic disciplines.

On a broader scale, AI can be a very useful tool to help students structure their essays and gather information, and act as a prompt for carrying out further research or considering a broader array of topics.

But AI should be used with several caveats in mind.

The threat to academic integrity

One of the most obvious and pressing issue universities face with the rise of AI is the persistent threat to academic integrity. While AI has enabled more robust plagiarism detection tools, making it increasingly difficult for students to submit someone else’s work as their own, it has also opened the doors to more sophisticated and harder-to-detect forms of academic dishonesty.

Students can employ AI to generate essays, reports, or answers that are more difficult to trace back to their sources. This form of ‘smart cheating’ challenges universities to adapt their anti-cheating strategies continuously which is time-consuming and not always possible.

Universities need to confront the ethical quandaries posed by AI. Where does the line blur between legitimate assistance and unethical academic support? How do they strike a balance between leveraging AI for educational benefits and safeguarding the integrity of the learning process?

Proposed pathways for universities

Policy updates

The rise of AI presents a dual-edged sword for universities, where academic integrity faces new challenges. To thrive in this digital landscape, universities must carefully consider their existing policies on academic standards and expectations of students. Universities must establish clear ethical guidelines for the use of AI in coursework and research where the boundaries between acceptable assistance and academic dishonesty are transparent and well-defined.

Student and staff training

Another essential aspect is staff and student training. As AI continues to evolve, universities should invest in faculty development to ensure that course leaders and students are well-versed in how to use AI tools effectively and ethically, whilst preserving academic integrity. Students must learn that research gathered by an AI must always be validated by rigorous scholarship and critical thinking and not accepted at face-values. Just as humans are fallible to bias and stereotypes, so too is AI. While AI tools can generate information that may seem credible at first glance, they are only as good as the sources of information they have been trained on – leaving room for misinformation, dubbed an ‘hallucination’.

Guidance on AI markers

Alongside updating policies on AI use in academia and staff and student training, ideally universities should provide additional guidance on certain ‘red flags’ that strongly indicate AI usage. At present, despite the existence of anti-plagiarism solutions like Turnitin, Copyscape, and GPT-3 Detector, there is no system, as yet, that can definitively detect the use of AI. There are, however, certain markers that course leaders can look for, signalling the need for closer scrutiny of the submitted coursework.

In this digital era, where AI and academia intersect, universities must navigate a path that preserves the core principles of learning and knowledge dissemination while harnessing the power of AI for the advancement of education and research. By doing so, higher education institutions can reap the benefits of AI whilst avoiding the worst pitfalls.


About the author: Tamar Elderton-Welch is Head of E-Learning at Marshall E-Learning. Tamar’s key areas of interest are learning theory, gamification and emergent technologies. With a background in Ancient History, Tamar also likes to keep in touch with the latest journals and publications that relate to religiosity in Roman Egypt.

Marshalls is creating guidance for universities on this topic as well as other AI related topics. If you are interested in learning more about our upcoming generative AI eLearning courses or would like to learn more about AI impact on the HE landscape, contact us.