At a recent AI workshop I was at, the word ‘trust’ kept coming up. Over the period of 30mins, several different contexts were given by various HE leaders, on how the word ‘trust’ was playing a role in stifling engagement in the use of AI within their institutions.
These included:
- Redundancy: an obvious and well-versed area, with some staff fearing that
AI will replace their role. - Governance: a lack of clarity on expectations and responsibilities when it
comes to AI usage. - Content control: a major concern by academics about how internal AI
systems would use their content. - Lack of knowledge: staff don’t trust their own capabilities when using AI due
to a lack of training or support. - Anxiety: the fear of doing it wrong and getting into trouble, a key concern
amongst students in terms of academic misconduct.
Upon reading the newly published Stanford HAI 2026 AI Index report, I realised
that this is not just an HE problem, but clearly a national and global one. The report
states that in Great Britain, only 39% of people said they trust AI to be regulated
responsibly.
For me, the trust issue goes entirely hand-in-hand with the AI engagement
challenges that institutions are facing. If we don’t trust something, then why would
we get on-board with it and use it?
So how do we build the trust with our staff and students?
I would suggest:
- Have informed, honest, two-way communications about what the
institution’s expectations are from AI; address people’s fears on workforce
implications, highlight identified challenges that may impact people (and
what’s being done to mitigate these), as well as the benefits that are hoped
for. - Ensure that time and resources are allocated to make everyone aware of the
appropriate policies associated with AI; so that people understand what is
expected from them when it comes to their usage of it and who is
responsible for what. - Bring transparency into communications and conversations around how
staff and students’ content will be used by AI; allowing people to raise
concerns but also understand the decision-making process. - Commit to all staff and students having access to quality, personalised
training; recognising their different roles and needs when it comes to using
AI, as well as the time required to carry out upskilling. - Create an environment that supports the use of AI; champion the
innovators, promote peer learning and good practise….and develop ‘safe
spaces’ where people can ask questions or just give it a go!
If your team or institution are facing challenges around AI engagement, please get
in touch.
About the author: Katie Steen is the co-founder of WorkSmart-AI, which specialises in supporting universities to adopt AI, through senior leader consultancy and task-based workforce training.
Katie and her co-founder Dave Weller have both worked within educational L&D and communications for over twenty years, most recently as Digital Skills Leads at the University of Exeter.
If you’re keen to find out more about what AI training and consulting packages they offer, please visit their website: WorkSmart-AI.co.uk
