LSI Insights - Future of Higher Education
The student journey and AI: what should stay human?
Across higher education, AI could move us from simply digitising services to automating judgements across recruitment, admissions, learning support and progression. The strategic risk isn’t automation in itself. It’s that we start to define the student journey mainly by what’s easiest to measure, rather than what matters most. In an AI-rich environment, the real question becomes: which moments should stay human because they shape trust, identity and opportunity, and because someone must be able to stand behind the reasons for what happened?
Executive summary
AI could increasingly be used to link what used to be separate parts of the student journey into end-to-end systems that can predict, recommend and intervene. However, that can change what institutions assume about judgement, fairness, evidence and accountability. Done well, it’s an opportunity to redesign the journey so support is timelier and outcomes improve. Done poorly, it can quietly erode trust or widen gaps.
The practical question isn’t “AI: yes or no?” It’s: which decisions are genuinely safe to automate, and which must still require human judgement, with clear records of why a decision was made and who is accountable for it?
In what follows, I look at the student journey as an integrated system shaped by AI, where risks and unintended consequences tend to appear, which roles and moments should remain explicitly human, and what governance and decision tests can help balance innovation with trust.
Explore more