Get in touch

AI in Higher Education: 3 practical steps universities can take to forge ahead with generative AI

08 May, 2024 | by Veronica Moran, Director of Innovation

Universities around the world are embarking on their own strategic generative AI journey. What processes can they adopt to unlock the power of this emerging technology for the benefit of their students?

OES’s Director of Innovation, Veronica Moran, shares her tips for institutions just beginning to explore AI.

Defining the AI ‘why’ for higher education 

There’s no question that university staff and students are already using Gen AI. So, it makes sense for universities to ensure they are doing so safely, and with appropriate enterprise-grade tools.

According to Deloitte Access Economics, students are using AI at three times the rate of the general workforce, highlighting the need for universities to get on the front foot. The rate of academic integrity breaches has increased significantly, catalysing the need to re-think assessment types and policy change.

A further imperative is borne out in research, which shows that companies using AI tools are much more efficient. A study conducted by Harvard Business School and Boston Consulting Group revealed that using generative AI leads to major gains in productivity, efficiency, and performance, helping staff to work faster and produce higher quality outputs.

In a university context, determining the right approach for Gen AI adoption is critical to address disruption, harness efficiencies and innovate both the teaching and learning experiences.

1. Benchmark your university’s current AI capability and maturity

Universities should start by benchmarking their current AI capability across three main areas: people, processes, and technology.

Research the level of AI literacy staff currently have and plan for how that could be uplifted. Review the current strategy for innovation, including how the organisation evaluates and experiments with new technologies. Review data strategies, the types of data being stored and data governance procedures. Data is the fuel for AI, and a critical part of the AI ecosystem. Review AI offerings within current platforms in use at the University.

Thankfully, there are a range of quality assessment tools and frameworks to make this process easier:

  • The newly released EDUCAUSE Higher Education Generative AI Readiness Assessment can provide a sense of institutional preparedness for strategic AI initiatives and help universities understand the current state and the potential of generative AI at their institution.
  • The Accenture AI-maturity framework features 17 key capabilities universities can use to establish whether they are AI Experimenters, Innovators, Builders or Achievers. As large, complex institutions, most universities are likely to be at the Experimenter phase. However, this could vary across the institution. While some departments might be dipping their toe in the AI pond, others are using it more widely for learning analytics or chatbots.
  • It is also worth consulting HolonIQ’s open source digital capability framework to identify AI maturity gaps in the context of higher education capabilities across the entire gamut of the student journey.

2. Appoint an AI team to ‘test and learn’

Ownership is essential for pushing innovation forward. Institutions need a skilled and qualified AI team with executive level support and overarching responsibility for institution-wide AI assessment, strategy development, governance and implementation.

Having a dedicated team in place allows the organisation to more readily identify, explore and pilot AI technologies.

At OES we take a ‘test and learn’ approach, testing new AI technologies with small cohorts of students before running a full evaluation of their effectiveness and value. Universities can adopt a similar process to inform their AI direction and to select AI tools that will deliver the most value.

The ‘test and learn’ approach informed OES’s development of its AI-powered learning analytics tools that help universities support students to succeed with the right resources at the most appropriate time.

As early adopters of generative AI, we have followed the same process to develop our latest offering: an AI-powered teaching and learning platform that complements current learning management systems and associated tools.

We’ve already piloted one of the products in the platform – a Gen AI virtual learning assistant, named ALVIE, with a key university partner. The results from the pilot were exceptional, delivering improvements in student experience and performance and across every metric.

We are also using Gen AI internally to drive efficiencies and productivity within our own business by providing the advanced GPT capability in ChatGPT Team to those who are showing a keen interest in using AI to make their work more efficient. Be careful of externally built custom GPT’s however as they use external API’s and should be vetted by security teams before use.

Having a dedicated innovation process, dedicated AI team and advanced GPT capability for select staff has enabled us to move fast to discover the potential of these technologies and start leveraging to produce real, positive outcomes.

3. Tackle risks and establish AI governance

Of course, there are wide-ranging risks associated with AI tools, including unfortunate examples from around the world where proprietary information and confidential data was fed unwittingly into ChatGPT. Uplifting AI literacy and establishing clear rules for staff usage of AI tools is important to mitigate against this.

It is also important that universities understand how to conduct an AI risk assessment when developing or adopting AI tools. The newly ratified EU AI Act’s regulatory framework defines four levels of risk for AI systems: unacceptable, high, limited, and minimal or no risk. An example of high risk in an educational context would be to purely use AI to score exams.

Knowledge of and compliance with ethical AI standards such as the Australian Government’s AI Ethics Principles and how to develop responsible AI systems is crucial. To enable this, the establishment of a robust AI governance framework is vital.

University policies relating to AI should bake in principles such as transparency, fairness, privacy and accountability. Universities already effectively govern the collection and use of student data, and should approach AI governance with equal rigour.

Generative AI is here to stay

As generative AI continues to influence the way humans learn, work and play, higher education must take a proactive, process-led and rigorous approach.

As more and more organisations jump on the bandwagon, and students harness AI tools in their own lives, developing AI capability maturity should be a key goal for universities.

Next steps

Get in touch to find out more about the work our Gen AI team is doing in this space.

Read our round up of top generative AI considerations for universities.