Like almost every industry and sector globally, higher education has been ideating, exploring, questioning, interrogating, researching, and strategising about the potential and very real impact of generative AI. And for good reason. With its capacity to support students in real time, enhance learning design, provide rapid feedback and accelerate personalised learning as well as reducing the burden of administration, Gen AI will significantly re-shape higher education.
Indeed, tools like ChatGPT, Midjourney, DALL·E 2 and Google’s Bard have set the scene for widespread sector disruption, as generative AI explodes in its use, application, and influence in higher education.
Here at OES, our AI experts suggest seven strategies for universities to consider right now, as they navigate the mind-boggling new world of generative AI.
1. Identify and explore the best AI opportunities
OES’s Generative AI Lead, David Paroissien says Gen AI presents profound opportunities for higher education innovation and advancement, from virtual tutors and chatbots to increased student support and more efficient teaching and administration.
There are so many ways universities can use AI, from creating or re-designing high quality learning materials to customising student learning tasks to delivering automated, personalised feedback, to achieving time savings for academics,” he says.
The challenge is to prioritise and identify which AI tools can best enhance and support your institution, people and programs.”
David sees the enormous potential for AI to boost equity and accessibility for regional and remote students, as well as those from culturally and linguistically diverse (CALD) backgrounds or those with different needs or disabilities.
Shifting the equity and opportunity paradigm is a high priority for most universities and also for the Federal Government, with the Australian Universities Accord Discussion Paper flagging the importance of ‘supporting students from under-represented groups to overcome barriers to participation and enjoy success.’
With the right groundwork in place, this is something AI tools can help address. Every university is different. Consulting AI experts or convening working groups with broad representation can progress decisions about which AI opportunities to investigate and adopt.” David adds.
2. Maintain integrity with clear and accessible guidelines
Universities understand the importance of proactively preserving integrity in teaching, learning and assessment when embracing AI.
While the potential of AI is huge, academic misconduct remains a critical issue. Universities will need to implement robust mechanisms to prevent misuse of AI tools and ensure academic integrity and trust remain at the heart of the degree,” he says.
With this in mind, TESQA plans to issue a request for information (RFI) to all higher education providers in 2024, to understand the development and progress of each institution’s strategy around the impact of generative AI on the integrity of their awards.
David also recommends universities urgently develop AI guidelines, regulations and policies that can be easily understood, adopted, applied and updated.
There’s an inherent challenge for universities, as AI tools are evolving so quickly. But having agreed guidelines provides a critical framework for decision-making.
Universities are not acting in isolation on this front, as institutions, corporations, governments and jurisdictions around the world grapple with AI regulation. The European Union’s AI Act and White House executive order with new standards for AI safety and security are two examples. Meanwhile, the Bletchley Declaration has been signed by countries committed to seeing AI used in a safe, human-centric, trustworthy and responsible way. Locally, the Australian Government is currently conducting an inquiry into the use of generative artificial intelligence in the Australian education system.
3. Rethink professional learning to quickly upskill
To keep up with the rapid pace of AI development, universities can provide staff with frequent opportunities to learn about and understand AI – even if they are quite informal.
Rethinking professional development can help educators and administrators stay on top of rapid developments in AI,” explains OES Director of Innovation, Veronica Moran.
Regular AI PD sessions could be as simple as teams workshopping ChatGPT prompts or sharing tips and tricks on how to use Midjourney. As everyone is learning how to use these new tools, frequent sharing of experiences can help collectively upskill and identify opportunities for agile, practical implementation.”
More formal brainstorming sessions and hackathons can also help teams innovate and work out which tools they can use to achieve the results they’re after.
With the potential of AI only beginning to emerge, David says “creating a system that allows for innovation” is critical.
4. Be on high alert for biases, and don’t remove the human element
There are myriad ethical, copyright, intellectual property and bias concerns about existing and emerging AI tools. According to Veronica, mitigating these concerns means keeping humans firmly in the driver’s seat, using AI to enable teaching and learning. She says universities should never allow AI to decide a student’s future.
Ethically we should not enable AI tools to determine whether students pass or fail. Human decision-making is critical. More broadly, AI should enhance, not replace, the support provided to students throughout their learning journey. If institutions approach the technology as a facilitator, they can amplify the level of support and ultimately enhance learning outcomes and KPI’s,” she says.
Veronica says integrating generative AI-powered virtual learning assistants and automated feedback tools provide a unique opportunity to personalise learning, catering to diverse learning styles and individual paces, while teachers and academics remain instrumental in guiding, mentoring, and inspiring students.
5. Reimagine assessment
This is a truly huge area for universities, but there’s no doubt AI is prompting a re-think of how we assess the attainment of skills to find, analyse, interpret and apply knowledge.”
AI will be the catalyst for widespread changes in higher education assessment, which many institutions have been considering for some time anyway. It’s a real chance to make assessment much more authentic.”
Veronica adds universities could look at re-structuring existing assessments to incorporate AI tools. She also recommends creating new assessment tasks that can’t be completed using AI.
6. Protect data privacy and security
AI tools may present cybersecurity risks such as hacking, threats to data privacy and data breaches, particularly if institutions are using student details or interactions to train large language models.
Veronica suggests universities and individual faculties carefully assess the data privacy policies and settings of their chosen tools and platforms.
Leading software platforms document how they are safely organising and protecting institutional data, but these documents should be regularly and carefully reviewed in light of the shifting AI landscape,” she says.
7. Communicate with your students
Students are key stakeholders as higher education is disrupted by AI. Veronica says bringing them on the journey will make for a smoother ride.
Students want modes of learning and assessment that are fair and transparent, and they want to graduate with the skills employers are seeking. Communicating regularly and openly about how AI is being used in your institution can help students understand and adhere to the new learning paradigm.”
The generative AI story is just beginning
We’re living through the 4th Industrial Revolution and as David explains, “we don’t know yet, the impact of AI on today’s students. How will students who have used AI throughout their school and university education be impacted in the way they learn and work in the future? We simply don’t know.”
Yet, it’s vital that higher education institutions embrace and upskill staff and students in applying AI to their work and learning experiences. Employers will expect graduates to be literate in how to use AI tools and also able to leverage them to achieve productivity gains.
At OES, we’re grounded in our value of supporting student success. We are developing ways to use AI to help students to develop skills, to learn and understand, not just get the right answer.”
We see AI as a massive opportunity to re-imagine higher education for the better. To do that, we’ll need careful, deliberate thinking, and guardrails to guide the technology in the best possible direction.”
Get in touch to find out more about the work our Gen AI team is doing in this space.