The National Centre for AI in Tertiary Education has been established to help members unlock the power of artificial intelligence in order to deliver a fantastic educational experience to every learner. Amongst other things, the Centre will run pilots to evaluate AI solutions, provide advice and guidance on using AI, and support the ethical development and deployment of AI.
The Centre has also set up a community group, which will be a place where members can share best practice and lessons learned, upskill and learn more about artificial intelligence, ask and answer questions around AI and its uses in education, and connect with each other to find common solutions to shared problems.
Members of the Community got together virtually on 27th July 2021; here is an overview of what we discussed.
Uses of AI
A number of Community members discussed how and why they were using AI in their settings. We learned, for instance, about the use of virtual assistants to improve administrative processes for disabled students. Students can use the service (which responds to both text and speech) to register their disabilities and get relevant information and support, which can make processes more satisfying and less burdensome.
We also learned how AI can be used to help students visualise a corpus of literature on a particular topic. This application was found to be particularly effective for students who were just starting to learn about a particular domain, as it enabled them to make sense of the rich tapestry of ideas, research papers and nomenclature.
A number of members also discussed the potential for using AI to support the assessment process. One member noted that AI could improve the consistency of marking, whilst also saving educators’ time and enabling students to get a greater level of feedback on their work. They did note, however, that the use of AI for marking might be limited to shorter answers and less creative pieces of writing. Another member mooted that it might be more effective and appropriate to focus the use of AI in assessment on generating questions and answers, for instance for multiple choice questions. With this model, human assessors would still be expected to verify that questions were suitable. This would maintain human responsibility for assessing and making judgements on students’ work but would also allow educators to use their time more efficiently. More broadly, the value in using AI to save time which could then be used more effectively was raised at numerous points throughout the discussion.
We also heard about the potential use of AI in analysing and evaluating student data, so that colleges and universities could better identify their needs, and hence deliver more timely and focused interventions. On this topic, an appetite for expanding on the successes of learning analytics was also expressed.
Concerns and barriers to adoption
Several members raised potential barriers to successfully adopting AI in the tertiary education sector. One such issue was that the needs of different institutions – and even of different departments within the same institution – may vary so widely that it could be difficult to find effective solutions that worked across multiple contexts. That said, it was suggested that there might be more shared ground than institutions recognised, and that there may well be common problems across the sector that shared common solutions.
The problems arising from integrating new AI systems with existing systems were also mentioned on a number of occasions. It was suggested that the National Centre for AI could work with suppliers and institutions to streamline integration processes and hence minimise this barrier.
Cultural barriers to the adoption of AI were also raised. One member observed that trust in AI and the motivations for using it were essential for sustainable adoption. For instance, if stakeholders perceived that AI was only being used to cut costs rather than to enhance the quality of teaching, learning and research, then due levels of skepticism could undermine adoption. Similarly, if stakeholders were excluded from conversations around the use of AI, then this could further erode trust. Another member stressed that internal politics created its own set of barriers around the use of AI.
A further concern was that the increasing maturity of AI could potentially facilitate student cheating, which would significantly undermine the integrity of assessments and qualifications.
Suggestions for the National Centre for AI
As well as discussing instances of good practice and barriers to adoption, community members also presented ideas on how the National Centre for AI in Tertiary Education could support the accelerated adoption of AI. One member suggested that the Centre should aim to map out a holistic student journey, identify problems faced by students and institutions and thus explore the solutions that AI poses. Another member suggested that AI in tertiary education might be too broad a remit, and that it might be more effective to focus on either the uses of AI in a particular domain (I.e. Assessment) or the uses of a particular sub-category of AI (I.e. natural language processing).
The track record of AI in education to date was also questioned, with one member querying whether there were actually that many examples of the effective adoption of AI, and suggesting that the Centre should focus its attention on establishing a canon of mature, effective practice. Conversely, it was also suggested that the sector could learn effectively from what hasn’t worked in the past, so that pitfalls are avoided and mistakes are not repeated.
A further suggestion was that the Centre could spur innovation by initiating AI challenges in which PhD students would compete to design challenges to well-defined and pressing problems faced by colleges and universities.