The Scottish Parliament's think-tank

Seminar Report: AI in Scottish Schools – What do we know so far?

Tuesday March 19 2024


What challenges and opportunities lie ahead, and how can we ensure AI benefits all students? This event continued our critical conversation on the policies, strategies and practices shaping artificial intelligence’s future in Scottish schools. It brought together educators, academics and other experts and provoked discussions on key themes like ethical considerations, equitable access, and collaborative development.

What challenges and opportunities lie ahead, and how can we ensure AI benefits all students?

This is the third in a series on Learning through Life in the 21st Century, which the Goodison Group in Scotland is running with support from the Futures Forum


Report


Summary

The seminar, held on March 19, 2024, continued our exploration of Artificial Intelligence (AI). Hosted by the Moray House School of Education and Sport, University of Edinburgh, the event was opened by Sir Andrew Cubie, Chair of the Goodison Group in Scotland (GGiS), who welcomed attendees and emphasised the significance of AI in education. He highlighted the rapid pace of AI development, driven by global powers, and stressed the importance of embracing this change. He underscored the role of GGiS as a convening organisation, providing a safe, collaborative space for discussions on AI in Scottish education.

Attendees heard insights from Professor Judy Robertson and Dr Jen Ross. They highlighted early findings from a project running in partnership with the GGiS “Towards Embedding Responsible AI in the school system”. This is part of the Bridging Responsible AI Divide (BRAID) programme, a UK-wide programme dedicated to integrating Arts and Humanities research more fully into the Responsible AI ecosystem, and is funded by the Arts and Humanities Research Council. The aim is to understand young people’s perspectives on AI in education through arts-led workshops, develop creative forms of AI literacy learning, and shape the future of AI use in education.

Early findings from initial stakeholder interviews revealed diverse opinions on AI’s role, ranging from a transformative tool to an extension of existing digital tools. The project highlighted the need for a cohesive national policy on AI use in Scottish schools.

A second project, funded by the Economic and Social Research Council, will be starting shortly. It aims to develop teaching resources for AI futures thinking in Scottish secondary schools. It will support teachers in piloting these resources and share insights with education stakeholders.

The breakout discussions identified several key themes:

  1. Educational Focus: Emphasis on aligning AI tools with the purpose of education, learning, teaching needs and the principles of the Curriculum for Excellence.
  2. Ethical Issues and Commercial Interests: Concerns about the profit-driven nature of AI companies and unethical content scraping.
  3. Legislation and Approval: National principles and a regulatory framework are needed as Scotland lacks tailored policies. Practical tools and strategies should be developed with local authorities and educators. Could vetting and approval of tools help Scottish education bring some order to the current anarchic nature of the AI tools landscape?
  4. Inclusion and Inequality: Potential for AI to enhance inclusion but also exacerbate inequalities. How can we ensure equitable access to AI resources?
  5. Data Management and Privacy: Importance of ethical data gathering and managing the large volume of data generated about young people.
  6. Role of Schools and Educators: AI has potential to enhance efficiency, allowing teachers to focus more on direct engagement with students. Schools should focus on building capacities of learners, and educators must support safe and effective AI use.
  7. Assessment and Skills: AI necessitates reevaluating assessment methods and the skills required for the future.
  8. Collaborative Development: AI tools should be developed with input from educators and learners.
  9. Need for Collective Action: Addressing AI in education requires collaboration beyond individual schools or local authorities.

The seminar called for collective action among educators, policymakers, and technology developers. The discussions highlighted the complexity of AI in education, emphasising the need for clear educational focus, addressing inequality, redefining roles, and fostering collaboration. The potential of AI to enhance education is significant, but it requires thoughtful and inclusive approaches to ensure equitable benefits for all students.

Introduction

On 19 March 2024 we were due to continue our evening seminar series, in partnership with the Goodison Group in Scotland (GGiS), on Learning throughout Life in the 21st Century. Regrettably late business in the Chamber meant that parliamentarians were unable to participate and the Moray House School of Education and Sport, University of Edinburgh, kindly stepped in at the last minute to host the event.

We wanted to set out what we know about AI in Scottish schools so far and we did this by hearing from Professor Judy Robertson, Chair in Digital Learning and Dr Jen Ross, Senior Lecturer in Digital Education, co-director of the Centre for Research in Digital Education and Education Futures fellow at the Edinburgh Futures Institute. The academics presented their early findings and future plans from two projects currently being undertaken in partnership with GGiS.

The event provided valuable insights into AI’s impact on learning processes, policy considerations, and ongoing projects aimed at integrating AI into education. This report captures the key messages and discussions from the seminar, highlighting the challenges, opportunities, and future directions for AI in education in Scotland.

Chair’s Remarks

Sir Andrew Cubie, Chair of the Board of the Goodison Group in Scotland (GGiS), opened the seminar with a warm welcome to those joining us both in the room and online.

He emphasised the significance of the seminar’s focus on AI, noting the considerable interest in the topic and outlined GGiS’s commitment to learning throughout life and focus on forward-looking discussions rather than tactical policy debates, noting the importance of addressing AI’s impact on education from a broad, long-term perspective. He then outlined their involvement in two AI projects with Moray House, the School of Informatics, and the Centre for Research in Digital Education at the University of Edinburgh. He praised these collaborations for their potential to make significant contributions to AI development in education.

He also reflected on AIs rapid pace of development and noted how this pace can be bewildering, particularly for older generations. He stressed that the acceleration of AI is driven by global powers and that attempting to halt its progress with an artificial pause is unrealistic. Instead, he argued for embracing the vigorous pace of change to ensure society, especially younger generations, can adapt and benefit from AI advancements.

In concluding his remarks, Sir Andrew highlighted the role of GGiS as a safe, convening space for collaborative discussions on AI. He expressed optimism about continuing contributions to the dialogue on AI in education and looked forward to the seminar’s outcomes influencing future initiatives.


Project 1: Towards Embedding Responsible AI in the school system

Prof. Judy Robertson began by explaining that she would be providing an update on one of the AI projects with partners at GGIS, including initial findings from an early project work package. She would then hand over to Dr Jen Ross, who would share more about the project’s plans to work with young people.

Prof. Robertson started by outlining that the first project, ‘Towards Embedding Responsible AI in the School System,’ is a scoping project under the Bridging Responsible AI Divides (BRAID) call and funded by the Arts and Humanities Research Council (AHRC). The project is one of 10 scoping projects funded and is the only one focused on education. Prof. Robertson explained it is a large project and in addition to Dr Ross, 15 other colleagues are involved.

Knowing the importance of the learner voice in Scottish education and that the UN Convention on the Rights of the Child (UNCRC) extends to the digital world; the objective of the project is to understand young people’s (aged 13 and above) perspectives on the future of AI in their education, and ensure these perspectives can influence decision-making. Being funded by the AHRC, opens up ways for the project to use creative and imaginative, art-based approaches and through participatory workshops, young people in three different schools, two in Scotland and one in England, will be able to share their views and aspirations for AI in school.

To help underpin the work with young people, Prof. Robertson said the early work of the project sought to map the current landscape of AI integration in Scottish schools, because if you want to ask someone what the future could look like it helps if they know what the present looks like.

Prof. Robertson’s went on to say, her colleague, Dr Laura Meagher, interviewed key stakeholders, including representatives from the Scottish Government, Education Scotland, SQA, Skills Development Scotland, local authorities and the regional collaboratives, educational technology companies or consultants and allied groups such as the Scottish AI Alliance, the Children’s Parliament AI Group and other academics. This exercise revealed diverse opinions on AI’s role in education. Some view AI as a transformative tool requiring national guidance, while others see it as an extension of existing digital learning tools.

The following section provides an overview of the themes emerging from this work.

Current AI Use in Schools

Stakeholders reported varied levels of AI use, with some schools experimenting with AI driven tools for personalised learning and administrative tasks, while others remained cautious or uninformed about AI’s potential benefits.

Interviewees highlighted the lack of a unified approach to AI adoption, resulting in a patchwork of practices and experiences. “There’s a lot of variability in how AI is being used across schools, and it often depends on the enthusiasm and knowledge of individual teachers or school leaders,” noted one interviewee.

One of the interviewees characterised the idea of people who are working on AI in education in Scotland as moles, “industriously burrowing through the blackness of the soil.” Whereas another interviewee described the landscape as a “unsolvable maze but the solution should be everybody working together to find the way through the maze.”

Stakeholder Perspectives

Some stakeholders believe AI has the potential to revolutionise education by offering personalised learning experiences, automating administrative tasks, and providing real-time feedback to students and teachers. “AI can offer insights that were previously unimaginable, tailoring education to each student’s needs,” said one interviewee.

Others view AI as an extension of existing digital tools, enhancing but not fundamentally altering the educational landscape. “We already have digital learning tools, and AI is just the next step in that evolution. It doesn’t require a complete overhaul of our educational practices,” argued one interviewee.

Policy Gaps

There is seemingly no Scotland specific policy or advice and the Scottish Government “currently advocates the use of the UK Government guidance on the use of generative AI.” The Department for Education takes a very pro AI stance which could and probably should apply in Scotland.

This reveals a significant gap in cohesive, national-level guidance on AI use in education. However, there are some initiatives underway, such as the Scottish Government digital strategy, due to be published later this year, which is expected to include AI. Education Scotland will not be issuing a policy statement or guidance, their remit is to offer teachers professional learning. The SQA’s position is that “learners cannot submit AI outputs as their own work and AI cannot be referenced as a source” although they are actively looking at this.

Stakeholders expressed a need for clear policies to ensure safe, ethical, and effective AI integration. “We need a national framework to guide AI adoption in schools, ensuring consistency and addressing concerns about equity and bias,” emphasised one interviewee.

Another interviewee highlighted that “no one quite has the confidence to take ownership of the issue and take a lead on it; nobody’s quite sure who’s remit it is under.”

The absence of Scotland-specific AI policies has led to reliance on broader UK guidelines, which may not fully address the unique context of Scottish education. “Scottish schools are looking for tailored guidance that considers our specific educational goals and challenges,” noted one interviewee.

Concerns and Ethical Considerations

The issues that stakeholders raised included the potential for the exacerbation of existing inequalities, with uneven access to technology potentially widening the gap between advantaged and disadvantaged students. “If we don’t address the digital divide, AI could end up reinforcing the inequalities we’re trying to eliminate,” warned one interviewee.

The biases within AI systems were a major concern, with stakeholders emphasising the need for transparency and accountability in AI algorithms. “AI systems are only as good as the data they’re trained on, and if that data is biased, the AI will be too,” said one interviewee. Professor Robertson pointed out that all the large language models were found to be racist.

The evolving role of teachers in an AI-driven educational landscape was another focal point. Stakeholders stressed the importance of maintaining the human elements of teaching, with AI serving as a tool to support rather than replace educators. “AI should free up teachers to focus on what they do best—building relationships with students and fostering a love of learning,” argued one interviewee.

Concerns over what is plagiarism, the lack of reliability and errors in generative AI results, the potential for the accelerated spread of misinformation and the potential for the loss of the social dimension of learning were also raised.

Next steps

Dr Jen Ross stepped up to the lectern to talk about the foundational ideas behind their AI projects, highlighting the significant role of the arts and humanities in understanding AI’s impact on various aspects of social life. She emphasised that creative and arts-based approaches have been invaluable to their team’s work and outlined their plan to run co-creation sessions with young people, noting that every vision of the future has underlying values. Understanding these values, particularly those of young people who will live with the consequences of today’s AI decisions, is crucial. The project will involve artist-led workshops in three different schools (two in Scotland, one in England) to explore the future of generative AI. These workshops aim to encourage imagination, ideas, and the articulation of values, using playful and speculative approaches.

Participants will include students from mainstream and additional needs education. The workshops will use arts-based methods to develop and communicate young people’s messages and ideas about AI. These sessions will help identify current experiences with AI and integrate insights from earlier project phases to provide a solid foundation for exploration.

She highlighted the importance of these creative outputs, known as zines, which are DIY, self-created short books expressing creative responses to significant issues. These zines will be shared with policymakers, education leaders, teachers, and schools to generate new ideas and perspectives on AI in education.

However, she also acknowledged challenges, such as ensuring the ethical, safe, and supported use of generative AI tools in workshops. Balancing the amount of information given to young people to foster AI literacy while encouraging critical imaginations about AI’s future is another challenge.

She also mentioned the tendency of future predictions to fall into utopian or dystopian extremes and emphasised the need to find a grounded approach. This involves considering the everyday experiences of students in a future AI-integrated school environment.

Project 2:

Dr Ross introduced us to the second project, funded by the Economic and Social Research Council’s Impact Acceleration Account, which, in partnership with the Goodison Group in Scotland, aims to build on the first project’s work. This project will develop, pilot, evaluate, and publish a teaching resource for Scottish secondary schools. The resource will guide teachers in leading AI futures thinking and creative participatory methods with their students. The project will support a group of teachers in piloting the resource and developing case studies to share insights with school leaders, policymakers and other educators.

Dr Ross outlined two questions that the project team was seeking input on from the group.

1. What aspects of leading AI futures work with young people will be the most exciting for teachers and what might be challenging?

2. How should we balance developing young people’s AI knowledge and literacy and fostering critical imagination about the future?


Discussion

Seminar attendees had the opportunity to participate in breakout group discussions to explore two key questions:

1. What questions and thoughts do these key messages and themes raise for you?

2. What would you add, change, or challenge in these key messages and themes?

The discussions were wide-ranging, covering various aspects of AI in education. Here is a summary of the key points and questions raised:

1. Ethical issues, approval and commercial interest

One participant commented that some countries vet and approve AI tools, while Scotland feels more like the “Wild West” in this respect. Some questioned whether the, predominantly, large companies developing AI, cared about the unintentional consequences of introducing AI, if it did not impact the bottom line. Others wondered if private companies were influencing education through the ‘back door’ with AI tools.

Could vetting and approval of tools help Scottish education navigate, bring some order to the current anarchic nature of the AI tools landscape?

One group highlighted that the speed of AI development and its availability means children, young people and parents/carers can potentially access instructional learning directly, by-passing schools. Leading some to question the role of schools now and in the future. In addition, one group questioned whether AI can be truly experiential and replicate the ‘heart of pedagogy’.

2. The purpose of education, the role of the school and educator

For some, the discussion should start with the purpose of education, what do children and young people need to learn and educators to teach. AI tools should be developed collaboratively with educators and learners to support these needs, ensuring they are not just consumers of AI tools but also contributors to
development.

It was also suggested that in a context where AI could provide children and young people with access to the instructional side of learning at the ‘press of a button,’ there was a need for a genuine debate about the purpose of school and the role of educators. This could mean that the thinking behind the Curriculum for Excellence is timely, emphasising the importance of schools and educators focusing on building the capacity of learners.

There may be challenges and there is a fear among teachers about the implications of AI. However, there should also be optimism about the use of AI, it should be considered as a tool with potential to enhance efficiency, allowing educators to focus more on direct engagement with learners.

3. Inclusion and Inequality

AI has the potential to enhance inclusion, equalise learning and opportunity for all learners in educational settings but could also exacerbate inequalities if not managed properly. High social capital students may benefit more from AI, increasing inequities. Schools have a crucial role in addressing this disparity. Care must be taken to ensure that the use of AI in learning does not create an environment where children and young people feel isolated, unable to socialise or at arm’s length from supportive staff in learning settings. This scenario could potentially lead to an increase in mental health and wellbeing issues.

It was noted that high-quality AI resources are often behind paywalls. How do we ensure accessibility for all? Who should subsidise these resources, local authorities?

4. Legislation and policy making

It was suggested that we need to implement a set of overarching principles for the use of AI, agreed as part of a national conversation, with local authorities working with educators to develop practical tools and strategies to implement AI locally.

Views on pausing or not pausing the further development of AI varied and for some this was dependent on context. For some, a pause to catch up and think would be helpful, others believed an artificial pause is unrealistic and there is a need to adapt.

5. Data Management and Privacy

A view that it is crucial to ethically gather data to maintain high-quality AI tools while avoiding casual erosion of data privacy. Proactive involvement of education in the development AI tools for education would also help challenge unethical practices, such as scraping content from the internet, especially images.

A phenomenal amount of data is being generated about young people, and how that data is being managed is a concern.

6. Assessment and Skills

The increased use of AI necessitates a re-evaluation of assessment methods and the skills young people need for the future.


Conclusion

The seminar concluded with a call for collective action and collaboration among educators, policymakers, and technology developers. Dr Ross emphasised the need for ongoing dialogue and adaptive strategies to ensure AI benefits all students equitably. She also highlighted plans to develop and pilot teaching resources that incorporate AI futures thinking, fostering a deeper understanding of AI among both educators and students.

The breakout group discussions highlighted the complexity and breadth of issues surrounding AI in education. Key themes included aligning the development of AI/AI tools with the purpose of education, addressing inequality and ethical concerns, redefining the role of schools and educators and the importance of collective action and collaboration in developing and implementing AI tools. The discussions underscored the potential of AI to enhance education while also posing significant challenges that require thoughtful and inclusive approaches.


FURTHER READING

https://www.sqa.org.uk/sqa/110688.html