The Department of Education sent out a ‘Call for Evidence’ on the use of GenAI in education. It was open for 10 weeks, launching on 14 June and closing on 23 August 2023 and it went out to practitioners across all sectors of education, the educational technology sector and AI experts.
‘GenAI uses foundation models, including large language models (LLMs), trained on large volumes of data. It can be used to produce artificially generated content such as text, audio, code, images and videos. Examples of GenAI tools include ChatGPT, Google Bard, Claude and Midjourney. This technology is also being integrated within other tools. From 14 June to 23 August 2023, the department held a Call for Evidence on GenAI in education. The purpose was to understand the uses of GenAI across education in England and the sector’s views towards the opportunities and risks it presents’.
The call for evidence received 567 responses from the education, EdTech and other AI organisations both within and outside the UK.
The department of Education begin by acknowledging an increase in interest and use by the public and education sector of GenAI tools.
The responses indicated teachers are seeing the benefit of using Gen AI, identified the support required and the concerns of using Gen AI tools.
The report added that the department is ‘committed to ensuring the department does all it can to maximise these opportunities whilst addressing the risks and challenges’.
Opportunities identified were around giving teachers more time and the provision of support for students especially around SEND, and students for whom English is an additional language and subject specific support.
This is interesting as issues from other papers discuss similar possibilities or potential for AIED.
Based on their ‘call for evidence’ the Department intended to hold a hackathon which would allow teachers to experiment with GenAI in order to discover ’its capabilities in an educational context’.
The department committed to invest up to 2 million in Oak National Academy improve and expand their AI tools for teachers and provide ‘£137 million to the Education Endowment Foundation to encourage innovative and effective evidence-based teaching, including using technology such as computer adaptive learning and AI’.
The responses indicated a need to improve access to technology: ‘The department are therefore investing a further £200 to upgrade schools who have low WI=Fi connectivity and working with providers to ensure all schools have high speed connection by 2025.
‘The department is also setting standards so that school, college and trust leaders know what they need to do to ensure their technology is up to date, maintain security and support online safety’. This is a good response to most of the questions raised in the 2019 Council of Europe’s report which decried a lack of regulation and oversight in the sector leaving it vulnerable and a free-for-all for commercial operators.
Risk Protection Arrangement (RPA) is an alternative to commercial insurance for academies and local authority-maintained schools; and the Department has now included cybercrime in the cover from April 2022. They state it ‘has over 10,000 members (47% of all eligible schools)’.
Useful to know as this would give schools and academies confidence as they use of AIED tools in schools.
The department is collaborating with Ofqual, Ofsted and the Office for Students who have produced a White Paper ‘which sets out the government’s first steps towards establishing a regulatory framework for AI’. This is late in coming as AI tools are already available and being used. Creating this guidance policy must be done with urgency by the government.
The call for evidence was done so that as the department started its policy development of the sector, they could respond to sector changes and build from a strong evidence base.
They state that they will continue to monitor and engage with the sector as the technology changes and update their policy paper to accommodate these changes which will be provided in papers, blogs, reports and webinars.
This overview indicates an understanding by the department of education of concerns and shortcomings around the use of GenAI in education and a commitment to support education practitioners whilst enabling the opportunities and strengths in the use of GenAI in education.
The call identified areas of risks and concerns and reiterated the importance of the teacher in the classroom despite the use of AIED, something they stated will not change. An interesting conclusion given the forecast of AIED developers and writers that this would change an ambition they have had for the past thirty years according to Ido Roll and Ruth Wylie. (See my blog post: https://www.nanaoguntola.me/post/a-review-of-a-a-review-of-the-evolution-and-revolution-in-artificial-intelligence-in-education-by')
Some of their respondents called for change to the curriculum in response to the challenges of GenAI in the classroom. The department feels the ‘teaching of a broad, knowledge-rich curriculum is fundamental’ and important for learners to be ready for the future of employment and will reform the curriculum accordingly in order to ensure high standards of A ‘Levels and GCSE qualifications.
‘The department’s statutory safeguarding guidance’ provides information to schools on how they can protect learners online including with the use of GenAI limiting as much as they can any risks to students. This is key, as the Council of Europe’s Report decried the lack of safety and protection for learners. This responsibility had to lie with the policy makers and learning institutions and this paper indicates the department understands this and has taken the necessary steps to put the guidance and protection in place.
The report calls for schools to protect children from harmful practices online but most schools and learning institutions already have IT policies in place which effectively cover all use of technology in educational settings and should now include the use of AI.
The respondents felt that Institutions must understand the data privacy implications of using GenAI tools and must protect the personal and special category data of learners. They should also be transparent and ensure students understand the implications as well. Pupils and students own the intellectual property rights to original content they create.
The report added that the work of students must not be used to train GenAI models unless permission is expressly provided, a point the Council of Europe are adamant about. In fact the Council of Europe go further to challenge the very act of permission and its legality in the context of the power relationship between the school and the child.
Exam boards have set out strict rules about students cheating with GenAI tools and this could lead to disqualification with some bodies. They suggest teachers would be best placed to identify student’s own work as they would know them and their capabilities.
‘The Joint Council for Qualifications published guidance earlier this year which reminds teachers and assessors of best practice in preventing and identifying potential malpractice, applying it in the context of AI use’.
Ofqual are in constant discussions with qualifying bodies to ensure their accreditation is robust with regards to the use of GenAI tools and to make adjustments whenever required.
They note that ‘GenAI tools can produce unreliable or biased information’, and thus must be checked by users and that the accuracy of the information are the responsibility of the individual and institutions using them. This is right. There is a clear understanding here that GenAI tools are just tools utilised and manipulated by the user. Thus, GenAI tools may produce inadequate content, but it remains the responsibility of the human to decide what is appropriate or correct to be utilised. GenAI tools cannot just produce themselves without human agency.
Additionally, the use of GenAI tools do not eliminate the requirement of knowledge of the field. It is imperative that there is human judgement to determine the accuracy of content provided by GenAI tools. This is right, and once again reiterates the value of human agency in the use of the tools.
It does also raise questions of the possibility of this changing in the future as GenAI tools become better trained.
They found the response to their call for evidence came from teachers who were early adopters and using GenAI tools in their classrooms. They used it to create educational resources, lesson planning and curriculum and to streamline tasks. Some teachers were experimenting using it for automatic marking and student feedback.
Teachers listed benefits such as freeing up their time, enhanced teaching effectiveness, student engagement and ‘improved accessibility and inclusion’ for learners.
The responses also included concerns about GenAI use which included dependence on the tools by learners, misappropriation of the tools and data and privacy risks.
Some expressed concern over the possibility of AI tools replacing the human tutor, a position the Council of Europe is not keen on, but which developers are aiming to accomplish.
They also expressed concern about the digital divide created by socio economic factors, another area which the Council of Europe is concerned about as they see inequality being amplified by ownership and access to the tools.
Most of the respondents were optimistic about the use of GenAI tools for the future especially with the capacity of freeing up teacher time whilst a minority expressed concern over the risks, which for them ‘outweighed the benefits’.
Respondents called for increased support from policy makers and governments to ensure the safety of AI tools. This is an important issue as the Council of Europe calls for designers to ensure this is the case, but actually governments and policy makers must set the standard to which designers must build.
The respondents would like to see training for the use of GenAI tools, improvements to AI infrastructure which the department has already committed to, regulation on issues of privacy and data protection and reforms to curricula and assessments in line with the use of GenAI.
The report concluded by recognising the sample size was limited but recognise that most respondents were positive about GenAI use in the classroom but felt they need more guidance and an understanding of how to mitigate the risks. The department called for more research to understand the impact but committed to engaging and supporting the education in its interaction and adoption of AI.
Respondents encouraged the department to play a prominent role in shaping GenAI use in education. There was a broad acknowledgement of a need to balance risk and reward. Most respondents wanted the UK to become a proactive, influential player in this emerging field. At the same time, respondents expressed a desire to proceed with caution, due to the concerns and risks identified.
This report was an excellent read because it provided a response from users of GenAI tools. It clearly shows the difference between when people actually use the tools and when people write about the tools without experience.
It indicates that the use of GenAI tools in education have positive benefits with many opportunities for growth.
It dispels the fear that teachers will become redundant and shows them rather as a key part of the classroom, monitoring, allocating, directing the tools and still providing empathetic teaching and learning for students. They also provide a more effective system of monitoring of students’ work.
Additionally, the report demonstrates the need for the department of education to be responsible for oversight and developing policy in collaboration with stakeholders which would guide the use of GenAI tools in education and give teachers the confidence to use them.
Unlike the Council of Europe’s report which urges the developers to be responsible for this, the department is the right place where this responsibility should sit. Designers and developers who are eager to make profit will develop tools to meet these guidelines otherwise they would lose out on sales curtailing risks identified by the Council of Europe that these commercial bodies could operate without control or oversight.
The report also indicates that there are other guideline documents available such as ‘the department’s statutory guidance on Keeping children safe in education and the Filtering and Monitoring standards, reviewing the Data Protection and Digital Information Bill and their impacts on individuals’ which can be used in the interim as they are updated. This should provide confidence for users
In conclusion, GenAI tools are useful, and the department of education needs to have oversight and provide guidance for use to create confidence and enable educational institutions to use them in confidence to prepare students for the workplace of the future.
You can read my review of the council of Europe's report here: https://www.nanaoguntola.me/post/a-review-of-the-council-of-europe-s-report-artificial-intelligence-and-education-a-critical-view-th
My course on 'How to use AI in your Creative practice' is available at www.famk.co.uk
Department for Education, Generative AI in education Call for Evidence: summary of responses November 2023 https://assets.publishing.service.gov.uk/media/65609be50c7ec8000d95bddd/Generative_AI_call_for_evidence_summary_of_responses.pdf
Comentários