Young people see value in greater use of artificial intelligence in schools and their lives, but want tougher rules to protect privacy, ensure equitable access, and mitigate its effect on climate change, according to a consultation conducted in partnership with the FT.

Youth Talks, run by the Swiss-based Higher Education for Good Foundation, used AI to solicit, translate, and summarise the views of people aged 15-29 around the world in April and early May this year. In total, they gathered more than 1,000 responses.

The highest volume of comments related to data protection and privacy concerns. One participant, Floribert, said: “Private information can be compromised through an attack against an AI system . . . [and] AI, itself, can be used as a tool to collect private data about individuals.”

Others raised worries over the risks of surveillance and profiling with the technology. Geo said: “AI might track attendance and classroom engagement, but that same data can be used to monitor students’ social interactions.”

Noamrech added: “I don’t want my private data to reach the wrong hands, especially in a world where people always sell data.”

There was demand for strict limits on the types of data that AI can collect and robust mechanisms to ensure transparency in how it was used. Bangel wrote: “No matter what privacy options you choose in your settings, we all know they are listening. This is the sad reality, and there is nothing we can do about it. In the future, I believe privacy will be a joke.”

While 23 per cent said they were ready to use AI in connection with their health and wellbeing, 77 per cent preferred to restrict its use, in order to protect personal and sensitive data. Younger people and those based in the Americas were more wary of potential misuse than those in Asia.

Many respondents also commented on the trade-offs between the value of AI and the large carbon footprint that it imposes, because of data server power consumption. Of those expressing a view, four-fifths preferred to limit AI’s use as a result. There was less concern expressed by those living in the richer, net energy producing countries of North America and the Middle East.

71%Proportion of 15-29 year old survey respondents opposed to robot teachers

For schools, and their staff, the good news was that 71 per cent — and even more among those who are currently studying — opposed the idea of replacing teachers with robots for classroom teaching. Many cautioned that AI lacked the emotional intelligence, or the level of understanding and guidance, that human teachers provide.

Only 13 per cent expressed a readiness to embrace a school-free world and use AI as their personal teacher instead. Mafiken wrote: “Schools play a crucial role in fostering social interaction, emotional development and learning experience — which AI cannot offer.”

More than four-fifths also said they would want only limited help from AI with their homework — in order to nurture critical thinking and to avoid becoming too dependent on technology. Others highlighted its potential threat to creativity, with a concern about the “laziness” it could instil.

They stressed the need for training in how to use AI technology effectively, and understand its pitfalls. A respondent using the name Curious said: “In the current world, which is infested with fake news every day, we should teach everyone critical thinking and fact-checking. AI will greatly increase the scale of this problem of fakes.”

Many called for a stronger combination of people and technology in learning, noting that AI’s useful applications can include help with note taking, making suggestions for suitable content; and aiding assessment. Tatiana said generative AI need not be viewed as cheating: “I view AI as a support tool, similar to a dictionary, rather than something that should do all my work.”

Chikondi agreed, saying: “If we push for coexistence, it’ll make learning a fun and lucrative experience. The teachers will be able to research more, and they’ll make up for what AI lacks most: the emotional perspective and support.”

Grettel said that AI “enables personalised learning experiences . . . facilitates instant feedback on assignments and assessments . . . insights into student performance and learning trends . . . [and] enhances the learning experience, making it more immersive and enjoyable.”

Young teachers who participated in the survey stressed AI’s value in simplifying their administrative tasks, freeing up students’ time, and deepening research work. They saw the opportunity for making learning more fun, as well — in the form of personalised tests or quizzes.

Teacher Rochelle said: “It has saved me so much time when lesson planning, leaving me to fill in the blanks with my knowledge — and allowing me to be more present with my students in the classroom.”

However, a number of respondents cautioned that AI could increase the “digital divide” and inequality for young people, adding to social tensions. “Without intentional design and oversight, AI systems can perpetuate biases and inequalities present in society, leading to discriminatory outcomes,” warned Rasheed Jr.

Blanchard pointed out: “The majority of the [global] population has no access to the internet. Less than 50 per cent of the population own smartphones. This is especially true in under-developed countries, and even more so in countries that are rich in the mineral resources used to manufacture smartphones — eg, the Democratic Republic of Congo.”

They called for clear supervision and scrutiny, seeing a need for global regulation by international bodies to promote equity in access to the technology, and to avoid its misuse — for example, in propagating fakes.

As Geo said: “The line between helpful personalisation and intrusive surveillance can be thin. We must ensure that AI’s reach and power are balanced with our autonomy and self determination.”

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Comments