Artificial Intelligence

10
 m

Develop effective learning environments and approaches to student support and guidance

The use and value of appropriate learning technologies

What can I do?

Impact
3
Quality
3
  • Use GenAI for structured learning activities where students practise, receive feedback, generate explanations, compare responses, revise work, or solve problems — not as a general-purpose shortcut.
  • Design tasks so students must show their thinking, such as explaining why they accepted, rejected or improved an AI response; this helps distinguish AI-assisted output from student learning.
  • Avoid or limit GenAI when students need to independently demonstrate core knowledge, reasoning, writing, calculation, professional judgement, metacognition or self-regulation, unless the use is carefully scaffolded and evaluated

What is this about?

Artificial Intelligence in education (AIEd) includes tools like chatbots, tutoring systems, feedback tools, and generative AI systems such as ChatGPT. These tools can provide explanations, feedback, practice opportunities, and support for writing, reasoning, and problem-solving.

What's the evidence say?

Overall, meta-analyses suggest that AI can have positive effects on student learning, but the size and reliability of these effects vary. A broad review of AIEd found a very large effect on learning achievement, but also showed that outcomes depend on education level, discipline, learning mode, intervention duration, and geography (Tlili et al., 2025). Chatbot studies also report positive effects, particularly in higher education and short-term interventions (Wu & Yu, 2024), although newer evidence suggests these effects may be smaller after adjusting for publication bias and may depend on targeted implementation (Laun & Wolff, 2025).

Generative AI studies are promising but still emerging. Meta-analyses report positive effects for academic achievement, language skills, affective-motivational outcomes, and higher-order thinking (Sun & Zhou, 2024; Deng et al., 2025; Han et al., 2025; Chen & Cheung, 2025). However, effects are not consistent across all outcomes. For example, one meta-analysis found positive effects for achievement and higher-order thinking, but no significant effect on metacognition (Chen & Cheung, 2025). Other reviews also note methodological limitations, publication bias, and the need to distinguish improved AI-assisted outputs from genuine learning (Deng et al., 2025; Laun & Wolff, 2025).

When might I use GenAI?

  • When it is designed as a structured learning scaffold, rather than simply made available for general use; AI effects vary by learning mode, discipline, duration and implementation context (Tlili et al., 2025; Han et al., 2025; Zhu et al., 2025).
  • For activities involving practice, explanation, tutoring, feedback, revision, language learning or problem-solving, where the AI has a clear role in the learning design (Tlili et al., 2025; Wu & Yu, 2024; Deng et al., 2025; Chen & Cheung, 2025).
  • When students are required to question, compare, refine or explain AI outputs, rather than simply accept them; this aligns with evidence that benefits are more likely when AI use is purposeful and embedded in structured interventions (Deng et al., 2025; Chen & Cheung, 2025; Laun & Wolff, 2025).
  • Where the aim is to support academic achievement, language learning, affective-motivational outcomes or some higher-order thinking outcomes, noting that effects vary across outcome types and study contexts (Sun & Zhou, 2024; Deng et al., 2025; Han et al., 2025; Chen & Cheung, 2025).

When might I avoid GenAI?

  • When students need to independently demonstrate core knowledge, reasoning, writing, calculation or professional judgement, especially where AI use would make it difficult to determine what the student knows or can do; this is a cautious implication from evidence that AI-assisted performance is not always the same as learning (Deng et al., 2025).
  • Where GenAI could allow students to bypass the cognitive work the task is intended to develop; reviews caution that improved outputs need to be distinguished from genuine learning gains (Deng et al., 2025; Chen & Cheung, 2025).
  • For outcomes such as metacognition or self-efficacy, unless use is carefully scaffolded and evaluated; some reviews found weaker or non-significant effects for these outcomes compared with achievement or performance outcomes (Deng et al., 2025; Chen & Cheung, 2025).
  • As an unstructured add-on with no clear learning purpose, guidance or evaluation, because AI effects vary substantially by discipline, education level, learning mode, duration and implementation conditions (Tlili et al., 2025; Han et al., 2025; Zhu et al., 2025).
  • When interpreting short-term gains, because some chatbot effects may reflect novelty effects, and publication bias may inflate estimates of impact, particularly in shorter interventions (Wu & Yu, 2024; Laun & Wolff, 2025).

What's the underlying theory?

AI in education draws from cognitive load theory, self-regulated learning, feedback theory, personalisation, and activity theory. These perspectives suggest AI can support learning when it reduces unnecessary effort, provides timely feedback, helps students practise, and supports reflection. However, the same tools can weaken learning if they replace cognitive effort or allow students to bypass the thinking needed to develop expertise. When well-designed, AI can act like a scaffold or tutor. When poorly designed, it can become a shortcut.

Where does the evidence come from?

This summary is based on eight recent meta-analyses of AI, chatbots, ChatGPT, and generative AI in education. Tlili et al. (2025) analysed 85 studies of AIEd and found large positive effects, but with important contextual moderators. Wu and Yu (2024) reviewed 24 chatbot studies and found stronger effects in higher education and short-term use. Laun and Wolff (2025) analysed 62 chatbot studies and found that effects were smaller after adjusting for publication bias. Sun and Zhou (2024) reviewed GenAI use with college students and found a medium positive effect on academic achievement. Zhu et al. (2025) found a smaller but significant effect of GenAI on student learning outcomes. Deng et al. (2025), Han et al. (2025), and Chen and Cheung (2025) provide newer evidence that GenAI and ChatGPT can support learning, but that effects vary by outcome, context, implementation, and study quality.

References

Chen, S., & Cheung, A. C. K. (2025). Effect of generative artificial intelligence on university students’ learning outcomes: A systematic review and meta-analysis. Educational Research Review, 49, 100737. https://doi.org/10.1016/j.edurev.2025.100737

Deng, R., Jiang, M., Yu, X., Lu, Y., & Liu, S. (2025). Does ChatGPT enhance student learning? A systematic review and meta-analysis of experimental studies. Computers & Education, 227, 105224. https://doi.org/10.1016/j.compedu.2024.105224

Han, X., Peng, H., & Liu, M. (2025). The impact of GenAI on learning outcomes: A systematic review and meta-analysis of experimental studies. Educational Research Review, 48, 100714. https://doi.org/10.1016/j.edurev.2025.100714

Laun, M., & Wolff, F. (2025). Chatbots in education: Hype or help? A meta-analysis. Learning and Individual Differences, 119, 102646. https://doi.org/10.1016/j.lindif.2025.102646

Sun, L., & Zhou, L. (2024). Does generative artificial intelligence improve the academic achievement of college students? A meta-analysis. Journal of Educational Computing Research, 62(7), 1676–1713. https://doi.org/10.1177/07356331241277937

Tlili, A., Saqer, K., Salha, S., & Huang, R. (2025). Investigating the effect of artificial intelligence in education (AIEd) on learning achievement: A meta-analysis and research synthesis. Information Development, 41(3), 825–842. https://doi.org/10.1177/02666669241304407

Wu, R., & Yu, Z. (2024). Do AI chatbots improve students’ learning outcomes? Evidence from a meta-analysis. British Journal of Educational Technology, 55(1), 10–33. https://doi.org/10.1111/bjet.13334

Zhu, Y., Liu, Q., & Zhao, L. (2025). Exploring the impact of generative artificial intelligence on students’ learning outcomes: A meta-analysis. Education and Information Technologies. https://doi.org/10.1007/s10639-025-13420-z

Additional Resources