Article Preview
TopIntroduction
Artificial intelligence (AI) in education has enabled the development of e-learning systems that simulate students’ knowledge and experience to provide personalized support to students (Nwana, 1990; Self, 1974; Wenger, 2014). AI-supported e-learning refers to the use of AI techniques (e.g., fuzzy logic, decision tree, Bayesian networks, neural networks, genetic algorithms, and hidden Markov models) in e-learning (i.e., using computer and network technologies for learning or training) (Colchester et al., 2017). A recent meta-review by Kaudi et al. (2021) reported that that the most identified AI-supported e-learning systems were adaptive learning systems and the second most identified kind of AI-enabled learning systems were intelligent tutoring systems, with the recommendation system being the least reported. Some of the recommender systems for personalized learning adopted collaborative filtering (Chen & Cui, 2020; Wind et al., 2018), content-based filtering (Kandakatla & Bandi, n.d.; Lops et al., 2011), and knowledge-based filtering (Haddad & Naser, 2017; Samin & Azim, 2019). These methods are commonly used in recommendation systems, but it is difficult to describe them as AI-supported learning systems since they do not employ Bayesian networks or neural networks. Therefore AI-supported recommendation systems in the educational field have not been well studied.
On the other hand, when we shift our focus to our daily lives, it is clear that recommender systems are everywhere, and AI is being used here as well. For example, Amazon recommends products with collaborative filtering (Smith & Linden, 2017), and Netflix recommends movies using deep learning (Amatriain & Basilico, 2015). In the e-commerce research field of recommendation, explainable recommendations, which provide explanations about why an item is recommended, have received much attention for improving transparency, persuasiveness, and trustworthiness (Zhang & Chen, 2020). Based on these studies, also in education, it is supposed that explanations from a learning system could provide additional benefits for students. Previous research on intelligent tutoring systems has shown that student motivation in system-based self-regulated learning can be improved by prompting and feedback mechanisms, leading to higher achievement (Duffy & Azevedo, 2015). Further, eXplainable AI (XAI) has begun to attract attention in the field of education for emerging concerns about Fairness, Accountability, Transparency, and Ethics (Khosravi et al., 2022). Explanations interpreting the decision-making process of AI are very important for teachers because they must be accountable to students, parents, or governments. Teachers need to know why such feedback was given by the AI, and interpreting why it was given may help teachers improve their teaching skills.