Article Preview
TopIntroduction
Learning analytics have been applied to study and visualize the relationship between student activity and performance in online-based university-level courses during the last decade (Dyckhoff, Zielke, Bültmann, Chatti & Schroeder, 2012; Ferguson, 2012; Gunn, 2014; Nieto Acevedo, Yuri Vanessa, Montenegro Maran & Enrique, 2015; Retalis, Papasalouros, Psaromiligkos, Siscos & Kargidis, 2006; Scanlon, McAndrew & O'Shea, 2015). The authors of 11 relevant studies published in peer-reviewed scholarly journals all found some benefits but they also cited many problems when trying to assess student learning through combinations of learning analytics, learning management system (LMS) activity data logs, and graded performance results (Agudo-Peregrina, Iglesias-Pradas, Conde-Gonzalez & Hernandez-Garcia, 2014; Fidalgo-Blanco, Sein-Echaluce, Garcia-Peealvo & Conde, 2015; Gomez-Aguilar, Hernandez-Garcia, Garcia-Pealvo & Theren, 2015; Iglesias-Pradas, Ruiz-de-Azcarate & Agudo-Peregrina, 2015; Nieto Acevedo et al., 2015; Reyes, 2015; Ruiparez-Valiente, Mua-Merino, Leony & Delgado Kloos, 2015; Scheffel, Drachsler, Stoyanov & Specht, 2014; Xing, Guo, Petakovic & Goggins, 2015; Yahya, Messoussi & Touahni, 2015).
Each of those scholarly manuscripts cited above made unique contributions to the literature beyond visualizing inferential deductions that previous GPA predicts future GPA. In addition to explaining how learning analytics may be used to measure and visualize student learning activity in online courses, all of those researchers raised the concern that deep learning may not be reliability predicted by learning analytics. In fact, only two groups of researchers (Agudo-Peregrina et al., 2014; Gomez-Aguilar et al., 2015) found any statistically significant correlation relationships between student online activity reported in learning analytics and academic performance. More than one researcher confirmed there was no statistically significant relationship between learning analytics, LMS activity data and student learning outcomes (Iglesias-Pradas et al., 2015). Therefore, more studies are needed to test if learning analytics data could relate to or predict student performance.
It is difficult to generalize either the positive, negative or null findings of the above studies towards any business school population due to the differences in the research design, unit of analysis, LMS context and subject matter disciplines from which the samples were drawn. For example, only a few studies tested for and used objective measures of student learning performance for the dependent variable. All of those researchers called for more studies to further explore how learning analytics and LMS data could be utilized to assess student performance.
There are numerous empirical studies published in scholarly peer-reviewed journals beyond the scope of learning analytics where researchers have found statistically significant links between student performance in online courses and activity-related factors (Blumenthal et al., 2014; Chang, Wu, Kuo & You3, 2012; Farrington, 2014; Gibson & Dunning, 2012; Hu, Lo & Shih, 2014; Kaufman & Schunn, 2011; Lu & Law, 2012; Lu & Zhang, 2012; Mirriahi & Alonzo, 2015; Pombo & Talaia, 2012; Russell, 2015; Shih, 2011; Strang, 2010, 2011, 2013a; Thomas, Reyes & Blumling, 2014; Wichadee, 2014; Zacharis, 2015). Thus, there may be relevant concepts in other domains that show how student activity in online courses may be related to estimating performance as well as how that knowledge could be applied to improve teaching.