Educational data mining is widely deployed to extract valuable information and patterns from academic data. This research explores new features that can help predict the future performance of undergraduate students and identify at-risk students early on. It answers some crucial and intuitive questions that are not addressed by previous studies. Most of the existing research is conducted on data from 2-3 years in an absolute grading scheme. We examined the effects of historical academic data of 15 years on predictive modeling. Additionally, we explore the performance of undergraduate students in a relative grading scheme and examine the effects of grades in core courses and initial semesters on future performances. As a pilot study, we analyzed the academic performance of Computer Science university students. Many exciting discoveries were made; the duration and size of the historical data play a significant role in predicting future performance, mainly due to changes in curriculum, faculty, society, and evolving trends. Furthermore, predicting grades in advanced courses based on initial pre-requisite courses is challenging in a relative grading scheme, as students’ performance depends not only on their efforts but also on their peers. In short, educational data mining can come to the rescue by uncovering valuable insights from academic data to predict future performance and identify the critical areas that need significant improvement.
This paper presents a systematic literature review of the coordinated use of Learning Analytics and Computational Ontologies to support educators in the process of academic performance evaluation of students. The aim is to provide a general overview for researchers about the current state of this relationship between Learning Analytics and Ontologies, and how they have been applied in a coordinated way. We selected 31 of a total of 1230 studies related to the research questions. The retrieved studies were analyzed from two perspectives: first, we analyzed the approaches where researchers used Learning Analytics and Ontologies in a coordinated way to describe some Taxonomy of Educational Objectives; In the second perspective, we seek to identify which models or methods have been used as an analytical tool for educational data. The results of this review suggest that: 1) few studies consider that student interactions in the Learning Management System can represent students’ learning experiences; 2) most studies use ontologies in the context of learning object assessment to enable learning sequencing; 3) we did not identify methods of evaluation of academic performance guided by Taxonomies of Educational Objectives; and 4) no studies were identified that report the coordinated use of Learning Analytics and Computational Ontologies, in the context of academic performance monitoring. Thus, we identify future directions of research such as the proposal of a new model of evaluation of academic performance.
This paper considers the use of log data provided by learning management systems when studying whether students obey the problem-based learning (PBL) method. Log analysis turns out to be a valuable tool in measuring the use of the learning material of interest. It gives reliable figures concerning not only the number of use sessions but also the interlocking of various course activities. The longitudinal study based on log analysis makes use of a new software tool, SPY US. Our study concentrates on using log data analysis in improving the PBL method used in learning diagnostic skills with the help of Virtual Patients.
Student evaluations to measure the teaching effectiveness of instructor's are very frequently applied in higher education for many years. This study investigates the factors associated with the assessment of instructors teaching performance using two different data mining techniques; stepwise regression and decision trees. The data collected anonymously from students' evaluations of Management Information Systems department at Bogazici University. Additionally, variables related to other instructor and course characteristics are also included in the study. The results show that, a factor summarizing the instructor related questions in the evaluation form, the employment status of the instructor, the workload of the course, the attendance of the students, and the percentage of the students filling the form are significant dimensions of instructor's teaching performance.
A high quality review of the distance learning literature from 1992-1999 concluded that most of the research on distance learning had serious methodological flaws. This paper presents the results of a small-scale replication of that review. A sample of 66 articles was drawn from three leading distance education journals. Those articles were categorized by study type, and the experimental or quasi-experimental articles were analyzed in terms of their research methodologies. The results indicated that the sample of post-1999 articles had the same methodological flaws as the sample of pre-1999 articles: most participants were not randomly selected, extraneous variables and reactive effects were not controlled for, and the validity and reliability of measures were not reported.