Do Performance Trends Suggest Wide-spread Collaborative Cheating on Asynchronous Exams?
Surprisingly, the authors found no evidence of collaborative cheating in these asynchronous exams. A closer look at the study reveals that the main reason for not finding such an effect is that assessments followed best practices, including proctoring and question randomization techniques. These techniques were question pools with randomized variables in each question.
"These exams used a fixed pool of questions for all students, with each student getting different parameterized versions of the same questions, with the question order randomized."
Research Summary
Based on a study of 29,492 asynchronous exams at the University of Illinois, researchers examined students’ performance throughout a multi-day test period to identify whether or not students were cheating. This was done through two methods: (1) Looking at standardized scores from different courses throughout the exam period; and (2) by comparing students to their own previous performance on in-person written exams. The results showed that performance scores did not improve significantly over the multi-day exam period, indicating that students were not engaging in cheating behavior. Yet, the reason for such a finding is that assessments were administered using best practices (e.g. question randomization). These results held when controlling for the student's ability.
Key Takeaways
1. Online Assessment Best Practices Can Eliminate Collaborative Cheating
Best practices for online assessments in asynchronous exams should also be applied to face-to-face asynchronous exams.
Read the full article:
Chen, B., West, M., & Zilles, C. (2017). Do performance trends suggest wide-spread collaborative cheating on asynchronous exams?. In Proceedings of the Fourth (2017) ACM Conference on Learning@ Scale (pp. 111-120).