Skip to main content

Lockdown Browsers and Monitoring Software

Around one in four college learners take at least one class online (Allen & Seaman, 2016) and many on-site classes now make use of learning management systems for assessment. The changed modality for testing, to an online format, can bring increased concern with academic integrity, similar to the earlier “proctored testing center” movement associated with lecture hall testing. Resulting software “solutions” are available, though software is only tool to sculpt behavior and methodological adjustments to your assessment instruments and protocols may be useful to consider. 

What is a lockdown browser? 

A lockdown browser (LDB) is software that encourages academic integrity in online assessment. It “locks down” a learner’s computer so the device can only access a specific website, typically a learning management system. With LDB open, other computer and browser features such as copy/paste, multiple tabs, right-click options, and minimizing the browser window are disabled to discourage cheating. It is important to note that use of LDB sometimes brings a need for increased technical support due to browser versions, security settings, wireless connectivity, and other factors. 

What is a monitor? 

A monitor is software that records learners and their environment during online assessments to simulate a proctored exam. LDB is often coupled with use of the Monitor software as a double deterrent for academic integrity. Though this software may flag certain activity as “suspicious,” it is important to note that it does require educators to review video footage to verify and make judgements on academic integrity.

How effective are LDBs & monitors? 

These tools do offer a more secure way to administer objective assessments online (Alessio, Malay, Maurer, Bailer & Rubin, 2017). Alessio et al. (2017) found that learners taking an online exam with LDB and Monitor software scored an average of 17% lower than non-recorded learners and took 30% less time to complete the test. At first glance, this seems to discourage the use of monitoring software, but in reality, this research may suggest that unmonitored students take longer and score higher on exams, which could in part be due to academically dishonest behaviors like looking up answers or asking peers for assistance. Only 17% of students in the recorded group earned a final ‘A’ grade, while an astounding 63% received that same grade in the unmonitored sections of the class (Alessio et al., 2017).

However, while it’s essential to maintain integrity and rigor, it’s also essential not to forget the learning value inherent in assessment activities. In other words, it is not always a negative thing when students take longer on an exam, use reference materials to research course-related content, or dialogue with peers to gain understanding. Read our Apply section for other ideas to encourage academic integrity. 

What else can I do to deter cheating? 

There are countless other ways of assessing learners online while preserving academic integrity. Several methods outlined by Casey, Casey, and Griffin (2018) and Paullet, Chawdhry, Douglas, and Pinochot (2016) include: 

  • Assigning authentic or applied assessments instead of traditional exams. These could include term papers, research activities, presentations, group projects, portfolios, etc. 
  • Considering a mix of objective and subjective questions to test not only facts and definitions, but also deeper understanding of the material through short answer and essay questions. 
  • Creating a culture of academic integrity by explaining concepts like cheating and plagiarism to learners, discussing their importance to people of high integrity and future professionals, indicating potential consequences associated with such behaviors, and having learners sign a “pledge” to abstain from academic dishonesty as part of the assessment or academic program requirements. 

If you must make use of more traditional online assessments, consider these tips from Casey et al. (2018) & Paullet et al. (2016): 

  • Use a large test bank, shuffling questions and answers, so each learner gets a unique exam. Some would suggest a 3-to-1 or a 4-to-1 ratio, meaning that there should be three or four times the questions in the question bank as there are on the actual exam. 
  • Rewrite exams every year/semester to ensure answers do not get passed from class to class. 
  • Consider use of a timer for exams, knowing that it is more likely for learners with a strong understanding of the content to be successful in the allotted time (but also understand this can be anxiety-provoking for some learners). 

What lockdown browser and monitor software is available at CMU? 

CMU’s learning management system, Blackboard, has built-in integration for Respondus Lockdown Browser and Monitor. This program is free for students to download to their personal computers and links directly to Blackboard where assessments can be configured to require the LDB. To set up an assessment with Respondus, simply go to the “Course Tools” menu of a Blackboard shell and select “Respondus LockDown Browser” to set up the parameters for the exam. If Respondus Monitor is also selected, students will be instructed to follow a series of instructor-chosen prompts such as taking a picture of themselves, showing photo ID, and rotating their webcam for an environment check.  


Alessio, H. M., Malay, N., Maurer, K., Bailer, A. J., & Rubin, B. (2017). Examining the effect of proctoring on online test scores. Online Learning, 21(1), 146-161. 

Allen, I. E., & Seaman, J. (2016). Online Report Card: Tracking Online Education in the United States. Babson Survey Research Group. Babson College, 231 Forest Street, Babson Park, MA 02457. 

Berkey, D., and Halfond, J. (2015). Cheating, student authentication and proctoring in online programs. New England Journal of Higher Education, July 20. Retrieved from

Casey, K., Casey, M., & Griffin, K. (2018). Academic Integrity in the Online Environment: Teaching Strategies and Software that Encourage Ethical Behavior. Copyright 2018 by Institute for Global Business Research, Nashville, TN, USA, 58. 

King, C.G., Guyette, R. W., & Piotrowski, C. (2009). Online exams and cheating: An empirical analysis of business students’ views. The Journal of Educators Online, 6, 1, 1-11. 

Paullet, K., Chawdhry, A., Douglas, D., & Pinchot, J. (2016). Assessing Faculty perceptions and techniques to combat academic dishonesty in online courses. Information Systems Education Journal, 14(4), 45. 

Watson, G., and Sottile, J. (2010). Cheating in the digital age: Do students cheat more in online courses? Online Journal of Distance Learning Administration, 13(1) Retrieved from