[bsa_pro_ad_space id=1 delay=10]

Online exam monitoring can invade privacy and erode trust at universities

By , on December 4, 2020


Tools that use eye tracking can flag students who fail to keep their eyes on the webcam or screen, even if the reason is autism or disability rather than cheating. (Pexels photo)

The health risks posed by COVID-19 mean most Canadian university classes are online this year. As a result, some students will write exams online via remote proctoring platforms that surveil their activities.

These tools go by names like ProctorU, Examity, Respondus and Proctorio, among others. Designed by for-profit tech startups, they
monitor students’ laptops, tablets or phones during the course of an exam. Proctoring tools can monitor eye movements, capture students’ keystrokes, record their screens and track their searches as well as their home environments and physical behaviours.

As an education technology scholar, I see institutions turning to online proctoring in the name of academic integrity, to prevent cheating. But the risks of exchanging the four walls of the classroom for surveillance platforms may be higher than many institutions bargained for.

Testing and proctoring methods that invade privacy and erode trust end up undermining the very integrity that institutions demand students uphold.

Consequences of being flagged by the tool

Institutions pay proctoring services to address a core paradox of online learning: the internet puts a world of knowledge at learners’ fingertips, but schools tend to count using that knowledge as cheating.

Some proctoring vendors use algorithms and artificial intelligence (AI) to flag suspicious behaviour.

Some also offer human proctors as an option in combination with AI.

Tools that use eye tracking can flag students who fail to keep their eyes on the webcam or screen, even if the reason is autism or disability rather than cheating.

In the all-seeing eye of the remote proctor, all students become potential cheaters.

Equity is also not a consideration for online proctoring. Some platforms use discriminatory facial recognition technologies that work poorly with darker skin, forcing students to sit for exams with bright lights shining in their faces in order to be recognized by AI.

Others force students to verify their identity via government-issued ID, potentially outing trans or undocumented learners.

Control over students’ environments

Even prior to the pandemic, some higher education voices had raised concerns about proctoring platforms. When COVID-19 forced institutions online, protests and stories began to emerge about the extremes these technologies can impose.

Some students were told to ensure no one else in their home used the internet during an exam, despite the fact that students may live together and take the same classes, or live at home with family members also working and learning online.

In the United Kingdom, some students reported resorting to wearing adult diapers or urinating in bottles in order to avoid having their assessments flagged or terminated.

In October, a new lawyer in New York went into labour in the middle of her bar exam, but did not leave her chair for fear of being disqualified as a potential cheater.

And while robo-proctors’ algorithmic decision-making is at the core of most critiques, even the platforms that use human proctors can create anxiety and harm for students sitting exams.

In August, a Muslim lawyer in the U.K. deferred her high-stakes bar exam until December. She was told she’d have to partially remove her headscarf in order to validate her identity, but the platform refused to guarantee her a female proctor.

Some students in Canada and elsewhere have protested proctoring with petitions.

Data privacy

Platforms that collect biometric data — including students’ unique facial and voice data as well as behavioural data — as a condition of course completion put students at risk of data breaches.

These risks are an extension of higher education’s data ethics gap, and teach students that violation of their data privacy is normal.

In a pandemic, the trade-off of data for access can feel like safety. When it comes to proctoring, though, students have no say in the trade-off they’re subjected to. One University of British Columbia student who tried to have his say about Proctorio found himself publicly addressed by its CEO.

Remote proctoring gives corporate third parties a controlling hand in the academic integrity conversation between students and their institutions.

This is a breach of the duty of care that universities owe students, and an abdication of higher education’s societal role to create opportunity, not harm.

True, the contemporary web demands we click “Yes” to data collection as the price of admission to everything from recipe sites to banking apps. But higher education is where people go to learn to think critically about emergent challenges in society.

If universities demand students ignore data privacy concerns just to take tests, then which societal institution will teach us to value the new commodity that is our data?

Academia doesn’t need proctoring

The truth is, higher education doesn’t need proctoring.

For over 25 years, the field of online learning has been about enabling students to contribute to the abundance of knowledge on the web. The internet can be a way to support learners to connect meaningfully while protecting communities from COVID-19.

But proctoring tools don’t serve those ends. Timed, proctored tests value what students remember. Proctoring tools reinforce an
approach to teaching and learning that is all about memorization.

Is memorization really a valid educational reason for risking privacy, well-being, and tight university budgets in a world where students will spend most of their lives with Google in their pockets?

Students might be better served by alternative approaches to assessment that focus on how they synthesize, apply and interpret information at their fingertips.

Rather than spending on remote proctoring, institutional funds could be re-allocated to hire more educators and reduce class sizes. Large classes make it challenging to evaluate learning without resorting to multiple choice testing. Some faculty might also benefit from added invigilation support, or grading support. All of this can be accomplished without automation or invasions of data privacy.

Academic integrity matters. But integrity works both ways.

As 2020 draws to a close, institutions need to be careful not to toss out the metaphorical baby of higher learning in order to hang on to the bathwater of high-stakes testing.The Conversation

Bonnie Stewart, Assistant Professor, Online Pedagogy & Workplace Learning, Faculty of Education, University of Windsor

This article is republished from The Conversation under a Creative Commons license. Read the original article.

[bsa_pro_ad_space id=2 delay=10]