Digital assessments on campus at scale – insights from the survey
The issues discussed in in our previous blog post on Digital assessment face-to-face were used to create a survey that was sent out to a small list of institutions. The response rate was quite low (12 in total), however all the institutions who responded were undertaking some level of digital exams face to face.
The survey was undertaken by Marcus Elliot having consulted with several assessment experts and the University of Warwick on the questions. Lawrie Phipps and Sarah Dunne have contributed to the insights from the survey.
Below are some insights from the survey and characteristics of the different approaches being used by institutions.
Some insights from the survey
Delivery and tech
Most institutions are still using a mix of in-person digital, in-person analogue, and remote digital exams. The majority are doing some digital exams face-to-face but scale varies from 100s to over 20,000. For digital exams face to face the spaces being used are a combination of the following:
- The same spaces as used for hand-written exams. However the infrastructure needs to be in place for digital exams (see below). The layout of desks in rows won’t work as everyone can see the screens of the people in front and also traditional exam desks may not be suitable for laptop use.
- Existing computer labs. Many institutions successfully use computer labs for assessments. However it takes away access to much needed computer access for students at a potentially busy period when many course work assignments might also be due. Limited by the maximum room sizes.
- An external venue that is hired for the exam period.
Institutions have also developed policies on Bring-your-own device (BYOD) for exams, with a minimum technical specifications set.
Several institutions are using third-party digital assessment solutions like Uniwise, ExamSoft, Inspera, and ProctorTrack, while others rely on university-managed IT labs or software.
Invigilators, training, and support
Invigilators are often internal, sometimes supplemented by external staff, and are trained to handle both technical and exam-related issues. Most institutions are using external invigilators to supplement internal staff.
Training for staff and students is required, e.g. familiarisation activities, practice assessments, and guidance on invigilation. Technical support during exams is critical with IT staff on call or present at exam venues.
Infrastructure
The most common exam venues used are repurposed spaces like IT labs, seminar rooms, or gyms. Institutions manage the provision of power supply, network connectivity, etc. except where a third party venue is used. WiFi capability needs testing in advance. Power supplies for BYOD can be a challenge, so students are asked to bring a fully charged device and a power pack. Institutions will also provide spare devices and power packs. A different room configuration is needed, preferably better seating and larger desks. Other challenges include managing multiple venues, technical support across locations, and ensuring consistent exam experiences.
Several institutions using BYO device also responded to say they had a bank of university managed devices for students just in case (their own stopped working) or for students who needed them (accessibility or did not own suitable device).
Security / Integrity
Lockdown browsers, proctoring services, and password-protected exam access are all in use. Proctoring services seem to be used mostly for remote digital exams. The decision not to use proctoring is often made due to institutional values or resource constraints rather than technical issues. Several institutions mentioned the security of online assessments, having concerns that someone might access them before the exam.
Scheduling
Most institutions have in-person exams in more than one venue and at different times (concurrent) and frequently have exams of different lengths in the same venue to manage. It is possible to schedule mixed exams in the same venue or across venues but none reported trying that approach.
Students access to information at the same time can lead to bottlenecks at exam venues (but this is not unique to digital exams). Digital exams can require additional time for students finding the right venue, seat and time spent getting set-up.
Student Experience
Institutions are concerned about the student experience, e.g. providing earplugs for noise, ensuring comfortable seating arrangements, and addressing heat and noise disruptions. Student feedback is actively sought and used to inform improvements.
Going forward…
Questions are raised about the suitability of digital exams for all types of assessments and the need for reviews in light of evolving technologies like AI.
There is a push towards more efficient services, better resource allocation, and consideration of the post-exam process, including marking and feedback.
Characteristic by type of implementation
This analysis was limited due to the structure of the survey and it was intended that follow-up interviews would be used to explore the challenges in more detail. However it was possible from the survey to pick up on several insights specific to BYOD and third party exams. There were several common insights.
General
There were a number of insights that applied to most digital exams.
- Allow extra time for everyone to get logged into the exam at the start.
- Technicians need to be available to support students as well as assessment product specialist and e-Learning/TEL team.
- Provide a Teams help channel for invigilators (or similar) that can be monitored by IT support and TEL team so they can respond to issues.
- Spreading assessment across smaller rooms increases demand on invigilators and support.
- Wi-Fi audits before assessments to check range and strength of signal, test and test again.
- Students encouraged to bring fully charged laptops and spare power packs.
- Ensure no planned work on the network or servers during exams.
- Check server capabilities and security of the delivery platform/software.
- Students allowed to wear ear plugs (noise cancelling headsets) to reduce noise and disruptions.
- When there is a problem, consider taking students out of the room to fix it so as not to disrupt others in the room. Have a plan to mitigate loss of time or technical failure (for individuals).
BYOD specific
The following were specific to the use of BYOD for exams.
- There were no specific institutional characteristics associated with use of BYOD. Seven institutions reported using some level of BYOD for assessment.
- The process for exams is very similar to paper-based exams.
- Student seating layouts and spacing may differ to paper-based exams.
- There is a need to set a minimum specification for laptops. Some institutions do not support Chromebooks. The laptop specification is driven by the software being used, so it needed to be tested on multiple devices.
- Laptops with operating systems in a foreign language were a challenge for IT support.
- Many institutions used locked down browsers and could access user logs if they suspected any cheating.
- There was a rule around no additional tabs to be opened in browsers, where this could not be enforced through the software.
- One institution recruited special invigilators for BYOD exams.
- Ensure spare laptops and power packs were available to students.
- If exams require audio, then spare headsets were required.
Outsourced specific comments
Many of the comments about BYOD apply to outsourced digital exams as well. However the following points where made specifically in relation to outsourced digital exams.
- Student layouts and spacing differ to paper based exams.
- The Third party supported the hardware,
- but institutions used their own IT support for login and general technical support,
- and the venue was responsible for the IT for the WiFi support (as they provided it).
- Unlike BYOD where Microsoft Windows is easier to support for outsourced exams Chromebooks can be easier to support and lockdown than windows.
- Outsourcing can be an expensive option.
Follow-up questions
The following questions need to be explored further and we’d welcome any suggestions or answers.
- Examples of different room layout being used for digital exams.
- What desks and chairs are suitable.
- How best to provide power and Wi-Fi.
- When is it best to use an existing, repurposed or third party space – how to decide.
- What exams need to be digital or just work better this quote “Our in-person BOYD are for qualitative course types only” – how are digital quantitative being implemented?
Conclusion
At the start of this research there was ambition that we would discover some useful insights to help institutions implement digital assessment face-to-face and how it can be delivered at scale. There are several challenges that still need to be addressed.
The research has also highlighted some areas where Jisc innovation work could be useful.
- Digital tools to support assessment and feedback: the 2023 landscape – a report due in 2024.
- Extending eduroam connectivity could provide Wi-Fi access where needed during exam periods.
- Providing secure access to cloud based services within the security of the Jisc network. As the number of high takes exams increases it will make them more attractive to cyber attacks.
We would welcome your thoughts or comments please contact us on innovation@jisc.ac.uk