Student digital experience tracker case study (2017)
01 August 2017
  • Insights

Context

The University of Westminster has three campuses in central London that are home to the Faculty of Architecture and the Built Environment, the Westminster Business School, the Faculty of Social Sciences Humanities and Languages (including the School of Law), and the Faculty of Science and Technology. A fourth campus at Harrow is where the Westminster School of Media, Art and Design can be found. The University has around 20,500 students, 79% of which are undergraduates. The latest Times Higher Education rankings place Westminster among the top 100 most international universities in the world.

Why the tracker was used

In 2016/17 the University will begin to deliver a new 'Learning Futures' curriculum to all undergraduates. The Tracker offered an ideal opportunity to benchmark current digital provision and to 'establish more clearly what it is that students want from the digital university environment' according to Professor Gunter Saunders (Associate Director, Internet and Education Technology Services). Results from current students are being used to schedule improvements in time for the new cohort. There are hopes to follow that cohort's experience from the very start by using the Tracker in July/August 2016 as part of their pre-arrival experience.

Uniquely among our University pilot sites, Westminster invited all current undergraduates (around 17500) to participate. This was an ambitious proposal, but it has paid off in terms of the robustness and quality of the data collected.

Engaging key staff and learners

The Tracker pilot was driven by Internet and Education Technology services with support from the DVC Learning and Teaching/Student Experience, the Registrar (responsible for all student services and Information Services), the Students’ Union, Senior Library Staff and Registry staff. Academic staff were asked to promote the survey to their students and even consider setting a short time aside in scheduled sessions.

Typically the University would expect no more than 1,000 respondents from an online survey of all undergraduates. For the Tracker it was decided to recruit student helpers who would collect input face-to-face on dedicated tablets, alongside the usual methods for attracting online respondents. This involved regular contact with all undergraduates by email and text alongside a web-based, social media and poster campaign. After the first couple of weeks, other channels of communication were used including the student news portal and course reps network, and a marketing campaign based around 'emerging trends' from the responses to date. In practice most further responses from students came in reaction to the all-student emails and texts, suggesting that persistence pays off (though perhaps at the cost of irritating students who had already engaged, or who had made up their minds not to).

What the tracker found

A total of 3,593 students completed the Tracker out of a population of 17,585. This is the highest number of responses recorded in the pilot, and means that Westminster can analyse its data by faculty as well as across the University with an extremely high level of confidence in the results.

Despite some problems with the number of users accessing the survey at one time, the combination of methods (including paper-based and face-to-face in some cases) meant students engaged who would not have done otherwise. Another key factor was persistence in communicating about the survey, keeping messages fresh with up-to-date news about response rates and emerging findings.

The guidance provided by Jisc was pronounced 'very useful' and the team had no problems customising, implementing or analysing the data. Embedded sector benchmarking was found to be a highly useful feature of the Tracker.

The data has now been analysed both for the University overall and for each faculty. There are plans to add further detail through use of some Blackboard analytics data and student focus groups convened to address specific issues. Key findings include the following:

  1. Higher personal use of tablets has no significant impact on laptop ownership – so students who own one digital device are not less likely to own another.
  2. Higher tablet use seems to have no impact on use of university laptops or desk top computers – so there continues to be a demand for institutional provision even as the use of other devices becomes more commonplace.
  3. Students get the most useful support on digital matters by searching the internet, followed by the library staff, teachers and friends.
  4. On the theme of 'computers' (coded free-text responses) students mostly mention laptops and are keen to see more laptops available for loan, for longer periods.
  5. On the theme of 'teaching' students tended to want more interaction and use of technology in their classes.
  6. On the theme of 'training' there was strong support for more opportunities to learn about software, both specifically linked to the course of study and more broadly in digital skills for development and employment.
  7. Students wanted to see lecture recording done routinely for reasons such as use in revision, for catching up when not able to attend, and to help when English is not their first language. They wanted learning resources (as distinct from library resources) provided by their tutors and they wanted access to them before lectures or immediately after. A desire to see more e-readings/e-books/e-journals was also evident.
  8. Whilst there was feedback about improving Blackboard, especially in terms of simplified navigation and communication, there were many comments that highlighted its value and importance: overall 95.8% of undergraduates found Blackboard Very useful or quite useful.

There are significant variances between faculties but further work is needed to understand whether this is related to the different subject areas taught, or represents differences in provision that need to be addressed. Equivalence in the digital learning experience is an issue that was highlighted in a recent internal audit. The Tracker results have added more detail to these findings.

Responses and reflections

The study team have already made recommendations to key players at the University. Some of these are to:

  1. Involve students in developing a coherent, user-facing Exploit your Own Device policy, building on the University's better-than-average wifi coverage
  2. Explore why scores for the development of digital skills on courses are lower for certain Faculties, bearing in mind that the reason for this may be legitimately related to the different subjects on offer
  3. Sponsor a review of policies, procedures and processes for online assessment
  4. Consider expanding the current laptop loan scheme
  5. Share effective practice in the use of live technologies to support engagement in classroom teaching and learning
  6. Consider how best to enhance training opportunities for both subject-specialist and more general ICT skills
  7. Simplify the Blackboard student landing page and educate students on how to manage notifications from Blackboard
  8. Move towards routine recording of lectures and ensure timely provision of online materials via module Blackboard sites

Overall the Tracker pilot was found to have been 'very useful'. Student focus groups are now being convened to discuss and add detail to the findings about the quality of students' digital experience.

The Tracker has engaged people here in a discussion about digital skills development. The data collected provides much needed baseline data that should help the university measure the impact of its new Learning Futures curriculum

Professor Gunter Saunders, associate director of education technology services

Key lessons learned

  • A high response rate can be achieved using a combination of methods, persistent reminders, and varying the message to students e.g. posting findings as they emerge and asking 'do you agree?' or 'what do you think?'
  • A large sample size is likely to reveal significant variations between subject areas, but more work will be needed to understand whether this is related to differences in the student cohorts, and/or in subject requirements, or whether it represents inequalities in provision that need to be addressed
  • Tracker data can be used as a credible source of evidence in formulating strategies for digital learning and the digital environment