University of Bradford
Gathering meaningful and powerful data to inform developing practice
The insights service has been the first opportunity to have any sort of measure of the student digital experience. To have some concrete data to support investment that is relevant to our institution is really powerful.
Project lead: Carol Higgison, senior lecturer in technology enhanced learning, Centre for Educational Development, University of Bradford
Project aims: establishing baseline data and measuring impact
The University of Bradford took part in both the 2016-17 and the 2017-18 pilots of the digital experience insights service. On both occasions, there were several initiatives in process that were designed to improve the digital environment and infrastructure. These included:
- Enhancing wireless access by investing in high-capacity, campus-wide wifi to provide an enabling infrastructure (2016-17)
- Increasing the amount of online assessment taking place (2016-17)
- A focus on improving the blended learning experience for learners (2016-17)
- A move to 100 per cent online submission of coursework accompanied by online marking and feedback (2017-18)
- Plans to introduce a new virtual learning environment (VLE) in September 2018
The insights service has allowed the university to gather baseline data which is being used to measure impact and inform implementation, practice and future investment.
685 students (ten per cent of students), both undergraduates and post graduates, completed the survey.
The university also participated in the pilot of the staff digital experience survey in 2017-18, securing 57 responses (29 per cent response rate).
Strategies for engaging students
The university was proactive in engaging students using a variety of strategies and raised the profile of the survey further by attending events and having a presence in student spaces. The student union academic affairs officer for 2017-18 actively supported the project and the disabilities office used their communication channels to send out alerts to registered students.
The first challenge in engaging students was that of finding a suitable slot to run the survey. In 2016-17 the student survey was run before Christmas but the holiday period and pressures of the new term after Christmas did not yield the hoped-for response rate. For 2017-18, the survey was opened after Christmas and ran for a shorter period of three weeks, timed to finish before the university’s own student satisfaction survey began.
Individual email invitations to participate in the survey were sent to all students who were not participating in the National Student Survey (NSS). These were sent via the online survey portal using the unique identifier option. Weekly reminders to encourage completion were sent out, together with a final reminder that this was their last opportunity to contribute. The survey was also promoted via the university’s internal home page, on the VLE, on plasma screens and student information systems.
Project lead, Carol Higgison also identified and targeted specific events, had a presence in student areas such as the coffee bar, student central, the atrium, restaurant and outside the student union office.
A prize of a tablet was offered to incentivise students to participate.
Data analysis
The customised questions were particularly helpful in securing feedback on specific initiatives, for example, 81 per cent of students like to submit assignments electronically but only 51 per cent enjoy getting feedback this way. This is quite a significant dissonance in reactions and an important consideration for curriculum design.
The university really appreciated the templates provided by the insights team and found the ideas for how to present the information visually and guidance on the sort of things others were comparing provided a useful framework. The templates were helpful in disseminating the findings and enabled the project lead to get the information out to colleagues across the university quickly.
The data from the insights service will be reviewed alongside other data sets: Bradford’s own internal version of the NSS which surveys all students not eligible for the NSS (ie foundation and years 1 and 2); the Advance taught postgraduate survey; and other internal surveys. The findings from the staff insights survey is also being reviewed alongside this data to see to what extent, if at all, staff perceptions and skills mirror the student experience.
Carol is also project lead for the university’s participation in the Jisc discovery tool pilot, a tool which is designed to encourage self-reflection and development of digital capabilities. Carol is also interested as to whether the top-level data from the discovery tool pilot corresponds in any way with the staff insights survey data.
Key findings
The University of Bradford is a comparatively small university with a very diverse student population, over 50 per cent of which are of black, Asian or minority ethnic (BAME) backgrounds and over 50 per cent of students are drawn from the Bradford metropolitan region. There are many students from low socio economic groups and some culturally specific aspects to take into consideration. Despite the unique nature of the university, it was reassuring to note that the data were not dissimilar to the national benchmarking data.
- One area where the University of Bradford’s data were different from the national data is that a high number of their students access their learning via mobile devices
- Student perceptions of the quality of digital provision at the University of Bradford are high at 84 per cent. This indicates that students appreciate the infrastructure.
- 70% of students at the University of Bradford feel that using digital technologies enabled them to understand things better, 71% said that digital technologies meant that they enjoyed learning more and 77% felt that use of digital technologies made them more independent.
- Electronic management of assessment (EMA) is seemingly a procedural matter but it has a big impact on practicalities for students – it is cheaper than printing and although a lot of students are nominally full time, the family culture may mean they are brought to/collected from university for scheduled sessions only. Many have family commitments or part-time jobs and so the flexibility of being able to submit electronically is truly valuable.
Acting on the findings
It is really powerful and valuable to see data on areas of developing practice, like EMA and plans to make the VLE a more active learning space. The findings will feed into support for staff as they develop their practices, strategies and tactics to increase interaction.
Project lead: Carol Higgison, senior lecturer in technology enhanced learning, Centre for Educational Development, University of Bradford
Initial findings have been presented at the university’s internal learning and teaching conference. The university’s learning and teaching strategies include a focus and targets for work placement activities as part of the curriculum. The insights survey data gives the university the ‘student view’ and their perception of the value of these activities. This data is useful to the centre for improving teaching and learning and feeds into curriculum design sessions; it helps to put more flesh on abstract principles and underpins and informs practice. The data is meaningful to academic staff.
The high level of mobile use is something the university is taking on board when considering learning and teaching. While the university knew they had a high number of unique users accessing university systems via mobile devices, the Insights survey has provided more information and helped to put this in context. The majority of programmes don’t yet design for high mobile use so this is something to work on in terms of curriculum development and staff training. The fact that this information has surfaced at a time when the university is implementing a new VLE is very timely.
Tips for others and lessons learned
- It is valuable to have the support from the Insights service team and the extensive guidance was ‘spot on’, very accessible and easy to follow. You won’t necessarily need it all – pick and choose the guidance that you feel will work best in your context
- Ensure you have sufficient resource – while there is extensive support, it is still quite a major undertaking, possibly a bigger commitment than you may anticipate but there are strategies such as forming a stakeholder group that can ameliorate this
- Identifying a larger target group requires fewer responses to achieve your targets and provide statistically meaningful data
Next steps
Getting baseline data for key initiatives such as EMA and student engagement with the new VLE after 17 years with the previous platform is a significant achievement. The data, and perhaps a repeat of the student insights service in about 18 months’ time will enable the university to monitor progress and inform ongoing strategies.
The university is currently analysing the data from the staff insights survey and has identified some interesting differences and perceptions between the student and staff data sets. Staff were generally a little more conservative in their responses than students on aspects such as the digital infrastructure but this may be because they engage with a larger number of administrative systems. The Centre for Educational Development will be looking for these differences in perception and analysing the data further.
In times of financial constraint and considering investment levels that infrastructure and technology-based learning services need, it is useful to build an evidence-base to assess the impact and benefits and shape your educational development offering and guidance for staff. The data will contribute towards the achievement of our strategic aims.
Project lead: Carol Higgison, senior lecturer in technology enhanced learning, Centre for Educational Development, University of Bradford