The University of Queensland
Data from nearly 10,000 students provides a solid evidence base
Overall, using the Insights survey has been fabulous. There are so many wins to using this survey: it has a good history of use and provides a solid evidence base to support what we are doing at the university.Project lead: Dr Sam McKenzie, project manager, Institute for Teaching and Learning Innovation, University of Queensland
Project aims: Informing innovation and checking assumptions
The University of Queensland used the digital experience insights service for the first time in 2017-2018. The student survey ran alongside key development initiatives that were already underway and the aim was to get reliable and up-to-date information to support:
- The implementation of a bring your own device (BYOD) policy. There was an expectation that students would have wifi portable devices to bring to campus and that they would use these both in and out of class to support their learning. The university wanted to better understand and test this expectation and the issues around BYOD
- The university was looking at how they could change from a centrally invigilated paper-based examination system to make use of e-assessment opportunities and to provide more authentic assessment instruments
- Refurbishment of learning spaces through a campus wide strategy that includes student use of own devices and developing multi-functional study areas
With the Australian Government potentially planning to align university funding policy with metrics such as student satisfaction, the survey was also seen as an opportunity to assess satisfaction in a key area of provision using an internationally recognised survey instrument.
Strategies for engaging students
The university decided to invite all higher education students to participate in the survey to get the full gamut of student experiences and reflections. With a student population of approximately 55,000 students, the university was delighted to receive 9,987 – an 18% response rate.
Early discussions with colleagues led to the survey being launched as the 2018 UQ Student Technology Survey. Naming it by year signifies intent to run the survey in future years. Some basic customisations were made to the survey, for example, naming the learning management system used by the university as Learn.UQ (Blackboard) as virtual learning environment (VLE) is not a term the students would recognise.
The survey was also customised to collect data about the study area of each respondent, allowing the university to cross-reference the results with data from the Quality Indicators of Learning and Teaching (QILT) scheme. This is a national scheme which provides prospective students with relevant, transparent and up-to-date information on the study experience and employment outcomes data from Australian higher education institutions. It allows students to compare statistics from different fields of education, universities and programs.
Two final open-ended questions were added to the survey: ‘Have we missed something?’ and ‘What else would you like to tell us?’
Students received targeted communications via several means:
- A student email was sent out to all 55,000 students using the customer relation management (CRM) system. These were individually addressed and were sent from the academic registrar. It was interesting to note that the survey was launched at 12:30am and the first response was received just nine minutes later. 4,316 replies were received on day one. A week later, as the response rate naturally started to dwindle, follow up emails were sent which resulted in a second spike in responses with 2,369 respondents on day eight.
- The project team of seven staff included a social media expert so platforms such as Facebook, Twitter and Instagram were used to help promote the survey, maintain the survey profile and to send out reminders to participate. The Office of Marketing and Communication (OMC) retweeted and reposted the social media postings along with team members from the Library and Institute for Teaching and Learning Innovation. Getting the OMC onboard was instrumental to achieving a high response rate and also meant that the project team were able to leverage further support.
- Digital signage channels such as announcement screens across the university and at entrance points were used. While different screens were owned by many different owners, a significant success was securing approval from the owner of 70 centrally-controlled screens broadcasting announcements and information in teaching and learning spaces.
- Communication materials produced by the insights team were also used.
A prize draw incentivised students to complete the survey with the opportunity to win donated coffee vouchers, gift cards and the main prize of an iPad.
Data analysis process
The high response rate surprised the project team and has presented challenges in data analysis. The initial expectation was that approximately 600 to 1000 responses would be received so receiving nearly 10,000 responses gave the small, grass roots project team a bigger than expected challenge.
Standard data outputs from the Jisc online survey system gave a clear picture of the findings to share with key stakeholders. For more detailed analysis, the university evaluations unit took on the task of cleaning, partitioning and further analysing the data, producing reports for each subject area and level of study.
The involvement of our evaluations unit in promptly analysing our data is enormously beneficial and has ensured credibility. This seems to be a measure of the importance the data is being afforded.Project lead: Dr Sam McKenzie, project manager, Institute for Teaching and Learning Innovation, University of Queensland
Data from previous IT services surveys about wifi provision and library services provides historical knowledge and context. Various surveys conducted in 2008, 2010, 2012 and 2015 showed that a high number of students have laptops but don’t bring them to campus because they don’t feel they have to, they are heavy, or the batteries don’t last and they can’t find a secure power source to charge them safely.
The global questions started to tell our students' story but the follow-up questions that asked about specific things such as wifi brought the story out. The qualitative data are what explained those nuances.'Project lead: Dr Sam McKenzie, project manager, Institute for Teaching and Learning Innovation, University of Queensland
- Students rated digital provision very highly at 92%, but the survey provided more nuanced data and identified areas that required further investigation such as wifi (satisfaction rate of 84%), how well the university is supporting use of personal devices and the timeliness of access to support.
- 75% of students said that they know they need digital skills in the workplace but only 44% felt prepared. This disparity provides a useful discussion point when engaging others across the university. With potential government plans to align funding policy to metrics such as student satisfaction ratings, retention and employability, this perception requires further investigation
- The project team were surprised by the wifi response rate of 84%. It was thought that deadspots and patchy wifi provision had been identified and remedied after an internal survey conducted in 2012. The qualitative comments provided by survey respondents are highlighting wifi issues that the university can address. As in the UK, Australian universities have EduRoam so no matter what campus or university students are on they should be able to access wifi
- The university now knows that 95% of students have laptops which they use for learning and 98.8% have smartphones or tablets or laptops. This is a key question not always found in other surveys and the emphasis is important: it moves the focus from ownership of personally owned devices to specifically reference use for learning. It also helps the university to address equity issues and has supported the case for a scholarship scheme, funding learners without a laptop to buy their own that they can use off campus as well as in class
Acting on the findings
A key benefit of the insights survey is that it has eliminated some hurdles for the projects going on at the university. We can now say that we have reliable and current data from 10,000 students that will enable us to focus on improving things across the university.Project lead: Dr Sam McKenzie, project manager, Institute for Teaching and Learning Innovation, University of Queensland
Data from the digital experience insights survey is being used to inform other projects and initiatives.
The university evaluations unit have cleaned the data (for example, to remove nonsensical answers) and are producing the first draft of reports for each curriculum area and level of study. These draft reports reports have been presented to a strategic group whose members are connected with a wide range of the university’s faculties and service departments. The strategic group, in collaboration with the project team, has developed short and mid-term actions and will help guide the dissemination of the results that are relevant to their areas.
The reports will then be formalised and disseminated via each of the 30 teaching and learning committees that support the individual schools at the university. Dissemination activities will include opportunities for the teaching and learning committees to discuss the findings and to see how they compare within the university and against international benchmarking data, although the high sample size from the University of Queensland and different sampling methods makes a direct comparison with other institutions difficult.
The university’s learning designers’ forum is using new knowledge about learners' course experiences to promote changes in course design and to work with individual academic practitioners on their digital approaches.
The OMC would like to publish the findings too and share these through their media channels.
Tips for others and lessons learned
- Getting senior management approval to run the survey was important. The approval of the academic registrar to send out an email to all students was instrumental and the support from the OMC with communication was valuable
- It is relatively easy to implement the survey but can be harder to manage the data once the survey closes. The higher than expected response rate caused capacity issues for the project team who recommend considering the data analysis process and how this fits into work schedules early on in the implementation process. If you have data analysis expertise within the university, their help is likely to be invaluable.
For example, the data has informed the work of a newly formed digital literacy task force which has developed Digital Essentials, a series of online modules for students to quickly build digital skills so that they can succeed in study and work.
The university is considering using the Jisc discovery tool to encourage students and staff to self-reflect on and develop their digital capabilities.
Project team and contact details
The project team included:
- Dr Simon Collyer, manager, eLearning Systems and Support, University of Queensland
- Mr Nick Fitt, manager UX services, Library, University of Queensland
- Associate Professor Pedro Isaias, associate professor of higher education innovation, Institute for Teaching and Learning Innovation, University of Queensland
- Dr Sam McKenzie, project manager, Institute for Teaching and Learning Innovation, University of Queensland.
- Dr Christine Slade, lecturer of higher education, Institute for Teaching and Learning Innovation, University of Queensland
- Mr Jacob Tilse, online content coordinator, Library, University of Queensland
- Mrs Noela Yates, acting manager information and digital literacy team, Library, University of Queensland
Dr Sam McKenzie (email: firstname.lastname@example.org), project manager, Institute for Teaching and Learning Innovation, University of Queensland.