The SHEILA project team ran a workshop on “Developing an evidence-based institutional learning analytics policy” at the 13th European Conference on Technology Enhanced Learning on 3 September at the University of Leeds. The workshop started with a presentation of the project findings and how the development of the SHEILA framework considered different stakeholders’ expectations and concerns about learning analytics. The workshop served the project goal of engaging with “other stakeholders” in the process of policy & strategy development in addition to the direct engagement with senior managers, teaching staff, students, and learning analytics experts. Four representatives of the academic partners of the SHEILA project delivered the workshop.
One interesting question raised among the audience was “Is it possible to solve the conflicts among different stakeholders in terms of expectations of learning analytics?” Well, we thought the answer is “unlikely”. However, the importance of policy development is to minimise such conflicts.
After the presentation, we invited all participants to join the hands-on session – drafting an institutional policy for the use of learning analytics, using SHEILA framework. Before diving into the six key dimensions of the framework, we discussed the purposes for the development of institutional policies. One key feedback is to serve political purposes; that is, to signal institutional priorities and thereby scaling up the adoption in a systematic way. On the other hand, it ensures quality standards in terms of the offerings of educational services across the university.
Dimension 1 – Map political context
One participant raised the point that involving high-level managers would be a key action for this dimension, as institutional leaders have a better overview of the institution’s internal and external drivers for learning analytics. This provides a good example showing that the six dimensions of the SHEILA framework do not need to happen in a fixed order. In this case, actions taken in Dimension 1 rely on actions taken in Dimension 2 – Identify key stakeholders.
Dimension 2 – Identify key stakeholders
External stakeholders, such as industries, that offer apprenticeship to university students were identified as one of the key stakeholders, and a related challenge was to strike a balance between the values held by universities and by industries. As a result, it is particular important to establish a diverse working group to manage the interests and share data ethically.
Dimension 3 – Identify desired behaviour changes
One of the changes expected to see is that data will be used to inform decisions in the daily practices in higher education. However, it is crucial to provide examples showing what it means by “data-driven decisions” and clarify that learning analytics is not meant to replace human contacts but to enhance existing communications with students and the feedback support for them.
Dimension 4 – Develop engagement strategy
As the discussions of the previous dimensions have highlighted, communicating the purposes of learning analytics is key to buy-in, and the hierarchical structure of an institution is something to take advantage of in this aspect when spreading ideas efficiently to the bottom-level stakeholders, e.g., raising awareness among learning & teaching committees in each Department/School and among Student Unions. It was also pointed out that contrasting the benefits of using learning analytics with the benefits of not using it could be an effective way to gain buy-in (or to identify ways to gain buy-in).
Dimension 5 – Analyse internal capacity to effect change
Although the participants in general have good resources to utilise in terms of analytics expertise and infrastructure in their institutions, these resources were not necessarily shared in a way that could move towards wide adoption of learning analytics. In particular, some Departments/Schools tend to have less support than the others when it comes to learning analytics. This shows the importance of key leadership in orchestrating existing resources and facilitate 1st, 2nd, and 3rd line support for all members at the institution.
Dimension 6 – Establish monitoring and learning frameworks
As always, our discussions of the previous dimensions were so enthusiastic that we ran out of time for the last dimension. However, we emphasised the importance of reviewing the big goal for learning analytics and the objectives as identified in the 3rd dimension of the framework. In particular, expectation management is crucial for this dimension, as it will have a determining impact on the perception of usefulness of learning analytics for different stakeholders.
To conclude this blog post, we had a good time and good discussion with all the participants. Thank you all for your contributions to the discussion. Until next time!
Yi-Shan Tsai, 03 September 2018