On 14th and 15th March I attended Jisc’s annual Digifest event, held at the International Convention Centre in Birmingham. After a conference, I always feel like I need a few days to rest and think. You attend so many sessions and spend a lot of time listening, discussing, tweeting and – at this particular event – playing with Virtual Reality and robots that it takes a while for all the ideas and discussions that have taken place to sink in. After looking back over some of the tweets, I’m definitely suffering from Conference Session Envy. The second day of the programme in particular had a lot of good sessions that clashed with each other. Thankfully, Jisc live-streamed some of the sessions so I – and anybody else who’s interested – can catch up on the Digifest Website. In this post I’m going to reflect on one session in particular. My Part 2 post will focus on some of the other sessions I attended and discuss some of the tech I got to play with.
Humans vs Machines
Learning analytics interventions should always be mediated by a human being
This session faced stiff competition from Eric Stoller’s session on ‘Why educators can’t live without social media’ but I decided that I’m already converted to Eric’s cause, having seen him speak before, and I’ve got much more to learn about learning analytics.
This session was a debate between Richard Palmer from Tribal, who was arguing the case for machines, and Sheila MacNeill from Glasgow Caledonian University who was arguing in favour of humans. Leanne Etheridge from Cardiff Metropolitan University was originally billed to be representing ‘the humans’ but was unfortunately unable to attend. The structure of the debate was as follows:
The machine will argue they can use learning analytics to provide timely and effective interventions to students improving their chances of achieving better qualifications. Machines don’t forget or get sick; learning analytics is more accurate and not prejudiced; evidence for automated interventions.
The human will argue although machines can make predictions they will never be 100% accurate; only a person can factor personal circumstances; automated interventions could be demotivating; automated interventions are not ethical.
Both Sheila and Richard blogged about their respective sides of the debate in advance of Digifest, which are linked below so you can get a quick summary of their main points.
- Time for analytics of the oppressed? [Sheila MacNeill’s blog]
- Learning analytics: ditch the humans, leave it to the machine [Richard Palmer writing on the Jisc blog]
I thought both sides did a fine job at defending their positions, but I confess that I started out and remain biased towards humans rather than machines. While I am aware of all the positive potentials of learning analytics, I don’t think things should ever be left to computer-mediated interventions alone. I thought Sheila raised some fantastic points on the need to think critically and open up dialogues around why we are collecting such vast amounts of data from students and that we need to be transparent about what we intend to do with it. On the surface, it may seem that computers are ‘neutral’ and don’t come with the baggage and bias of humans, but who programmed the machines? Humans with the same imperfections and biases. Education can be messy and complicated and higher education can prove to be an emotional and stressful time for many. Without support and human empathy from tutors and other support staff, it could be that drop-out figures could be even higher. Should we ditch the human interventions completely? Can machines really be ‘better’ than humans?
Automation of certain interventions and services may prove effective and be able to signpost students to services within a university or college that they may not know existed. Perhaps a timely notification or automated email based on a student’s activity on campus or with systems such as Moodle and online library resources could be the intervention that spurs them on to ask for help? Teaching staff could also identify when a student may be struggling, due to low attendance or lack of engagement with university systems and services, and could use the information to make a decision on whether to contact the student. Nottingham Trent University have successfully implemented their NTU Student Dashboard, which has had a positive impact on student engagement, and I do think there is potential for such a system at YSJ.
At the end of the debate, we did a straw poll on which side we thought had won. I was surprised to see that it was an even spread for both sides, as I personally thought that Sheila had made more thoughtful and carefully considered points that considered wider issues around data-driven approaches and their inherent biases. It seems that the word ‘always’ in the debate title was the grey area for many people, and indeed Richard Palmer did concede that there is absolutely a need for human intervention in certain cases. I’m interested in following developments in learning analytics over the next few years and will be keeping an eye on both ‘sides’, despite my pesky human biases! I’ve put together a Storify below of some of the live tweets from the debate, and there is a video of Sheila and Richard in a debrief after the debate underneath. In the meantime, I’ll leave you with a quote from Sheila’s blog which I think is especially important to consider in the current climate:
In these days of alternative facts, distrust of expert knowledge, human intervention is more crucial than ever. Human intervention is not just an ethical issue, it’s a moral imperative. We need to care, our students need to care, our society needs to care.
View the Storify on storify.com
What do you think? Are you ‘Team Humans’ or ‘Team Machines’?! I’d love to hear your thoughts and carry on the debate in the comments below. Look out for my Part 2 post before the end of the week!