Critiques of learning analytics?

Re: Critiques of learning analytics?

by Dianne Rees -
Number of replies: 2
I come from a science background and I'm an ex-genetics researcher so I accept the idea of analytics being useful and important to decision making. However, there's a saying in genetics that you get what you screen for. In other words, your view of the world is limited by the initial question you set up and the actual outcomes you are measuring (e.g., does the fly die vs does the fly have disturbed cell signaling and die or does the fly die because it's food was prepared wrong and that may or may not be on your radar). The parameter you measure affects your ability to make conclusions and yet there's the seductive tendency, because numbers are attached, to treat conclusions as particularly authoritative. The study is then summarized to stand for those conclusions and people stop looking into the underlying assumptions and data because, well that takes time, and we may feel uncomfortable questioning people who have a reputation for knowing things. Plus everyone on Twitter is retweeting the numbers, so doesn't that make it so? :).

An issue I have with LAK, as currently implemented in higher education, is that it appears that instructors are letting the systems drive the questions (e.g., what the LMS is programmed to measure drives the questions being asked). The result is that the questions aren't particularly connected to important issues that get at the heart of why instruction is or isn't working. I do think (or hope) that most researchers realize you bring multiple tools to bear on problems, both quantitative and qualitative, and your qualitative tools can recast the way you assess the numbers. I disagree with the assessment of one of the articles that the "numbers speak for themselves" except as this can mean you sometimes can find answers (or at least suggestions) in numbers that you might not have been looking for.

I do think privacy is a concern but this can be tackled with proper levels of trust, safeguards, and most importantly transparency. The same issues impact the field of health informatics and raises the need for constant scrutiny and care but isn't a complete deal breaker.
In reply to Dianne Rees

Re: Critiques of learning analytics?

by Gillian Palmer -
Dianne, I think you are quite right that the teaching staff are letting others drive the analytics agenda and more needs to be done to balance up the questions asked. I add that timeliness of analysis is response is key to utility. I've seen so many earnestly dissected stats that have about as much use as a pennyfarthing cycle because students, employment prospects, government policies, technologies and syllabi have all moved on by the time anyone gets to even hear of the results, never mind do anything with them.
In reply to Gillian Palmer

Re: Critiques of learning analytics?

by Jenni Hayman -
My comment is not a critique as such, but an observation that many of the criticisms and fears expressed in this thread support the need for a personal and deep understanding of how analytics work and how they might be manipulated to suit the agenda of the analyst(s). Knowledge is power in this regard. I have the same strong approach to statistics and research findings. As an Instructional Designer and Instructor who will always seek feedback to improve my work, I do not wish to simply be fed information without fully understanding its provenance.

I view it as an exceptionally serious responsibility.You can't play a game well, unless you know both the rules and the unspoken subtleties.

Also this, I try to keep the chill off cold hard facts by warming them with a hopeful interpretation of what they truly mean to humans.