Critiques of learning analytics?

Critiques of learning analytics?

by George Siemens -
Number of replies: 28
What are your concerns with analytics when applied to learning and knowledge? What types of critiques and concepts should we explore/consider?

I've started with a few quick thoughts on the topic here: http://www.learninganalytics.net/?p=101
In reply to George Siemens

Re: Critiques of learning analytics?

by Viplav Baxi -

I think Scott is being vastly unjust with his comments. Be that as it may...

At the very basic level, there are many arguments for or against statistical analyses and other forms of analytics (such as those generated by "intelligent" systems). The arguments address generalizability (do the analytics imply that we can take general actions and predict outcomes), appropriateness (are the analytics appropriate to generate for the domain under consideration), accuracy (did we have enough information, did we choose the right sample), interpretation (can we rely on automated analytics or do we need manual intervention or both), bias (analytics used to support an underlying set of beliefs), method (were the methods and assumptions correct), predictive power (can the analytics give us sufficient predictive power) and substantiation (are the analytics supported by other empirical evidences).  

I remember throwing this up in the Edfutures discussion as well.

But why LAK11 should be a smashing success and why it should spawn a new field is surely obvious - for years we have been saddled with inefficient (sometimes barbaric) analytics prone to the same reductionism that Scott rails against. This is an opportunity to see a different approach. Not only that this is the forum which could potentially witness an emergent dialogue based on a theory of knowledge and learning that embraces chaos, complexity, self-organization and emergence.

Viplav

In reply to George Siemens

Re: Critiques of learning analytics?

by Hans de Zwart -
One area that might be worth exploring are the perverse effects of using measurement to change policy. By chance I encountered Goodhart's law (http://en.wikipedia.org/wiki/Goodhart's_law) this week. There are some related "laws" mentioned in the article that I think are interesting too.
In reply to George Siemens

Re: Critiques of learning analytics?

by Shahrinaz Ismail -
I think I've mentioned before in one of my replies or reviews that this learning analytics is becoming more and more of 'education management' area. Yes, we can still use it to measure performance and understand the situation in classroom scenes, but what is the point if only a few classes (or facilitators) use it, and the result of overall system cannot be tabulated?

I guess I'm talking on behalf of 'small lecturers' in an establishing university that is yet to understand and realise the power of learning analytics.

My concerns are more on measurement, whether the results meet the initial objectives we want (because we tend to 'drown' in the pool of data overflow), and what's next...

I mean, we can analyse all we want from the stats and figures we retrieve, but do we really know what to do with it, or what difference can we make out of it? Because again, this is education management level, and the results may say that "this is for registrar dept to change, not me" or "this is my faculty board's responsibility, not me".

Honest opinion,
Shazz
Kuala Lumpur 6:13AM 17Jan2011
In reply to George Siemens

Re: Critiques of learning analytics?

by Tony Searl -

Tending to think LAK is here ready (correct?) or not. If one agency doesn't employ it the next one will.

Whoever manages their bias agenda better, will have first use of attendant moral rectitude. Statistically speaking, of course.

Currently I'm entirely biased towards Peggy Holman's emergence 'hang on and confidently hope'. As dangerous as it may be, I'd rather saddle up than be left dejected, not knowing, at the glue factory.

@PeggyHolman We can’t control emergence. We can engage it confident that unexpected & valuable breakthroughs can occur http://peggyholman.com/papers/engaging-emergence/engaging-emergence-table-of-contents/

I also hope Scott Leslie can find the energy to engage. His yip yip comment ("I mean REALLY hard, because that ... doesn't even start to capture the amount of bullshit this (LAK) smells like to me." ) is just outrageously reductivist.big grin

http://nogoodreason.typepad.co.uk/no_good_reason/2011/01/learning-analytics-2011-is-the-place-to-be.html?cid=6a00d8341c0c0e53ef0147e195fa00970b#comment-6a00d8341c0c0e53ef0147e195fa00970b

 

In reply to Tony Searl

Re: Critiques of learning analytics?

by George Siemens -
Hi Tony - I'm less interested in *who* offers a solid critique...and more interested in getting some discussion going on potential drawbacks to analytics. I"m not content with an emotional reaction or vague dismissal of analytics. Simply reacting negatively to numbers and quantification without some basis on which to explore the critiques is on par with superstition.

I see several concerns arising in relation to analytics:

1. It reduces complexity down to numbers, thereby changing what we're trying to understand
2. It sets the stage for the measurement becoming the target (standardized testing is a great example)
3. The uniqueness of being human (qualia, art, emotions) will be ignored as the focus turns to numbers. As Gombrich states in "The Story of Art": The trouble about beauty is that tastes and standards of what is beautiful vary so much". Even here, we can't get away from this notion of weighting/valuing/defining/setting standards.
4. We'll misjudge the balance between what computers do best...and what people do best (I've been harping for several years about this distinction as well as for understanding sensemaking through social and technological means).
5. Analytics can be gamed. And they will be.
6. Analytics favour concreteness over accepting ambiguity. Some questions dont have answers yet.
7. The number/quantitative bias is not capable of anticipating all events (black swans) or even accurately mapping to reality (Long Term Capital Management is a good example of "when quants fail": http://en.wikipedia.org/wiki/Long-Term_Capital_Management )
8. Analytics serve administrators in organizations well and will influence the type of work that is done by faculty/employees (see this rather disturbing article of the KPI influence in universities in UK: http://www.nybooks.com/articles/archives/2011/jan/13/grim-threat-british-universities/?page=1 )
9. Analytics risk commoditizing learners and faculty - see the discussion on Texas A & M's use of analytics to quantify faculty economic contributions to the institution: http://www.nybooks.com/articles/archives/2011/jan/13/grim-threat-british-universities/?page=2 ).
10. Ethics and privacy are significant issues. How can we address the value of analytics for individuals and organizations...and the inevitability that some uses of analytics will be borderline unethical?

As Viplav stated, analytics/statistics/research/data collection have methods and criteria that can be employed to ensure that we're dealing with accurate and representative data.
In reply to George Siemens

Re: Critiques of learning analytics?

by Chris Lott -
Well, you've hit most of the points I would make that make me nervous about learning analytics becoming the buzz-term du jour (particularly points 1-3, 5, and 6) and which I'd bet reflect some of what makes Scott similarly angsty. I've seen too many examples where what is at root a common-sense proposition (the the context of teaching and learning produces a variety of data) in support of an obvious idea (some of that data could be useful) becomes co-opted and commoditized and ultimately the basis of a cottage industry of repetitive pundits and restatements of the same points and packaging of derived principles that are outright damaging to education.

I suppose there's something noble in the idea of trying to grab hold of the weapon that is being used to kill us and turn it into something good, but it takes a faith I don't have that it's even a possible outcome, much less a likely one. Look at the state of education and technology and educational technology and enlighten me as to where that faith should come from?

One way to think about this would be to consider one of your keynote speakers, (all hail) Tony Hirst. Does the new sexiness of LA seem to be a place where the kind of work Tony has done will be recognized as it should be (but has not been), or will it ultimately promote the kind of shallow, reflexive use of data as a cudgel that has come to epitomize what makes up the de facto field of LA so far?
In reply to Chris Lott

Re: Critiques of learning analytics?

by George Siemens -
Hi Chris, I appreciate your reflections.

"I suppose there's something noble in the idea of trying to grab hold of the weapon that is being used to kill us and turn it into something good, but it takes a faith I don't have that it's even a possible outcome, much less a likely one."

I'm an idealist, I guess, in this regard. As I stated on Bill Fitzgerald's blog, there is a degree of inevitability to the development and use of learning analytics in education. I am uneasy about many aspects of it, but if I attempt to be part of the conversation, then I a) assuage my unease by thinking I'm doing the right thing, or b) actually make a meaningful contribution that is in line with my views of education/learning. But, as you've noted,, there are good reasons to not be too optimistic!

In terms of Tony Hirst's work - I hope that the creativity he exhibits with data will be the spirit that influences what we do with analytics in learning - i.e. analysis that provides new insights gives learners greater control of learning.
In reply to George Siemens

Re: Critiques of learning analytics?

by Bill Fitzgerald -

George Siemens wrote,

there is a degree of inevitability to the development and use of learning analytics in education.

I've been thinking about the inevitability question wrt the increased use of analytics in learning and training scenarios - and it's worth asking, just because something might be happening regardless, why not provide people the tools to discuss the shortcomings of this (potential) new reality instead of looking to hasten it?

George's list of concerns are all great concerns; the things I discussed on my blog tie into that some of these notions as well.

But my single biggest hesitation with how analytics are being discussed here is the near complete exclusion of the rights of the learner in the equation. Analytics purports to make the processes of learning better *as measured by the needs of the organization* providing the learning. There is no feedback mechanism to actually assess the validity/usefulness of what is being analyzed. In other words, using a strictly analytical approach, a person could be judged ineffective at doing something that is inane and mindless - and the person would be "wrong."

Of secondary note - returning to the lines where increased use of analytics is inevitable - is the false dichotomy between two mutually exclusive extremes: Embrace analytics because we're powerless to stop it; versus Not embracing analytics means you are getting left behind.

Other options exist. There is a lot of great data work that can and should be done that will help inform how people learn more effectively, and this can be gleaned without compromising the privacy of people who are learning.

Additionally, there is a lot that can be done that does not require any data crunching. The human element of successful learning is still poorly/incompletely understood. In contrast, some of the optimism I feel about analytics feels comparable to some of the optimism we heard about computers in the classroom, then the internet in every school, then about 1:1 initiatives, then about the virtual worlds, then about the transformative power of game-based learning, etc. I would hope that we could outgrow our pursuit of silver bullets.

We need to resist the urge to paint people who have valid hesitations about compromising personal privacy in the interest of organizational efficiency as luddites, or as people who will be left behind.

Also, re Chris Lott's comment, above:

One way to think about this would be to consider one of your keynote speakers, (all hail) Tony Hirst. Does the new sexiness of LA seem to be a place where the kind of work Tony has done will be recognized as it should be (but has not been), or will it ultimately promote the kind of shallow, reflexive use of data as a cudgel that has come to epitomize what makes up the de facto field of LA so far?

Amen. Tony's work is a model for what we can strive for, but I'm skeptical that people will actually know quality when they see it. We don't need to look very far to see people who should know better making craptastic use of data.

In reply to George Siemens

Re: Critiques of learning analytics?

by Tony Searl -

George

I've only been considering LAK11 for a week or so and your comprehensive concern list disturbs me. Not so much for who said it, or not, but that a preconceived bias may have been brought to the table. I love analyticals unreservedly.evil

I'd be curious what a combined worst case scenario (WCS) based on all your concerns would look like.

I suspect if it was the least expensive model (maybe also the cheapest?) it would be an education system only ROI agenda's could abide.

Can someone build the following in world? Then hawk it around for comment and publish reactions. I know, too predictable, eventual opponents will say. That is the point; scenarios can be built because the machine side of LA can be accurately modelled.

So fix me up this WCS: Narrow outcomes, for profit, concrete, numbered control via computers, employee/er gaming, commoditised and standardised learners with productivity based on value added. Black swans ignored or culled, depending on cost.

A collective voice for the future certainties professional educators don't want is one start with LA. Use it on itself. 

In reply to George Siemens

Re: Critiques of learning analytics?

by Nicola Avery -
Regarding 4-7, there was an interesting lecture at MIT that Robert Merton did a couple of years ago regarding use of quants and failure in terms of technology engineering incl

“Things are not conceptually out of control…. There are plenty of bad and incompetent people…but if there are well meaning, ethical people, there is still a structural problem…If we do 100 innovations…lucky if 2 of them are successful so not feasible to produce a complete infrastructure for every innovation in advance or whilst simultaneously creating them…Its a trade-off, judgement…Models are always incomplete descriptions of complex reality.The need for financial engineering is going up, not down. 600 trillion of derivatives are not going away – its like saying we’re going to get rid of cars.” 1

Also 80 years ago, Knight on risk

“If risk taking were exclusively of the nature of a known chance or mathematical probability, there could be no reward of risk-taking, the fact of risk could exert no considerable influence on the distribution of income in any way…the existence of a problem of knowledge depends on the future being different from the past, while the possibility of the solution of the problem depends on the future being like the past.”2

And Paul Wilmott & Emanuel Derman when writing the Financial Modelers Manifesto, attempt to also outline the ethical issues relating to modelling and prediction

"...Whenever we make a model of something involving human beings, we are trying to force the ugly stepsister’s foot into Cinderella’s pretty glass slipper. It doesn’t fit without cutting off some essential parts. And in cutting off parts for the sake of beauty and precision, models inevitably mask the true risk rather than exposing it. The most important question about any financial model is how wrong it is likely to be, and how useful it is despite its assumptions. You must start with models and then overlay them with common sense and experience."3

I agree with some of Robert Merton's points and find it similar to why I might think there is value in producing learning analytics, but I don't wish to be in a position where a learning 'market' is dominated by a big few in the same way financial markets are and I don't see how it can't be? I am unfamiliar with prediction in learning analytics specifically but would assume that it would be similar in terms of a model developed for a person's learning or a module of learning or an institution's practice.

I don't think there is a lot of legal precedent in relation to user consent, although it is growing.

e.g. having done a quick look today
Privacy - “intrusion into seclusion” is a form of invasion of privacy in which one intentionally intrudes into the “seclusion” or private affairs of another in a manner that would be considered offensive or objectionable to a reasonable person.

(with the underlying complication of defining offensive or objectionable, reasonable and where the legal line is drawn..)

and the historical background in Privacy Considerations for Internet Protocols.

Whilst still wrestling with, don't wish to just be a learner in a kind of Pink Floyd Comfortably Numb state so this forum is helpful thank you.


1. Merton R, (2009) Observations on the Science of Finance in the Practice of Finance, MITWorld, available at http://mitworld.mit.edu/video/659
2. Knight F, (1921) Risk, Uncertainty and Profit, Library of Economics and Liberty, available at: http://econlib.org/library/Knight/knRUPCover.html
3. Wilmott P, Derman E (2009), Financial Modelers Manifesto, Wilmott.com, available at http://www.wilmott.com/blogs/paul/index.cfm/2009/1/8/Financial-Modelers-Manifesto
& Derman E (2009), Scientists, Sciensters, Anti-Scientists and Economists, Economic Manhattan Project, available at
http://streamer.perimeterinstitute.ca/Flash/afa2290d-2a19-46b8-8dcf-c2fbc86a0a17/viewer.html



In reply to Nicola Avery

Re: Critiques of learning analytics?

by Vanessa Vaile -
visions of "learning derivatives" dance through my head
In reply to George Siemens

Re: Critiques of learning analytics?

by Murray Richmond -
I would add another concern, which is implied in your list, about the metric or the basic units being measured - e.g., what does the number of "interactions" tell us about the "quality or value" of a discussion?

How do we compare data from a Moodle course that is set up as an instructor's resource dump versus one that emphasizes learner participation in discussion forums, wikis, and in discovering and posting relevant resources for the learning community of that course?
In reply to George Siemens

Re: Critiques of learning analytics?

by Marielle Lange -
It may help to define terms before engaging in critique.

In the business field, "Analytics often involves studying past historical data to research potential trends, to analyze the effects of certain decisions or events, or to evaluate the performance of a given tool or scenario. The goal of analytics is to improve the business by gaining knowledge which can be used to make improvements or changes." (source: http://www.businessdictionary.com/definition/analytics.html)

In the web-design field, "Measurement tool that analyzes user behavior based on logs of activity on a website. Includes information such as entry and exit pages, most popular pages, paths through the site, links from other sites, and search terms." (source: http://nform.com/tradingcards/web-analytics)

What is not too helpful is that you mix two different notions in your in your post. The ones of evaluation and analytics. Your list of concerns have to do with "analytics" used as ONLY and exclusive means of evaluation. Which it rarely is.

It is important to be aware that analytics should be part of a strategy that should be carefully managed. Project management is a well established practice in many fields. It is typically characterized by these different stages:
* analysis-phase – discovery
* design-phase – problem definition
* development-phase – write solution
* implementation-phase - deliver and manage → delivery
* evaluation-phase – evaluate how well you did
* revision – made as necessary.

Instructional designers may have come across the ADDIE acronym that captures these different steps, though in simplified manner.

Analytics are typically used during the evaluation stage. Evaluation can be defined as "The process of gathering information in order to make good decisions. It is broader than testing, and includes both subjective (opinion) input and objective (fact) input. Evaluation can take many forms including memorization tests, portfolio assessment, and self-reflection." (source: http://www.nwlink.com/%7Edonclark/hrd/glossary/e.html)

A huge range of tools are available to perform evaluations. One of them is analytics. It is just one tool in the toolkit. You pick up the analytics tools whenever you think that it is more adequate than other tools you have in the kit. Whether you pick it or not depends on your goals and strategy.

However, and most importantly, you don't select that tool very late in the process. Both your evaluation and assessment strategy should be defined early, at the design stage. Evaluation will be useless unless you have clearly defined & tangible objectives.

This is unambiguously recognized in the field of web-design, where analytics is extensively used:

"Site Strategy: Defining your own goals for the site can be surprisingly tricky. Arriving at a common understanding of the site's purpose for your organization, how you'll prioritize the site's various goals, and the means by which you'll measure the site's success are all matters of site strategy."
(source: http://www.jjg.net/ia/files/pillars.pdf)

In reply to George Siemens

Re: Critiques of learning analytics?

by Marielle Lange -
What you described, with your list of concerns is this:
http://en.wikipedia.org/wiki/Modern_Times_(film)
A 20th century system.

The question then is whether you need to reject analytics because these concerns maybe relevant to a 20th century system or have education start to move into the 21st century for analytics to be used efficiently.
In reply to Tony Searl

Re: Critiques of learning analytics?

by Mary Rearick -
Toni,
I agree with your observation: "Whoever manages their bias agenda better, will have first use of attendant moral rectitude. Statistically speaking, of course. "

Thanks for the reference to Peggy Holman's book. Enjoyed reading it. Holman talks about patterns emerging from chaos... and she remains optimistic...

Nevertheless, I am reminded of a recent quote by Margaret Wheatley
http://www.brainyquote.com/quotes/authors/m/margaret_j_wheatley.html

I'm sad to report that in the past few years, ever since uncertainty became our insistent 21st century companion, leadership has taken a great leap backwards to the familiar territory of command and control.
Margaret J. Wheatley

Mary


In reply to George Siemens

Re: Critiques of learning analytics?

by Bert De Coutere -
Many points were raised here, and I'd like to emphasize on two:

1- (close to Hans' critique on Goodhart's law) I worry about the self-prophecy effect of using analytics.
Such effect can arrise either because you 'get what you measure' ; once people know how the algorithms work, they will take that into account. (Close to the Search-Engine-Optimization business that grew out of Google's secret algorithm)
Or a self-prophecy effect can occur by actions that reinforce it, eg if you detect certain behaviors or content that attribute to success, you will promote them and as such make them even more important in the model.

2- I see a tendancy to apply analytics just for the 'internal', 'operational' side of corporate training or public education. Those are sanitation statistics on our internal working. While helpful, my critique is that we might be focussing on where analytics have the least value/impact. In business, learning analytics have the potential to finally deliver on the promises to better align with business needs and emerging trends, with workforce predictions and with work performance. In education, analytics might equally not just optimise our own black box, but reach out to parents, provide actionalbe dashboards for policy makers, etc.
In reply to Bert De Coutere

Re: Critiques of learning analytics?

by Adam Weisblatt -
My concerns with any kind of measure is that you have to think about the costs of measuring. Too many senior leaders ask for numbers without giving a thought to the consequences. It takes time and resources to capture and analyze data.

In the same way, people need to build measurement into the design process, not as an afterthought.

It drives me crazy when people ask for numbers to justify the ROI of learning. People scramble to pull the numbers which takes them away from providing value. What is the ROI of measuring ROI?

The best would be if you could analyze data that occurs naturally in the process of accessing information. I think that aggregate data can be useful and doesn't affect privacy or individual performance because it doesn't focus on any one person. For this reason, I am looking forward to the discussion on Big Data.
In reply to Adam Weisblatt

Re: Critiques of learning analytics?

by Anna Dyckhoff -
My concern is that we close our eyes to soon if we get numbers that are satisfactory. For example: if somebody asks me about the e-learning at our university I can show that the number of courses that are supported by the LMS is constantly increasing. I have impressive diagramms :) (We even have more courses registered in the LMS than actual courses.) To my surprise my audience has never asked how the LMS is used!
In reply to Anna Dyckhoff

Re: Critiques of learning analytics?

by Ken Masters -
Anna, when I've asked that question (how the LMS is used), the most common responses were:

  1. That is information about individual courses and teachers, and we're not permitted to give that information (i.e. it is considered spying, running counter to privacy rules, academic freedom, etc.).
  2. The 'system' does not permit us to extract that information.
  3. We do not have the resources (i.e. people and time) to extract that information (alternately phrased as interfering with people's work of supporting the LMS).
  4. A reference to one or two 'successful' (cherry-picked) courses to give the impression that it is common across campus.
  5. A broad generalisation, of a listing of tools and what 'can' be done.
  6. Any combination of the above (#1 is very effective when used in conjunction with #4 and #5).
Perhaps people have just become cynical.

The best thing to do is to volunteer that information.

In reply to George Siemens

Re: Critiques of learning analytics?

by David Jones -
This idea connects with some of what has already been said, but it does provide a useful label, I think.

McNamara's fallacy http://en.wikipedia.org/wiki/McNamara_fallacy

i.e. ignore what can't be measured and assume it isn't important.

Some discussion from last year of this fallacy as it applies to analytics and the idea of engagement

http://davidtjones.wordpress.com/2010/08/09/the-mcnamara-fallacy-and-pass-rates-academic-analytics-and-engagement/
In reply to David Jones

Re: Critiques of learning analytics?

by Mary Rearick -
Or in a similar vein:

Learning and Learning Analytics

"Not everything that counts can be counted, and not everything that can be counted counts." (Sign hanging in Einstein's office at Princeton)
In reply to George Siemens

Re: Critiques of learning analytics?

by Dianne Rees -
I come from a science background and I'm an ex-genetics researcher so I accept the idea of analytics being useful and important to decision making. However, there's a saying in genetics that you get what you screen for. In other words, your view of the world is limited by the initial question you set up and the actual outcomes you are measuring (e.g., does the fly die vs does the fly have disturbed cell signaling and die or does the fly die because it's food was prepared wrong and that may or may not be on your radar). The parameter you measure affects your ability to make conclusions and yet there's the seductive tendency, because numbers are attached, to treat conclusions as particularly authoritative. The study is then summarized to stand for those conclusions and people stop looking into the underlying assumptions and data because, well that takes time, and we may feel uncomfortable questioning people who have a reputation for knowing things. Plus everyone on Twitter is retweeting the numbers, so doesn't that make it so? :).

An issue I have with LAK, as currently implemented in higher education, is that it appears that instructors are letting the systems drive the questions (e.g., what the LMS is programmed to measure drives the questions being asked). The result is that the questions aren't particularly connected to important issues that get at the heart of why instruction is or isn't working. I do think (or hope) that most researchers realize you bring multiple tools to bear on problems, both quantitative and qualitative, and your qualitative tools can recast the way you assess the numbers. I disagree with the assessment of one of the articles that the "numbers speak for themselves" except as this can mean you sometimes can find answers (or at least suggestions) in numbers that you might not have been looking for.

I do think privacy is a concern but this can be tackled with proper levels of trust, safeguards, and most importantly transparency. The same issues impact the field of health informatics and raises the need for constant scrutiny and care but isn't a complete deal breaker.
In reply to Dianne Rees

Re: Critiques of learning analytics?

by Gillian Palmer -
Dianne, I think you are quite right that the teaching staff are letting others drive the analytics agenda and more needs to be done to balance up the questions asked. I add that timeliness of analysis is response is key to utility. I've seen so many earnestly dissected stats that have about as much use as a pennyfarthing cycle because students, employment prospects, government policies, technologies and syllabi have all moved on by the time anyone gets to even hear of the results, never mind do anything with them.
In reply to Gillian Palmer

Re: Critiques of learning analytics?

by Jenni Hayman -
My comment is not a critique as such, but an observation that many of the criticisms and fears expressed in this thread support the need for a personal and deep understanding of how analytics work and how they might be manipulated to suit the agenda of the analyst(s). Knowledge is power in this regard. I have the same strong approach to statistics and research findings. As an Instructional Designer and Instructor who will always seek feedback to improve my work, I do not wish to simply be fed information without fully understanding its provenance.

I view it as an exceptionally serious responsibility.You can't play a game well, unless you know both the rules and the unspoken subtleties.

Also this, I try to keep the chill off cold hard facts by warming them with a hopeful interpretation of what they truly mean to humans.
In reply to George Siemens

Re: Critiques of learning analytics?

by Bert De Coutere -
I'm just reading a newspaper article on the findings of the USA commission on the financial crisis and what went wrong.
One of the reasons/excuses that came up during the hearings is the utter trust bankers had in their mathematical models to calculate risk. To the point that risk was not managed, but justified by reports.

Learning and all other analytics face similar danger. All the algorithms in the world will not stop us people from having to think really hard. Even harder, because all the algorithms and their findings will give us more to think about... Are we up for that?
In reply to Bert De Coutere

Re: Critiques of learning analytics?

by Paul Bond -
We have to keep in mind that they're tools and not rules. And if we come up with any automated systems based on analytics and algorithms they all have to have manual overrides.
In reply to George Siemens

Re: Critiques of learning analytics?

by Linda Burns -
We have to keep in mind that not all data is for the good of students. Even though I am sure all administrators will say so. All universities especially for-profit need to make a profit. I have worked at two and both use data to figure out how long a student needs to be enrolled for them to make a profit. The last school which I taught in a Masters of Education program figured out it was between 3 and 4 months for a one year program. Students were encouraged to keep enrolled, even if they had not completed assignments, and even had flunked classes. It was very obvious that these students would not be able to complete the program. But money rules... not education.