Week 1: Introduction to Learning and Knowledge Analytics

Questions about the questions

Questions about the questions

by Irwin DeVries -
Number of replies: 5
There's a fairly classic consideration in the development of systems for collecting and analyzing data (beyond routine transactional functions), and it is this: what are the questions we want to put to the system? Who will be asking, and how will the answers be used? These questions are core to the discussion of analytics and get us focused on what we are doing, as well as why and how. Many issues flow from that simple set of questions.

There are comments related to this sprinkled throughout the discussion, but I would propose we share our ideas in one place for consideration and discussion.
In reply to Irwin DeVries

Re: Questions about the questions

by Anna Dyckhoff -
Hello Irwin,

> Who will be asking, and how will the answers be used?
In my experience, teachers who are actively engaged in creating online learning scenarios (additional to traditional teaching) are also motivated to evaluate the effectiveness of the online learning activities. Action research might reveal new information worth knowing, but teacher face difficulties: the increase in workload is too high. Therefore they wish for computer-based assistance, like visualizations that are easy to interpret.

> What are the questions we want to put to the system?
Last year I collected some questions that teachers are interested in and already have asked. Here are some examples (maybe you can add more...):

  • Are students learning in groups ore all by themselves?
  • Do students like/value a specific learning activity?
  • Are students still motivated to use a learning offering, after having used it?
  • Are students using specific learning materials at home or mobile?
  • Are students printing learning materials to learn offline?
  • Are there learning offerings that haven't been used at all?
  • How intensely and when is the learning offering used for preparation of exams?
  • How much efford does a learning activity take compared to other activities?
  • Which teacher activities facilitate continuous learning/ increase learning?
  • How do learning offerings have to be combined wirh support to increase usage?
  • By which properties can students be grouped?
  • Do students of all learning styles profit in equal measure?
  • Do native speakers have less problems with the learning offerings than others?
  • Is the participation/performance in preparatory online tests somehow related to exam grades?
  • Are students using specific learning materials (e.g. lecture recordings) in addition or alternatively to attendance in class?
  • Will the access of specific learning offerings increase if lectures and exercises on the same topics are scheduled during the same week?
  • How is the usage of specific learning offerings differing according to user properties?
  • etc.
All these questions have been asked for different purposes as, e.g., exploratory purposes, getting to know the audience, improving learning materials, deciding about what to keep and what to change, finding indicators for "good learning behavior" etc.

In reply to Irwin DeVries

Re: Questions about the questions

by Apostolos Koutropoulos -
A number of people in other threads have also brought up the issue of bias. I think this would be a good place to restate this because questions that we ask are questions that come out of our world experiences and as such have some bias. If the questions are multiple choice, what we ascribe as valid values for these choices are also based on our personal biases.
In reply to Irwin DeVries

Re: Questions about the questions

by Tanya Elias -
Hi Irwin,

I especially like the question "How will the answers be used?" I did read a good quote somewhere (maybe in the work on action analytics??) that if what is gathered and learned is not applied to improving the system we are not engaging in analytics at all, but rather simply mining data.

At my work, someone came up with the idea of inserting a short evaluation tool at the end of each online training session. It is both an excellent idea and doable. The questions that have not yet been answered (or asked actually as so far they've only lived in my head) are "who is going to look at that information", and "how will use it to ensure the development of better courses?"

(And I'm very open to any potential answers out there smile.)
In reply to Irwin DeVries

Re: Questions about the questions

by Sylvia Currie -
As I was re-reading this thread I was reminded of an earlier (2001) open seminar discussion on Data Mining* facilitated by Osmar Zaiane from University of Alberta. It was organized through the Global Educators Network where I was moderator, which is one reason it stands out in my mind. I mined my own brain to retrieve it!

Thanks to Osmar, 10 years ago many of us were introduced to the ideas of data mining in e-learning, and also struggling with these "what comes first" questions.

During that seminar Irwin asked this question, not unlike the concerns expressed around learning analytics:

...it's interesting that we started by talking about the solution (data mining) and then looking for questions it could address. Sounds to me like, "Here's a new drug--let's see if it can cure anything." Why aren't we asking what our needs as educators/adminstrators/learners are with regard to data--and then exploring what tools (e.g. data mining) could be used to accomplish this.

Osmar responded:

...I have noticed in previous attempts that it is difficult to exactly express these needs if we do not know the capabilities of the existing tools. It seems that users (educators in this case) become inhibited by what they foresee being possible. My attempt in this forum is to explain the tools in the hope that they broaden the vision and help express more sophisticated needs for educators using e-learning.

It's an interesting dilemma for any type of tool / design scenario. What we need is influenced by what is possible. And often what is possible becomes so overwhelming that we lose site of what we thought we needed. (er, something like that! mixed )

* I uploaded a .pdf of the seminar transcript. It's also available online, but requires guest login: Guest / Guest
In reply to Sylvia Currie

Re: Questions about the questions

by Murray Richmond -
I think this is an almost universal dilemma; do we have a tool in search of a solution or do we have a need to answer some questions that we haven't even thought of yet.

It's a key weakness of needs analysis. In the early days of e-learning or distance education there was a lot of asking potential users what mode they preferred. Of course, the answer was always overwhelmingly in favour of classroom/face-to-face because that's all most of them had experienced. I used to say it was like asking someone if they liked sky-diving. Their answer is meaningless unless they have actually experienced it.

I think the Gartner Hype Cycle best captures the phenomenon of introducing a new technology or strategy into an existing area. If the new technology ultimately reaches the Plateau of Productivity, many things will have changed and there will be new ways of doing whatever it was that was impacted and new questions and issues emerge requiring yet undiscovered solutions.

Note that in the 2010 cycle Predictive Analytics is almost there while Social Analytics is heading towards the Peak of Inflated Expectations.