Thank you Sten and Erik. I'm thinking even i can make some sense of this. I like that.
I may be missing the point completly, but is this data more a reflection of seat time than active learning?
I sense some anxiety re: time spent at the course hub, especially amongst those who favour distributed participation?
I'm certainly suprised at what the data suggests for my own 'participation'. Maybe it reflects my log ins where I leave moodle open without doing anything active towards course learning? Or maybe randomly clicking docs genuinely searching for learning, which would have the same data effect of gaming activity? I'm not sure. The user intent is uncertain.
Whatever this data is really telling me, I guess it makes me think about how cautiously humans should inform themselves rather than drive themselves, or student and self learning, with stuff they don't really get, yet.
I wonder what other variables are not being taken in account.
Tracking what happens after the students leave the online learning management system is of course very hard to do automatically (unless we could mount some kind of smart device on their heads that detect what they are doing and looking at ;-). We did an experiment for an evaluation of the tool, where we asked students to tweet what they were doing for the course and for how long. We used twitter because students might like it more, but most of them stopped tweeting very soon.
Next to that, I use the moodle logs to drive the visualizations. The moodle logs contain timestamps of the user interactions. The problem is that there are only events for when something starts and not when it finishes. So I know when the user opened the forum for example, but I do not know exactly when the user closed it. Most of the times this is not a problem, when the user does another action in a short period after the last action. So I use a threshold to determine the duration for longer periods between 2 actions.
I hope this makes things a bit more clear...
It's been interesting exploring the range of analytical tools and reading the key blog entries George shared lately. Like some I am learning about this movement and don't feel equipped to share insights or recap my understanding at this point. Being the analytical type, I hold back until I completely understand and have read all, which might take a few years! Smile. As a result, I am grateful for those who are sharing their thoughts and resources.
One thing that is concerning me about analysis of this magnitude is the validity and authenticity of the data. I am impressed with the volume of data the tools can manage but wonder about the compatibility, origin and validity of it.
I don't have the time to review the data presented so far but do plan to attend the LAK11 conference (I live in Calgary) where maybe I can explore this concern deeper.
Anyone else feel the same?
Gillian, thanks for your comments. I think we need to determine what learning is. Yes, an old question, but if we don't know what we are looking for we can't measure it. To me learning is not about the number of posts but what students do with it. I think we need to start with the end in mind - assessments - and rich assessments such as reflections, creations, arguments, etc. Going backwards, we can research the tools used by students to arrive at their understanding and work. If situated in the cloud and accessible, we can review the types of interactions, materials explored, and tools used.
It seems to me most people in here are saying the same thing. Student action in a LMS is only a part of their learning. With a diverse and wide spread networked world, we need to reconfigure how we capture and analyze data about student learning.