Visualizing LAK11

Re: Visualizing LAK11

by Kelly Edmonds -
Number of replies: 2

It's been interesting exploring the range of analytical tools and reading the key blog entries George shared lately. Like some I am learning about this movement and don't feel equipped to share insights or recap my understanding at this point. Being the analytical type, I hold back until I completely understand and have read all, which might take a few years! Smile. As a result, I am grateful for those who are sharing their thoughts and resources.

One thing that is concerning me about analysis of this magnitude is the validity and authenticity of the data. I am impressed with the volume of data the tools can manage but wonder about the compatibility, origin and validity of it.

I don't have the time to review the data presented so far but do plan to attend the LAK11 conference (I live in Calgary) where maybe I can explore this concern deeper.

Anyone else feel the same?

In reply to Kelly Edmonds

Re: Visualizing LAK11

by Gillian Palmer -
Hi Kelly. I'm nowhere near Calgary but would be delighted if you could mine a bit more on the validity question. LMSs tend to give huge amounts of data - like number of posts viewed by a particular student - but when you know full well that there aren't that many posts in the whole course and they view that number (or thereabouts) every week, something isn't right. Time on task is another big problem. Staying in a course to read material is just not me (or many others) so I download 'stuff' as fast as I can and then sort it out later. That distorts not just 'time on task' but also the through-clicks and work done there. I have no answers to getting it right - just observations :(
In reply to Gillian Palmer

Re: Visualizing LAK11

by Kelly Edmonds -

Gillian, thanks for your comments. I think we need to determine what learning is. Yes, an old question, but if we don't know what we are looking for we can't measure it. To me learning is not about the number of posts but what students do with it. I think we need to start with the end in mind - assessments - and rich assessments such as reflections, creations, arguments, etc. Going backwards, we can research the tools used by students to arrive at their understanding and work. If situated in the cloud and accessible, we can review the types of interactions, materials explored, and tools used.

It seems to me most people in here are saying the same thing. Student action in a LMS is only a part of their learning. With a diverse and wide spread networked world, we need to reconfigure how we capture and analyze data about student learning.