After John Fritz's session on how his institution is using what they can glean from Blackboard, I'd really like to ask and share - what lms is your institution using?
Do you have access to the data that the lms is collecting?
Are you planning on developing a student success intervention based on this?
Or do you already have one?
However most of our courses use many outside platforms, such as Google Groups (because our forums don't yet have email integration), blogs, and especially synchronous tools like Skype, Tokbox, etc. How to capture user interactions across a course, using many tools, and across many courses, using all different tools, is a headache.
This was for corporate training with a regulatory compliance obligation so almost all of our analytics were about percentage of completions of required training across the enterprise. Students and their managers had access to their compliance status.
We also used the analytics to report on usage trends: volume of transactions (1 million to 5 million in one year) and help desk trends (1% of transactions, which was very different than the perceived anecdotal information).
The problem with reporting on usage is that it can backfire: Senior management begins to ask why their people are spending so much time training and what did they get out of it. That is why we need better analytics.
We did a hybrid class where a certain amount of what was normally F2F instruction was replaced by online. But the state agency required documentation of time spent online - did you have time requirement with your training? While there was assessment of the knowledge, the time online requirement did set the course for cheating or walking away from the keyboard by the students. Did the analytics you had have the capacity to ascertain if people were multi-tasking on their computer while training?
In Moodle - are you listed as an instructor or do you have administrator access to take a look at usage?
And if you're not an adminstrator, how well do you know your lms adminstrator? Could you ask them for more information or access?
What type of student usage reporting do you currently have acccess to in the courses you teach?
Do you know first and last login? What they clicked on? How long they spent there?
Depending on how your Moodle has been set up, there’s quite a bit you can find out from the tools already available in Moodle. ‘Reports’ are available within each course Administration block so you’ll be able to use the type of information shown at http://docs.moodle.org/en/Reports which can be downloaded into Excel.
If your Administrators have enabled ‘statistics’ then you have access to even more.
‘Statistics’ isn’t turned on for our University, but our programmers are in the process of writing something which takes the ‘sitelogs’ from the previous day onto another server to show the activity in a way which is more easily digested.
I don’t think they’ve considered the idea of incorporating student grade data the way John Fritz was describing – but they could be soon.
I was wondering if there were any more advanced log analysis tools like you refer to. A quick search indicated that there were some projects (like at your school) but nothing anyone was sharing yet.
We have a long history of using analytics: our "ATK" (Analytic ToolKit) has been used for over a decade and provides a variety of reports. Please see http://ikit.org/atk/ for an overview.
We have also developed a series of "assessment applets" for Knowledge Forum that use those stored data and present them to the users. See http://analysis.ikit.org/w/index.php/Assessment_Applets for details about those.
The upshot of all this is that we find that feeding the results from the analytics back into the course really changes the dynamics. We've called this "concurrent, embedded and transformative assessment" in the literature about Knowledge Building.
Thank you for posting your applets. I don't think I actually have the vocabulary to describe how useful the vocabulary and writing applets seem. I think for anyone teaching a career or trade related course this could show how and if the individual students and the class as a whole was learning the vocabulary needed in their community of practice. (Sorry, this is coming from someone who has worked for the last four years with WebCT, Blackboard and now Desire2Learn and is happy just to see student tracking of content module access.)
When we were looking at the Next Generation Learning grant at my college, one of our philosophy instructor said they would love to have a tool in the lms that checked to see if the students were doing the reading prior to their submission of essays. They referred to a practice they remembered where they hand to turn in their reading notes to their professors. I could see discussions that used the vocabulary applet would allow the instuctor to ascertain this.
I also think the vocabulary applet going into wish list of learner analytics for this class.
But does tracking time logged in really give us an accurate picture of what that student is doing? I can click through forums and not really be engaged in what I am doing, or even sitting in front of my screen for the 15 minutes the LMS thinks I am.
I equate this to our f-2-f classes where the student is physically present, but not necessarily mentally present.
No it doesn't. But in the case I brought up, the state agency requires students have 60 hours of in class training. A student sitting in the physical classroom can be totally disengaged but meet that state requirement. It does seem like we are taking ineffective measures from F2F teaching and using them in the online settings.
So how do we design for engagement? And do we want to design for reportable engagement?
Is it a game design where a learner would always either have to be reacting to material or performing tasks to advance?
I've had classes where people were asked (in a rubric) to reply to at least 5 classmate's posts, in addition to writing something original, and that is what they did, but the work (judging as a fellow student in the class) was mediocre. A better rubric would have been to leave the amount of posts out (something more than 1 is implied), but specify the content more and what is appropriate/good content for the response.
I've also had classmates, by the way, that went way above and beyond the call and really provided thoughtful insight to most posts (very demanding I think, but I learned from them)
I have no problem with bringing in game design. But I do agree grinding (doing menial tasks in order to level) is not engaging and does not necessarily promote learning.
The open ended portion works with motivated students. I think for open ended we do have to look at the learner. If it is the online introductory college accounting course, students may need more encouragement to participate. The students may have never taken an online or a college course. So some guidelines in this type of introductory course can assist students in learning how to participate as an online student. We use a rubric in our basic accounting courses and also what we we call evolving scenarios. The evolving scenarios have the basic setup at the beginning of the week, a plot twist in the middle, a second turn of events and then a wrap-up at the end of the week. But they are required to do original posts and then respond to classmates or the instructors' questions. So what John Fritz's presented on students being able to compare themselves to others in the class may very well be another way other than rubrics to let students learn the habits and culture of being a better student.
Before leaving my previous institution, I participated in a project (http://indicatorsproject.wordpress.com) that was starting to look at using LMS data.
Initally, we we're looking at Blackboard v6.3 and a local system. Then the institution adopted Moodle and that's where focus went.
Getting access to the data from the LMS was very simple technically, but very difficult politically. The technical task was made simpler because the project included a couple of folk from a technical background.
Most of the initial work was simply looking at the data and figuring out what useful patterns could be identified.
I've moved on from the institution, however, the other participants are continuing and that includes a "student success intervention" based on an application that attempts early identification of student failure.
Colin Beer (http://beerc.wordpress.com/publications-and-writings/) is the driving force behind the project.