We've been talking about students, but these analytics could also be used to look at instructor performance. For those who are administrators or who train instructors what do you want to be able to see? How can analytics help us gauge instructor presence, engagement and performance? How can it help inform instructional design and also training and professional development for instructors?
Anyone supervise instructors or do quality assurance of online courses?
This will be interesting to explore both the potential uses and abuses.
Closing the loop is what we need. In light of Ryan Baker's Elluminate session that presented data on how students game cognitive tutors, it would be interesting to look at how instructors may game the analytics or intelligent agents.
If analytics look at total time spent - login and take a shower (off-task behavior)
If it is time and location - click through while watching TV or on the phone (off-task behavior)
If it is number of responses - have cut and paste comments ready (systematic cut and paste or gaming the grading?)
If specific key words are need - then have those ready in your cut and paste comments (systematic cut and paste or gaming the grading?)
There may be other strategies that take advantage of the system to complete the task rather than teach the class.
These measures seem to make sure that a task is done and a certain amount of time is spent, but we do need more. How do we get best practices into this? Personally I wouldn't want to teach in a system that evaluated my performance based on the above criteria. Making sure I have the correct amount of logins and the correct amount of responses per students isn't why I love education and teaching. I would want to teach in a system that was able to evaluate my performance and advise me on how to better reach my students.
Great thoughts, and I agree with you that the reason most of us teach or train is to get people more skilled or knowledgeable, or to increase their performance (so they can have more spare time, yes, that would be why I would increase performance :-)
But in order to achieve this, I think learning analytics should be followed in the long-term. I do not think short-term analytics are capable of directing instructors or trainers towards an improved scaffolding or learning-support position that really makes a difference. There are too many factors involved in learning/teaching, which is why I think only a long-term analysis can result in 'best practices' for learners and teachers. Short-term analytics can help with situating learners that are in need for extra support, but not on how the whole group could benefit.
We have the most potential control/influence on the design of courses and strategies, so it seems like Analytics would be a great tool to help us evaluate which designs work, and which don't, for selected purposes.
Is anyone aware of literature around the use of Analytics to evaluate course and programs? I haven't found any myself.
We're planning this kind of work in the California State University system as an outgrowth of auditing/reporting metrics (such as those previously mentioned, around time on task, last login, etc.).