Analytics on Instructor Engagement and Performance

Analytics on Instructor Engagement and Performance

by Kae Novak -
Number of replies: 8

We've been talking about students, but these analytics could also be used to look at instructor performance. For those who are administrators or who train instructors what do you want to be able to see? How can analytics help us gauge instructor presence, engagement and performance? How can it help inform instructional design and also training and professional development for instructors?

Anyone supervise instructors or do quality assurance of online courses?

In reply to Kae Novak

Re: Analytics on Instructor Engagement and Performance

by Murray Richmond -
I don't have any ready answers, but I think this is a really important issue. As with students, the process would need to be transparent and include some types of benchmarks for various levels of "performance".
In reply to Kae Novak

Re: Analytics on Instructor Engagement and Performance

by Rurik Nackerud -
Analytics could potentially help the overall quality of content and instructional interaction if used in this way. Too many times I have joined an online course where an instructor felt they could simply receive student data inputs without providing content or feedback themselves. As an instructor and student I would find this valuable simply in allowing myself to compare my work with other instructors and perhaps discovering areas in which I could improve. As an administrator I would hope to ascertain information that would help me target specific instructors as potential mentors and mentees in teaching online.

This will be interesting to explore both the potential uses and abuses.
In reply to Kae Novak

Re: Analytics on Instructor Engagement and Performance

by Gillian Palmer -
Kae, A number of (I think most) online universities already have instructor performance analytics and acceptable minima are already written into their contracts. Data can be used to assure accrediting authorities that instructors are indeed available and responding. Response times are analysed. Instructor availability, number of inputs, assessment turnaround times, interactions with each individual student and more are collected into assorted charts and used for various QA and HR purposes. The one thing that I have never seen, however, is a serious analysis of the value-added. By that, I mean how much the instructors efforts increase the student's performance. I have seen plenty of raw scores of instructor performance and student Likert-scale perceptions of instructor performance but nothing that closes the loop where the instructor can say, 'Well, if their own effort to use what they had been given had been increased a bit, they might have got full marks." Without that loop closure, feeding back the instructor data into instructional/learning design requirements - especially requirements from people who do not teach the course in that specific target market - is always going to have a large element of guesswork attached.
In reply to Gillian Palmer

Re: Analytics on Instructor Engagement and Performance

by Kae Novak -

Closing the loop is what we need. In light of Ryan Baker's Elluminate session that presented data on how students game cognitive tutors, it would be interesting to look at how instructors may game the analytics or intelligent agents.

If analytics look at total time spent - login and take a shower (off-task behavior)

If it is time and location - click through while watching TV or on the phone (off-task behavior)

If it is number of responses - have cut and paste comments ready (systematic cut and paste or gaming the grading?)

If specific key words are need - then have those ready in your cut and paste comments (systematic cut and paste or gaming the grading?)

There may be other strategies that take advantage of the system to complete the task rather than teach the class.

These measures seem to make sure that a task is done and a certain amount of time is spent, but we do need more. How do we get best practices into this? Personally I wouldn't want to teach in a system that evaluated my performance based on the above criteria. Making sure I have the correct amount of logins and the correct amount of responses per students isn't why I love education and teaching. I would want to teach in a system that was able to evaluate my performance and advise me on how to better reach my students.

In reply to Kae Novak

Re: Analytics on Instructor Engagement and Performance

by Inge Ignatia de Waard -
hi Gillian, Kae and all,
Great thoughts, and I agree with you that the reason most of us teach or train is to get people more skilled or knowledgeable, or to increase their performance (so they can have more spare time, yes, that would be why I would increase performance :-)
But in order to achieve this, I think learning analytics should be followed in the long-term. I do not think short-term analytics are capable of directing instructors or trainers towards an improved scaffolding or learning-support position that really makes a difference. There are too many factors involved in learning/teaching, which is why I think only a long-term analysis can result in 'best practices' for learners and teachers. Short-term analytics can help with situating learners that are in need for extra support, but not on how the whole group could benefit.
Inge
In reply to Kae Novak

Re: Analytics on Instructor Engagement and Performance

by Adam Weisblatt -
This would be great in the corporate environment. I would love to get rid of "Happy Sheets" as they are called (Was the room temperature comfortable? Was your instructor prepared?). This was an example of useless busy work that took on a life for itself. The time and infrastructure expended to collect meaningless surveys about student classroom experiences was immensely frustrating. I would like to see a more robust data-driven methodology for analyzing (and therefore developing) instructors.
In reply to Kae Novak

Re: Analytics on Instructor Engagement and Performance

by Su-Tuan Lulee -
Good questions! I wonder if there is any tools for analyzing these very helpful points. SNAPP seems only work for analyzing interactions between participants.

Su-Tuan Lulee
In reply to Kae Novak

Re: Analytics on Instructor Engagement and Performance

by John Whitmer -
This is a great thread, and what I'm interested in as well. While "Learner Analytics" are a great idea to help students modify their behavior (or at least understand what the predicted outcome is), I'm interested in how we can use Analytics to design environments that improve learning outcomes for all students - or those targeted, such as those at-risk.

We have the most potential control/influence on the design of courses and strategies, so it seems like Analytics would be a great tool to help us evaluate which designs work, and which don't, for selected purposes.

Is anyone aware of literature around the use of Analytics to evaluate course and programs? I haven't found any myself.

We're planning this kind of work in the California State University system as an outgrowth of auditing/reporting metrics (such as those previously mentioned, around time on task, last login, etc.).