I know there's never time to answer everyone's questions or discuss everyone's comments. So I'm just going to put up some of mine.
Listening to Linda, I was of course envious of the community colleges who do have access to the data that was discussed. And I did want to know how the "boots on the ground" - department chairs, instructors and academic advisors were using it.
Or even more learner focused, in the mandatory orientations was analytics based data being disseminated to the incoming students?
As discussed in the Elluminate - I was looking for "action" rather than measuring.
Based on the speaker's discussion of how analytics were already driving action at her institution, the questions I asked in chat during the session were:
What can we as educators do with analytics right now?
And what do we ask for? ( from our institutions)
Or if I wanted to plan for "action" or intervention for next semester what could I or someone else reasonably do based on current common analytics?
In the short term, what intervention for student success could take place based on this data?
What data is commonly available that most instructor could ask their school, college or university for?
Anyone have an thought, answer or want to discuss?
My questions were:
- When considering benchmarks, what is more important national or peer institution benchmarks?
- What are some of the tools that can be used to collect Virtual World (Second Life, Open Sim, spoton3D, etc. metrics?
- How do we combat student and faculty cheating / exploiting the metrics once they know how they are being assessed?
- How do you tie determine key metrics and then develop the analytics to actually measure these?
I've just had a response from some of the PREVIEW project team who used an open source tool for two types of scenarios in Second Life, however the tool has the capability to be adapted for other platforms as needed (They also used OGRE open source engine too).
In follow up to the discussion in Week 1 as well, have received the following
"the system which is used for the PREVIEW scenarios does offer the capability to store information about the decisions taken in the virtual world and stores them outside the virtual world. Every time someone starts a scenario it gives that session an id value, and then stores all the nodes visited during that session, along with time stamps, in text files named as the session ID values. It is also possible to retrieve chat logs from SL. In more recent versions you can access some of the session information via the manager web interface.
However, we haven't really made use of this information or collated any of these statistics from our PREVIEW scenarios, but the capability is certainly there within the system to do so. We have instead gathered statistics on the user experience of students using observation, questionnaires and focus groups."1
The videos or visiting the locations in Second Life, will give an idea of how questions are answered and actions follow on from choices made in the scenarios.
1. More information available via PREVIEW, blog
Thanks for the info. Do you know if PREVIEW is still running? It seems their last post was Dec 2009 on their blog. I'll continue reading what is there and see what I can find.
Will find out,
Currently on John Lester 's (Pathfinder) google group (email@example.com) there is a side discussion on metrics for Second Life and also the amount of concurrent users on the Teen Grid. It's in the "FW from the SLED to HGAC" discussion.
Okay, so I asked to see what data I could get. It won't be the same for every institution and local policies may dictate how much information you can have access to or can request.
Here's what I found:
Registrar - there are a number of reports on enrollment, no-shows, etc. that are run on a regular basis. There was a bit of surprise when I asked for them but with an explaination of why I needed and also with imput from my Dean was able to be sent these.
Admin Assistant, Advisor or other person who has daily access to your school's student information system or ERP- can usually run basic reports. Have the conversation to see what data they have access to and if they have time to pull it for you.
Institutional Research - already runs a number of reports. At my school these reports based ERP and census data.
LMS - most LMS(s) have a student progress section where you (as an instructor) can look up basics on indivdiual students and then statistics on the class as a whole. These can include:
last time logged in
module or content viewed
time on quizzes or tests
discussions posted or read
LMS Coordinator - usually has access to more detailed data by class and may be able to run a number of useful reports. But you have to ask and you may very well be the first person who has ever asked.
Getting a sense of what kind of data is available is an important start. Thanks for listing the outcome of your exploration.
Most universities/colleges focus on institutional statistics (ages of learners, where they live, previous academic experience, drop out rates, etc).
In an educause presentation, I suggested that analytics need to be considered at five levels:
1. Course-level: how are students doing, attendance rates, are they conducting the readings, frequency of log in, use of clickers in classroom.
2. Aggregate-level: this is where we draw most of our patterns of performance. What are the actions of students that succeed? What are early warning signs that we need to intervene? Which readings and learning resources are effective? Which aren't? These results need to be fed back into the course design and teaching processes.
3. Institutional: Three things are important at this level: 1. knowing our students - where they are from, why they are here, what their needs are, etc. 2. The performance of faculty is also a consideration at this level (research impact, publications, productivity in contrast to other universities). 3. Finally, how does information flow within an organization? What are the social and value networks that contribute to effective collaboration and information sharing? This is focus on the administration of the university - understanding how work gets done and ensuring that barriers to creativity and innovation are removed.
4. Regional: state/provincial/institutional comparisons. Analytics at this level assist funders/politicians/decision makers/bureaucrats to understand how different universities are performing in relation to others. Where are costs high? Productivity low? Why do some universities produce significant research while others languish? Who is using public funds "well" and who is not making an impact, etc.
5. National and international: league tables and aggregate comparisons of universities/colleges occur at this level. Obviously ranking highly is a huge benefit to universities for attracting students...but these comparisons are also important for determining which countries and regions are transitioning to a knowledge economy based on how universities perform internationally by research, patents, etc. These comparisons are controversial...but are increasingly being used by decision makers in setting goals for "world class universities".
I've thrown out many subjective terms (productivity, teaching well, creativity, and so on). Each of these is worth debate. For example, what does it mean to be a good teacher? Or a productive researcher? These are not easy questions. But answering them isn't my main concern in this post. Instead, I'm trying to emphasize the different levels of analytics use in higher education.
What have I missed?