SEMINAR EVALUATION

SEMINAR EVALUATION

by Bill Thimmesch -
Number of replies: 6

I found this tool from Professor Dawn Wright (who owns and gets all the credit).  dawn@dusk.geo.orst.edu

Please compelte this form by June 24th and post.  Best,

Bill Thimmesch

In reply to Bill Thimmesch

Re: SEMINAR EVALUATION

by Nicholas Bowskill -

Hi Bill,

First of all thanks for doing this. I may have made some grumpy remarks but I actually found the session useful because it made me think about why, what and how stuff in general. I couldn't really see much I'd want to automate but even so I probably didn't try hard enough. In getting me to think about this I did however ask myself why not automate different things. That prompted some internal questioning and dialogue so it was a productive process.

Certainly top marks for hosting this and for your effort and resources. If I thought about the discussion of OER and course design patterns etc. maybe there might be some scope for being able to run through some menus to put together design options as an aid to thinking through the issues.

Either way, it was interesting and different. Thanks again and good luck with your work.

Nick, Glasgow

In reply to Nicholas Bowskill

Re: SEMINAR EVALUATION

by Nicholas Bowskill -

Bill, one other point. I was perhaps being in self-centred mode in my last comment. I should have asked how would YOU evaluate the session? What has happened to confirm or change your initial view of automation? What is your view of running a seminar on it? And slightly tongue-in-cheek perhaps, could you have automated this seminar with hindsight? If so what and how? ;-)

Nick

In reply to Nicholas Bowskill

Re: SEMINAR EVALUATION- Nick

by Bill Thimmesch -

These are very thoughtful questions, so I'll be happy to respond to most of them:).

I have learned something from this seminar.  I have learned that the instructional process is more complicated, and that the worlds of education and formal training (instruction) have merged with the help of social media tools (wikis, blogs, sites like this).  I think  my main takeaway is that an automated tool needs to be tried out in a much narrower universe:  skill-based training.  When your training objectives can be precisely stated I think you can work backwards and ask the training design questions that will get you to the right training tools and instructional strategies.  But even then you have to take the learner into account  (do visual learners need different instructional techniques in order to learn how to fly an airplane, put out a fire, or cross-examine a witness)?  Anyway...I'm thinking smaller these days--but still am determined to find a way to automate the simple process that trainers go through every day--asking questions about the purpose of training to define if training is even needed, what training tools would be optimal, and what would the best ways be to evaluate skill transfer.  After all, There are a million books, articles, and research papers out there already prescribing the same training analysis process (ASTD.ORG)--If the steps are so redudant and predictable, then why not automate this part of the thinking process so the trainer can ge to the core of what is  needed to improve employee performance?

My thoughts on "running" this seminar (I'd say facilitating from afar) are that you never know where you'll end up when you'r e talking to instructional designers and experts from around the world.  One day I'll get a doctoral degree and put that all into a dissertation...but not now, when I'm broke and still paying college tuition for my own children...

Automating this process? No. Not as a facilitated result.  Perhaps just indexing the discussion threads so that future users would not have to read all the posts--but could search by the tags to get to an area or URL of interest.

This has been a great experience. Honored to have been a small part of it.

Bill (ROBOT)

In reply to Bill Thimmesch

Re: SEMINAR EVALUATION- Nick

by Nicholas Bowskill -

Bill, what struck me most was when you said that this is unpredictable. I think that's the nub of the thing in some ways. Although the seminar provides a framework and a broad topic the start and finishing point is completely open. This is not one of those courses or activities where the outcomes are already known by someone and its the job of the learners to guess what's in the tutor's mind. It's much more of a social enquiry.

If it was to continue and develop it *could* allow us to shape the thing further by structuring the environment to address our interests and as a response to the needs of the group etc.

I dare say the shell of the thing with an initial forum plus/minus some resources could be reproduced automatically. And it wouldn't make a lot of difference to the rest of what could happen. However, that would really be a fairly trivial thing and maybe the enrollment stuff could be automated too.

Even so the real deal is the openness - in disposition, in the process and in the dialogue etc. It's that open start and end that really marks out these seminars for me and the content is our participation which is also open in its nature. This is rather than having people 'do this and then do that' and do it 'until you are able to understand the other' which is sometimes useful but often very dull.

When we think about OER and openness I think these seminars show the depth of the concept that could be brought into play when compared with materials production being made for free and that kind of thing.   

In reply to Bill Thimmesch

Re: SEMINAR EVALUATION

by Sylvia Currie -
I've found it interesting how title/focus of this seminar has opened up such interesting dialogue about curriculum and design in general. If we had started off with a seminar called "exploring instructional design" or something similar it might not have sparked the passion we saw in some of the contributions here!

I had a peek at Dawn Wright's quick evaluation form and saw that the point of the exercise is to contrast online with "traditional" seminars. I realized that I could not honestly respond to any of the questions about how much I learned or where it happened. I'm not on a campus, but nevertheless I read and engage in discussions through so many different channels. And what I find with SCoPE seminars is that the current topic sort of sticks with me as I do other things, such as how my interactions at the STLHE conference were influenced by what was being discussed online in this seminar. Now I'm puzzled by the use of the Wright evaluation form in this context. But I'm also noticing that "virtual" is not as much a distinguishing factor as it used to be when talking about learning environments.

It also struck me that our evaluation process for SCoPE seminars is always very free-form and informal, just like the seminars themselves. We rarely use the term evaluation at the end of a seminar, and sometimes the effort to gather feedback is barely noticeable. Often we become more interested in the process/design of the seminar rather than commenting directly on the content.

Just as the fact that the seminar is "virtual" does not seem too important, I wonder how thinking of a seminar as a scheduled event influences the evaluation. I'm not sure this is making any sense because it feels like I'm typing before processing, but perhaps it's impossible to ever treat learning as something that happens through an event -- a course or a seminar or whatever.

Anyway, here it is the final day of the seminar I'm getting this sudden urge to talk about evaluation! tongueout Maybe a topic for a future seminar? Note we had a great discussion about evaluation practices for informal learning a couple years ago. It would be interesting to build on that, and include formal learning as well.

Is that of interest?

I'm copying the questions from the evaluation form below for those who haven't downloaded the .doc file:

===snip===

How much did you learn overall?
0clip_image002.pngmore than a traditional seminar
0clip_image004.pngabout the same
0clip_image006.pngless than a traditional seminar

How much did you participate?
0clip_image008.pngmore than a traditional seminar
0clip_image010.pngabout the same
0clip_image012.pngless than a traditional seminar

How much of your learning occurred locally on your campus?
0clip_image014.pngalmost all
0clip_image016.pngthe majority
0clip_image018.pngabout half
0clip_image020.pngless than half
0clip_image022.pngalmost none

Would you take a "virtual seminar" again?
0clip_image024.pngyes
0clip_image026.pngno


In reply to Sylvia Currie

Re: SEMINAR EVALUATION

by Nicholas Bowskill -

Hi Sylvia/Everyone,

I think evaluation would be a very interesting topic for a seminar. Yes. This makes me think about collaborative and democratic evaluation - everyone gets to have a view and to hear the views presented. Its also distinct from someone harvesting the group and taking those views away like stolen goods or something. Lots more I think we could all say about that and other aspects of evaluation.

I've also been thinking about the whole idea of informal learning. It really troubles me in these times of networked people etc. and also in the context of events or sessions like this which might be described broadly as half way between formal and informal. I wonder if these terms have both become problematic and we might instead talk about moments of focus. By that I mean we can be present at any old event, meeting, course etc and we may be mentally attending or not. It's not determined by how 'posh' it is. I've come to thinking that it's really more about what I've so far called 'moments of focus.' That's pretty much as far as I've got or can say in a brief message like this. It's really an expression of dissatisfaction with the notion of informal learning in a networked world. If you like it i claim copyright. ;-)

Have a great weekend.

Nick