Posts made by Sylvia Riessner

It makes logical sense to find a way to restructure the criteria so they reflect a developing level of understanding and application of a skill or understanding? But how to assess the relative weight of that skill/understanding across a program?

I used to faciltate DACUM analysis workshops for groups trying to define the important skills and knowledge within job-defined topics - that was a herculean task but worthwhile as it contributed to a much deeper understanding and appreciation of the range of skills and knowledge required to perform effectively in a job AND the need to assess emerging, developing levels of ability as each person developed experience and expertise.

Could you involve instructors in your program is outlining your "rubric" and then teasing out what they would expect to see as students come to them from earlier courses in the program?

We used a series of booklets by Ruth Stiehl and Les Lewchuk while I was at the College - they had one on Assessments that many of us found worthwhile - time-consuming of course.

Good luck!

Sylvia

I think you've already received some excellent suggestions on how to develop distinct rubrics for the 4 speaking classes. I particularly liked Sue's suggestion to do some reliability testing. The way an audience perceives a presentation can vary a great deal in my experiences and I realize that you have course standards to assess but collecting student feedback would help you identify any criteria that might be improved?

I'm curious as to whether you provide feedback to coincide with the rubric evaluation? Some criterion seem to require more explanation (e.g., "no unnatural fillers", or "appropriate language")  Perhaps those terms are supported by other content addressed during the course?

Thanks for sharing - it's a very detailed rubric but seems to identify progressions that are logical and assessable.

Cheers, Sylvia

Thanks for sharing your Peer Evaluation Form - both the Word doc and the Gform. I thought your approach was very straightforward and, putting on my participant/learner hat, I'd be more inclined to complete the online form and maybe it doesn't have to be anonymous to make me inclined to be honest?

As Bettina mentioned, creating a supportive learning environment/community could help, as well as incorporating an ongoing, formative use of the tool?

A couple of minor suggestions/questions:

1.  Move the meaning of the "1234" to the top of the table in the Word doc - sometimes students don't take the time to scan an entire document and might just guess.

2. Make clear when you want students to share their own perspective (expectation) or when there is a clear course standard and they should be assessing whether their peer had achieved that standard

Note:  I noticed a couple of comments you've received have addressed the vague nature of expectations and assigning a point based on expectations. However, I'd suggest you NOT try to be more specific - the feature of single point rubrics that appeal to instructors and students (IMHO) is the simplicity and ease of completion.

If you clarify that this is the student's assessment, based on their understanding of course standards AND their own perception of what was "useful", "valuable", "tactful" etc.  I'm not sure that such subjective terms can ever be totally clear without way too much detail?  Perhaps the suggestion to add an Examples column could help? So the students could quickly provide an example of what they appreciated or found lacking?

And a final reflection in terms of how such a tool could be used. You say, in the Gform, that their responses are anonymous but, perhaps, just as important, would be to make a brief statement of the value of sharing peer perspectives and how the data will be used. If not in the form, then whenever you introduce its use?

You could introduce an appreciative perspective both explicitly and by modeling how you yourself provide feedback/assessment information to the group?

In terms of your exploring the use of an anonymous tool - it would be helpful to think about the size of your cohort, and how valuable their feedback or how fair their assessment might be if the teacher is going to use it to create marks?  My concern with peer feedback, particularly in larger groups would be that it would have to be considered in light of the participation level and attitude of the person providing the feedback. I've seen students provide fairly quick responses to peer feedback and, when I checked my notes (and the online course records) they sometimes had spent very little time participating themselves.

If the purpose of the peer feedback was to identify potential areas that could be improved rather than helping an instructor render an evaluation, I believe your rubric would be very useful.

Cheers


I can see Christina's point about the potential issue around students assigning themselves letter grades too early. However, I think if they are asked to apply the rubric regularly (you say monthly?) then they will have a bit of a continuum that might help them to see letter grades as an evaluation of where 'they were at' at a point in time rather than as a fixed label of performance or ability.

I think the hardest thing for some of my adult learners has been to move away from considering their learning as what can be measured by a grade, by a project, by my judgement. Your rubric, used over a period of time, could not only show them the value of the 3 Ms and the 4 factors, but also show them that their performance may fluctuate over time but that their overall learning and improvement will "make the grade." (so to speak ;-)

One question I had when I was reading your criteria for the Teamwork item:  how do you find they are in terms of recognizing their behaviours and how frequent they are? It seems to me that the students most in need of recognizing that they need improvement might be oblivious.

Do you respond to their self-assessment of the rubric if you feel they are not aware of some of their behaviours?

Thanks for sharing!

As Leonne mentioned, I appreciated your emphasis on the rubric as a tool that benefits students.

Your Powtoon is easy to follow, clearly spoken and adds an additional level by suggesting they self-evaluate before handing in an assignment and then cross-check the similarities and differences when they received their marked assignment back. What a great way to have a really in-depth conversation with your instructor; as an instructor I would have found that so energizing.

Thanks for sharing!

Sylvia