Ok. So here’s the practical side: I sit in an office with a manager who says we need a new training program on employee performance management (correcting poor performers). Typically, I would ask a lot of questions to get around what the issue or problem is. Maybe disciplinary procedures and systems aren’t in place, or are not supported by management. Do managers understand and know where to go to enforce performance standards and discipline? Is it enough to “know” what to do or do they need guided practice or the transfer of new skills? Maybe they don’t confront poor performers because senior management does not consistently support actions taken? Or are they just not clearly communicating performance expectations to employees? Could it be that employees don’t understand their written performance plan or that there are consequences for not performing at desired levels? So we go on and on and round and round…eventually the exasperated managers just says, “Bill, get me the conflict management training course for experienced supervisors…here’s the contact number for the vendor.”
Is an automated approach to ISD even possible or desirable? My assumption is that thousands of instructional specialists spend a good deal of time analyzing performance problems and issues to identify whether—and what—training would be most useful to close the performance gap. But we’re still doing this manually—through conversations that narrow the performance issue to skill building, education, or performance support solutions for the learner.
This leads to another question: In an age of increasing collaborative technologies for learning, is ISD still relevant? Learning technologies have emphasized community learning as a new model (wikis, forums, blogs, etc.). But is ISD dead? A structured approach to designing performance-based training might still be relevant. Airline pilots need structured training to teach flight skills, students of public speaking still require guided practice and feedback, and employee counselors still require interpersonal skills training and coaching to be effective mediators.
The question for this opening thread is whether or not an automated approach to the ISD process is still relevant, desirable, and practical. Here's some research to get you thinking--and I'll be online through ScOPE text chat at 7:30 p.m. tonight Eastern Time (United States) for those who want to expand more...
A structured approach can go a long way in assisting teachers where to concentrate their efforts and to be mindful of quality when designing online experiences for their students. An iterative process will guide teachers to concentrate on learning and not simply the content. I do believe that using blogs, forums, et cetera, can assist would-be online instructional designers in the process of developing their courses. Currently, much of the training for teachers developing courses focuses on how to use the tools for the given learning management system. There often isn't any focus on the design process. Many teacher/designers in this context develop their courses in isolation. Here's a place where conversation, whether through blogs, forums or face-to-face can be a benefit. I don't believe that the design process can be automated. Quality is acquired through reflection and conversation. It doesn't happen automatically.
My other issue is faculties and administrators unconscious incompetence when it comes to instructional design. Like a child watching skaters on a pond, they really believe the design process is easy until they strap the skates on for the first time.
I have seen some template based design processes work fairly well in skill-based training where the learning goals are easily assessed because of their visibility ex. learning to tie a simple surgical knot. (Clear objective, visual demonstration, supported practice for accuracy, independent practice for speed, and assess for competency can be repeated for a variety of skills and has been the template used in medicine for years.) Competencies like diagnostic reasoning are much more difficult to template because of the huge variability in critical thinking processes required by different medical disciplines vs.those previously attained by an individual.
I have also been challenged to design "universal" templates in my teacher professional development/university online course design work, and agree with Deirdre that it only works to a point, perhaps for introductory learning or "training" where many must master similar skill sets.
As David points out: Many teacher/designers develop their courses in isolation. Here's a place where conversation, whether through blogs, forums or face-to-face can be a benefit. I don't believe that the design process can be automated. Quality is acquired through reflection and conversation.
One tool I employ all the time now is Bonk & Zhang's book: Empowering Online Learning: 100+ Activities for R2D2. If a talented face-to-face instructor with strong pedagogical sense and background has a manual like this, higher quality learning activities can emerge online with BACKGROUND rather than foreground IT support.
Here's my pet peeve: IT designers who also, albeit unintentionally, create formatting requirements for courses in a program or institution that fly in the face of ease of use/ubiquitous technology/flow and by association, learning! 16 clicks to get to the actual directions for an assignment, or materials sequenced from top (first) to bottom (final) so that as the weeks go by, the scrolling down to find the current xyz actually adds significantly to "class" time! It seems a rare combination to find an IT designer who comes with a sense of "user friendly" and learner-centered when it comes to general (template-like) formatting decisions in their own presentations/online trainings...instructors new to online adopt what is modeled.
All these "pet peeves" are emerging, I think, because the tension between template and high-quality/high-end learning experiences are, I believe, why David claims, it just doesn't happen automatically.
But I'm still listening...a good manual (like Bonk and Zhangs) can be a start, perhaps there are other bridges?
Hello everyone, Great to see another of these sessions unfolding. It feels like a ages since the last one. I really love these sessions as a free space for thinking together.
This topic is interesting to me, as I'm sure it is for others here, from a number of angles. I feel that if you are wedded to the idea of 'instructing' people then it may indeed be helpful to have some kind of template library. We're seeing more and more of this in open source education and representations of course design movements. Although interestingly, these people are very interested in representing the structure but completely overlook any representation of the 'experience' of those designs (this is a paper I'm working on at the moment). So, in this sense, templates, automation etc would only be part of the story that needs to be shared.
Following Piaget, I can't help but think that there is a huge difference between 'instruction' and 'learning' as a goal for education. The whole idea of being 'told' what to think and what to do may fit easily into a corporate view of thinking to be judged, managed and benchmarked etc by others. Despite this, it sits very uneasily in a humane whole-person view of development.
So, from this latter perspective I would suggest a course is co-constructed by the participants and their learning-relationships each time it is implemented. You could still have a broad 'shell' that could be rolled out repeatedly or tweaked. You could even develop and build such a shell automatically. But it is the values behind it that need investment.
In saying all of that I guess that the design for instruction or learning could be automated if we are talking here about technology-based online development. We all re-cycle ideas, materials and structures. So maybe the answer is 'yes.'
This idea of designing 'instruction' does generate feelings of learning-farms and positivism though. It makes me think of the pursuit of total conformity which is a very sad view of learning. These days, we seem to think only of learning as something in the service of employment so maybe 'instruction' is the right word after all. ;)
Even so, if we automate the process and get rid of the people (as we always seem to have as an underlying agenda nowadays), that do the design work, then we must surely lose any further creativity and only replicate the same shell or content endlessly. So maybe the answer is really 'no.'
University of Glasgow, Scotland
http://sharedthinking.info - empathic pedagogy
P.S. Thank you Sarah for the reference to Bonk and Zhang. I was unfamiliar with this particular book. I've been looking it over on Google:
Sounds like you've done a lot of good work in this area. When I look at automating processes, I'm thinking: how can we capture best practices in teaching (for any subject) and create an electronic performance support system for novices to help them quickly generate content--following well-established design standards...Put it another way, if 600 books on instructional design (for training or education contexts) all advise the same approach to design--then why can't we automate that thought process so instructors can concentrate on their own creative delivery of the content...or maybe I'm talking apples, oranges, and kittens. You need to remember that I come from a training background where even complex skills (conflict negotiation) require the same, predictable process in designing instruction and practice to achieve the goals of the training...
Here is an analogy. There are whole classes of operations humans can do quickly and reliably, and computers can't do well (yet?). For example, computers are much worse than humans in sorting galaxies by categories http://www.galaxyzoo.org/ or telling the gender by a photo.
That's because some tasks - and quite a few educational tasks, at that - are inherently complex. As a whole, they have properties not following from their steps or their parts in a cause-effect manner: http://en.wikipedia.org/wiki/Complex_system
Complex tasks may be impossible to automate, at least at this stage of human know-how.
Perhaps an overall ISD "thinking process" is too complicated to map into a computer program as an EPSS. I liked your comment and put up a URL (above) that talks a lot about ISD from a trainer's perspective.
Having said that, are there not best practices in learning tools (e.g., when/how to use simulations; when/how to use critical thinking case studies; when/how to use immersive learning scenarios for particular outcomes)? I'm thinking of helping a new ISD professional with knowing the basics of where to start: So I'm creating a new course on negotiation skills for contract managers...what questions should I ask first about the learners/training outcomes that will point me to the most relevant and available tools to teach this skill? Bill
I have only recently done some work designing for one small business, but haven't any experience designing corporate training. Nicholas' remark "The whole idea of being 'told' what to think and what to do may fit easily into a corporate view of thinking to be judged, managed and benchmarked etc by others." was echoed in the article:
One reason could be (probably is) that creativity and innovation seem heretical, unmanageable and un-measurable, a manager's nightmare. It is easy to ask someone to write x lines of code in 8 hours. How do you ask someone to be creative to x degree??? in 8 hours. Sounds stupid to say the least!!
The notion of how complex problems may require novel responses reminded me of the tragic '98 Swissair crash off the coast of North America. As I remember the story, shortly after take-off, a fire occurred on board of the aircraft. The flight recorder revealed that as the fire approached the cockpit, there was disagreement between the captain and co-pilot. The captain insisted on following procedure while circling and dumping fuel, whilst the co-pilot was encouraging the captain to land immediately, if even over water. Ultimately, the aircraft crashed, killing everyone aboard.
Of course, this is an extreme example. Crash landing the aircraft may have resulted in a similar dire ending. However, while the fuel dumping procedure was the "official" correct response, the co-pilot's novel response may have saved lives.
Yes I think that's an excellent point David. To my mind the task of creating and replicating a design - automating the production of the site/course - overlooks the values that are to be adopted.
Some of the problem comes from the need to have a fixed or known starting point and end-point in a course. Although on the face of it that seems sensible and reasonable its just stultifying in terms of creativity.
If it comes out of the group and they define their interests/agenda it may be far more engaging and creative. The tutor can support that (its not just facilitation).
That takes me to the other bug-bear about the way we design learning based on individual learning and individual assessment. I know that we would need confidence that a person can be a pilot but it doesn't mean that the best way to think about the design process is to start with individuals.
I guess I should check whether we're talking about designing a course with group learning in it, at the heart of it, and what we might mean by the idea of a group that learns.
I could go on and on but I need to shut myself up to let others come in. ;-)
Bill et all
Okay..I think we are getting a little sidetracked but maybe not.
First Bill, based on your replies, your perspective seems to be related to the automation of ID in business/training world if I am correct and your perspective seems to be ADDIE based.
While you may think you are a behaviorist..I have my doubts that you are really a “true” behaviorists with operant conditioning or S-R basis for doing training!
My guess is that all of the “behaviorists are really cognitive behaviorists
From behaviourism to cognitive behaviourism to cognitive development: Steps in the evolution of instructional design. Instructional Science, 13, 141-158.
(I don't have access to a search base for a university, but this article might prove interesting).
The business world has a very limited view of the use and purpose of ID/design. ADDIE lives and breathes there. The focus on the word “performance” has always had a bad taste in my independent mind, that the focus was on the worker as being a trainable item rather than an individual thinker responsible for their own learning. Don't get me wrong..I have been involved in that world and the demands for performance may mean the difference between having and not having a job. Its just the attitude that seems to go along with some business practices that the only important thing is to “perform” rather than participate in the production of that outcome.
For Deirdre and Julia, we are more likely what I call “w system -wayists”.. In other words, the 6 ws make up our approach based on the ID theories and knowledge that is buried in our tacit knowledge banks. No one single approach or method controls what we do, though we may have similar steps each time we look at an issue.
but, the business world more often takes the roads of the Performance Ids
and you have to remember where the objectives/performance and behaviorist training really started..the military
As far as development of design for training until it reaches automaticity, these publications might be of interest..I am still reading the 3rd one.
So what does all this mean for “automating” instructional design? What it means is that computers/software programs would find it difficult to follow our pathways of working unless they were highly adaptive with no set format. The costs of producing such a system would be high and the effectiveness could be impaired by the limitations. It would also not be reproducible the next time. The computer might get frustrated and melt down! :)!
As I said, the collection of data alone such as “Designers Edge” used to do in order to formulate design docs and decisions would and could be possible, but it would still have to allow for flexibility of the user's input.
In addition, this discussion seems to be based on the presumption of using ADDIE which is the trainer's favorite and based on skills sets, not necessarily on integrating knowledge with skill. There is another model called 4C which would make it much harder to automate ID
I am using that link because it shows part of the model
This was the ID2's breakdown of the model ( which has changed since then)
Why automating may be limited
the whole presentation:
I have seen the program that they are using to try to automate 4C/ID design based on the work done originally at Open Univ in the Netherlands. It is an extremely complicated program even for an experienced ID. The learning curve requires absolute knowledge of the theory behind the model. There are no screens showing anymore on their website, but pages 4, 7, 9 in this article will give you an idea.
There are other approaches to design:
The book is that expensive on Amazon too:
I would love to read this one, but don't have 500 for spare change reading!
So where does that leave us..see my reply to Bill's where to go question.
In addition, this discussion seems to be based on the presumption of using ADDIE which is the trainer's favorite and based on skills sets, not necessarily on integrating knowledge with skill. There is another model called 4C which would make it much harder to automate ID.
To which I replied:
True! Isn't "skil" the home of training and development (see ASTD.ORG) just as "knowledge" is the home of learning--which we accomplish through guided reflection through such tools as discussion forums, list serves, blogs, etc.?
While you may think you are a behaviorist..I have my doubts that you are really a “true” behaviorists with operant conditioning or S-R basis for doing training!
To which I replied:
SD-->R-->SR is the correct model here (let's not reduce Skinner to SR only). Actually, training is a behavioral model based on the SdRSr model:
Sd (discriminal stimulus): a training situation
R (response): the student responds (operates on versus sneezing) the situation with a response.
sR (Reinforcing stimuli): the student succeeds. The behavior is reinforced. Now let's get more complicated:
The student received training on difficult conversation skills. This took place via a 2-hour webinar because ISD is too complicated and rapid e-learning is what we do in the world now. SD: The student (a manager) enters a situation with an argumentative employee during a performance review. R Response: the manager fumbles. The student managers the conversation and the manager leaves frustrated and without making the points about the employee's behavior. SR: the employee is going to do this again...enter B.F. Skinner-Thimmesch.
Communication skills require fine tuning (reinforcement, progressive skills training, etc.) so Bill ditches the webinar and says that guided practice with increasingly difficult situations is required. So the manager enters a six-week workshop with guided instruction (behavioral modeling of proper conversation management skills by a professional followed by initial trial responses by the student (manager). These workshops are integrated into the work world so that the manager can discuss training transfer issues (behaviorally speaking, generalization to other settings). The target behavior is reinforced and occurs in training--and transfers to the workplace where the pesky employee finally admits to the manager that they are responsible for their own job performance and that they need to follow the instructions (Sd) of the supervisor as written on the job aid (work plan) so they can then be reinforced (paid and verbally complimented) for doing a good job...
Operant conditioning is active (not reactive) and reveals a complex scientific approach to studying human behavior. Application of OC to corporate training is well documented: http://www.ispi.org/content.aspx?id=54
A tool that takes an instructor/developer through a series of decision cycles with an algorithm of question prompts may work fine for many design problems. For lack of an ID specialist, maybe it should be used rather than relying on the uninformed gut instincts of the would-be designer. Indeed, such as tool may better keep this individual focused on decision criteria rather than blindly selecting any random approach that the individual maybe aware; However, how much reliance would you be willing to place on such a tool? I've seen similar decision making tools used for other applications. One tool that comes to mind is used in career preparation programs. It is a software program that asks the user a series of questions such as "Do you like to work with others or work alone?"; "Do you prefer working with your hands or to solve mental problems?"; "Do you like working outside or inside?"; etc. At the end of the "interview" a list of possible careers will be displayed, based on the user's responses to the questions. Something that has always bothered me about this program is that it doesn't inform the user why any particular career option may have made it to the final list. You will often have reactions like "Why is this telling me I should be a mortician?" Of course, the answer is that the the program is based on simple filtering algorithms as opposed to the more sophisticated, interpretative skills of an experienced career adviser. I agree with Maria's earlier comment "Complex tasks may be impossible to automate, at least at this stage of human know-how."
Bill, people seek your advice as a professional ID specialist when they could easily reference the research materials or any good ID guides on their own. I wonder why they do that? I presume they prefer to have a conversation with someone like you who can share their experience and give an informed response to their design problems. The questions you ask them to determine the challenge of whether a design solution is required, involve an interpretation of the information they share with you. Certainly, a tool or program may ask similar questions, but it blindly follows algorithms that do not pickup on nuances nor relate on a human level to a person sitting in front of them.
During my MA work I read up on ISD, ADDIE, etc., but don't recall design rubrics as such. Are there any that demonstrate a performance standard range ( I.e. Beginning, developing, capable, expert design)?
Though as I write this, I'm pondering how to scale the design's responsiveness to 4 key C's: context, content/curriculum, & client (learner/target audience) needs?
Is it different where the context & client needs are fairly stable so that you might be able to developed a basic institutional design rubric-- maybe with some sub-rubrics based on type of content ( behavioral, cognitive, etc.)? But then where does instructor's pedagogical approach figure?
Interesting to ponder.
I’m Paddy Fahrni (MDDE Athabasca) just having a quick lurk – and have a brief story from the front lines. I’m currently pushing to ready a blended settlement English language course. I’ve pulled in several teachers to the development and the best thing about them is not that they don’t freak about moving to blended in this traditionally class-based area, but that they are experienced teachers who can read their learners’ needs. I’m an experienced teacher as well. The course design fell into place as if we already had very similar templates in our heads. We all understood an education approach that works well in this specific area. An important consideration was that the design be light enough, or porous enough, that the course teacher could flex it for diverse learners.
The online design aspects I transferred to my colleagues through constructing “my” area of the course, then walking them through noting the fidelity to the educational approach and demo-ing course upload. The provision of a model (guess this could be considered a template) was the most important thing to the teacher/builder group in the hurly-burly times approaching course delivery.
Quickly and perhaps vaguely, I’d say that the initial process was selection from a rubric (what works well at this time in this situation) and the actual course at delivery is a temporary template that will be adjusted with reference to a rubric.
Thank you for sharing your practice. Will the blended course be OER? I am working on the development of a blended English language course for Kigali Health Institute Francophone staff as well. I would love to have a look at the design of your course if it will be licensed under the creative commons.
Quest used to have a software program that they tried to call an authoring system that had lots of Word documents in it with pre-designed questions etc. that were supposed to act as the design docs for projects. (Designer's Edge)
Dr. David Merrill and a ID2 group worked on ID Expert but it never was produced commercially.
Today, there are a few different options such as WIDS, VUE and CompendiumLD (mapping). Probably Articulate and Lectora are current softwares that could help, but learning is still based on knowing what to put where in the process.
When I moderated TRDEV there were always requests for templates, but templates, like rubrics, are limitations. A template implies that there is one or a limited number of concrete and highly defined ways of designing instruction. Even a simple task may have multiple ways of doing the task/training, but the issue is which is most effective and efficient at that time. That may change and may vary highly between/among learners.
Templates cannot address algorithms that are constantly changing. There is a definite lack of task analysis and needs analysis that would define the performance steps and the desired outcomes whether you call it constructing knowledge or skills. Then the steps may need to be changed depending on the learner so options are necessary and templates cannot address that. The same happens in automating such processes.
In the 90s they tried multiple ways to use artificial intelligence to design learning, but the brain is too complex to be emulated for every person in one document/process.
Tennyson and Park were two prolific writers back then. (Search eric also)
While rubrics could have some use, in my experience, most of the ones I have seen are too nebulous to be effective. Words like some, most, many, few are not measurable or reproducible between learner experiences or among learners. Rubrics would need to be very specific for outcomes, but that does not define the "learning process/skills" that are to used in the "instructional materials". What I see sometimes is that thinking because it says "instructional design" that they are thinking of lecture delivery, not learning experiences that are covered in the session. Perhaps a better description would be knowledge/skills session. It is where the "art" of design meets the "skills/knowledge" of ID/LD.
Can EPSS help? Yes..but the system is usually meant for more generic tasks. There were some software programs to aid in media selection, but they are/were outdated by constantly changing technologies. If you have entire databases, tools etc. in a complete system it can make a difference by an experienced user and designer. The inexperienced can get lost in the possibilities if they cannot discriminate on the needs versus resources and demands of the product. Starting from design docs would be a better place for the new ID till they learn how to design the docs!
My 2 cents worth.
I always enjoy digging up past discussions to see what we were talking about way back when. The seminar can be accessed here with the login: Guest Guest http://vu.cs.sfu.ca/vu/tlnce/cgi-bin/VG/VF_dspcnf.cgi?ci=147 Click "full message view" to get all posts on one page.
As I recall, one conclusion from that discussion and other user testing was that the guide on how to use the tool was the most useful part simply because it prompted individuals to consider questions in the design process. The fact automation of the design process wasn't valued as much as the developers thought it would be.
Very persuasive, intriguing, and gave me a lot to think about! This also reminded me about the role of changing your perspective...here's a silly story to illustrate: I was in my backyard one day, desperately searching for my car keys (which I'd dropped). Frustrated, I thought for a moment about changing my visual perspective. Instead of facing the ground in my bent posture, I turned around and stepped up onto the storm doors to get a different look...and there they were!
Maybe automating the instructional design process for teachers is not the point at all. Perhaps the way to leverage technology for dynamic results is to adapt technology to indivdiualized instruction -- learner centered. What do you think about adaptive learning technology? I saw this article from the Chronicle of Higher Education a few months ago and was intrigued...perhaps it applies to our discussion of how to use technology in instruction.
I read the article, but I have a few problems with their premise. First, one of the groups that was "successful" was an accelerated stats course..means high achieving students, who are probably adaptable.
Using this type of program with "at risk" or students in CC who have problems completing ASC degree programs, may not be as flexible in their learning, without first doing studies and research may only accentuate the problems.
In addition I wonder what all of the "constructivists" are saying when this is based on Skinner! Skinner works for some things but not for promoting critical thinking skills!
As a further aside it might help to decide which area of ID we are trying to automate.
If you are dealing with an instructor in the business/gov/mil, the skills are similar to a teacher's but the development processes are a bit different.
These are only outlines, NOT the full competencies..
The book might help to clarify some of the issues of competency in design/ development and how to develop those skills
http://eric.ed.gov/PDFS/ED453803.pdf BUT they are being revised:
Third project here:
Personally I think there are several competencies that were missed in the 2001 set. Implementation is very poorly discussed and it is critical to the success of any learning design going live.
If someone is looking to develop competencies then this might be of use currently, but I am not certain if it covers new technology skills:
since the improved and consistent performance of the ID is what makes the difference.
This certification process may shed some light on processes
I think that some of the process, such as the documents etc. can be "automated" or digital when it comes to collecting the data and information, but some of the critical skills such as SAA (Situational Awareness and Analysis)
http://en.wikipedia.org/wiki/Situation_awareness may not be amenable to automation. Yes, the programmer can set algorithms, but it is that certain "something" that the experienced ID recognizes that cannot be captured in the "science" that is considered the "art" of design processes. Also, teams make considerable differences in that same SAA that computers cannot "understand" yet..HAL 2001 and 2010. ID can possibly be considered a "wicked problem" or at last an ""ill-structured" one in many situations and those are not greatly amenable to automation or solution. This poses a problem when definite performance outcomes are expected.
This document might help to clarify some of my concerns.
FYI- the Rothwell book in its 4th revised edition, rather than the 2nd claims to have made major revisions in their philosophy since the 1998 version. The model presented in the CJLT article has changed considerably.
Ok. Here's my issue on this: I've been a student/practitioner, manager in the area of training and development for nineteen years. It has been my experience that in those years I keep re-reading the same research that recommends key processes in determining whether or not a performance problem is related to training or something else. Those same articles that kick around the universe today on blogs continue to validate past practices, and even add some research practices of their own: for example, for performance based-training you need to apply simulations, role-play, or structured one-on-one practice. For information blasts (new company information)--a webcast is enough. Let's call me tired at 50:). In my world of training and development the same thinking process for assessing training needs, identifying preferred modes, and evaluating results have been consistent, predictable, and rather stable. So why in the world would a novice trainer have to pull down research articles, training workbooks, and blog discussions to learn the same information that he or she could learn in seconds by answering some sequential questions from an automated tool?
hmmmm. that felt good.:)
As a behaviorist I'll have to politely disagree. I think Skinner would call critical thinking "verbal behavior." In fact, applied behavior analysis has also been studied in its application to teaching critical thinking skills:
See the PowerPoint that goes with that site. PS: Am I the only behaviorist in the room??????
Bill, I think you must be one of only a few behaviorists still alive. ;-) Surely if we were just slaves to the environment we are all just living out a pre-scripted life. What about our individual self-determination or our collective agency?
How could we reflect or step back and consider other options and wider frameworks if we were just reacting the whole time? I can see some circumstances where we are predictable and where we can do drill n practice to get the point but I don't see ourselves as only reacting to stimuli.
The idea of verbal behaviour, I guess is where you say that and I reply like this until I incorporate the script. I don't think we are organized as a set of scripts one for each situation. It seems like a very reductive view of humanity don't you think?
Now now now....let's not overstate things. I believe there are 7 1/2 behaviorists still alive (risk of reductionist thinking). I agree. Life is more complicated. Did I mention I am also a Christian? My view of behavior analysis is that it has been the most useful and practical tool to set up and reinforce (transfer) skills training in the workplace. I've never gone beyond appreciating its fact-based approach to defining what works with human performance. WHY is not a matter I'm qualified to even get into...and thanks for posting!
Deirdre, my GP is exactly the same when it comes to empathy he says "Don't give me the background. Just tell me the problem." He might have studied with you (joking).
Have a great weekend.
That said, I've frequently said re. working with younger students & my own children, B.F. Skinner is my friend!
Yes you're probably right Julia. I certainly didn't mean to appear pointed or rude. I apologise to you Bill (and to you Deirdre about my last quip). I was really just trying to make a point in jest or at least a very broad point.
I agree that for certain situations it can be very appropriate. I am certainly sorry if I cause anyone any offence. None was ever intended and it's a good reminder to me to be more thoughtful and not to put my foot in my mouth so much.
I'm looking at a few of the programs you mentioned.
One last message and then I will shut up.
In addition to the Compendium LD which came out of the JISC/cetis environment, here are a couple more that you might be familiar with:
Xerte..looks a lot like VUE, but you need apache/linux server to install(no windows version!)
has nice templated pages if they fit your needs
training video of how to use it- will play to the end!
I am going to try this one this weekend.
LAMS- has more design features, especially collaborative
Atelier- out of LD background:
If you use gmail, then you have indexed messages..all grouped by the original message and replies as long as the subject line does not change.
IF you are going to automate then try this for a starter:
Reload also has a SCORM checker which is necessary if you are in the business /mil world
While there are multiple books on training design, they are not necessarily comprehensive in the "ADDIE" process. Many steps can be missed. Each situation is unique and without the ability of the software to adapt, things can fall through the crack.
I tried to find out what happened to "Designers Edge", but they have not replied to my inquiries.
Take a look at WIDS.