So - for those of you that represent or have interest in the corporate space, what are your interests/concerns/thoughts around the use of analytics to improve learning and/or collaboration?
Week 1: Introduction to Learning and Knowledge Analytics
What about learning analytics in the corporate sector?
So - for those of you that represent or have interest in the corporate space, what are your interests/concerns/thoughts around the use of analytics to improve learning and/or collaboration?
- Corporate training rarely has grades. Instead, performance reviews and business outcomes are the measure of success. In a regulatory environment, compliance is the significant measure.
- Analytics is most important for decision making at the senior level. Senior management wants to see business results for their learning spend. In a regulatory environment senior management needs summaries of compliance levels.
- Prediction would be most helpful in creating development plans, but that information would come from the Talent Management System, not the LMS.
- Corporations are interested in the value of social media but they want to be able to track the advantage of its use because they pay for it in one way or another (exposure to outside tools, cost of internal tools, time spent).
I would be very interested in how to measure business impact from learning activity.
Where corporate training may be less likely to have grades (although I did have to get over 85% on my a recent elearning module at work before the system would mark me as completed), they do/should/could meaure performance in a way that I think is difficult in the academic world (No work management systems there).
Businesses with clearly defined KPIs and competencies should be able to determine: What is happening? What do we need to improve?
I should then "just" of figuring out how to foster and measure that improvement.
I think I mentioned we are working a project that is crunching agent performance data pre and post the implementation of new training, and preliminary results are quite positive (it is nice to know the training you spend months building is actually making a difference).
And, of course, because I see the process a cyclical, training outcomes should both be a product of strategic planning and performance management and seek to inform/improve those high-level decisions moving forward in a process of continuous improvement.
In short, I envision a world where the LMS and WMS (work management system) play nice with one another without the need to generate and interpret enormous Excel spreadsheets, all in the name of doing whatever it is we are doing better.
I do also have some siginificant concerns about building a system thats only goal is to increase ROI may actually in the long term cause more harm than good, but I'll save those thoughts for another time...
It seems to me that industry has to set targets for the completion of courses to satisfy accreditation (or mandatory compliance as Apostolos says). Once past the basics, many industrialists that I have dealt with in the past have been more interested in the outcomes of the learning, for example: What will the learner be able to do/ do better/ contribute more, etc., what benefits will the comapny gain?
In addition the cost/ benefit has to be justified by the provider/ HR department, to ensure more of the same (if that is what you want to encourage).
They (HR/ Provider) then need to persuade the learner (if the course is not to be compulsory) why they need to undertake the course: The immediate benefits: future prospects: relevance to current role, all apply here but there may also be an element of who else is taking/ has taken the course and the personal interests of the learner.
In this way I think that a 'Hunch' based analysis might be beneficial in influencing course design, subject matter presentation and may well be better placed at the end of a course rather than at the beginning to relate to uptake decisions / success rate and even completion rates. As long as the results are used. So much analysis is quickly reviewed and the filed away - never to be used again or not looked at in the first place.
How many of those who looked at Hunch would return to it again in a years time to performa a comparison. In addition it may be difficult to get learners to complete an in depth analysis. How many have answered all of Hunch's questions - how many got bored before they finished?
I think the questions I would like to ask is 'Who are the analytics for - the learning provider or the learner?' Which leads to 'Should the analytics be up-front or hidden?'
It is only possible to measure its effectiveness based on its practical application in the real working environment, where does not measure the specific result of knowledge adquired but if you measure the degree of achievement of individual goals and group obtained.
You may you be formed throughout life in different topics, not necesarily only of your current job.
The business training especially for new and old workers is usually provided in a simulation environment and only at the last moment can access on the real environment - to avoid mistakes.
Dolors: If, as you say, corporate training can only be measured by the results of applying the learning in the learner’s work, I would struggle to see the difference between corporate training and lifelong learning.
In my experience it is the knowledge that is gained that is assessed by corporate training and often courses are written that do not allow the learner to progress unless they prove their knowledge of the current study. In fact this is my argument against much of the current corporate training; that it only teaches knowledge, to be able to do something by rote, not the application of the knowledge or the development of skills. Of course there are courses that are skills based, but I do not know of many such online courses in industry. To write a course that can assess skills costs more than the general (off the shelf) courses that teach and assess the knowledge since the skills will be relevant to a particular company’s systems and the course applicable only to that company.
Francisco: I think that companies would be reluctant to allow trainees to use their real work during a course because of the risks involved. For example; in the Timber Industry (with which I am familiar) trainers often take on trainees from different companies but have to develop their own scenarios because of the secrecy many companies have about their processes.
In the UK, the term ‘Lifelong Learning’ is applied to any learning that is undertaken after leaving formal education. In theory online corporate training would fall into this category but, to me, it does not fit well due to the simplistic methods used (in order to assure assessment) and because it does not address the acquisition of the ability and skills needed to apply the knowledge. Lifelong learning is more about knowledge based upon experience (both one’s own and that of colleagues) and therefore bound up with skills and ability. The trick is, understanding how to capture this lifelong learning in a way that is meaningful to employers (current and future). I am hoping that learning analytics may supply some of the answers.
Indeed, one thing is the learning and other is the skills training. The first is permanent while the second is set to a particular work situation
And it is true, which the training have controls (tests) that measure the understanding of the basic concepts learned. However, I think this does not really measure learning but justifies the payment of fees of training.
But my experience is that even in traditional higher education you have students doing internships in companies whose work has to be kept secret (my students are from software engineering). All goes well not even involving lawyers!
Joining this with John Fritz post on students being the ones to act on their own data I think all makes even more sense: we learn our way with our personal and/or professional data and interests and we share just what we want.
Is this utopia? I do think in many cases we can achieve it.
I also tend to think that some training does require more than the time one puts into completing the eLearning module (or classroom session). I think that post-training interventions and assistance really do make it much more likely that the information gained in training can be put to use.
What will the learner be able to do/ do better/ contribute more, etc., what benefits will the comapny gain?
In the corporate world learning analytics has to tie to the return on investment for training - why would a commercial entity do it otherwise?
1. I am very involved with innovation. It would be interesting to think about how learning analytics can help me in creating business cases voor learning innovation opportunities. Can it help in validating the business value of certain innovations? I am sure it can, but it requires the competences of asking the right questions and successfully navigating the maze of data ownership. This could allow us to use metric and data (instead of opinion) to prioritise investment.
2. I have a complete fascination for methods like social network analysis or semantic similarity checking and how they could be used to make individuals work smarter. I firmly believe we are at the cusp of many practical implementations of these kinds of methodologies and it is good to be at the forefront of that.
The concern that I have is very much regulatory. Most attempts at doing social network analysis for example seem to be blocked by legal departments with privacy concerns. This needs to change in some way before it can start becoming more useful.
In my company, some of these tools under the umbrella 'Technology Adoptation Program' are for this reason not available in selected European countries.
The finality of education and corporate sectors is different, and that has its consequences for the analytics. The finality of education (from what I've read this week) is 'the grade', and thus learning and academic analytics will be about dashboards on how students are doing on their journey towards 'the grade' and how they match with the rest of the group, and signaling and remediation of students that might not get to 'the grade'.. It will be about the Key Performance Indicators the government cares about: dropouts and people failing. The finality of the corporate sector is not an educational grade. It is not even an accreditation. (That is an semi/false finality that got into corporations when they adopted the university model in their corporate 'universities'. Only within the operational aspects of corporate 'universities' can you apply much of the same that education analytics would.) The finality of corporations is valuable performance of their employees. Therefore, in my humble opinion, you cannot properly introduce learning analytics in a corporation without also getting your act together on performance analytics. They go hand in hand. (Who said that learning IS the work again?)
We need to break out of this old idea the training and learning is different. And we need to change education/learning in both worlds. Students are getting out of school without the skills needed for the very specific skills needed for jobs today. Schools need to change and teach students how to learn. Corporations need to change and start educating their employees. Both need to create a much more open and creative and trusting environment.
Analytics relating to training tends to be focused on Kirpatrick's approach, which isn't the only evaluation method out there. As noted by Brinkerhoff, "Achieving performance results from training is a whole-organization challenge. It cannot be accomplished by the training function alone. … Virtually all evaluation models are construed conceptually as if training were the object of the evaluation.” (Brinkerhoff, http://aetcnec.ucsf.edu/evaluation/Brinkerhoff.impactassess1.pdf) (as cited by Jane Bozarth in Learning Solutions Magazine). I think the concept of training as an isolated function is an issue that goes beyond the analytics challenges. It's why training has received quite a bit of backlash lately (in favor of non-formal learning, for example). Just as training needs to be integrated into the work environment more seamlessly, analytics related to workplace learning also needs to be integrated into the overall work environment.
I couldn't agree more. I've recently been talking with my boss at work in the business world about the importance of moving towards creating efficient problem-solvers and innovators in the workplace (educating). Simultaneously, I see a shift in universities towards focusing on skills of importance in the workplace (training).
Both sides have a lot to learn from the other.
So let me try to better explain my point why there are differences: a lot of discussions in this course so far seem to talk about using learning analytics within the boundaries of learning/training. It's about operational productivity to do things better (predict failure and remediate, better allocation of time and resources, analyse what works best, etc). If we reduce learning analytics to that, there might be little difference within education and corporate training (that is often just mimicking the educational system). In corporations, learning is not a goal on its own. It only becomes relevant if that helps to achieve the real goals that are valuable performance. (same goes for most corporate functions, they only matter as much as how they contribute to the overall corporate performance). So if learning analytics really wants to make the day in corporations, it needs to see beyond its own box, and stretch into performance analytics, align with trends detection in business and workforce, etc. Corporate training is often critised for its low integration within the business, insufficient alignment with business priorities and weak evidence of impact. Learning analytics have potential to address these issues, as well as the operational sanitation checks within your learning box.
I'm very in line with Bert's comments on the scope of learning analytics to be wider than the learning (development) process and/or intervention only, and must include components that are relevant for the business.
Also agree with Hans that we haven't really (yet?) talked about the 'how' in LA for corporations.
Working only in the corporate setting myself, I have no clear answer nor examples of how to use/implement it. Key reason for that is the strong link you need with business intelligence & analytics (BIA) and the fact that so many corporations (still) have little to no strategy/approach/system in place.
So my question would be, are we at all able to deliver LA that are valuable for the business if and when the business have no clear understanding of what is 'valuable' at all??
(I know, i'm being a bit provocative here)
Using the 'hunch' tool did triggered a thought:
1. (corporate) learning changes more and more into enabling employees to work smarter
2. One of the key trends today is the move to 'personal learning paths' (thanks Bert for sharing the paper).
3. Employees (especially 'knowledge workers') most likely know best a) what competencies/skills/knowledge the have, b) what the need for the job, and c) what they are missing (if not, ask yourself if you are a through knowledge worker!)
4. Looking at myself, I have increased challenges finding the knowledge & learning that would really benefit (especially as there is so much available) myself, and for sure would grasp reliable data analysis that can help me out here.
Why don't we combine these statements and look at LA from the perspective of the individual, rather than the corporation and/or senior/line management? (on a side note, this could also contribute to stimulating a learning culture etc...)
In that case, the key question would be: how can we design/develop/implement LA to optimal support the individual 'learner' in their effort to become a smarter worker...?
We could do this by enabling the individual to build their own business case based on for example comparising based analysis of present data, like peer references a la 'hunch', combined with models/analytics that predict or compare the potential result/outcome?
I think Peter's point is important, to keep learners hooked into that bigger picture,the learner should be responsible for his/her own talent management (though this can be nurtured and supported by management). The ideal of LAK is to keep the worker/learner a self-directed participant in the work culture in order to give the learner a sense that he/she has some ownership in the business (and no, LAK is not currently implemented that way, either in education or in organizations).
I definitely agree that the education continuum (otherwise known as professional lifelong learning?) needs learning analytics to be seen in a much broader context than mere course statistics. While so much political input is made into the availability (or not) of education and training that is hardly avoidable and the challenge, as others have commented, is to stop the analytics becoming gameplay. My stance remains that at the grassroots level there should be a focus on making learning analytics useful for learners but that does not stop them being useful to others as well. In a corporate setting, Dan Snowden has done some very useful work under the 'knowledge management' umbrella (and his blog is 'Cognitive edge'). The diversity of academic labels used by different people does not, in this case, matter: it is the utility of the end-result that matters; along with the spread of that utility, its ethics and affordability.
One of the areas that greatly interest me is simulations for corporate training needs. I have done work in the IT and Financial sectors that show the power of simulations to bring together some complex information about how a learner navigated a simulated job situation. Scores are too simplistic to do justice to such complex tracking of learner progress and competence. Consequently, learning & knowledge analytics become more complex as well.
For example, let us consider a scenario that has multiple decision points (connected like in a graph) and multiple paths to the correct outcome. Let us assume that there is an ideal path (not hard to imagine in a highly disciplined process training). A learner's decision making trail or actions trail could be compared to the ideal path/trail and analytics could be programmed to infer from deviations to get a better and more comprehensive picture of learner performance. (also not unlike the notion of knowledge analytics being used to compare competency levels in a discipline).
If you know cricket, you would be familiar with a graph that shows runs scored vs overs for both teams with circles denoting fall of wickets, resulting in what are popularly called "worms" that deviate from each other on the graph as the match progresses.
Point is, these analytics move from being comparisions between numbers (Peter spent 5 more minutes than Pan on the google group), to being comparisons and analytics based on patterns and paths.
Tailing this, I wonder:
- How can this pattern be presented to senior management (assuming they are not at 'our level of understanding') in a form that they appreciate?
- How can we be certain that the paths the employee takes in getting to the final point are something they learnt and contributing to the final result, or merely a waste of time until they found the right 'nodes'?
- How should the evaluation be designed to make it fair for every employee (because some people do take time to understand after a long-winded paths, etc...)?
- Does this mean an experienced 'wanderer' could achieve the KPI better than newbies? I don't think so too.
There's still further way to go after this, and I believe corporate sector is very particular about 'measurement' in evaluation.
Hope I'm in track with my points,
Kuala Lumpur 1:33AM
There could be many visualization techniques culled from daily intelligible life that we could use e.g. directional maps, heat maps, graphs etc. that could be presented to senior management.
How to be sure they are learning? The system could record dead-ends. Simulated scenarios, for example, often have pedagogical dead-ends only a couple of levels deep, post which the learner is forced to retrace her steps and try again. Again, what works in a simulation may not work in an open ended scenario unless there are checkpoints that enable the system to understand and measure progress. In some cases, the wandering around gathering accidental knowledge, may be a desired feature of the learning experience.
How to keep evaluation fair? No definite answer for that one. It depends on the metrics you are using for the evaluation.If time taken is a metric then it would be uniformly used for all learners. Having said that simulations could have guide modes or collaborative environments would have peers or facilitators or instructors who provide the guidance. There is also often some value in letting an orientation happen if it is the first time the learner has encountered the simulation or the format.
Experienced wanderer = expert learner/performer? Maybe. Some people learn much better if they wander around getting a general sense before they settle down to a decided path.
You are absolutely on track