Archive for June, 2005
One of the big nagging issues till date in organizational learning, including eLearning is ROI. In a recent interview with Karl Kapp for E-LearningGuru.com, Cisco’s Tom Kelly mentioned –
No one ever calculated the ROI on fax machines or email or anything like that. They seem to think the learning industry should focus more on benefits like improved quality or responsiveness and less on some of the harder business methods. In fact that’s what we did here for the first three or four years, we said the same thing. We said no one did ROI on email or voicemail because it is better to be connected than not connected. We asked, so why are you having this problem with learning? Isn’t the only reason to communicate to either teach or learn? And mostly the answer is “yes” even if you’re yelling. So then email, voicemail are two tools for e Learning. And if that’s true then we should say that all this stuff, even if its video based, is content that drives both the formal and the informal parts of learning. And yes, in some cases, it is expensive, but ignorance costs a whole lot more.
I would personally agree that we shouldn’t have to do ROI on things that make people smarter and better at their job, and more connected to their companies, their goals, and their missions but I don’t write the rules.
Jay cross firmly believes that (and explained in detail in his e Book- Metrics)
- The metrics of training should be business metrics
- ROI is in the eye of the beholder
- Intangibles and services make old accounting models obsolete
- If you don’t have your sponsor’s support, you are toast
Chip Cleary recently wrote a great article at CLO magazine on this subject – Measuring Business Results Using Business Impact Analysis.
First he stated the problem -
Most learning executives find it frustratingly hard to determine how much a learning event benefits their organization. After all, the direct results sit inside employees’ heads, where you can’t see them. The business benefits then unfold over time, in the midst of many other forces that also influence end results. Because of this indirectness, the business impact of training is rarely evaluated.
Why is evaluation (in business context) so difficult?
- The training is only one of many factors that drive results
- It is not possible to run controlled studies
- Existing metrics do not adequately measure performance
How can we do better?
The approach underlying BIA is to open up the black box by creating a simple causal analysis of how a given training course is expected to create value. With this analysis, we can then create an efficient method of quantifying the impact of a piece of training and measure its economic benefit.
here are many factors that cause peak performers to achieve better results than average performers. It is wasteful to try to target all of them with training. Some do not much matter to the business. Others are not amenable to training. So, instead, we focus on a set of the “critical mistakes” that separate peak from typical performance and that matter the most to the business.
To maximize and measure the benefit produced by training:
- Identify a set of potential critical mistakes.
- Estimate their cost.
- Provide a solution that eliminates the most costly mistakes.
- Measure the resulting reduction in the frequency of those mistakes.
The process reminds me of a post by Dave Pollard – The Cost Of Not Knowing. It’s like a practical and technical application of the business context analysis of the business context analysis of learning in organisations.
TagCloud is an automated Folksonomy tool. Essentially, TagCloud searches any number of RSS feeds you specify, extracts keywords from the content and lists them according to prevalence within the RSS feeds. Clicking on the tag’s link will display a list of all the article abstracts associated with that keyword.
As they say, know thyself! Here goes the TagCloud of soulsoup for last 10 posts
And here is my reading pattern cloud, created from my imported OPML from Bloglines subscription.
This is the first part of my ‘Learning Actually’ series. Although the core of the series is technology-aided learning, the main theme is organizational learning using a holistic approach. Whenever I am trying to design a solution for organizational learning, be it theoretically or practically in my professional life, I face roadblocks arising from the faulty core of the traditional learning culture, clichéd training methodologies and more intensely, the disability to distinguish between training and learning. After all, with a faulty learning strategy, your problems are not going to get solved by adding an ‘e’.
Hence ‘Learning Actually’.
Please share your comments, agreements and disagreements to make the series more complete.
is non linear
Like problem solving in the real world, learning is non-linear. The problem starts when we try to carry our habit of ‘first learn then work’ from our schools to work organizations. In the real world, there is no such thing as the linear method.
Why? Because the real world is complex, not just complicated. The learning needs of knowledge workers are also complex, networked and asymmetrical. Think about a hi-technology sales person. Will in-depth knowledge of the product’s features, functionalities and benefits alone make him successful? Apart from explicit knowledge (which can be gathered through linear learning) he needs to know the competitive advantages of his product. He requires the skill of listening to his clients and promptly designing solutions, which fit their needs. And throughout his sales cycle he needs to be in a learning curve. This can’t be achieved through a linear classroom-based model.
The metamorphosis of data into information, information into knowledge and knowledge into intelligence is complex and non-linear. The knowledge worker needs the knowledge inflow when he requires it. An excellent example of this can be found in an original paper published in Touchstone, written by E. Jeffrey Conklin & William Weil.
A study at the Microelectronics and Computer Technology Corporation (MCC) examined how people solve problems.
A number of designers participated in an experiment. Each was asked to design an elevator control system for an office building. All the participants were experienced, expert designers, but none had worked on elevator systems. Participants were asked to think out loud while they worked on the problem. The sessions were videotaped and then analyzed.
Traditional thinking, cognitive studies, and existing design methods all predicted that the best way to work on a problem like this was to follow an orderly and linear process, working from the problem to the solution. You begin by understanding the problem, which can include gathering and analyzing data. Once you have specified the problem and analyzed the data, you are ready to formulate-and then implement-a solution.
Traditional wisdom for solving complex problems-the “waterfall”
This is the pattern of thinking that we all assume we follow when faced with a problem. The conventional wisdom is that the more complex the problem, the more important it is to follow this orderly flow. If you work in a large organization, you have probably seen the waterfall model of problem solving enshrined in policy manuals, textbooks, internal standards for the design process, and the most advanced organizational tools and methods.
In the MCC study, however, the designers did not follow the waterfall model. They would start by trying to understand the problem, but would immediately jump to formulating potential solutions. Then they would go back to refining their understanding of the problem. Rather than being orderly and linear, the line plotting the course of their thinking looked more like a seismograph for a major earthquake, as illustrated in the diagram. We call this pattern both chaotic, for obvious reasons, and opportunity-driven, because in each moment the designers are seeking the best opportunity to progress toward a solution.
Actual pattern of problem solving – the “seismograph”
This non-linear process is not a defect, not a sign of stupidity or lack of training, but rather the mark of a natural learning process. It suggests that humans are oriented more toward learning (a process that leaves us changed) than toward problem solving (a process focused on changing our surroundings).
Of course, linear processes are quite appropriate for solving many problems, such as computing the square root of 1239 or choosing the shortest route to the new mall. But within organizations-such as corporations, institutions, and government-where lots of people work on complex issues, people are encountering a new class of much more difficult problems. We call these wicked problems because of the dynamic and evolving nature of the problem and the solution during the problem-solving process. It is these problems that the techniques described in this book are especially useful for solving.
Let me give a simplistic and personal example. I bought a Nikon D70 Digital SLR recently that came with a 400-page manual. The manufacturers would hope that I read the manual completely before operating the camera. Naturally, I did not. And I am positive most of you wouldn’t. I started using the camera straight out-of-the-box. Got stuck at some point, referred to the manual and moved forward. Again, I got stuck on something else, picked up the manual, and the cycle continues. I am still in the learning curve, but hey, it works. I am confident you would have done it the same way.
Classroom-based learning in organisations doesn’t always work because it follows a linear pattern. Sequester a bunch of people in a room for 3 days and teach them about the product you are going to launch next quarter. What happens next? Where is the continuity? What if I get stuck? Where is my cheat sheet? How can I obtain the tacit knowledge such as my advantage over the product of a competitor that I am going against in a bid next week? All these questions remain unanswered.
Again! A huge pause in my blogging life. Personal and professional problems took over everything. Reading was also sporadic and limited (only small blogposts). Trying to restart again with a longish post (coming next). I’ll publish few link blogposts soon (gathered during month of April/May/June).