Some ideas for improving Performance Support
"Great trainers used to be defined by great delivery," Rossett says. "They were magnificent in the classroom. Nowadays, they have to be magnificent in the results." www.workforce.com
This Workforce.com quote once again illustrates the fact that changed performance is the essential purpose for training. This is something that is easy to lose sight of in the everyday bustle of creating content and the difficulties of pinning down valid performance measures.
Much of the time, the feedback we receive supports the theory that training does help, that our trainees are more successful, faster. It is this success which proves that it is time to raise the bar. If we can get great feedback with only the most rudimentary forms of performance-measurement, just think of the successes we could create with the ability to measure success on the job.
In my experience, there are two things stymie performance measurement and support in a Learning and Development department: client-focus and measurement abilities.
Much of the time, training is ordered: a supervisor is having trouble, senior management are starting a new initiative, what have you. We do our best to push back, to do a needs analysis and a task analysis to define the root problem and figure out what to do about it. My contention is that speaking with the learner during the analysis is already too late. We need to be in front of the problem, constantly supporting and communicating with our learner population. That way we can provide performance support tools and modify our training before problems arise. If we wait, frequently the supervisor or manager must detail a plan before getting the go-ahead to request training from L&D. Going back and getting new approvals based on the needs analysis can be too time-consuming or can increase the chance of a final veto.
This strategy has many problems as well, but these must be surmounted in order to push ahead as performance technologists.
The other problem is our measurement abilities. How can you judge training's impact on sales figures when your salespeople transfer between products (requiring re-training), quit, come from different backgrounds, have different sized territories, different markets and have different success metrics?
My response is to hire a statistician. Your local university should have one tucked away in the business department, or psychology, sociology or anthropology. These disciplines understand the complexities of measuring human performance.
A statistician should be able to help you sort through the data that is available and decide what new measurements are needed. (And tell you what other measurements you could use when you can't get the reports that you need.) Then, they could analyze the data and coach you on what it means. The good news is that the consultation about the needed data would only be required the first year, with occasional updates as the focus of the department changes.
The analysis itself should take very little time as well (probably less than a day). It's likely you could get some real data to use in departmental strategy, budget requests, etc. for around $1000 the first year. It's also likely that the solid numbers could justify much more money being spent on learning and development.
This Workforce.com quote once again illustrates the fact that changed performance is the essential purpose for training. This is something that is easy to lose sight of in the everyday bustle of creating content and the difficulties of pinning down valid performance measures.
Much of the time, the feedback we receive supports the theory that training does help, that our trainees are more successful, faster. It is this success which proves that it is time to raise the bar. If we can get great feedback with only the most rudimentary forms of performance-measurement, just think of the successes we could create with the ability to measure success on the job.
In my experience, there are two things stymie performance measurement and support in a Learning and Development department: client-focus and measurement abilities.
Much of the time, training is ordered: a supervisor is having trouble, senior management are starting a new initiative, what have you. We do our best to push back, to do a needs analysis and a task analysis to define the root problem and figure out what to do about it. My contention is that speaking with the learner during the analysis is already too late. We need to be in front of the problem, constantly supporting and communicating with our learner population. That way we can provide performance support tools and modify our training before problems arise. If we wait, frequently the supervisor or manager must detail a plan before getting the go-ahead to request training from L&D. Going back and getting new approvals based on the needs analysis can be too time-consuming or can increase the chance of a final veto.
This strategy has many problems as well, but these must be surmounted in order to push ahead as performance technologists.
The other problem is our measurement abilities. How can you judge training's impact on sales figures when your salespeople transfer between products (requiring re-training), quit, come from different backgrounds, have different sized territories, different markets and have different success metrics?
My response is to hire a statistician. Your local university should have one tucked away in the business department, or psychology, sociology or anthropology. These disciplines understand the complexities of measuring human performance.
A statistician should be able to help you sort through the data that is available and decide what new measurements are needed. (And tell you what other measurements you could use when you can't get the reports that you need.) Then, they could analyze the data and coach you on what it means. The good news is that the consultation about the needed data would only be required the first year, with occasional updates as the focus of the department changes.
The analysis itself should take very little time as well (probably less than a day). It's likely you could get some real data to use in departmental strategy, budget requests, etc. for around $1000 the first year. It's also likely that the solid numbers could justify much more money being spent on learning and development.
0 Comments:
Post a Comment
<< Home