The Effectiveness of Learning – What Does Success Mean to You?

Success

This post follows on from “Back to Basics” which outlined some key considerations for learning effectiveness and called for some refresher thinking about the fundamentals of what we do within our professional, how we do it, why and how successfully we do it.

I’d like to reflect on “success” because, depending on context, success can represent a variety of outcomes.

From a learning and development function perspective, the ultimate value to an organisation, is the achievement of pre-determined success.   But, how often do we not even consider what it may look or feel like? So often, learning and development people are almost “led” into the provision of a learning intervention (or course) because it’s perceived to be the answer to a performance problem, or just because “that’s what we have  always done”.

Admittedly, there can be a dilemma here.  We can be torn when we accede to requests (and effectively selling ourselves short) because, on one hand, we want to “get  business” into the Learning and Development area, to be busy, seemingly wanted, important and well liked and, on the other, standing ground when we believe learning will not solve the problem presented.

Perhaps the most valuable question we should be asking up front, not just of ourselves, but the perpetrator of the request is “How will you know when this is successful?”. This question forces a response but it’s the quality of the  response that’s vital.

Hopefully, the response is grounded in performance terms because that sets the foundation for the vitally important expectation and risk management aspects of the overall project management which is necessary to design and executes the intervention efficiently.

Unfortunately the response can be nebulous at best and meaningless at worst. Here are some simple examples which highlight risk.

 “How will you know when this (learning) is  successful?” 

Response Risk Comment
All staff will complete this training by December – the new LMS will prove it High No indication of improved performance and  no compelling reason to provide quality material
All staff will pass the on-line assessment of this e-learning course
by December
High Although there is some level of success suggested (passing course by December), the assessment is on-line and may not demonstrate actual job proficiency
Every classroom will be full from now until December High Justification to utilise classroom space appears to be the driver for success
Errors in processing will reduce High No measure about extent of error reduction leaves it difficult to measure success or gauge if the cost of providing the training exceeds the financial benefit attached to reduced errors
Sales will increase High But by how much and at what cost?
Customers will be happier because staff will know what they are doing High How does happiness translate into real benefit?How will we know when they know what they are doing?
Our customers are asking for well trained staff High And the underlying problem is?How will we know when customers are satisfied?
Customers complain about the Help Desk, even though we have our best
people on it
High Suggests there could be a problem with the phone system, process, priority, wait time, response, and access to customer data or product specification?
We thought the people we are recruiting had the right background but now we need to train them High This could easily be a job design or specification problem that translates into a recruitment and selection problem
This will cut our OH&S costs – any reduction is a good thing High Surely not if it costs more than “any reduction” to provide the training.  Response suggests that
organisation may be more interested in reducing g costs than improving the safety of its staff.
This training is to be completed by everyone for the roll-out of our
new customised Customer Support System because everyone needs to know what it’s about – and we get it free from the system Vendor
High Sounds like a bargain!  Perceived success seems to be based on making use of the freebee!  But, is it all necessary?  Do all users of the system need the same
amount of training?  Do all topics suit all users?  Why do non-users need to complete the training?  If it’s free it’s likely to be very generic.  Can it be customised?  Is the system easy to use?   What will the nature of the calls to
the internal Help Desk be?  There are many questions to ask before arriving at some meaningful measures of success!
By December errors will be reduced by 30% Moderate Better measure but there could be types of errors which could be overcome by different interventions (eg. form design) so applying a learning intervention may not be successful.
Following re-fresher training, we will see staff confidently using the new guidelines provided to them when speaking with customers on the phone and by March next year customer complaints about product “x” will reduce by 30% savings us $100K. Low This is a better response.
There are clear and measurable performance outcomes suggesting that more analysis of the problem has been undertaken.We have something to work with.
The business has an expectation of success and we can satisfy it through effective learning (and real value).A possible solution is the combination of some structured (formal) learning (re-fresher referred to) and on-going performance support (un-structured learning) interventions.

Do these examples sound/feel familiar?

Securing some common ground around what constitutes success and how to measure it is one of many initial steps, but, for me, it heads the list because without it being clearly articulated, we run the risk of being unable to demonstrate our value contribution.

Worse still, if the problem is not resolved because the learning intervention was unnecessary, not warranted or wrongly directed, be assured you will soon hear “we spent a lot of money on training that didn’t work”.

The next posting will discuss needs assessment, expectation and risk management in more detail.