<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=115389302216927&amp;ev=PageView&amp;noscript=1">
LET'S TALK

Evaluating Training Effectivenes

    mega-menu-graphic

    Storyline Scheduled Public Courses

    2 min read

    Beyond Simple Likes and Dislikes: How to Really Evaluate E-Learning

    By Andrew Jackson on Wed, Mar 6,2013

    I don't know about you, but the word evaluation can send a shiver down my spine. For many of us in learning and development it's a word that can have so many negative connotations, we sometimes fudge or avoid thinking about it completely.

    I think these negative associations are because, typically, we take too narrow a view of the word.  For most people evaluation is about whether or not the learners liked the course or the trainer - or the chocolate biscuits served up at break time.

    This kind of evaluation really gives us little more than broad, hard to quantify opinions about something. What we really need to do is start  thinking  about evaluation as a means to really identify what's effective about a piece of learning. And what's not.

    If we adopt this broader view of evaluation, then it has a place through the entire design and development process, not just at the end. This is true for any kind of learning, but is especially true for e-learning.

    I say that because unlike classroom training, e-learning is more time-consuming and more expensive to refine once it's been created. If you are evaluating its potential effectiveness at every stage in the design process, it's much more likely to hit its target first time, avoiding the need for costly revisions.

    We can take a leaf out  of the usability designers book here.  They do something very close to what I'm about to describe with website design. It's a simple, practical exercise which frequently gets overlooked or skipped over in a typical e-learning design process.

    Work 1-to-1 with some typical learners
    This is something you should do while you are still in the prototype or storyboard stage of development. The only difficult parts are getting access to a learner or two and co-ordinating diaries. I say 'only'. I know those can be two major difficulties. But it's worth persisting, because the dividends this exercise  pays are tremendous.

    Sit with the learner. Have them evaluate the prototype or storyboard and give you their feedback. There are various things you can look at. How clear or understandable is the content?  Are the proposed interactions or activities relevant and meaningful? Can they make sense of the overall interface and the specific navigation?

    Do this with a handful of learners and you'll very quickly get a sense of what is problematic or confusing for everyone and what is just a subjective opinion held by a single individual.

    You'll need to be a good note taker. Because you'll usually get plenty of valuable comments which you won't want to forget. Better still, (with the learner's permission) you might consider recording what they have to say.

    Jakob Nielsen tells a funny story about how website designers react the first time they do an activity like this. The first user is wheeled in and starts to look at the design. Some things just don't make sense. 'They must be a particularly stupid user, not to get that" thinks the designer.  Then the second user is wheeled in. Same problem with the design. Then the third. Same problem again. And so on. Until the designer 'gets it' and the penny drops: their design is the problem. Not the intelligence of the users.

    And that's the beauty of carrying out an exercise like this, during your e-learning development. It strips out any ego that might've found its way into the design. It forces you as the designer to really see how the learners react to it. In the end, this helps you make changes that your learners will only thank you for.
     

     

    Topics: Instructional Design Measurement and evaluation e-learning
    3 min read

    An Olympics confession, Improving Performance and the Power of Kaizen

    By Andrew Jackson on Tue, Aug 21,2012

     It's time to confess. July 2005,  when we learned we would be the hosts of the 2012 Olympic Games, I wasn't that fussed. I wasn't anti. But not being much of a sports fan, the excitement mostly passed me by.

    Little did I think, 7 years later,  I would be cheering Team GB along and delighting in the fantastic achievements of the winners and empathising so much with the losers.

    In case you're wondering, I haven't suddenly become a devoted sports fan, but I couldn't help being swept up by the interest we all have in seeing truly remarkable individuals succeed. And the L&D bit of my brain couldn't help be fascinated by how this group of people had achieved so much stunning success.

    Actually, my interest started a couple of weeks before the Olympics with Bradley Wiggins winning the Tour de France. (Another confession - I'd never heard of the bloke until about a week before the Tour de France started).

    In the deluge of press coverage following the competition, we started to get some insights into how that fantastic win came about.

    Several things grabbed my attention. First, when Wiggo and team announced their ambition, most people thought they were bonkers. Second, not only have they proved those doubters wrong, they have done so far sooner than even they had imagined they could. Finally, 2011 had been a truly abysmal year for them and anybody looking on from the outside would probably have laughed even louder at the possibility of them achieving their stated ambition.

    So what changed? What turned things around so rapidly and so decisively?

    I can't claim to have the absolute scoop on all this, but here's what I gleaned from watching interviews on TV and reading articles in the press.

    That truly abysmal year I just mentioned was the catalyst for change and, ultimately, success. It was reaching a terrible, crushing low in their performance that forced the team to step back, re-asses and re-think their entire approach.

    They went against conventional wisdom. From what I can understand, the conventional wisdom in the cycling world is that you get better by being in lots of competitions. That seems intuitive doesn't it?  Practice makes perfect, after all.

    They decided to go for the counter-intuitive. Cut back on the number of competitions and focus instead on training and preparation for competitions they were going to enter.

    They completely re-engineered their approach to training and preparation. This involved breaking the entire process down, examining every aspect in detail and squeezing performance improvements out of every last bit of it.

    This, it turns out, is the secret of Team GB's success, too.  They refer to it as 'the science of marginal gains'. Dave Brailsford sums it up nicely in a recent BBC interview:

    "The whole principle came from the idea that if you broke down everything you could think of that goes into riding a bike, and then improved it by 1%, you will get a significant increase when you put them all together. There's fitness and conditioning, of course, but there are other things that might seem on the periphery, like sleeping in the right position, having the same pillow when you are away and training in different places. They're tiny things but if you clump them together it makes a big difference."

    The Japanese were the pioneers of something very similar in the world of business  - you may have heard of  kaizen. It's the 'continuous improvement of working practices'.

    Two things strike me about all this. First, most employees in most organisation are taught to fear failure in their day-to-day work almost as much as they fear receiving a redundancy notice. In fact, for many, the two are inextricably linked. If the first happens, the second will almost certainly follow.

    Yet, as the example of team Wiggo shows, failure is sometimes the most powerful motivator for subsequent success. Nobody wants or sets out to fail. It feels awful when it happens and it can be soul destroying. And I'm certainly not suggesting organisations should go around encouraging their employees to fail.

    But, I'd bet a fairly large sum of money that organisations which take a grown-up view of failure are better places to work and, overall, end up being more successful.

    Second, because employees fear failure so profoundly, most follow conventional solutions. So in many organisations, everyone just chugs along in quiet desperation. Everyone knows it could be so much better, but who's going to rock the boat and suggest outrageously unconventional change? Only a brave soul, but oh boy, the ones who do are likely to reap the benefits.
    Topics: Instructional Design Learning Psychology Measurement and evaluation
    1 min read

    Instructional Designer Training: Integrating Practice in Your Design

    By Pacific Blue on Fri, Mar 16,2012

    When we struggled to learn things or carry out new tasks as children, it's more than likely that our parents or teachers reminded us that 'practice makes perfect' or told us to keep going and to 'try, try, try again'.

    As adults, we might find those phrases irritating (or down right annoying); but, you know what, hate to have to admit it, but mums and teachers really did know best!

    This is borne out by some research into the use of practice activities in e-learning. It should be interesting to anyone involved in instructional design or instructinal designer training.

    Comparing learning from two versions of an e-learning course, (one offering more practice activities than the other) researchers found that the version with more practice activities increased learning for both higher and lower ability learners.

    In this study, both lower and higher ability learners scored 15% higher on end of course tests compared with those who had taken the version of the course with fewer practice activities.

    So it does seem that if higher learner achievement is a key goal (and surely it will be), broadly speaking, more practice will mean better learning outcomes.

    The other key point in relation to practice activities is the pacing of learning through a course. A good many studies have been carried out around this. The research has consistently highlighted two key points:

    First, that spacing practice activities through a course really is more effective.

    Second, the benefit does not show up immediately. Longer term studies have revealed that over a period of several years, spaced practice leads to much better long-term retention of learning.

    Thinking about how more practice might help to make perfect within your learning design projects? Check out our impact and instructional design programme options. Flexible and modular. ITOL accredited.
    Topics: Instructional Design Course Design
    3 min read

    The Trouble with the Virtual Classroom - Logged in and Gabbing Away

    By Andrew Jackson on Mon, Mar 12,2012

    It's 10.30 at night. I'm sat in front of my computer logged in as a participant on a webinar being run in the US.

    No idea how many other people are taking part in this 'training' - but after two and half hours of non-stop lecture and 30 minutes to go before we reach the finishing line, my attention levels are at a low point. I've already snuck off to the loo a couple of times and made several cups of tea.

    While the overall quality of the content is very good, the presentation is terrible and the interaction with the presenters zero. They  spend so much time labouring each topic, it's easy to slip away for a couple of minutes and not to have missed much by the time you come back.

    If you've had an online 'training' experience like this one, you'll understand why there is frequently such a gulf between an organisation's enthusiasm for the virtual classroom and the learners' lack of willingness to engage with it.

    And organisations seem to be very enthusiastic. In 2009 an ASTD survey revealed that 23% of companies surveyed were using online learning compared with only 10% in 2003. An IITT survey here in the UK last year showed that 44% of companies surveyed anticipated making greater use of live online learning.

    You can understand why. The cost and time savings are compelling and the technology is relatively cheap and easy to deploy. "What's not to like?" would be the question on the lips of many advocates.

    In theory it should be attractive to learners, too. The opportunity to get more focused, bite-sized training with minimum convenience would seem like a no-brainer.

    In practice, as my experience as a participant shows, for many learners online training is little more than a live, scheduled version of a deathly boring e-learning module - with minimal interaction.

    But, as always, it doesn't have to be this way. There are three key factors involved in making live online training successful:

    1. Is it a meeting, a webinar or a training session?
    Seems obvious when written down in black and white, but many people planning and running live online sessions are not particularly clear on what type of session it  is they are running.

    An online meeting is a virtual equivalent of a face-to-face session where issues are discussed, information is shared and decisions are made.

    A webinar is much closer to a live seminar or lecture - largely presenter focused with potentially large numbers of attendees and  limited interactivity between presenter and audience.

    A live online training session should aim to replicate what happens in a classroom. Trainer led, but with plenty of opportunity for activities and interactions between trainer and learners and between learners working in pairs or small groups.

    2. Is your trainer ready for the challenge?
    Understandably, many trainers feel that they can never replicate the real classroom  experience online. You can't see the participants, you can't read their body language. They can't see you. How can it possibly work?

    The reality is different. Taking exactly the kind of care you would over preparing and running a classroom event, it's perfectly possible to run a throughly engaging, effective and successful online event. It does take time, however, to adjust to a very different medium and understand how to do things somewhat differently in the virtual classroom.

    3. Do you understand how to fully operate your virtual classroom?
    One critical success factor is familiarity with the software you are using to deliver your virtual classroom event.

    Unless you understand all the features and functions available to you, you'll never be able to design and deliver the best event you could. And this is no different, by the way, from taking into account all the different facilities,  equipment and aids you have available when designing and delivering a classroom course.

    One final point - not all software is created equal. Some systems have more sophisticated ways of enabling interaction than others - something to be very aware of when making a buying decision.

    If your organisation is one of the 44% thinking about making use of (or perhaps wanting to make better use of) the virtual classroom, and you'd like to get best practice principles and hands-on experience of running a virtual event, we have several modules about virtual classroom design available in our impact and instructional design programme.
    Topics: Instructional Design Course Design Train the Trainer
    2 min read

    How Instructional Designers Can Manage Out of Control SMEs

    By Andrew Jackson on Fri, Feb 24,2012

    We can all feel our pulse quicken, our emotions rise when we get chance to talk or write about a topic that engages us totally.

    And we usually know lots about this topic. We can frequently talk about it for hours without getting bored. We can tell anyone willing to listen about its every last detail. In that sense, we are all subject matter experts (SMEs) in something.

    As instructional designers, when we have a talkative SME in front of us and limited time to get the information we want, it's worth remembering how our own passion for a particular subject matter can allow us to get carried away.

    So aside from being more empathetic to a talkative SME, is there anything else we can do to make our time with them more productive? I think there are four areas to consider when gathering content from SMEs. By the way, the greater the quantity of content you need to gather, the more you are likely to want to formalise the approaches below.

    Ownership
    Before any information gathering even happens, you need to take ownership of the process. This may involve becoming more assertive than normal: be quite specific about how you want the process to unfold, including the number of meetings you'll need, how long each meeting should be and how much time you'll need between meetings for reviewing and feedback.

    Planning
    Tempting as it might be to go into your early meetings knowing nothing, better to do research to familiarise yourself with the subject matter area. Spend time creating a basic project plan. Clearly define your and their roles in the whole process. Formally identify the risks of not getting the required information in a timely fashion and communicate this to the project sponsor.

    Connecting
    Your initial research can pay dividends once you start interacting with your SME. Exhibiting some knowledge of his/her topic can help build rapport and, more important, establish your credibility. Earn trust by emphasising the confidentiality of your information gathering sessions and the promise of a review of content before making it more widely available.

    As the content gathering progresses, aim to establish points of shared interest both within the subject matter area and outside. Most people appreciate a little interest in their life outside work.

    Focusing
    Set an agenda in advance of the meeting clearly stating goals and expectations.
    During your content gathering sessions, regularly paraphrase, clarify and summarise what you have covered; use closed questioning techniques if your SME has a tendency to go off on tangents. After the session, collate the content into a structured document you can share with your SME for review and feedback.


    It's easy to dismiss some of the subject matter experts we deal with in our professional capacity as out of control windbags who want to bore us and our learners with every last detail of their knowledge.

    That may be true. But let's not forget, given the right topic and the opportunity, many of us can happily do the same.

    So with a bit empathy and some detailed preparation and work before, during and after your content gathering, the analysis phase of your project need not be an out of control nightmare.

    If out of control SMEs are your current nightmare, check out our the Analysis and Planning modules in our instructional design programme for help on dealing with this problem.
    Topics: Instructional Design Course Design e-learning
    1 min read

    Does Compliance E-Learning Have to Be Boring?

    By Andrew Jackson on Tue, Feb 14,2012

     At a conference a year or so ago, I noticed a seminar that drew a good crowd was entitled, "Who says e-learning compliance training has to be boring?". Well not me for sure.

    Perhaps, I'm a bit naive, but even now (after many years in the world of learning) it shocks me that some people can shrug their shoulders and say., "well this material is pretty dry and boring, so we'll just have to accept that the way we deliver it is dry and boring". To me that's a bit like the designers at a car company saying, "well it's a bit difficult to design a really comfortable car seat, so we'll just fit the car with uncomfortable wooden benches and the passengers will have to lump it."

    Perhaps the acceptance of poor quality compliance training is linked to the box ticking mentality that often accompanies the dreaded 'c' word. We have to do the training  -  even though nobody wants to, so let's just collectively hold our noses and all be bored together - designers, trainers and delegates. Oh yes, and let's make it even worse, by delivering it as the most boring, sleep-inducing piece of e-learning you have ever seen.

    From this point of view, you'd think that compliance in a particular job role or organisation wasn't something anybody really needed to know or do. Yet it most definitely is.

    So rather than treating it as a dry abstract topic, why not relate it back to the context or contexts in which learners need to be compliant? Why not provide the learners with challenging, life-like scenarios and activities that require them to think about what they actually need to do to be compliant. How about some intrinsic, contextual feedback that vividly demonstrates the consequences of not being compliant or trying to cut corners.

    If you find yourself nodding your head in agreement but you are feeling a bit unsure about the effectiveness of your e-learning in general and your compliance e-learning in particular, take a couple of minutes to complete our E-Learning Impact Scorecard which can help you benchmark what you are doing against some of our recommended best practices. 
    Topics: Instructional Design Course Design e-learning
    2 min read

    Designing Training Programmes: What About Learner Self Awareness?

    By Pacific Blue on Mon, Jul 18,2011

    When we are designing training programmes, how much should we consider learners' self-awareness of their learning preferences?

    At the risk of doing a Donald Rumsfeld (he of the 'known unknowns'), one of the things that I find fascinating about learning and knowledge transfer is whether we know what we know.

    In other words, how much are we really able to assess our own learning needs and preferences?

    On this topic, I offer you a fascinating piece of research carried out by a group of people with the snappy surnames of Schnackenberg, Sullivan, Leader and Jones.

    In their research, a group of learners taking an e-learning course, were given a survey about their preferences for the amount of practice they do when learning - either high or low.

    The learners were then assigned to two different e-learning courses one with a high level of practice, the other with minimal practice.

    Half the learners were given the version of the course that matched their preference, the other half were deliberately mismatched.

    I've written previously about the significance of practice activities in learning, so you may not be surprised to discover that regardless of their preference, those who took the version of the course with more practice scored significantly higher on a post-course test than those who had taken the version with minimal practice.

    First of all, this highlights the importance of practice activities in learning. But the results are important for another reason. They chime with quite a bit of other research that points to a frequent mismatch between what we think we want as learners and what actually produces results.

    In other words, our perceived preferences about how we like to learn are not always good indicators of the way we actually need to learn.

    If you are new to the world of learning and instructional design (or want to enhance your existing skills) then our modular impact and instructional design programme is very focused on practical tools, techniques and principles designed to get you better results from the training courses you design. The programme is ITOL accredited too, so you have the option to get a meaningful certificate or diploma from it upon completion.

    Topics: Instructional Design Course Design Learning Psychology
    2 min read

    Does Motivation Play a Role in Our Learning and Development

    By Pacific Blue on Wed, Jul 6,2011

    Is a successful learning experience purely about external factors or do our own internal beliefs and motivations play a part?

    We've all had good and bad learning experiences, so this is a fascinating question. How much is that success or failure purely down to external influences?

    If we go back to the 1930s, Thorndike's Law of Effect holds that a correct answer needs a response to reward the learner. A "Well done, that's the right answer", from the trainer helps strengthen the association between the question and the correct answer and increases the probability of a similar correct response the next time around.

    I think most people in the world of learning and development would broadly agree with this view. But this emphasises the external environment. What about if we also put an individual's beliefs into the centre of the picture. It's likely that we then have several other factors to take into account.

    1. Beliefs about yourself
    Do you believe you can succeed and acquire the knowledge and skills you are setting out to learn? This level of belief varies tremendously and is influenced by existing knowledge and experience. Go outside of familiar territories and domains and it is likely our self-belief and confidence will plummet.

    2. Beliefs about the learning content
    Is the content interesting? Genuine personal interest makes learners far more willing to engage with content - even when dull and boring. If personal interest is low or non-existent than we need to create situational interest. In other words, grab learners' attention and interest by making sure the learning content is well-crafted and engaging.

    3. Beliefs about the success or failure of learning
    Do learners believe the outcome they achieved was under or outside their control? Do they believe it was a poor trainer that caused them to fail or sheer good luck that they did well? Whether the outcome is positive or negative, research into something called attribution theory suggests a learner who believes an outcome was caused by factors outside their control, is far less likely to be motivated to succeed in the future.

    By contrast, a learner who attributes success or failure to their own effort (or lack of it) is far more likely to be productive and put in more effort next time around.

    This suggests it is hugely important to foster an environment that encourages learners to understand (and believe) that the success of learning outcomes is clearly within their control

    Of course, all of this is just scraping the surface of an immensely complex (and very interesting) area. But it's a good reminder that we shouldn't just focus on external factors (important as they are) when thinking about how to achieve successful learning.

      

    Topics: Instructional Design Course Design Learning Psychology