<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=115389302216927&amp;ev=PageView&amp;noscript=1">
LET'S TALK

Evaluating Training Effectivenes

    mega-menu-graphic

    Storyline Scheduled Public Courses

    Andrew Jackson


    Recent posts by Andrew Jackson

    2 min read

    Lost in translation: why tech jargon drives me nuts

    By Andrew Jackson on Wed, Aug 27,2025

    I’ve long been interested in technology and its potential to make life easier and more efficient. However, I’m definitely not a pointy-headed tech nerd. Tech jargon user-unfriendly pieces of software drive me insane.

    One of my particular pet peeves is the ability of the tech world to make something simple and every day sound much more complicated than it actually is.

    One of my favourite toe-curlers is ‘boot up’ or ‘reboot’. For the ordinary folk in the room, ‘start’, ‘switch on’ or ‘restart’ seems like a perfectly reasonable and well-understood alternative.

    And it’s amazing how, just when you thought you had heard all the weird and wonderful terms that the tech world can come up with, you discover a new one.

    I honestly don’t know if this has come into use recently with the arrival of AI or if it has been around for a while and I’d just never come across it before; but a newly discovered tech term for me is ‘ingest’ or ‘ingestion’.

    Now the standard usage, of course, would be in a biological or medical context.

    But in the best traditions of tech jargon, this slightly obscure verb/noun combination, which is definitely not used much in everyday speech, has become the standard way to describe how a file or doc has been uploaded and chunked by a GPT’s knowledge base.

    Granted, it is a term that most people will more or less understand; and, ‘yes’ there is some logic behind its use. In a biological/medical context, of course, it describes the full consumption/digestion of something into something else; and this is clearly the intent behind the tech usage.

    But really? Couldn’t we just say, the files have been uploaded and chunked?

    Of course, the annoying thing about all of this jargon is that once it’s established, if you are working in a tech context in which it is used, at the very least you have to make the effort to understand it; and (shudder) eventually you’ll almost certainly find yourself using it.

    In short, sometimes you have no choice but to learn to speak the tech ‘lingo.’

    If you have read some of my recent posts, you’ll know that I’m in the early stages of building an AI-powered app for L&D called PerformaGo. I’m also keeping a diary-like blog which is documenting the journey ‘behind the scenes.’

    I’ve divided the diary into 4 broad sections, one of which is called, ‘Learning to speak API’. It’s all about understanding, unpacking and explaining the tech ‘lingo’ and concepts so they make sense to me and to ordinary readers like you.

    So, why not follow along if you are interested. And if you’ve got any particular tech-speak pet-peeves, I’d love to hear from you.

    Topics: Learning Tech
    3 min read

    The Gold Beneath The Gloss: Unlocking Software's True Potential

    By Andrew Jackson on Fri, Aug 22,2025

    This isn’t the first time that I’ve made reference to what I call Storyline’s ‘secret weapon’. A Storyline ‘advanced’ feature that gets buried away in the background because it’s not very ‘sexy’ nor particularly easy to market. Yet learning to use it can transform the e-learning courses you build.

    In case you are wondering? I’m talking about Storyline variables.

    However, the purpose of today’s post is not to write extensively about Storyline variables. (If you are interested, I have written about Storyline variables here).

    What interests me today is the fact that lots of software/technologies often have some kind of ‘secret weapon’. In other words, a hidden something that a majority of users don’t know about, which is incredibly powerful or useful but somehow never gets the attention it deserves.

    Typically, the hidden something is conceptually abstract or technically a bit complex; so, it takes some care and effort to explain it clearly and easily. And it often takes some care and effort by software users to really reap the benefits of it.

    Which usually leaves it friendless. Tucked away in an obscure corner of the interface by UX and UI designers. Shunned by the marketers (just not ‘sexy’ enough). Glossed over by the technical writers and technology evangelists.

    And you can understand why. In the push to get a piece of software or a technology widely adopted as quickly as possible, the most popular and easiest to use features are, inevitably, going to get the most attention.

    And in lots of ways, that is a good, user-centred approach. After all, if the software or technology in question isn’t obviously solving some kind of problem or making it easier or possible to do something that was previously difficult or impossible to do, what’s the point?

    But this also misses an important point. Hiding that more difficult-to-explain feature probably results in inferior user output. Returning to the example I opened with, this is absolutely true in the case of Storyline variables. It’s not that you need to use them in every single course that you create. But never using them at all will absolutely reduce the creative potential and the effectiveness of your instructional design.

    So, why am I obsessing about all this, at the moment? Well, you may already know that I’m working on a new software tool for L&D called PerformaGo. And the AI technology that this software is designed to help L&D folk like yourself unleash, definitely has its own buried treasures. (Something I’ve been writing about here). In essence, it’s about how you provide specific knowledge to your custom GPT so that you get reliable, accurate output when it’s being used by a learner.

    Which means that this particular ‘buried treasure’ is going to play a very significant role in making the PerformaGo tool a success. The question of how that specific knowledge is packaged up and accessed can’t be glossed over, in the hope that some users will find it and work out how to use it. It needs to be front and centre.

    And that will be a challenge. Because in some respects this element is a bit abstract. It will definitely require some careful thought around how it is presented and explained to users within the software interface itself and in any related ‘help’ content or software onboarding.

    But once you consider the benefits for L&D folk of getting this ‘buried treasure’ front and centre, the challenge of achieving that goal seems small by comparison.

    If any of this interests you, or you like the idea of becoming an early adopter or pilot user of PerformaGo or you would just like to find out some more about the tool, why not join our early-bird waitlist.

    And you can follow a more ‘behind-the-scenes’ take on the journey to build the PerformaGo software tool in my online diary.

    Topics: Performance Support Learning Tech
    2 min read

    Instructional design success: it's within your grasp...

    By Andrew Jackson on Wed, Aug 13,2025

    Regardless of the skill, the expertise or the situation, when we compare where we are currently with where we would like to be at some point in the future, the journey from point A to point B often feels pretty daunting.

    If you are are looking with dismay at your current e-learning output and thinking about doing something more effective and interesting for your learners, I get that you might feel like you simply don’t have the time to do anything any better.

    But here’s the dirty little secret of e-learning development.

    It takes pretty much the same amount of time and effort to produce a really dull piece of e-learning as it does to create a really effective and interesting one.

    I’d encourage you to re-read that last sentence. Because many people think I have become slightly unhinged when they hear me say something like that.

    But here’s the thing that most people don’t realise.

    It’s the development bit of e-learning that is the most time consuming. All the pointing and clicking in the authoring tool software is always the biggest and longest part of any project. Typically, I’d say that instructional design represents about 35% of the project time and the development represents the other 65%. 

    And here are the two really crucial points. First, the instructional design phase is never that long, anyway. Second, it’s going to take much the same percentage of time, regardless of whether you do it poorly or brilliantly. 

    In other words, it’s about how you approach that instructional design phase and the tools and techniques you use while you are in that phase that make the difference, not the total amount of time you spend on it.

    Once you’ve made the mindset shift from thinking about your e-learning as predominantly knowledge-presentation to something more task and skills-focused, all you are doing is using different tools and techniques to ensure that the time spent on that instructional design phase produces something much more effective and creative.

    Now, a skills and task-focused piece of instructional design might take a little bit longer to implement in your authoring tool. But really not very much. And the payback you will get in terms of improved effectiveness and impact will easily outweigh that small bump in development time. 

    So if you think that creating really effective e-learning is out of reach, think again. It’s about designing smarter not longer.

     

    Looking for help with making your instructional design smarter and more effective? Take a look at our impact and instructional design programme.

    Topics: Instructional Design e-learning
    2 min read

    Supporting workplace 'Moments of Need'

    By Andrew Jackson on Tue, Aug 12,2025

    I think there are many people in an L&D role who spend their days quietly tearing their hair out in frustration. They are what I call the order-takers. They came into the profession, like most of us do, because they like and care about people. They believe that learning can (and should) make a meaningful difference in someone’s professional life and career. Perhaps they experienced this in their own lives and wanted to help others achieve the same.

    But somewhere along the way, something went horribly wrong. The job they thought they would be doing wasn’t the job they actually found themselves doing.

    They became a harassed (and not very well-respected) internal supplier. Taking orders for courses and workshops others demanded from them. Courses and workshops that satisfy the demands of the order-giver but don’t do much to benefit the learners or the organisation they work in.

    Perhaps this describes your situation right now. Or perhaps you’ve been there, done that and escaped to pastures new. Either way, being caught in a cycle of delivering training that doesn’t really solve problems and doesn’t really improve workplace performance is deeply frustrating and ultimately, very demotivating.

    I’ve worked with scores of L&D teams over the years and witnessed people caught in the order-taking trap in a variety of sectors and industries. Almost everyone wants change. But how? How can we make the impact and build the influence that we keep saying we want, if just delivering excellent training is not enough.

    The truth is, we need to get better at enabling performance. Which means getting much, much closer to our learners’ real moments of need. For example, when someone is:

    • facing a new challenge on the job
    • making a decision under pressure, or
    • trying to apply a skill they should remember, but can’t quite recall

     

    Those are the moments where performance can either bumble along as always or start to excel given the right support. Things like a timely nudge; a helpful prompt; a short, smart answer that moves someone forward, in the flow of work.

    I learnt the importance of supporting those moments of need about 15 years ago, during a workshop run by Jim Kirkpatrick. And since then, I’ve spent years deeply frustrated by the fact that the concept is simple; but effective implementation of that concept is not.

    However, that frustration is no more. Right now, I’ve finally found a way to do something I’ve wanted to do for years. A way to help L&D professionals design and deliver intelligent, contextual support for those very moments of learner need just described above.

    This is not about trying to replace the learning experiences we already do well. It’s about adding a layer of performance-first thinking that gives our work more credibility, more relevance, and yes, more respect.

    The future of L&D isn’t about smarter content. It’s about smarter integration with how workplace performance actually flows.

    So, this is my new mission. And over the coming weeks, I’ll be sharing more about the journey that’s got me here — and the tools we’re building to help others add that layer of performance-first thinking.

    If any of this resonates and you’ve felt the same frustrations, then I hope you’ll come along with me.

    And if you would like a more personal take on this new mission of mine? I’m keeping a ‘behind the scenes’ diary of this new journey. You can follow the story here

    Topics: Performance Support Learning Tech Learning Impact
    3 min read

    From Training Delivery to Performance Improvement

    By Andrew Jackson on Tue, Aug 5,2025

    What I really want, what I’ve always wanted, is for L&D to actually make a difference.
    To stop being side-lined. To stop doing good, hard work that goes unrecognised.

    L&D professionals genuinely care about helping others grow. We want to have impact. We want to make a difference. But time and again, we share the same frustration:

    “We’re doing all this work and the business still doesn’t recognise our effort.”

    It’s a painful feeling. One I’ve heard expressed countless times across countless courses and workshops. But over the years, I’ve also had to face up to an uncomfortable truth:

    That lack of respect feels unfair but sometimes, it might be the symptom of a deeper problem we haven’t fully acknowledged.

    What do I mean by that? Well, people come to us with a ‘training’ need and ask us to create a course. If we see ourselves as the “training people” then, naturally, we want to oblige. So, we design workshops. Build e-learning. Roll out programmes. And then we wait for results - that rarely come.

    And the reason those results rarely come? Because, I believe, we’re focusing too much on delivering learning solutions… rather than supporting workplace performance.

    It’s not that training we are providing is bad. Far from it. But training alone doesn’t move the needle - especially when people forget most of what they’ve learnt before they get a chance to use it. What actually makes a difference is what happens after the training: in the messy, unpredictable reality of work. The part L&D rarely reaches. The part where support and follow-up could really make a difference.

    Now, let’s be honest, the idea of supporting learner performance in the workplace is hardly a new idea. In fact, it’s been around for decades. And we are already very familiar with this kind of ‘just-in-time support’ when using apps and systems.

    But the ways and means to make this kind of support simple, scalable, and genuinely useful in other areas of the workplace just hasn’t been there. Until now.

    Because the arrival of AI, I believe, completely changes the performance support game.

    Of course, AI is being widely used in L&D already. But most of that use is focused on content production. Quicker instructional design. Faster course creation. Automating aspects of e-learning production.

    Useful? Absolutely. Transformational? Probably not. Because speeding up training design and production doesn’t fix the core problem. We don’t need more training, created more rapidly. We need smarter workplace support.

    We need tools that help people in the flow of their real work, not just when they happen to have time for a course.

    We need smarter ways to support problem solving, decision-making, and action-taking
    right at the moment of need.

    Over the last 12 months, I’ve become a bit obsessed with all this and with thinking about how we can turn this new technology into a practical, performance focused solution. One that L&D teams can deploy easily. One that’s practical, low-friction, and grounded in the work people are actually doing.

    So, I’m working on a new approach. A new platform I’m calling PerformaGo that puts performance support at the heart of L&D.

    If you’re ready to move beyond training delivery and start designing for real-world results,
    join the waitlist and be the first to know more about PerformaGo and when it goes live.

    P.S. Curious about the journey behind this shift in focus?
    You can follow my personal diary, where I share the highs, lows, and learning curve of building an AI-first product from the ground up.

    Read my diary here

    Topics: Performance Support Learning Impact
    2 min read

    Does your e-learning look like a glorified PowerPoint presentation?

    By Andrew Jackson on Tue, Jul 15,2025

    Forgive me for asking, but does it? Because so much e-learning I see still falls into this category.

    If you’ve answered ‘yes’ (however sheepishly) I’ll guess there’s a good chance that you chose your e-learning authoring software on the basis that it was easy to convert existing PowerPoint presentations into something more ‘interactive’.

    This is still a major selling point pushed by many software vendors. Import your PowerPoint slides into our gloriously quick and easy-to-use authoring tool and then add some interactivity to those slides. 

    Sounds like a miracle, doesn’t it? But guess what? You still have a PowerPoint presentation. It’s just that now it requires the learners to do endless clicking to get through it. 

    Now I know that at this point some people get frustrated with me and go, ‘but Andrew it’s all very well you having a go at me for creating a dull piece of PowerPoint-like e-learning but I haven’t got time to do all that instructional design stuff you think I should be doing.’

    Well, let’s take a step back from that for a minute.

    First, imagine you are the learner. Seriously, would you willingly click your way through what you’ve just produced? Unless you suffer from desperately low self-esteem and think it’s your lot in life to be miserable, then the answer must surely be ‘No’.

    Second, this is a bit like saying, ‘I’m a chef but I don't have time to do proper, authentic cooking. I rely entirely on pre-packaged food and a microwave. So if you come and eat here, you’ll just have to put up with second-rate food’.

    Bet you wouldn’t willingly book a table at that restaurant, would you?

    Finally, do you really expect your colleagues and your learners to respect your skills and professionalism if this is the best you can do? 

     

    Ready to take stock of your e-learning and see if there's some room for improvement? Start with our Discover Your E-Learning Impact Scorecard. It only takes a couple of minutes to complete and you get some rapid feedback on your current e-learning design strengths and the areas that could do with some TLC.

    Topics: Instructional Design e-learning
    2 min read

    Is the LMS market ripe for disruption?

    By Andrew Jackson on Mon, Jun 16,2025

    I don’t have direct, hands-on experience with a huge number of Learning Management Systems but every single client we work with either has one in place or is in the process of acquiring one. And I’ve yet to hear a gushing recommendation about a single vendor or their system.

    What I do have direct experience of is numerous complaints about value for money, lack of transparency from vendors during the sales process and slow responses to fixing problems or getting customisation completed.

    Put less delicately, our clients frequently tell us that vendors tend to promise the earth, fail to deliver adequately on what they have promised, provide support in their own sweet time and charge an arm and a leg for the privilege.

    I usually hear all this in the context of, ‘can you recommend a good LMS because ours is terrible?’ I've considered a whole host of LMS options over the years sometimes for use by Pacific Blue and sometimes just to keep abreast of the market.

    But I’m never willing to ‘go there’ when a client asks for a specific recommendation because there are so many possible variables involved in how someone might use their LMS that it's impossible to say, 'use that one' without doing a deep dive into their current and potential future needs.

    Where people have only a limited initial need, I have in the past, suggested starting with a basic, cheap option to see how that works. It's often the case that only when you actually start using a system like this are you able to clearly identify what you need and what you don't need. This approach (if feasible) is a great way to test the water.

    But overall, I do marvel at the LMS market. It seems to me it’s hugely overcrowded with lots of look-alike products. 

    Which suggests to me that this is a market ripe for disruption.

    In recent years, there's a been a smoke and mirrors exercise within the industry, as some vendors have morphed to a magical new product called LRS. If the old LMS doesn't quite fit the bill, we've come up with a marvellous new variation which you'll just love. 

    While the possibility of much more granular tracking of learning activity is, on the face of it, quite appealing, this doesn't seem to be something that is really setting L&D on fire - or at least, not the L&D folk I talk to.

    So if there are any tech entrepreneurs out there, looking for a new opportunity, this could be it.

    And if anyone does use a system that they would love to shout about from the rooftops, I'd love to hear from you.

    Topics: e-learning software
    2 min read

    Does anyone really care about learning impact?

    By Andrew Jackson on Thu, May 15,2025

    This, I believe, is the elephant in the room for most people in a learning and development role. The one thing people will sometimes pay lip service to. But the one thing they never really want to think about beyond an end of course ‘happy sheet’.

    You know what I’m talking about. It’s the question of quantifying in some form, the impact and effectiveness of the training we design and deliver.

    We have a real hard time getting clients to engage with this when we are talking to them about a potential development project. 

    After delivering some in-house instructional design, we even encounter resistance to taking part in our own light-touch follow-up programme, which helps evaluate the usefulness of what we deliver. It takes very little time to participate. It is completely free of charge. It is almost always helpful for those who do take part. Yet almost no-one wants to.

    A couple of thoughts. If you think evaluating impact tis only about, boiling success down to the last pound or penny of return on investment, I get why you might be very hesitant. There are loads of factors outside learning and development's direct control that can have an impact on the effectiveness of the training provided.

    So you could end up on the awards podium (or in the dog-house) when you really don’t deserve to be, just because under a certain set of circumstances, the numbers came out good (or bad).

    That is a bit like walking along the high wire with no safety net below. Understandably, not many people are up for that. 

    But thinking about impact and effectiveness doesn’t have to be like that. The alternative starting point is that we are not in the business of instantly achieving a prized metric but we are there to support continuous improvement, over the longer term.

    This, it seems to me, is a more honest approach and a more achievable one. It acknowledges from the start that a single ‘dose’ of training almost certainly won’t cure any performance problem or performance need.

    And it recognises that a more consistent course of ‘treatment’ over a period of time is what likely will. It also recognises that not all ‘patients’ are equal and some might need a longer course of ‘treatment’ than others.

    What I’m talking about is a well-designed approach to achieving impact, which starts with a training intervention, follows up after that with learners to see what’s working and what’s not, provides short, simple follow-up interventions, as required and gradually course adjusts to help achieve a good outcome both with current and future learners.

    It doesn’t have the big bang impact of a large ROI number but it almost certainly guarantees visible performance improvements over time and will most definitely raise your profile and the respect for learning and development across the organisation.

    In reality, it’s not hugely difficult to set up and build in this kind of approach at the beginning of a project. So it begs the question, what’s holding us back?

     

    Want to start taking a holistic approach to looking at the impact and effectiveness of the learning you design and deliver within your organisation? Take a look at the programme we offer.

    Topics: Instructional Design Measurement and evaluation
    2 min read

    Will learning technologies ever live up to their promise?

    By Andrew Jackson on Wed, Apr 16,2025

    We've lived with a multitude of technologies for learning for decades now. 

    Each new technology that comes along is hailed as the next 'big thing', full of promise. We mostly get sucked in to the hype. Enthusiasm is high. Vast amounts of money are spent on said new technology. Yet the early promise is rarely fulfilled.

    Which begs the question, 'Why?' The problem, I believe, lies with us and not the technology.

    We lose sight of the fact that a particular technology is never the complete answer to improving the quality of learning. It is (and always will be) just another means to deliver it.

    But because technology adds a layer of complexity to the learning design process, we tend to get blinded by the mechanics of using the technology and focus too much on that.

    An example  would be someone thinking, "If I get really good at using Storyline, I'll be able to create fantastic e-learning". Sad to say, being really good at using Storyline only results in being really good at using Storyline. That is to say, mastery of lots of features. The ability to use those features quickly and efficiently. The ability to troubleshoot and solve problems when the features don't work as expected.

    All of these skills are fantastic to have and would be an asset in any learning and development team. But, unfortunately, mastery of the technology alone does not guarantee an effective piece of learning. 

    As well as being really good at using the technology, you need to be smart about how you apply your learning design when using a particular technology.

    And here's the conundrum. People who are really good at learning design are often not so hot (or interested in) the technology. Those who are really good with the technology tend not to be so talented with the learning design. Finding someone who is equally talented in both areas is rare. If you have such a person in your team, do whatever you can to hold on to them.

    In the end, it'll be the smart application of your learning design that makes the difference between a run-of-the mill piece learning and a really impactful, effective one. So if you don't have a tech and design genius all rolled into one, you'll really need to find ways of getting closer collaboration and understanding between the design and the tech experts.

     

    Want to see how you are faring with your e-learning technology and design? A great place to start is our Discover Your E-Learning Impact Scorecard. It only takes a couple of minutes to complete and you'll get some personalised feedback almost instantly.

    Topics: Instructional Design e-learning e-learning software
    2 min read

    The l&d dilemma

    By Andrew Jackson on Wed, Apr 16,2025

    When it comes to deciding about their own learning, do learners always know best?

    It pains me to say this, but there's quite a bit of research evidence to suggest that the honest answer would be 'quite a lot of the time, not'.

    This gives us L&D folk quite the dilemma.

    After all, we want the best for our learners. We don't want to appear like rigid authoritarians. Yet we also know that our suggestions or ideas can appear counter-intuitive to some learners and project sponsors.

    How often have you been asked to provide a specific type of learning solution to a group of learners, only to discover that once you start digging, the requested solution is entirely unsuitable. And then meet significant resistance to changing the original request.

    Often, the learners (or those requesting learning on their behalf) are trapped by their own limited experience of prior learning. If all they know, for example, is knowledge presentation e-learning and they are okay with this and haven't experienced any other type of e-learning, they will probably just request what's familiar and comfortable. Even though this might get poor outcomes or results.

    If, as another example, they have been told that on-demand videos are the new big thing in learning and that they'll get massively better results from using this medium, they will probably come to you asking for a video solution, utterly convinced this is what will work for them.

    All of this, I think, relates to problems with respect and perception. In many organisations, the learning and development function is viewed (rightly or wrongly) as not very effective. This means that people in the wider business don't especially value or respect the learning expertise that may be available.

    They look elsewhere for advice, work up their own ideas which are then presented to learning and development as a fait accompli.

    This can feel very frustrating and demoralising. After all, it's very unlikely that the very same people would pitch up in the marketing department, telling them how they wanted to see marketing campaigns and materials designed and executed.

    The key, ultimately, is what should we do as learning and development folk to deal with this dilemma. First, we need to take a long hard look at ourselves. Are we as expert and professional as we could be? Are we sometimes stumbling along and just getting by? 

    If you feel harassed and put upon, it's tempting to blame someone else: lack of resources, unrealistic timelines, a culture that is hostile to formal learning.

    All of these factors could be true. But unless we are prepared to think about how to start shifting perceptions and how to keep constantly evolving our skills and expertise, this particular L&D dilemma will not be resolved.

     

    Feeling frustrated and demoralised because your L&D expertise and skills are not valued or trusted? Check out our on-demand webinar on this topic: How to amplify learning impact: the journey from order-taker to trusted expert.

    Topics: Instructional Design