By: Julius Agyei Okanta and Candela Iglesias Chiesa
Have you found yourself trapped in a training that was:
– boring?
– unidirectional?
– did not ask for your input or expertise?
– had no significant impact on your daily work?
If so, you are not alone! According to a Harvard Business Review article, 75% of 1,500 managers surveyed from 50 different organisations expressed dissatisfaction with their training at the workplace.
In global and public health especially, the stakes are too high to keep relying on poorly designed training not based on evidence from adult learning sciences.
At Alanda, we believe it’s time to ensure every training is not only engaging but also creates new learning that leads into behavioural change and thus to results.
Myths vs Facts About Training
In today’s fast-paced work environment, many traditional training methods fall short. Below are four common training myths that no longer serve the needs of modern workplaces:
Myth 1: The goal of the training is the transfer of knowledge.
Fact: Effective training requires clear objectives for the training. There are very rare cases where the goal is purely knowledge transfer, as you want to ensure people work or act in a new way (e.g., a healthcare worker engages differently with pregnant women) so that a tangible result can be achieved (e.g., pregnant women feel heard and respected and continue antenatal care visits).
Myth 2: Unidirectional training works (I mean, everyone is doing it so…)
Fact: We call unidirectional training the one where the person “holding the knowledge” stands in front of a group and talks and talks. You’ve likely experienced it (hint: It usually includes a lot of slides!) There is research suggesting that this is not the most effective way to transfer knowledge and much less to generate a change in behaviour that leads to results. Interactive, bidirectional, or multidirectional methods like group discussions, scenario immersion, and supervised practice are not only more engaging but also more effective.
Myth 3: One-off training creates results
Fact: Sadly, most one-off training (e.g., a training that happens only once, with no follow up), will lead to poor results. Especially if they are short and unidirectional, and not paired with a strong follow up strategy. Training is better seen as a continuous process, where a first training can then lead to subsequent refreshers or other ways to keep building on that learning and creating more opportunities to apply the changes. Changing behaviours is hard, so a single instance of training is rarely enough.
Myth 4: I have a pre and post training test for evaluating my training.
Facts: While immediate feedback is valuable, people being happy with the training and having learnt something is not why you set up training in the first place. You want to see changes in behaviour (e.g. how healthcare workers talk to pregnant women) and check that those changes actually lead to results (more pregnant women having positive antenatal birth experiences and completing all their visits). True success of the training needs to be measured later, not right after the training, so that trainees have had time to apply their skills in real-life settings and results have a chance to accumulate.

Ok, great then, so no more one-off unidirectional training. But if not that, what?
Embracing Long-Term Training Strategies
What is the opposite of a one-off training? What can long term training strategies look like? Here are some of our favourites:
Continued Training Sessions: Regular, spaced-out training sessions (e.g. a couple of hours per week spread over 12 weeks, or a monthly engagement for a year) prove more effective than intensive one-time courses. This allows employees to gradually build skills, practise what they learned in each session, and come up with questions and tweaks.
Coached or supervised practice: Practising new skills under the guidance of a coach or supervisor who offers immediate feedback has proven highly effective across various fields, from surgical procedures to community health work.
In a supervised practice, the person learning the new skill applies it while a coach or supervisor observes and later (e.g. not in front of a patient) provides feedback on what went well and what to improve.
While the word “supervision” can have some negative connotations, supportive supervision which provides constructive feedback can significantly enhance skill development. To ensure this, it’s crucial to train supervisors in effective coaching techniques.
Peer-to-peer cross-checks: Peer-based learning methods, such as peer-to-peer cross-checks between teams or communities of practice, can significantly boost skill development and performance. They foster collaboration, encourage knowledge sharing, and provide opportunities for professionals to evaluate each other’s work, identifying areas for improvement, and exchanging best practices in a supportive environment.
If done correctly, these can lead to some healthy competition between teams, and are also great idea generators and ways to spread best practices.
For example, every quarter, labour room teams cross-check each other’s rooms and work based on a checklist of pre-established best practices, and then discuss the results, their own best practices and areas of opportunity.

But we are not going to go into all the trouble of developing well designed long term training without ensuring we are measuring our results beyond satisfaction with the training and short term retention of learning. So how do we evaluate our training?
Evaluating Training Effectiveness Using The Kirkpatrick’s Evaluation Model
As mentioned earlier, the effects of training cannot be measured right away, as it takes time for behaviours to change and results to improve.
The Kirkpatrick’s Training Evaluation model focuses on evaluating a training on 4 different levels:
- Reaction: Measures how participants respond to the training (did they like it, would they recommend it, etc.).
- Learning: Assesses the increase in knowledge or skills (e.g., through a pre and post-test quiz).
- Behaviour: Evaluates the change in behaviour on the job (usually some months after the training).
- Results: Determines the impact on the organisation’s outcomes and on the people the organisation serves (which will take some time and will likely be due to several factors and not just the training).
When you are clear on what the results are at the organisational level and for the people you serve that you are aiming for with training, this model can help you determine how effective the training was in achieving those results. This type of evaluation ensures actionable insights that allow you to continue improving your training.

In global health, where the stakes are high and resources are often limited, effective training is non-negotiable.
It is time to move beyond training practices with limited effectiveness and embrace strategies based on evidence if we want to build stronger teams and improve health outcomes for communities worldwide.
We’d love to hear about any interesting training experiences you’ve had recently! Feel free to share your insights with us.
Thank you for reading! If you found this post valuable, here is how you can take the next step with us:
Book a call to discuss our services or reach out to us via email at projects@alandahealth.com. We’d love to help you take your project or training to the next level!
If you liked this post, you will also like:
- How a Health Needs Assessment Can Boost Your Project’s Impact
- Alanda’s 3 E’s Framework: Evidence-based, Evidence-generating and Evidence sharing
- Mastering Global and Public Health Presentation Skills: Your Passport to Impactful Communication
- After Action Reviews (AARs): a tool to openly discuss successes and failures of your health intervention
