Module 5

Training & Capability Building

Design and deliver effective training programmes that equip banking professionals with the skills and confidence to succeed in new ways of working — from needs analysis to measuring training effectiveness.

Module 5 — 90-second video overview

The Critical Role of Training in Change Adoption

Training is the mechanism through which people acquire the knowledge, skills, and confidence to work effectively in the new way. In the ADKAR model, training addresses two critical elements: Knowledge (understanding how to change) and Ability (being able to perform the new way of working in practice). Without effective training, even the most well-communicated, well-sponsored change initiative will fail at the point of adoption — people will want to change but will not know how, or they will know how in theory but will not be able to perform in practice.

In banking, training carries additional weight because of the regulated environment. Staff must not only learn new systems and processes — they must be demonstrably competent. Regulators expect banks to maintain evidence that staff are trained and competent for the roles they perform. The FCA's Senior Managers and Certification Regime (SM&CR) in the UK, and equivalent regimes in other jurisdictions, require firms to certify that staff in certain roles are "fit and proper" — which includes having the necessary skills and training. A bank that migrates to a new transaction monitoring system without adequately training its analysts is not just running an operational risk — it is potentially breaching regulatory requirements.

Training is also one of the highest-cost components of any banking change programme. A large-scale technology migration affecting hundreds of staff may require thousands of training hours — representing significant direct cost (trainer time, materials, system access), indirect cost (staff taken off production during training), and opportunity cost (operational capacity reduced during the training period). Designing training that is efficient, effective, and targeted is therefore not just good practice — it is a financial imperative.

Training Needs Analysis

A training needs analysis (TNA) is the structured process of identifying the gap between current capabilities and the capabilities required after the change. It answers three questions: What do people need to be able to do? What can they do now? What training is needed to close the gap?

Step 1: Define the future-state competency requirements. Working with process owners, system designers, and subject matter experts, map out exactly what each role will need to know and be able to do after the change. Be specific: not "use the new system" but "log into the new platform, navigate to the alert queue, apply the triage workflow, disposition alerts using the new decision framework, escalate cases to Level 2 using the integrated escalation tool, and generate daily activity reports."

Step 2: Assess current capabilities. Evaluate what the affected population can do today. This assessment should consider both the skills that transfer directly to the new way of working (and therefore do not need training) and the skills that are new or fundamentally different. In banking, this assessment often reveals significant variation within the same team — some analysts are highly proficient with technology and will adapt quickly; others have relied on the same system for years and will need intensive support.

Step 3: Identify the gaps. The gap between future requirements and current capabilities defines the training need. Gaps should be categorised by type:

  • Technical skills gaps — learning new systems, tools, and technical processes
  • Process skills gaps — following new workflows, applying new decision frameworks, using new escalation procedures
  • Behavioural skills gaps — working in new ways (e.g., shifting from checklist-based processing to judgement-based investigation, from individual work to team collaboration)
  • Compliance knowledge gaps — understanding new regulatory requirements, updated policies, or revised control procedures

Step 4: Prioritise training needs. Not all gaps are equal. A gap that affects the ability to perform a core regulatory function (such as sanctions screening or suspicious activity reporting) must be closed before go-live. A gap that affects a secondary or lower-risk function might be addressed through post-go-live coaching or self-directed learning. Prioritisation ensures that training resources are focused where they matter most.

Competency Gap Assessment

The competency gap assessment is a more detailed extension of the TNA that maps gaps to individuals or role groups. In banking, this typically takes the form of a competency matrix:

CompetencyRequired ProficiencyCurrent ProficiencyGapTraining Priority
Navigate new platform interfaceProficientNoneHighCritical — pre-go-live
Apply AI-assisted triage workflowExpertNoneHighCritical — pre-go-live
Write structured case narrativesProficientBasicMediumImportant — pre-go-live
Generate management reportsBasicNoneMediumImportant — first 4 weeks
Use advanced analytics dashboardBasicNoneLowDesirable — first 8 weeks

This matrix allows the change manager to design training curricula that are calibrated — investing more time and intensity in critical, high-gap competencies and less in areas where the gap is smaller or the skill is less urgent.

For large populations, the competency gap assessment can be conducted through self-assessment surveys (where individuals rate their own proficiency), manager assessments (where line managers rate their team members), skills tests (practical assessments of current ability), or a combination of all three. In banking, where accuracy of assessment matters for regulatory competency assurance, manager assessments and skills tests are preferred over self-assessment alone.

Training Design Principles: Adult Learning Theory

Training for banking professionals must be designed according to the principles of adult learning theory (andragogy), as articulated by Malcolm Knowles. Adults learn differently from children, and training that ignores these differences will be ineffective regardless of how well the content is structured.

Adults need to understand the relevance. Before adults will invest effort in learning, they need to understand why the training matters and how it connects to their work. Every training session should begin with a clear statement of purpose: "By the end of this session, you will be able to triage alerts on the new platform — the same work you do today, but with better tools and a more efficient workflow."

Adults bring experience. Banking professionals often have years or decades of experience. Training that ignores or devalues this experience will be met with resistance. Effective training acknowledges existing expertise, builds on it, and explicitly connects new learning to familiar concepts: "The triage decision framework has the same underlying logic as your current approach — you are still assessing risk indicators, customer context, and transaction patterns. The difference is that the new platform surfaces this information automatically instead of requiring you to look it up manually."

Adults are self-directed. Adults prefer to have some control over their learning — the pace, the sequence, and the depth. Where possible, training should offer flexibility: self-paced e-learning modules that can be completed in any order, reference materials that can be consulted as needed, and practice environments where individuals can explore at their own pace.

Adults learn by doing. The most effective training for banking professionals is hands-on, practical, and immediately applicable. Lectures and presentations have limited value. Simulation exercises, sandbox environments, worked examples, and supervised practice on realistic scenarios produce far better outcomes.

Adults need immediate application. Learning that is not applied within days of acquisition is rapidly forgotten. Training should be timed as close to go-live as possible — typically 1-3 weeks before the learner will begin using the new system or process in production. Training delivered months in advance will need to be repeated.

Delivery Methods

Effective training programmes in banking use a blended learning approach that combines multiple delivery methods:

Classroom training (instructor-led). Best for: complex topics that require discussion, demonstration, and real-time Q&A. In banking, classroom training is particularly effective for process changes that require judgement (such as new investigation methodologies or revised escalation criteria), because the trainer can facilitate discussion around real scenarios and edge cases. Classroom sessions should be kept to a maximum of 3-4 hours with regular breaks — attention and retention drop sharply after this point.

E-learning (self-paced online modules). Best for: system navigation, compliance updates, and knowledge-based content that learners can absorb at their own pace. E-learning is efficient for large populations and geographically dispersed teams. However, it must be well-designed — interactive, scenario-based, and concise (15-30 minutes per module). Long, text-heavy e-learning modules with minimal interactivity are a waste of time and budget.

Simulation and sandbox environments. Best for: building practical proficiency with new systems. In banking, a sandbox environment loaded with realistic (but anonymised) test data allows analysts to practice transactions, navigate workflows, and make mistakes without risk. This is arguably the most valuable training investment for a technology migration — nothing builds confidence like successfully completing a task in a realistic simulation.

On-the-job training (OJT). Best for: developing proficiency in the live production environment after go-live. OJT involves supervised practice at the desk, where the learner processes real work with a trainer or experienced colleague available to coach, correct, and support. In banking operations, OJT is essential during the first 2-4 weeks post-go-live.

Buddy systems. Best for: providing continuous, informal support during the transition period. Each learner is paired with a "buddy" — an experienced colleague or an early adopter who has already become proficient on the new system. The buddy provides at-the-desk support, answers questions in real time, and provides psychological reassurance during the stressful first weeks of the transition.

Quick reference guides and job aids. Best for: providing at-the-desk reference materials that learners can consult while working. In banking, these might include step-by-step guides for common system tasks, decision trees for triage workflows, and checklists for quality assurance procedures. They should be concise (1-2 pages), visually clear, and laminated for desk use.

Measuring Training Effectiveness: The Kirkpatrick Model

Donald Kirkpatrick's Four-Level Model provides a comprehensive framework for evaluating whether training has achieved its objectives:

Level 1: Reaction. Did participants find the training useful, relevant, and well-delivered? Measured through post-training evaluation forms (often called "happy sheets"). While this is the easiest level to measure, it is the least valuable — people can enjoy training without learning anything useful.

Level 2: Learning. Did participants acquire the intended knowledge and skills? Measured through assessments — tests, quizzes, practical demonstrations, or simulation exercises. In banking, Level 2 assessment is particularly important for compliance-critical training, where the organisation must demonstrate that staff have achieved a defined level of competency.

Level 3: Behaviour. Are participants actually applying what they learned in their daily work? This is where most training programmes fail — there is a significant gap between knowing how to do something and actually doing it consistently under real-world conditions. Level 3 is measured through observation, manager feedback, quality assurance sampling, and performance data. In banking, this might involve reviewing a sample of the analyst's case decisions to determine whether they are applying the new triage methodology.

Level 4: Results. Has the training contributed to the business outcomes the change was designed to achieve? Measured through operational KPIs — alert processing time, error rates, STP rates, customer satisfaction, regulatory compliance metrics. Level 4 is the most valuable but most difficult to measure because it requires isolating the impact of training from all the other factors that affect business performance.

In banking change programmes, Level 2 and Level 3 are the critical measurement points. Level 2 provides assurance that people have the knowledge and skills (a regulatory requirement). Level 3 tells you whether the training has actually changed behaviour — and if it has not, it identifies where additional coaching, support, or reinforcement is needed.

Kirkpatrick's Four Levels of Training Evaluation

1
Level 1: ReactionDid they value the training?
2
Level 2: LearningDid they acquire knowledge?
3
Level 3: BehaviourAre they applying it at work?
4
Level 4: ResultsDid business outcomes improve?

Knowledge Transfer and Documentation

Training is a point-in-time activity; knowledge transfer ensures that learning endures beyond the training session. In banking, effective knowledge transfer includes:

Updated standard operating procedures (SOPs). All SOPs must reflect the new way of working before go-live. These become the definitive reference for how work should be performed and are the basis for ongoing training, quality assurance, and audit.

Training materials as reference resources. Training presentations, guides, and simulation exercises should be stored in an accessible, version-controlled repository so that they can be used for future reference, refresher training, and onboarding of new joiners.

Recorded training sessions. Where possible, record classroom and virtual training sessions so that they can be viewed by staff who were absent, new joiners who start after the programme, and individuals who want to revisit specific topics.

Knowledge base and FAQ repository. Create a searchable knowledge base where staff can find answers to common questions, troubleshooting guides for common issues, and tips from experienced users. In banking, this is often hosted on the internal intranet or a dedicated collaboration platform.

Subject matter expert (SME) network. Identify and formalise a network of SMEs who can provide ongoing support, answer complex questions, and contribute to the continuous improvement of processes and documentation.

Building Internal Capability vs External Delivery

Banking change programmes frequently face the decision of whether to build internal training capability or engage external training providers. Each approach has advantages:

External trainers bring expertise in training design and delivery, are immediately available, and can bring cross-industry perspectives. They are best suited for one-off, time-limited training needs — such as the initial training wave for a technology migration. However, they are expensive, they lack deep organisational context, and their knowledge leaves with them when the engagement ends.

Internal trainers have organisational context, understand the culture and operational reality, and remain available for ongoing training needs — new joiner onboarding, refresher training, and continuous development. Building internal capability requires upfront investment (identifying and training the trainers, developing materials, allocating their time), but it creates a sustainable, cost-effective training function that serves the organisation long after the programme ends.

In banking, the most effective approach is typically a hybrid model: external trainers deliver the initial training wave (leveraging their expertise in the new system or process), while simultaneously training a group of internal trainers through a "train the trainer" programme. The internal trainers then take over responsibility for ongoing training, refresher sessions, and new joiner onboarding. This approach captures the best of both worlds: external expertise for the initial wave, internal sustainability for the long term.

Banking Example: Designing a Training Programme for Reconciliation Analysts

A mid-tier European bank was implementing a new automated matching and reconciliation platform to replace a legacy system that had been in use for twelve years. The change affected 150 reconciliation analysts across two locations (London and Warsaw), who processed approximately 200,000 transactions per day across nostro, customer, and intercompany reconciliation streams.

Training needs analysis revealed significant gaps:

The new platform had a completely different user interface, navigation structure, and workflow management approach. No existing skills transferred directly to the new system. Analysts needed to learn: platform login and navigation, configuring match rules, reviewing and actioning automated match results, manually matching exceptions, creating and managing break investigations, generating reconciliation status reports, and escalating aged breaks.

In addition to system skills, the new operating model introduced process changes: analysts would shift from manually matching transactions (the legacy process) to managing exceptions from an automated matching engine. This required different skills — instead of comparing two data sources line by line, analysts needed to interpret automated match results, investigate the reasons for mismatches, and resolve exceptions using new diagnostic tools. The role was evolving from "data processor" to "exception investigator."

Competency gap assessment revealed three distinct population segments:

  • Segment A (60 analysts): Technology-comfortable, adaptable, moderate gap — would reach proficiency with standard training
  • Segment B (70 analysts): Process-expert but technology-cautious, significant gap — would need extended training and intensive desk-side support
  • Segment C (20 analysts): Long-tenured specialists with deep expertise in the legacy system, highest gap — would need the most intensive training and personal coaching

Training design used a blended learning approach:

Phase 1: Foundation e-learning (2 weeks before classroom training). Four self-paced e-learning modules covering platform overview, navigation basics, key terminology, and the new operating model concept. Each module was 20-25 minutes, with an end-of-module quiz. Completion was mandatory before attending classroom training.

Phase 2: Classroom training (3 days per group). Day 1 covered system navigation, match rule configuration, and automated match review — with extensive hands-on practice in a sandbox environment loaded with realistic transaction data. Day 2 covered exception management, break investigation, and manual matching — again with hands-on practice. Day 3 focused on reporting, escalation procedures, and end-to-end scenario exercises where analysts processed a full day's reconciliation in the sandbox.

Phase 3: Simulation exercise (1 week). After classroom training, analysts spent one week in a dedicated simulation environment processing "mock production" — realistic transaction volumes and scenarios designed to build speed, confidence, and muscle memory. The simulation was supervised by trainers who provided real-time coaching and feedback.

Phase 4: Parallel running with buddy support (2 weeks). For the first two weeks of go-live, the old and new systems ran in parallel. Each analyst was paired with a buddy — a member of the pilot group who had been using the new system for four weeks and had achieved proficiency. Buddies sat alongside their assigned analysts, providing at-the-desk support, answering questions, and helping resolve real-time issues.

Phase 5: Post-go-live coaching (4 weeks). After parallel running ended and the old system was decommissioned, dedicated coaches were available on the floor for four weeks to support analysts who were still building proficiency. Coaching focused on Segment B and Segment C analysts who had the largest gaps and the steepest learning curves.

Reconciliation Analyst Training Programme

Foundation E-Learning

2 weeks before

4 self-paced modules, 20-25 minutes each

Classroom Training

3 days

Navigation, exception handling, and real scenarios

  • Day 1: Platform navigation
  • Day 2: Exception workflows
  • Day 3: Full scenario practice

Simulation Exercise

1 week

Mock production environment with supervisor coaching

Parallel Running

2 weeks

Buddy support during live parallel operation

On-the-Job Coaching

4 weeks post-live

Targeted support and competency validation

Training effectiveness measurement used all four Kirkpatrick levels:

  • Level 1 (Reaction): Post-classroom evaluation forms — 91% rated the training as "effective" or "very effective"
  • Level 2 (Learning): End-of-Day-3 practical assessment — all analysts demonstrated the ability to perform core tasks in the sandbox. 12 analysts from Segment C required additional practice sessions before achieving the required standard.
  • Level 3 (Behaviour): Week 4 quality review — a sample of 500 completed reconciliations per location was reviewed for accuracy and adherence to the new methodology. London achieved 94% compliance; Warsaw achieved 89% (targeted coaching was deployed to address the gap).
  • Level 4 (Results): Month 3 KPI review — automated match rates reached 78% (target: 75%), manual exception resolution time averaged 12 minutes (target: 15 minutes), and aged break volumes reduced by 35% compared to the legacy system baseline.

The programme invested approximately 6,000 training hours across all phases — a significant investment. But the structured, needs-based, blended approach meant that every hour was targeted at specific gaps, delivered through the most effective method for each competency, and measured for impact. The result was a successful transition with zero significant operational incidents during the migration and full analyst proficiency achieved within eight weeks of go-live.

In the next module, we will address one of the most challenging aspects of change management — understanding and managing resistance.

Module Quiz

5 questions — Pass mark: 60%

Q1.What is the first step in designing a training programme for a banking change initiative?

Q2.In Kirkpatrick's Four-Level Model, Level 3 (Behaviour) measures:

Q3.Which adult learning principle is MOST important when training experienced banking operations staff?

Q4.What is the primary advantage of a 'buddy system' during a banking technology migration?

Q5.When deciding between building internal training capability versus using external trainers, which factor MOST favours an internal approach?