Before you commit to a platform, you want to know what you are getting into. Here is the full picture.
Most L&D tools ask you to commit before you know whether they work. Long contracts, long implementation timelines, and a promise that results will come eventually.
Multiply is different. The pilot is designed to give you evidence before you make that call.
Here is exactly what the pilot involves, what you will see during it, and what success looks like when it ends.
How the Pilot Starts: The Transfer Readiness Diagnostic
Before a campaign starts, the Multiply platform runs a Transfer Readiness Diagnostic on the training programme you have chosen for the pilot.
This is the Predict phase of the Multiply Method. It answers one question: will the conditions for training transfer actually be in place when this programme runs?
The diagnostic surveys a sample of learners and their managers. It collects data across four areas:
- Skill Deficit Score — Is the performance issue actually a knowledge gap, or something else?
- Environmental Barrier Score — Are there process or system blockers that will undermine transfer, regardless of the quality of the training?
- Manager Support Score — Are managers prepared and willing to support behaviour change?
- Transfer Readiness Score — The overall readiness signal, combining all four dimensions
The output is one of three recommendations:
- Train with Confidence — conditions are right, proceed
- Fix First — address specific barriers before training runs, or transfer will not happen
- Don't Train — the problem is not a training problem, and a different approach is needed
Multiply is the only platform that will tell you not to train. That is a feature, not a caveat.
The 90-Day Transfer Campaign
Once the diagnostic confirms training is the right approach, a Transfer Campaign launches around your programme. This covers three phases.
Prime: Before Training
In the fortnight before the training event, managers receive a briefing via Slack or Teams. They are told what behaviours learners will be working on, what to look for, and what their role is in the 90 days ahead. Learners receive pre-training reflections: short, specific prompts that connect the upcoming content to real work situations they already face.
Manager commitment is captured at this stage through the Alignment Gate. Without it, the system flags low manager readiness as a risk before the programme begins.
Perform: After Training (Days 1 to 90)
After the training event, a sequence of messages deploys via Slack or Teams across 90 days. These are split between learner reinforcement challenges and manager coaching prompts.
Learners receive spaced reinforcement: short challenges that ask them to apply specific behaviours in real work moments, not simulations. Managers receive coaching cards: specific conversation starters tied to the exact behaviours covered in training.
The platform monitors engagement throughout. If participation drops below threshold, early warning signals fire so the L&D team can intervene before transfer fails quietly.
Prove: Days 30, 60, and 90
At three points across the campaign, pulse checks go out to learners and their managers. These are short and behavioural: not "did you enjoy the training?" but "are you doing this differently?"
At Day 90, the platform generates the Actual Transfer Score.
What You See: The Metrics
| Metric | What It Tells You |
|---|---|
| xTS (Expected Transfer Score) | Pre-training prediction of transfer likelihood, based on manager commitment, learner motivation, and programme alignment |
| ATS (Actual Transfer Score) | Post-campaign measure of whether learners are applying target behaviours on the job |
| TCHS (Transfer Climate Health Score) | Whether the work environment is supporting or blocking behaviour change |
| BPI (Business Performance Index) | At Day 90: did the training move the KPI defined before the programme launched? |
| MEI / LEI (Engagement Indices) | Manager and learner engagement with the campaign throughout |
| Kill Signals | Early warnings when pre-training indicators fall below acceptable thresholds |
The xTS is your prediction before the campaign runs. The ATS is your proof after. The gap between them tells you where transfer succeeded and where the environment let it down.
What Success Looks Like
At the end of the pilot, you will have:
- A Transfer Readiness Diagnostic for one real training programme, with specific recommendations
- A completed 90-day Transfer Campaign with full manager and learner engagement data
- An Actual Transfer Score tied to a business KPI you defined before the programme ran
- A pilot report showing what happened, what the data means, and where to focus next
The standard we hold ourselves to: you should be able to take the final pilot report into your next leadership conversation and use it. If you cannot, the pilot has not done its job.
Timeline and What You Commit To
| Phase | Timing |
|---|---|
| Onboarding and setup | Week 1 |
| Transfer Readiness Diagnostic | Week 1 |
| Prime phase (before training) | 2 weeks before the training event |
| Transfer Campaign: Perform and Prove | Weeks 1–12 post-training |
| Final report and debrief | Week 12 post-training |
What you commit to: one training programme to run through the pilot, a clear view of the target behaviours and the business KPI you want to connect to, and manager access via Slack or Teams.
You do not need to change your LMS. You do not need to rebuild your training content. You upload what you have, and the platform builds the campaign from it.
Investment
The pilot costs nothing.
The only ask is honest feedback. You get boardroom-ready evidence. We get to see the system run in a real organisation. That is a fair exchange.
Key Takeaways
- The pilot starts with a Transfer Readiness Diagnostic — you know before the training runs whether conditions are right
- The 90-day Transfer Campaign automates manager activation, learner reinforcement, and behaviour measurement via Slack or Teams
- You finish with an Actual Transfer Score tied to a business KPI you defined at the start
Join the Pilot
The pilot is free. The only ask is honest feedback. You get boardroom-ready evidence that your training changed something. We get to see the system run in a real organisation.
If you have a training programme coming up in the next quarter and you want to know whether it will actually change behaviour on the job, this is how you find out.
