Building Routine, Building Belonging. Why In-Person Training Creates Real Change
The Remote Work Mistake
When the pandemic forced training online, something shifted in how we thought about training delivery. If professionals could work from home, surely people could learn new skills from home?
The logic seemed reasonable. But it confused two different things: doing work you already know how to do versus learning something that requires you to fundamentally change how you see yourself.
You can deliver digital content remotely. You can teach new software features remotely. What you cannot do remotely is rebuild someone's belief in themselves. And that's what digital skills training actually needs to accomplish. We made a deliberate choice five years ago, we removed our remote option entirely. Not because virtual training is bad for everything. But because it's the wrong tool for what we do. This blog is for funders who are considering virtual training because it's "more scalable" or "reaches more people." We want to be clear about what you'd be giving up.
What In-Person Training Actually Creates
When people show up to training in person, five things happen that cannot happen virtually:
1. Routine and Structure
Long-term unemployment doesn't just cost people jobs. It destroys routine. Days blur together. Sleep patterns shift. Motivation evaporates. The structure of showing up somewhere, at a specific time, with a specific purpose creates something that unemployment erases: rhythm.
The first week of in-person training is often just about rebuilding that rhythm. Getting up at a set time. Commuting. Being part of a group. Working specific hours. This matters more than most people realise—especially for long-term unemployed participants.
Virtual training can't create this. You can log in from bed. You can attend sporadically. There's no forcing function pushing you into routine. And routine is foundational to the confidence that leads to employment.
2. Belonging and Social Proof
Watching someone else figure something out changes what you believe is possible for you. It's one thing to have a trainer tell you "you can do this." It's another thing entirely to sit next to someone who looked just as confused as you yesterday and now they're building a functioning website.
This is social proof. It's one of the most powerful confidence-builders available. And it only happens in person.
In virtual training, everyone is in their own home. You don't see peers struggle. You don't see peers succeed in real time. The trainer tells you "others are figuring this out" but you don't experience it directly. The psychological impact is completely different.
3. Real-Time Response and Relationships
When someone's confidence is crumbling—and it always does at some point during challenging training—a trainer who knows them personally can intervene. They can see the shift in someone's face. They can pull them aside and have a real conversation. They can push them when they need pushing. They can reassure them when reassurance matters.
This relationship is built through presence. You cannot build genuine relationship through a screen. You can build transactional interaction. But belief in someone — real, personal belief — requires presence.
Virtual training loses this entirely. A trainer on a Zoom call has no idea that someone is about to quit. They don't see the moment of breakdown. They don't get to have the conversation that changes someone's mind.
4. Accountability That Matters
There's a difference between accountability to a computer and accountability to people sitting next to you.
When you're in a room with the same 15 people for five weeks, you build relationships with them. When someone commits to something and then shows up and does it, the other people see that. When someone struggles and gets support, it matters because it comes from people, not from a system.
This creates accountability that's psychologically meaningful. You don't want to let your group down. You want to show up because these people are counting on you. Virtual training removes all of that. You're accountable to no one but yourself, and self-accountability breaks down precisely when people need it most.
5. Embodied Learning
Humans don't learn by absorbing information. We learn through our bodies. We learn through movement, through spatial memory, through physical interaction with tools and spaces.
When someone sits at a real computer and builds a real website—feeling the keyboard, moving the mouse, seeing the immediate result of their actions—it becomes embodied knowledge. They remember it differently. They believe it differently.
Virtual training where someone's laptop is their training tool often produces more remote learning experience. They're learning on a screen, about things that happen on screens, in a way that's abstracted. The embodied knowledge never develops the same way.
This is why in-person digital skills training produces different outcomes than virtual digital skills training—even when the content is identical.
Why Virtual Training Fails at Confidence-Building
We're not saying virtual training can't work for some things. Online courses are great for people who already believe in themselves and just need knowledge. They're fine for professional development when someone's already employed and confident.
But for the population most councils are trying to help—long-term unemployed people who've had their confidence eroded, self-employed people struggling with imposter syndrome about marketing—virtual training is the wrong tool.
Virtual training can:
Transfer knowledge ✓
Create certificates ✓
Reach more people ✓
Virtual training cannot:
Build genuine belonging
Create real-time relationship
Rebuild routine and structure
Deliver social proof
Generate accountability that sticks
And those are exactly the things that change outcomes.
What In-Person Training Costs (And Why It's Worth It)
Here's the honest truth: in-person training is more expensive. You need a physical space. You need trainers present full-time. You can't scale to 200 people simultaneously. Logistics are harder.
But you get outcomes. And outcomes are why you're funding training in the first place.
The maths are simple: if virtual training costs 30% less but produces 50% fewer employment outcomes, it's not actually cheaper. It's just shifting the cost from training to ongoing unemployment support, lost productivity, and unmet potential.
In-person training costs more upfront but delivers real change. That's the deal funders should be evaluating.
What We Require of In-Person Training
We're not saying all in-person training is equally effective. But if you're evaluating in-person training providers, here's what matters:
Cohort size small enough for real relationships (We keep ours at 12-15. You can't build belonging in a room of 50.)
Same trainer and same group every day (Not rotating instructors, not people dropping in. Consistency builds relationship.)
Hands-on, real-world projects (Not exercises. Not simulations. Real work that creates real results.)
Intentional community building (Not incidental. The belonging is actively cultivated.)
Flexible but structured (People can't just bail, but the programme adapts to real-world demands.)
If an in-person programme has these elements, it will cost more than virtual training. It should. You're buying something that actually works.
The Funder's Choice
You can fund virtual training that's scalable, cheap, and produces completion certificates. You'll get good numbers for your report.
Or you can fund in-person training that's more expensive, smaller-scale, and produces actual employment and business growth. You'll get real outcomes.
Both exist. Both cost money. The question is what you want your investment to produce.
We believe that belonging, routine, relationships, and embodied learning aren't nice-to-haves in digital skills training. They're essential. Which is why we deliver them in person, and why we refuse to compromise on that regardless of pressure to scale or reduce costs.
If you're looking for partners who will — if they'll tell you virtual is "just as good" because it's cheaper or easier to deliver — you'll find them. But you'll be funding training that doesn't actually change lives the way in-person training can.
That's the choice. That's what we believe funders need to understand.
What Funders Miss When They Measure Completion Rates Instead of Confidence
The Metric That Lies
Every year, councils, combined authorities and other organisations fund training programmes and measure success using completion rates. 87% completion. 92% completion. 100% completion. These numbers look good in reports. They're also nearly meaningless.
A completion rate tells you that people showed up and finished. It doesn't tell you anything about whether their lives changed. It's possible to have a 95% completion rate and zero employment outcomes. It's possible to train 100 people and have none of them use what they learned.
But completion rates are easy to measure, they look impressive, and they don't require follow-up. So funders keep funding based on them.
This is costing councils real outcomes. Because training providers optimise toward what's measured. If completion rates are all that matter, you get programmes designed for easy completion, not for difficult, necessary change.
Why Completion Rates Are the Wrong Metric
Here's what completion rates actually measure, the trainer's ability to keep people in a room. That's it. It measures retention, not impact. And optimising for retention produces perverse outcomes.
A training provider can achieve 99% completion by making the course easy, letting people coast through with minimal effort, and requiring minimal actual learning. Participants leave with certificates but without confidence. No one gets employed. But completion rates are sky-high.
Alternatively, a trainer can push people through difficulty, which builds real confidence and changes outcomes but some people drop out because it's genuinely hard. Completion rate drops to 85%. But of the people who finish, 60% are employed within 6 months. Which is better?
Any funder focused on completion rates would choose the first option. Any funder focused on real outcomes would choose the second.
This is why completion rates exist as a metric, they make life easy for training providers and councils. They don't make life better for the people you're trying to help.
What Actually Matters: Confidence and Capability
If you want to know whether training worked, measure confidence.
Not self-reported confidence (which can be inflated). Real indicators of confidence:
Do they apply for jobs they wouldn't have applied for before? (For unemployed participants)
Are they actively using the skills they learned? (For self-employed participants)
Do they problem-solve when they hit obstacles, or do they assume they can't?
Are they willing to try new things related to their skills?
These are harder to measure than completion rates. They require follow-up. They require honest reporting. But they actually tell you whether training changed someone's belief in themselves.
The 6-Month Outcome Measure
This is why we track employment and business revenue at 6 months. Not because it's convenient. Because it answers the actual question: did this training change this person's life?
For unemployed participants:
Are they employed?
How long did it take to find employment?
Are they in jobs related to what they learned, or did the confidence from completing training help them succeed in any job?
For self-employed participants:
Has revenue increased?
Are they acquiring customers through the channels they learned about?
Have they hired additional people or expanded their business?
These metrics require honesty. Some people won't be employed at six months. Some businesses won't have grown. But you get a real picture of impact, not a manufactured story.
Why Councils Should Demand Better Metrics
You're investing taxpayer money. You have a responsibility to know whether that investment is working.
If a training provider can't or won't track outcomes at 6 months, there's a reason. Usually it's because the outcomes aren't strong enough to justify the investment.
Demand better:
Refuse to fund based on completion rates alone. It's not an outcome.
Require employment/business revenue tracking at 6 months. That's an outcome.
Ask how confidence is measured or assessed. If they don't have a method, they're not intentionally building it.
Demand honesty about who succeeded and who didn't. Real impact includes real numbers, not inflated claims.
The training providers worth funding are the ones who can honestly tell you: "Of 20 people who started, 18 completed. Of those 18, 14 were employed within 6 months in roles where they're using digital skills. Three were employed in unrelated roles but credit the confidence from training. One is still looking. Revenue for self-employed participants increased by an average of X%.
That's a real story. That justifies investment.
What It Means for Confidence-Building Training
Here's a critical insight, confidence-building training often has lower completion rates than content-delivery training.
Why? Because we actually push people through difficulty. We don't let people coast. We require them to struggle, problem-solve, and build real capability. Some people find this too hard and drop out. This is why completion rates are a terrible metric for evaluating training. They reward easy programmes and penalise challenging ones. But the challenging ones are the ones that actually work.
If you fund based on completion rates, you'll get easy training with poor outcomes. If you fund based on real outcomes, you'll get training that's appropriately challenging and actually changes lives.
The 5-Week Model. Why Confidence-Building Needs Time (But Not Too Much)
The Wrong Question Most Funders Ask
When organisations commission training, they often ask, "How fast can you deliver this?" The assumption is that faster delivery means more people trained, better ROI, and faster results. So training gets compressed.
But speed is the enemy of confidence-building. And confidence-building is what actually creates employment change.
The real question isn't "how fast can we deliver this?" It's "what's the minimum time needed for someone to genuinely believe they can do this?"
The answer is five weeks. Not two weeks. Not ten weeks. Five weeks.
Why Shorter Doesn't Work
In the first week of training, people are still in learning-mode. They're absorbing new information, adjusting to the environment, figuring out if they belong there. Confidence isn't built yet. They're just getting oriented.
By week two, they're starting to apply what they've learned. But the neural pathways are still forming. The new skills haven't become second nature. They still need close guidance.
Week three is where something shifts. They've done it enough times that it starts to feel less foreign. They're making mistakes and fixing them without immediately asking for help. They're starting to believe "I can actually do this."
Weeks four and five are consolidation. They're building speed, handling more complex projects, supporting each other as mentors, and internalizing the belief that they're capable.
Compress this to three weeks and you get knowledge transfer. You don't get confidence. People leave before the belief shifts.
This matters for your investment. Three-week programmes produce higher completion rates (they're shorter, easier to get through), but they don't produce employment or business growth. Five-week programmes take longer but produce the outcomes that actually justify the funding.
Why Longer Doesn't Necessarily Work Better
But does a 10-week programme build even more confidence? Not necessarily. And it introduces new problems.
Dropout risk increases. The longer a programme runs, the more competing demands pull people away. Some lose childcare, jobs start, life happens. Ten weeks is a long commitment that not everyone can sustain, especially if they're already vulnerable to disruption.
Motivation peaks and dips. Week two is usually optimistic. Week four is often where doubt creeps back in. Week six-eight is where people get tired. The extended length actually makes it harder to maintain the momentum that builds confidence.
Diminishing returns. After five weeks of intensive, in-person, hands-on training, people have either built confidence or they haven't. Additional weeks don't meaningfully increase confidence. They just delay employment or business application.
Cost efficiency matters. If you're funding training out of a fixed budget, five-week cohorts allow you to train more people annually than 10-week cohorts, with better outcomes per pound spent.
Why Five Weeks Is The Sweet Spot
Five weeks is long enough to:
Build genuine mastery (real practice at scale, not one-off exercises)
Create the social bonds that generate belief (people support each other through the hard part)
Move through the confidence dip (week 3-4 is hard; week 5 is breakthrough)
Complete substantive projects that prove capability
Establish new routine and belonging
Five weeks is short enough to:
Maintain momentum without burnout
Keep people committed despite competing demands
Sustain the intensity needed for real change
Finish before motivation naturally dips
Allow people to apply learning while it's fresh
This is based on what actually works, not on what's convenient. We could run two-week programmes if speed was the goal. We could run eight-week programmes if we wanted to maximise contact hours. But neither would change lives the way five weeks does.
In-Person Five-Week Training Changes the Equation
Here's what makes five weeks work, it's in-person.
Virtual training even if it were five week wouldn't work the same way because the key ingredient is absent: presence.
When people are in the same room for five weeks, they build relationships. They see each other struggle and succeed. They hold each other accountable. They create a sense of belonging that directly builds confidence. A trainer who knows them personally can push them in ways that matter. They can spot when someone's confidence is crumbling and intervene before they quit.
None of that happens virtually. So the timeline changes. You'd need more weeks in a virtual model to build the same confidence. And you'd still fall short because the relational element is missing.
What This Means for Your Funding Strategy
If you're designing a digital skills training intervention, don't optimise for speed. Optimise for outcomes. That means;
Reject two or three-week programmes (they don't work)
Be cautious of programmes longer than six or seven weeks (diminishing returns)
Demand in-person delivery (it's essential to the model)
Measure outcomes at 6 months (not completion rates)
Plan for cohorts that repeat (multiple five-week cohorts throughout the year beats fewer longer cohorts)
Five weeks isn't arbitrary. It's the result of understanding what actually builds confidence and what allows people to apply that confidence in their jobs and businesses.
Why Other Training Providers Focus on Content & Certificates But Miss Confidence
The Problem With How Most Digital Skills Training Works
When organisations commission digital skills training, they're typically looking for one thing - people with new skills who can get jobs or grow businesses. Sounds straightforward, right? But here's what's actually happening in most training rooms across the country, participants are learning content they'll forget, earning certificates that don't change hiring decisions, and leaving with exactly the same amount of confidence they arrived with.
This isn't a failure of the trainers. It's a structural failure of how digital skills training is designed.
Most providers operate on a simple model, deliver content, test knowledge, issue certificate, consider it a win. The logic seems sound. Skills → Certificate → Employment. But it breaks down in the real world because it's missing the crucial middle piece, belief.
Why Content Alone Doesn't Create Change
A participant can learn how to build a basic website. That's knowledge. But learning how to build a website in a training room, with an instructor available, in a low-stakes environment that's fundamentally different from believing you can build a website when you're alone at your desk, a client is waiting, and doubt creeps in.
Here's what other providers miss: employment doesn't happen because of skills. It happens because of confidence.
We've seen this repeatedly over 10 years. Someone completes a digital skills course, gets the certificate, and then... nothing changes. They don't apply for jobs that need those skills. Self-employed people don't use the marketing tactics they learned. Why? Because somewhere between learning and doing, confidence collapsed.
This is a specific problem. It's not that the training was bad. It's that training designed around content delivery doesn't address the psychological barrier between "I know how to do this" and "I believe I can do this."
The Certificate Problem
Certificates are easy to count. They look good on reports. They're tangible proof that training happened. But they're also a red herring for what actually matters.
Employers don't hire people based on certificates from a 5-week digital skills course. They hire people who:
Believe they can learn on the job
Aren't frozen by self-doubt when they encounter something new
Have proved they can finish what they start
Can work with others without feeling impostor syndrome
Traditional training providers optimise for what's easy to measure (completion rates, certificates issued) rather than what actually changes lives (confidence, employment, business growth).
When you fund training that prioritises certificates, you're funding something that looks impressive on paper but doesn't move the needle on the outcomes that matter: jobs filled, businesses growing, long-term unemployment reduced.
What Funders Actually Need
Councils, combined authorities, and other organisations are investing in people's futures. Your job is to fund interventions that work. That means funding training that builds both skills and the confidence to use them.
The distinction matters because it changes what you should be measuring, how you should be evaluating success, and which training partners are worth commissioning.
If a training provider tells you their success metric is "completion rates" or "certificates issued," they're measuring the wrong thing. If they tell you they track employment outcomes 6 months later and business revenue increases post-training, they understand what actually matters.
What Digital Gum Does Differently
We started 10 years ago because we got tired of watching people leave training programmes unchanged. We designed our approach around confidence-building, not content delivery.
Our programmes teach practical digital skills but we teach them in a way that builds confidence simultaneously. Participants work on real projects, support each other, prove to themselves they're capable, and leave knowing they can handle what comes next.
We're honest about our focus, skills without confidence don't create change. That's not marketing language. That's what we've learned actually works.
If you're commissioning training and want to see real outcomes—employment, business growth, reduced long-term unemployment—then you need a partner who understands this fundamental gap.
Knowing How vs Believing You Can. Why Confidence Gaps Block Employment
The Gap That Changes Everything
There's a critical moment in every training programme. It comes right after someone learns how to do something. In that moment, two very different things can happen:
Scenario 1: They feel a sense of capability. They think, "I learned this. I can do this. Maybe I can apply for that job." Confidence builds. They take action.
Scenario 2: They feel relief the training is over. They think, "That was hard. I don't know if I could actually do that on my own. I'll probably mess it up." Confidence collapses. Nothing changes.
The difference isn't in what they learned. It's in what they believe about themselves.
This is the confidence gap, and it's where most digital skills training programmes fail. They create knowledge but don't bridge the gap between knowing and believing.
What Research Shows About Confidence and Employment
Psychology research calls this self-efficacy, the belief in your ability to succeed in a specific situation. It's not the same as self-esteem (how you feel about yourself generally). Self-efficacy is specific: "Can I actually do this particular thing?"
Here's what matters for employment. Self-efficacy is a stronger predictor of job success than actual skill level.
Someone with moderate digital skills but high self-efficacy will apply for jobs, ask for help when needed, persist through problems, and learn on the job. Someone with strong digital skills but low self-efficacy will avoid opportunities, assume they'll fail, and retreat when something gets difficult.
Long-term unemployment specifically damages self-efficacy. When you've been out of work for months or years, you stop believing you're capable. The longer the unemployment lasts, the more your belief in your own ability erodes regardless of your actual skills.
This is critical for funders to understand, if you fund training that doesn't address this confidence erosion, you're funding something that can't reverse the damage.
The Employment Paradox
Here's what we see repeatedly: someone who's been unemployed for 18 months learns digital skills. On paper, they're now "job ready." But they don't apply for jobs. Why?
Because knowledge isn't enough. They learned that websites exist and how to use Google Ads. But they don't believe they can do it in a real work situation with real pressure. They don't believe they won't freeze up. They don't believe they can problem-solve when something goes wrong.
This is the gap. And most training programmes don't address it.
Why In-Person Matters for Building Confidence
Confidence isn't built in isolation. It's built through experience, through seeing yourself succeed in front of others, through getting real feedback from a real person, through belonging to a group working toward something together.
This is why in-person training is non-negotiable for confidence-building. You cannot build genuine self-efficacy through a screen. You can transfer knowledge. You can create the appearance of training. But you cannot create the conditions where someone actually believes in themselves.
When someone learns a new skill in a room with others who are also struggling and learning, when they see other people figure things out, when a trainer gives them live feedback on their work, when they're part of a group that's collectively problem solving that's when belief starts to change.
Virtual training removes all of that. It's transactional. Content moves from trainer to participant with no relational connection, no social proof that "I'm capable," and no accountability beyond a screen.
The Three Elements of Confidence-Building
Genuine confidence in employment situations comes from three things:
1. Mastery — Actually succeeding at something difficult. Not just learning about it, but doing it, struggling through it, and completing it. This has to be real, not simulated.
2. Social proof — Seeing others like you succeed at the same thing. When you watch a peer figure something out, it changes what you believe is possible for you.
3. Personal relationship with someone who believes in you — A trainer who gives you honest feedback, pushes you when you need pushing, and demonstrates that they genuinely believe you can do this.
Virtual training can deliver mastery experiences (sometimes). It cannot deliver social proof or personal belief-building. Those require presence.
What This Means for Your Commissioning Decisions
If you're funding digital skills training because you want to reduce long-term unemployment or help self-employed people grow, you need a provider who understands this confidence gap.
Questions to ask:
How do you build confidence, not just teach content?
How do you create the conditions where people believe they can use what they've learned?
Why is in-person training essential to your model?
How do you measure confidence alongside skills?
Any provider worth funding should have clear answers to these. If they tell you they use virtual training, you should ask why. The answer "it's more scalable" or "it's more convenient" tells you they're optimising for delivery, not for changing lives.
The ROI Of Confidence - Why Soft Skills Matter More Than You Think
A few months ago, we were talking to someone commissioning employment programmes. They were sceptical about our focus on confidence-building.
"I need people job-ready," they said. "I need them with skills employers want. Not just feeling better about themselves."
Fair point. We get it. When you're spending public money on training, you want tangible outcomes. Skills you can list on a CV. Qualifications. Things that sound concrete.
But here's what we've learned after ten years of doing this: technical skills alone don't get people jobs. Confidence does.
Or more accurately: technical skills plus confidence gets people jobs. Technical skills minus confidence just gets you someone who knows how to do something but won't apply for roles, freezes in interviews, or quits after two weeks because they don't believe they belong there.
That's a harder sell than "we'll teach them Excel." But it's true.
What Employers Say They Want (Then Struggle to Find)
Every employer survey says the same thing. The top skills they struggle to find aren't technical. They're:
Communication
Teamwork
Problem-solving
Reliability
Willingness to learn
Ability to take feedback
These are soft skills. Or employability skills. Or whatever we're calling them this year.
And before anyone says "well obviously, those are just basic expectations" - yes, they are. But loads of people don't have them. Or don't have the confidence to demonstrate them.
We've worked with people who are perfectly capable of working in a team, but in an interview they clam up and can't articulate an example. People who could solve problems, but don't believe they're "problem-solvers" so don't even apply for roles asking for that.
The technical skill isn't the blocker. The confidence to show they have it is.
Why Technical Skills Alone Aren't Enough
Here's a scenario we see fairly often. Someone's been unemployed for 18 months. They do an online course in digital marketing. They pass. They get a certificate. They understand SEO, social media, content marketing, all of it. Six months later, they're still unemployed. Why? Not because they didn't learn the skills. Because they didn't believe they were good enough to apply for jobs. Or they applied but couldn't articulate their value in interviews. Or they got a job but left after a fortnight because they felt like a fraud.
This isn't rare. This is common. The technical knowledge was there. The confidence to use it wasn't. We've also seen the opposite. Someone with basic skills but loads of confidence gets hired over someone with better qualifications but no self-belief. Happens all the time. Employers hire people they think will fit in, contribute, and stick around. Confidence signals all of that.
The Research
There's decent research on this, though we're not going to pretend it's all watertight.
Studies on job search behaviour show that confidence (or "self-efficacy" in research terms) predicts:
How many jobs people apply for
Whether they follow up on applications
How they perform in interviews
Whether they persist after rejection
People with higher confidence apply for more jobs, come across better in interviews, and keep going when they get knocked back. People with lower confidence apply for fewer roles (often below their capability), freeze in interviews, and give up quicker.
There's also research showing that long-term unemployment erodes confidence over time. The longer someone's out of work, the less they believe they're employable. Which makes them less employable, not because of skills but because of how they present.
It's a vicious circle. And skills training alone doesn't break it.
What Confidence Actually Looks Like in Practice
Let's be specific about what we mean by confidence, because it's not just "feeling good about yourself."
In an employment context, confidence means:
Believing you can learn new things - "I don't know this yet, but I can figure it out"
Believing you have something to offer - "I can contribute to this team/company"
Ability to articulate your value - Actually saying what you're good at without apologising
Resilience to rejection - Applying for another job after getting turned down
Willingness to try things you've not done before - Taking on tasks outside your comfort zone
These are all learnable. And they all affect whether someone gets and keeps a job.
Can You Actually Measure Confidence?
Yes, though not as neatly as you can measure "completed Excel module." There are validated frameworks for measuring self-efficacy and confidence. We use a simple one - nothing fancy, just a questionnaire before and after the programme asking people to rate statements like:
"I feel capable of learning new skills"
"I believe I could get a job in the next three months"
"I feel confident explaining what I'm good at"
"I would feel comfortable in a job interview"
People rate these on a scale. We compare before and after. It's not perfect, but it gives you a direction of travel.
We also track behavioural stuff:
How many jobs did someone apply for before vs after?
Did they attend an interview?
Did they actually take the job they were offered? (Some people get offers but turn them down because they don't believe they can do it)
These are proxy measures for confidence. Not perfect, but useful.
What We've Seen (When We're Allowed to Track It)
The tricky thing about measuring confidence is that sometimes the outcomes don't look like what you'd expect. Someone completes a digital skills programme and gets a job in retail. The digital skills weren't directly relevant to the role, but the experience of learning, working in a team, and finishing something built enough confidence to apply. Would that count as a success in your impact metrics? Depends how you measure it. Or someone learns the technical skills but takes six months before they actually apply for jobs. Not because they didn't learn, but because it took that long to believe they could do it. Or someone gets a job, leaves after two weeks because they feel like they don't belong, then gets another job three months later and stays. Was the programme a failure because they left the first job? Or a success because they eventually stuck at something?
This is why confidence is hard to measure. The outcomes are real, but they're messy and they don't always show up in neat three-month reports.
How We Build Confidence (Without It Being Touchy-Feely)
We don't do affirmations or motivational speeches. That's not what builds confidence. What does build it?
Completing something - Starting a project and finishing it. Sounds basic, but for people who've been unemployed for months or years, finishing things matters.
Working with others - Being part of a team, contributing ideas, seeing that your input helps. Real-world evidence that you can work with people.
Getting feedback from someone who isn't your mum - The charity we work with says "this is useful" or "this helped us." That's different from a tutor saying "well done." It's external validation.
Doing something you didn't think you could - Building a website when you thought you weren't technical. Presenting to a group when you thought you'd freeze. Proving to yourself you're more capable than you thought.
Having someone believe in you - Instructors who treat you like you're capable, not like you're broken. That matters more than people realise.
These aren't soft and fluffy. They're practical experiences that build evidence for yourself that you can do things.
What We're Not Saying
We're not saying technical skills don't matter. They do.
We're not saying everyone just needs a confidence boost and they'll be fine. Some people need other support - mental health, housing, addiction services. Confidence-building isn't a magic fix.
We're not claiming we can measure confidence perfectly. We can't.
And we're definitely not saying that every unemployed person lacks confidence. Some very confident people are unemployed for loads of reasons that have nothing to do with self-belief.
What we are saying is this, if you're commissioning skills training and only focusing on technical skills, you're missing half the picture.
The Uncomfortable Truth
Confidence-building is harder to sell than skills training.
"We'll teach them digital marketing" sounds concrete. Measurable. Clear.
"We'll help them believe they're capable" sounds woolly. Unmeasurable. Unclear.
But we keep seeing the same pattern: people with skills and no confidence don't get jobs. People with confidence and basic skills do.
So we can either keep commissioning programmes that look good on paper but don't change employment rates, or we can invest in programmes that address the actual barrier.
Which is often confidence, not capability.
If you're commissioning employment programmes and want to talk about how you measure what actually matters - not just what's easy to measure - we're happy to chat.
No hard sell. Just a conversation about what we've seen work and what we're still figuring out.
We focus on what actually gets people into work, which often isn't what you'd expect. Based in Reading, working with commissioners across the UK.