Social Value Engine

What is sensitivity analysis?

Sensitivity analysis is a technique that helps you test how your social value results change when you adjust key assumptions.

Sensitivity analysis lets you explore “what if” scenarios around these inputs.While the Social Value Engine provides standardised proxies, you set your own deflators (deadweight, attribution, displacement, drop-off) and beneficiary numbers. 

Please note that sensitivity analysis is not required to create a valid social value calculation. It’s an additional step that some organisations find helpful for building confidence in their results and communicating uncertainty to stakeholders.

When might sensitivity analysis be useful?

Consider using sensitivity analysis if:

  • You’re presenting to stakeholders who may question your assumptions
  • You have uncertainty about key figures (e.g., exact beneficiary numbers, duration of impact)
  • You want to identify which assumptions matter most to your overall results
  • You’re seeking funding and want to show you’ve stress-tested your claims
  • You’re comparing multiple intervention options and want to understand risk

You probably don’t need sensitivity analysis if:

  • You’re conducting a straightforward internal evaluation
  • Your assumptions are well-evidenced and widely accepted
  • You’re early in program development and need quick directional insights
  • Time and resources are limited

How sensitivity analysis works

The Social Value Engine gives you standardised outcome values, but you control:

  1. Beneficiary numbers – How many people experience each outcome
  2. Leakage – Benefits going outside your intended target area/population
  3. Deadweight – What percentage would have happened anyway
  4. Attribution – How much credit your organisation can claim
  5. Drop-off – How benefits decline over time
  6. Displacement – Whether your impact reduces impact elsewhere
  7. Duration – How many years the impact lasts

Sensitivity analysis means recalculating your social value using different assumptions for these variables to see how results change.

Practical example: Youth Employability Programme

Let’s look at a 6-month training program helping unemployed young people gain employment.

Base case calculation

Outcome: Person moves from unemployment to employment
Social Value Engine proxy: £18,678 per person per year

Your assumptions:

  • Beneficiaries: 35 people (70% success rate)
  • Duration: 2 years
  • Leakage: 0% (all beneficiaries are in target area)
  • Deadweight: 15%
  • Attribution: 70%
  • Drop-off: 25% per year after year 1
  • Displacement: 5%

Year 1 value:
£18,678 × 35 × (1 – 0) × (1 – 0.15) × 0.70 × (1 – 0.05) = £349,161

Year 2 value:
£18,678 × 35 × 0.85 × 0.70 × 0.95 × 0.75 = £261,871

Total social value: £611,032

Testing different scenarios

Now manually recalculate using different assumptions:

Scenario 1: conservative success rate

  • Change beneficiaries from 35 to 25 people (50% success rate)
  • Keep all other assumptions the same
  • Result: £436,451 (29% lower)

Scenario 2: lower attribution

  • Change attribution from 70% to 50%
  • Keep all other assumptions the same
  • Result: £436,451 (29% lower)

Scenario 3: shorter duration

  • Change duration from 2 years to 1 year only
  • Keep all other assumptions the same
  • Result: £349,161 (43% lower)

Scenario 4: combined conservative case

  • 25 beneficiaries + 50% attribution + 1 year duration
  • Result: £156,208 (74% lower)

Scenario 5: optimistic case

  • 40 beneficiaries + 85% attribution + 3 years duration
  • Result: £1,003,434 (64% higher)

Results Summary Table

ScenarioSuccess RateAttributionDurationTotal Value% Change
Base Case35 (70%)70%2 years£611,032
Conservative Success25 (50%)70%2 years£436,451-29%
Lower Attribution35 (70%)50%2 years£436,451-29%
Shorter Duration35 (70%)70%1 year£349,161-43%
Combined Conservative25 (50%)50%1 year£156,208-74%
Optimistic Case40 (80%)85%3 years£1,003,434+64%

From this analysis, you learn:

Duration matters most – Whether impact lasts 1, 2, or 3 years creates the biggest swing in results. This suggests it’s worth investing in tracking job retention rates.

Success rate is controllable – The number of beneficiaries who gain employment is something your program can directly influence through design improvements.

You have a credible range – You can confidently say your program generates between £156k (very conservative) and £1m (optimistic), with £611k being most realistic.

How to do this manually

Step 1: Calculate your base case

Use the Social Value Engine to calculate your social value with your best estimates for all variables.

Step 2: Identify key uncertainties

Ask yourself: Which of my assumptions am I least confident about? Which could reasonably be different?

Step 3: Choose 2-4 variables to test

Don’t test everything – focus on the inputs you’re most uncertain about or that stakeholders might question.

Step 4: Define your ranges

For each variable, decide on realistic alternative values:

  • Conservative/pessimistic
  • Your base case
  • Optimistic

Step 5: Recalculate manually

For each scenario, plug the new numbers into your calculation. You can do this in a spreadsheet or even with a calculator.

Step 6: Present your range

When reporting results, you might say:

“Our analysis indicates social value of £611,032. Under conservative assumptions (lower success rates and shorter impact duration), this could be as low as £156,000. With strong program performance, it could reach £1 million.”

Tips for using sensitivity analysis

  • Keep it simple – Testing 3-4 scenarios is usually enough. 
  • Be realistic – Use plausible alternative values, not extreme outliers.
  • Focus on what matters – Test variables you’re uncertain about, not ones that are well-evidenced.
  • Document your thinking – Note why you chose particular ranges so you can explain your reasoning.
  • Use it as a learning tool – Sensitivity analysis often reveals where you need better data or stronger evidence.

Remember

Sensitivity analysis is a tool, not a requirement. Many excellent social value assessments are created without it. Use it when it adds value to your analysis and communication, not because you feel you must.

The Social Value Engine provides robust, standardised proxies. You can apply them thoughtfully to your context, and whether you choose to test different scenarios is entirely up to you and your organisation’s needs.

 

 

Maddie Kortenaar

Maddie Kortenaar

Maddie Kortenaar is a Level 1 accredited social value practitioner. She is the author of the eBook AI for Social Value, exploring how technology can drive meaningful change. Drawing on her expertise in sustainable innovation, Maddie empowers organisations to measure and communicate their impact, fostering a culture of positive social value.
View All Posts
Share the Post:

Related Posts

Open Letter: Building an Open Infrastructure to Measure Scotland’s Social Value Delivery

As the Community Wealth Building (Scotland) Bill progresses, Scotland has the opportunity to lead internationally by creating shared, open frameworks for measuring community wealth and social value. Today, we’re publishing an open letter calling on the Scottish Government, Parliament, COSLA, and Scotland’s 32 local authorities to commit to co-owned measurement standards rather than building separate systems. We invite organisations across Scotland’s social value community to co-sign.

Read More

Charities SORP 2026: implications for social value reporting

From January 2026, charities preparing accruals accounts will need to follow updated accounting rules under Charities SORP 2026. The changes introduce a three-tier reporting system based on charity size, with new requirements for impact reporting that will affect organisations with income over £500,000. This article explains what’s changing, why impact measurement is now part of the formal reporting framework, and what it means for charities of all sizes.

Read More

Join Our Newsletter