LogoLogo
  • Introducing the framework
    • Context
    • Definitions
    • Development of the framework
    • Who is the framework for?
  • Framework overview
    • Framework at a glance
  • Engage and convene
    • Stage outline
    • Activities
    • Supporting tools
  • Understand and plan
    • Stage outline
    • Activities
    • Supporting tools
  • Develop
    • Stage outline
    • Activities
    • Supporting tools
  • Learn and publish
    • Stage outline
    • Activities
    • Supporting tools
  • Implement
    • Stage outline
    • Activities
    • Supporting tools
  • Appendix one - Engagement protocol
    • Prioritisation matrix
    • Support tiers
    • Interaction process map
  • Appendix two - Governance and accountability structure guidance
    • Governance and accountability structures
  • Appendix three - MEL framework
    • Overview
    • Step by step guide
  • Appendix four - Data requirements framework
    • Data requirements framework
  • Appendix five - Use case template
    • Problem statement template
    • Use case prioritisation decision-making matrix
  • Appendix six - Funding model guidance
    • Funding model guidance
Powered by GitBook
On this page
  • Step 1: Define SMART Objectives against the OA outcomes your use case supports
  • Step 2: Define measurable KPIs for each objective
  • Step 3: Measure use case delivery
  • Step 4: Share learning
  • Further reading
  1. Appendix three - MEL framework

Step by step guide

PreviousOverviewNextData requirements framework

Last updated 11 months ago

Step 1: Define SMART Objectives against the OA outcomes your use case supports

Create Specific, Measurable, Attainable, Relevant, and Time-Bound (SMART) objectives for your use case that support the three OA strategic outcomes of being an independent initiative, becoming integral national data infrastructure, and reducing inequalities in the sport and physical activity sector. This step should be completed during '’.

Step 2: Define measurable KPIs for each objective

Key Performance Indicators (KPIs) will be used to measure whether the objectives you set have been achieved or are on track. Some examples could include:

  • increased levels of activity within disproportionately affected demographics

  • commitment to inclusive governance

  • new metrics demonstrating ambition to reach under-represented groups

  • increase opening up access to opportunities data

  • participatory case studies

  • increased level of engagement with a diverse range of groups across sectors

This step should be completed during '’.

Step 3: Measure use case delivery

By measurement, we mean that the KPIs will be assessed continually over the life-cycle of a pilot, with pre-agreed milestones. Monitoring is the systematic measurement of the use case pilot's progress. It often entails the use of KPIs and other measurement activities. The information collected here will generate a useful evidence base to be reviewed regularly, to demonstrate each use case’s contribution to OA's outcomes. This step should be completed during ‘’. One suggested way to monitor performance is to use a ‘RAG’ rating (red, amber, green).

Percentage
RAG monitoring
Meaning

0-40%

Blocked or not relevant

We failed to make progress and/or the KPIs are not relevant to us any more.

40-70%

In progress

We made progress, but fell short of completion.

70-100%

Accomplished

We delivered.

Step 4: Share learning

The intention is to collate the shared learning from use case communities to build an evidence base demonstrating the impact of OA. This will help to improve and grow the initiative. With these efforts, we hope to increase the levels of activity within disproportionately affected demographics, including:

  • children and young people

  • older people

  • women and girls

  • people from lower socioeconomic backgrounds

  • disabled people

  • ethnically diverse communities

Further reading

The key to learning is the application of insights to improve use case work. Learning involves making improvements, either during the use case rollout or for the next time you do similar work. Tools to support learning and reflection include summary reports of KPIs, process evaluations and case studies. This step should be completed during ‘’.

Understand and plan
Understand and plan
Develop
Learn and publish
Improving impact: Our approach to monitoring, evaluation and learning
Sport England Evaluation Framework