 |
Activity |
 |
Information |
 |
Presentation |
 |
Website |
3 Evaluation Planning: RUFDATA
CSET use a range of planning tools to assist practitioners in developing
a framework for evaluation and indicators of performance. For identifying,
planning and collecting evaluation data we use the RUFDATA tool which
consists of a series of topics, each with a set of questions that allow
you to develop your evaluation in a methodical way.
The RUFDATA tool is suitable for developing an evaluation plan, or for
planning an individual evaluation project listed on a wider plan. In essence
the process is the same.
 |
Evaluation Preparation: RUFDATA - a theoretical
explanation 3A (pdf
209kB)
Saunders, M (2000) Beginning an evaluation with RUFDATA: theorising
a practical approach to evaluation planning. Evaluation,
Vol 6, Number 1, pp 7 - 21 |
 |
Evaluation Preparation: RUFDATA
Headings 3B
(pdf
slides 435kB) (pdf
Handout 200kB)
This pdf file includes the headings used in the RUFDATA planning framework,
it can be used as a basis for workshops or group discussion |
 |
Things
to do
Review the RUFDATA questions and materials
to decide which will be most suitable for you to use for your context.
The presentation 3B provides a basis for
general discussion and identification of answers relevant to the
evaluation plan itself, the activity sheets in 3C are suitable for
individual evaluations within your plan and the table more suited
for the plan itself.
In developing the overall evaluation plan
it is worth ensuring that those involved have either participated
in the evaluation preparation activities or received the outcome
and ideas generated during that process.
Remember it is not possible to evaluate everything
so you will need to decide on priorities. It saves time and increases
institutional commitment if key institutional stakeholders help
decide priorities and the relevant member of the senior management
team or committee signs off the plan.
A key evaluation activity in year 1 is to
confirm and evaluate your data collection, recording and reporting
mechanisms. You may have begun this process as part of the evaluation
preparation, nevertheless the importance of establishing a robust
set of ‘core participant data’ means that participant
data, progression and attainment are key activities in your evaluation
plan.
NB If you believe you have data sorted, then
you could focus your evaluative activity on identifying the enabling
and process impact indicators to disseminate your good practice
|

 |
Evaluation Preparation: RUFDATA Record Sheet
3C (pdf
60kB) (word
45kB)
This activity template includes two formats for the RUFDATA questions
and prompts. The portrait version is most suitable for planning individual
evaluations. The landscape version provides a tabular format for recording
multiple evaluations and could be used as an action plan for the evaluation
plan itself. |
 |
Evaluation Preparation: RUFDATA
a worked example 3D (pdf
135kB)
This information sheet includes the bullet points and ideas generated
by an HEI working through the RUFDATA framework. Although institutional
specific it illustrates how the RUFDATA framework supports the identification
of broad ideas suitable as a basis for a specific evaluation plan.
|
NB As individual evaluations planned using RUFDATA become available these
will be posted on the website for others to adapt and modify see Evaluation
Evidence

RUFDATA: Further Guidance
RUFDATA operates as a framework for guiding the evaluation process. The
metaphor of a 'framework' is apt. A framework should provide a generic
context for action in which some recognisable shaping characteristics
are evident but within the shape, a wide range of actions is possible.
Our interpretation of the context of work of WP teams suggests the need
for a framework, which would enable each WP team to fulfil some basic
evaluation activity and develop a reporting mechanism for its activities
to the wider University within a coherent management process.
RUFDATA is an acronym for the procedural decisions which can shape evaluation
activity within a widening participation strategy based in a University
or an Aimhigher Partnership. RUFDATA provides the basis for a series of
decisions, which can frame an institutional or partnership widening participation
evaluation plan as well as developing a plan for a specific activity.
Reasons
What are our Reasons and Purposes for evaluation?
These could be planning, managing, learning, developing, accountability
Uses
What will be our Uses of our evaluation?
They might be providing and learning from embodiments of good practice,
staff development, strategic planning, PR, provision of data for management
control
Focus
What will be the Foci for our evaluations?
These include the range of activities, aspects, emphasis to be evaluated,
they should connect to the priority areas for evaluation
Data
What will be our Data and Evidence for our evaluations?
numerical, qualitative, observational, case accounts
Audience
Who will be the Audience for our evaluations?
Community of practice, commissioners, yourselves
Timescale
What will be the Timing for our evaluations?
When should evaluation take place, coincidence with decision making cycles,
life cycle of projects
Agency
Who should be the Agency conducting the evaluations?
Yourselves, external evaluators, combination

Return to 'Toolkit' Structure:
Ten features of evaluation
|