|
Activity |
|
Information |
|
Presentation |
|
Website |
9 Evaluation Dissemination
The RUFDATA evaluation framework used by CSET includes three sections
that relate to dissemination. The overall PURPOSE and the USE will both
shape answers to the questions about AUDIENCE and in turn they influence
the strategies that you might use to disseminate your evaluation. Decisions
about dissemination also connect with the different evaluation perspectives
of accountability, development and knowledge (see below).
Questions about evaluation dissemination
The PURPOSE and USE of your evaluation will influence:
- why you disseminate
- to whom you disseminate
Thinking about your AUDIENCE will help you decide:
- what you disseminate
- how you disseminate
- when you disseminate
|
Evaluation Dissemination: Dissemination Strategies
9A (pdf slides kB)
This information sheet includes ideas for different ways of dissemination
that might be included within a communication plan.
|
|
Evaluation Dissemination: Dissemination
Opportunities 9B (pdf slides kB)
This list of websites includes details of where you might
disseminate findings from your evaluation. |
Dissemination and Aimhigher communication plans
Aimhigher Partnerships are required to develop a communication plan this
is an ideal framework to capture and record decisions about how, what
and when you will disseminate your findings to different AUDIENCES. Although
not required to produce a separate communication plan, HEIs may find it
useful to discuss dissemination strategies with institutional colleagues
who have responsibility for working with the press, developing news pages
for your website, internal electronic and paper newsletters. Whilst using
the RUFDATA model encourages a more planned approach to dissemination,
this should not stop you pursuing the valuable serendipitous and unpredicted
opportunities to disseminate your findings.
Dissemination and evaluation perspectives
The section of the toolkit about evaluation
perspectives outlines three umbrella uses of evaluation: accountability,
development and knowledge. Each of these perspectives has a different
audience with respect to dissemination.
Dissemination for Accountability
There are several layers of accountability influenced by external and
internal drivers which determine what you have to do and what you choose
to do with respect to dissemination. It is likely that there will be a
difference between dissemination to:
- meet the external and conditional requirements of a funder, which
may include a significant amount of monitoring data as well as providing
an opportunity to disseminate some of your evaluative findings
- demonstrate your commitment and accountability to the wider communities
of practice with whom you work (e.g. other Aimhigher or widening participation
practitioners), which may emphasise evidence relating to examples of
good practice and depending on the context and format the artefacts
you produce may encompass ideas contributing to your dissemination for
development.
|
Things
to do
Identify as early as possible if your funder
has specific requirements regarding dissemination. This allows you
to begin to gather monitoring data and consider how you will incorporate
your evaluative data from the start.
For instance:
Is there a particular form? (This is often
the case with reporting monitoring data, which may include space
and opportunity to disseminate some of your evaluative findings.)
When does the form need completing and
in what format? (Sometimes this will need ‘signing off’
by specific individuals or groups and thus you need to think carefully
about the timing.)
What other ways can you communicate and
disseminate your findings to funders and others? See dissemination
strategies 9A
Who else are you accountable to? (It is important to think about
the way in which you disseminate your findings to them and ensure
it is in an appropriate format.)
|
Dissemination for Development
One of the features of this type of dissemination is that it is underpinned
by an intention and hope that you can use the process and artefacts you
produce to generate additional learning for yourself and thus enhance
the quality of what you do. The importance of, and interest in, feedback
is to gain further insights into how you might develop your work in the
future.
There is, however, likely to be a difference between dissemination to:
- members of a specific team who are working together on developing
and delivering a particular activity, which may include informal verbal
feedback as well as a written account that you might use as a basis
for the other two forms of dissemination for development. (TEAM)
- members of your own institution, or Aimhigher partnership with whom
you maybe willing to share a 'warts and all' account that may be shared
under Chatham
House Rules, rather than a more sanitised or edited version of a
report which often has the ‘less successful components removed’.
(HEI / AIMHIGHER PARTNERSHIP)
- members of a wider community of practice or network of interested
practitioners who are already engaged in, or planning to deliver, similar
activities in the future. This form of dissemination for development
may include top tips, good practice case studies, short summaries submitted
in response to others requests e.g. Action on Access Good Practice Guides,
summaries in response to individual requests, or even answers provided
in other people's electronic surveys as part of their evaluation activities.
(SECTOR)
Using evaluation as an aid to developing activities is one of the more
rewarding elements for those who undertake evaluation as one part of their
work. This is because it enables you to think about what you are doing
and legitimately take time to consider how the findings from your evaluation
can improve your future action. In some situations it can even inform
a decision to stop an activity as well as modify an activity.
|
Things to
do
The needs of the audience also influence
dissemination for development. When thinking about the:
TEAM: identify when you will reflect on the evaluative evidence
gathered and try to achieve a balance between a reactive and proactive
response. See cycle for an overview of the
process of reflection – action – review – reporting.
A reactive review might include reading
through the evaluation feedback forms to help decide what changes
need making for the next event
A proactive approach might include a planned
review of emerging issues from a series of events, these will
have been collated by a member of the team and ideally include
some initial analysis. This process might be repeated throughout
the duration of an evaluation but eventually lead to the production
of reports, presentations for HEI/AIMHIGHER PARTNERSHIP or SECTOR.
HEI/AIMHIGHER PARTNERSHIP: identify when
you will disseminate your findings (existing or special events)
and what you hope to gain from the process.
Within both organisational contexts there
will be a range of existing meetings to which you can gain a captive
audience who will be willing to read a short paper or listen to
a short presentation and ask questions or offer feedback.
At other times you may wish to invite
individuals to a special event. This is more likely to be appropriate
towards the end of an evaluation when you have more to disseminate.
Who you invite may depend on whether your focus is on deciding
the future of an activity or to promote good practice which may
have much in common with dissemination to the sector.
SECTOR: identify where and who or how you
will disseminate your findings in order to have the biggest impact.
Although the focus may remain a developmental one, as you disseminate
to the sector you are beginning to move into disseminating for knowledge.
When disseminating at conferences or workshops
try to find out who the audience might be so that you can select
the information that is most likely to be relevant to them. For
instance, policy makers will have a different interest and ask
different questions to a group of practitioners.
Similarly a regional audience might lead
to developmental opportunities for working in partnership whereas
a national audience may result in more requests for information
which may help raise the profile of your work but not aid you
in developing your own provision.
Think about who is best placed to disseminate
your findings. Is it possible or appropriate to invite some of
the participants to help you in this process?
Although a presentation maybe the most
immediate means of obtaining developmental feedback, written reports,
press releases, or items placed on an email discussion list or
website can also provide a useful means of disseminating for development.
See dissemination strategies 9A for other
ideas for disseminating for knowledge.
|
Dissemination for Knowledge
The purpose of evaluation for knowledge is to obtain a deeper understanding
of some specific area or policy field. It is more typical of levels 4
and 5 evaluation that is more concerned with macro or sector wide practices
and longer term strategic objectives. It is not expected that the scale
of individual evaluations will automatically lead to dissemination for
knowledge; however, as discussed in the evaluation
practicalities, the rigorous collection of data and clarity about
the context and status of the evaluation findings will make it possible
for meta analysis of smaller scale evaluations.
|
Things to
do
Consider working in partnership with others
to develop a combined evaluation project that has the potential
to contribute to the wider knowledge base
Identify other projects to which you might share your evaluation
findings to enable them to become part of the evidence base of a
Meta analysis
Explore within your HEI or Aimhigher Partnership if there are individuals
interested in or already undertaking research, whose purpose is
to contribute to knowledge (for instance, a postdoctoral student)
Use others’ research to inform your own practice and decisions
about how to develop your activities, for instance:
enable staff to attend relevant conferences
to learn more about a specific topic;
allow them the time to read, discuss and
share ideas from evaluation reports with others in the team.
It is a common concern amongst practitioners that they have pressures
to fit the action into their daily timetable, if the benefits of
engaging with knowledge arising from evaluation are to be realised
they need to have time. It is also worth remembering that whilst
some will relish this opportunity, others will prefer to focus on
action, but welcome the opportunity to hear brief updates from their
colleagues.
Identifying suitable people to undertake the different elements
of all stages of the evaluation process from data collection through
to dissemination is important.
|
Disseminating Evaluation Evidence
For examples of evaluative reports please see evaluation evidence. If
you have a good idea of how to disseminate your evaluation findings then
we would be happy to disseminate them on this site. Please email Ann-Marie
Houghton.
Return to 'Toolkit' Structure:
Ten features of evaluation
|