This post is a short detour from the Big Data path, and a brain dump of ways of working I find to be helpful when commissioning an evaluation, or responding to an ITT to deliver an evaluation (if you have ever asked me for advice then it will look familiar!).
As I started to explore big data analytics in more detail, I found it reassuring that, despite the mountains of new stuff I was looking at, there were similarities with my established ways of working. In particular, the importance of a good strategy (or design) and the questions you need to ask to get there.
I have been using (and updating) these questions to inform my approach for nearly ten years. You can ask them of yourself if you are commissioning an evaluation, or of your client, to help meet their needs. They are particularly useful for getting all stakeholders are on the same page before you begin in earnest.
- What stage of development are we at (is the project or intervention at an early stage of
development and still evolving, or more stable)? - What do we want to know, for what purpose, for what audiences?
- When do we need to know this by?
- What evidence already exists that is relevant to what we are doing?
- What data do we currently have or are we likely to have in the near future?
- What resources do we have – money, people’s time and expertise?
- Who else wants to know about this? Can we work together? Would our objectives be properly aligned or would it over-complicate things?
- What can we do ourselves and what are the tasks where we need to access additional skills and support?
I find these questions to be fundamental for good design, managing expectations, and saving time and money. Some other steps I find to be useful are:
A little bit of philosophy: pausing to ask, what is knowing in this context? What is a ‘learning’? What counts as ‘evidence’? There are plenty of detailed epistemological debates out there for those who want it. This open access BMJ supplement is one example from health care improvement, and BBC Radio 4s In Our Time programmes on philosophy are helpful for developing this kind of thinking.
Planning for detailed planning: it is very rare for all the information needed for a good design to be available during a bidding process. If you are commissioning, I would recommend sharing as much detail as you can, perhaps using the questions above as a starting point. This will save time once you start. If the written bids and interview have already provided opportunity to get into the detail of how the work will be approached, the planning phase at the start of the project should be shorter and run more smoothly.
Preparing a stakeholder map: a useful approach to addressing question 2, is to develop a stakeholder map. This should include stakeholders internal to the organisation commissioning the work, and their external audiences. This document should be ‘live’ and provides a useful focus for working effectively with whoever is responsible for communications.
Costing for a ‘translation’ phase: the stakeholder map will be invaluable here, for example helping to recognize all those audiences that won’t want to read a formal evaluation report. The skills required to produce a rigorous evaluation are not the same as those required to produce engaging outputs suitable for different stakeholders. Either make sure the team doing the work has this mix of skills, or slice off part of the budget to use for different outputs – summaries, design & info-graphics, events. Include this ‘translation’ from a traditional final report (or ‘master document’, as I think of it) into other materials and communications in the overall time-frame for the work. Translation is also about people having sufficient understanding to be able to communicate accurately and effectively about research and evaluation. This is often an issue with quantitative data and analysis, and I refer to some useful resources in this previous post.