17. Methodological Playgrounds

I first heard the term methodological playground in 2014, from Institute for Education’s Professor Carey Jewitt. She was talking about her work with digital artists and social researchers, synthesizing methods to open up different perspectives, generate imaginative research questions, and create a wider range of research tools.

Around that time I was working on an evaluation for Tempo Time Credits. In addition to the usual quants and quals skills needed for a large, complex evaluation, our core delivery team included a design specialist. Design skills were really important for developing a data collection process that focused on the experience of the participants – making it more engaging, less technical and potentially valuable to the participants, as well as for the evaluation. Some of the tools looked more like games, with packs of cards and maps to fill in.

Gamifying data collection is a way to increase collaboration and participation, and potentially drive down the recruitment costs of research. Talking in a language people understand and thinking about the end user to humanise the research process draws heavily on the human centred approach we hear about from the design world. For an example, discussed in detail in section 3.5.3 in Salganik’s excellent book on digital research, and this paper from 2010 describes how a survey was packaged as a game on Facebook called Friendsense (although this project would arguably raise a whole number of access and ethical issues today).

Since I first listened to Jewitt’s podcast, the theme of creativity and blurring traditional boundaries in methods has grown stronger. Some examples:

• Data science and social science are becoming more integrated in academia, from Cardiff’s Social Data Science Lab and Cambridge University’s Undergraduate Quantitative Methods Centre teaching quants skills to budding social scientists, so they can work with massive data sets, to the University of Essex dedicated undergraduate degree in sociology with data science.
• Big qual involves computational skills being used by qualitative researchers to combine and explore qualitative data sets in new ways. Valuable, for example, for adding contextual depth to systematic reviews.
• Evaluation communities have recognized the important intersection with design. In 2016 the American Evaluation Association focused their annual conference on exploring how design and evaluation could be better integrated. More recently the Australian Evaluation Society ran workshops on the role for evaluation skills in supporting design processes within organisations – recognizing that many evaluators don’t know the language or the promise of design, while many designers don’t think to draw on the skills of evaluation and evaluative thinking.
• Design specialists Snook have been exploring how to combine design and data, to push designers and data experts out of their comfort zone to work together more to create services that are user-friendly and sustainable.
• The National Centre for Research Methods has an online resource introducing creative research methods, led by Helen Kara. It covers arts-based research, research using technology, mixed-methods research, and transformative research frameworks.
• The importance of storytelling, for example Alex Evan’s book The Myth Gap, highlights why evidence and arguments are not enough to make change, you need unifying narratives from storytelling. Similarly, is statistics in crisis, when the numbers provide only an overview that no longer connects or makes sense to the individual?

So Mark Birkin’s description of a ‘4th paradigm in research’, should be expanded beyond integrating sources of big data. Cross-functional teams now need to consider more than traditional quants and quals skills, and also think about: context expertise, data & computer science, data steward roles, design and storytelling capabilities; with team members having at least a high-level understanding of other disciplines to work together effectively.

What do we mean by a high-level understanding of different disciplines?

It goes beyond technical understanding: do we understand each other’s languages or know how to translate what we do in a way a non-expert can comprehend? Are our data protection and ethical standards (both statutory and best practice) aligned? What it is to ‘know’ something in your field? How do I quality assure without expertise? Is everyone clear about IP rights?

And underpinning all of this is a need to centre work on addressing issues of real issues concern, putting people at the centre of the work, articulating questions clearly and precisely, and being able to reflect on your practice in relation to others. (For more I recommend Mathematica’s panel on the intersections of data science and social science, and the Q&A section (approx. minutes 46) of this Sage Ocean panel on the skills social scientists need for the future).

Why should commissioners and users of research and evaluation evidence be interested in (and willing to pay for) new configurations in delivery teams?

Even if you have a ‘simple’ intervention, the way it interacts in context, and the kind of questions we are interested in – does it work, for who, in what contexts, what else works, how much does it cost, what don’t we know? – are likely to be complex. In reality most interventions are not simple in practice, they do not fit linear diagrams.

At the 2018 CECAN conference, Matthew Taylor called for evaluation that is responsive to the policy making cycle, making available rich accounts of change and systems, with public engagement and experience being critical. Systems perspective and systems change are both popular terms. To be meaningful this means more than a descriptive systems map, to gather richer accounts of change from more, and more diverse viewpoints and systematic evaluation to understand what encourages outcomes or failures across a system. Penny Hawkins emphasised the need for deliberative democratic evaluation, where people don’t only respond, but are involved in framing and participating.

These needs can be better met through a plurality of methods, combining the best in quants and quals with new approaches from design, storytelling, and data science.

Share on facebook
Share on twitter
Share on linkedin
Related Posts