2016-12-15

Happy Holidays! In celebration of a great year, we've collected a list of favourite resources of 2016 that are freely available online so that we could share them with you as a bit of an end of year gift. The resources aren't necessarily new - but they are things that have been discovered for the first time or rediscovered in 2016. We have a wide range of cool tools, guides, examples and other resources that help us do evaluation better and think about it differently.

We had some great suggestions for this list from people in the BetterEvaluation community and we'd really like to thank everyone who wrote back or tweeted to us about their favourite resource. We hope you find it as interesting as we did to see what people's picks were and why.

The resources are clustered around the Rainbow Framework - so that if you are interested in more resources in a particular area, you can click through to find them - with more general resources at the end.

Are we missing something you found enormously useful or engaging in 2016? Please leave us a note in the comments with a link and tell us why it's your pick!

Favourite Evaluation Resources of 2016

Manage an evaluation or an evaluation system

View MANAGE Task Cluster

The World Clock Meeting Planner: This one seems almost too simple for inclusion, but then again, it is one of our most used tools. This tool lets you figure out the best time to talk to people from multiple time zones - and we highly recommend this if timezone arithmetic gives you as big a headache as it does us.

The GeneraTOR Our interactive tool prompts for particular information to produce a draft Terms of Reference document which can be shared, reviewed and finalised with other stakeholders.

Capacity development - Dennis Bours drew our attention to his favourite 2016 resource - Scott Chaplowe's book Monitoring and Evaluation Training, and accompanying Resource Page (which we also came across recently), which includes over 150 resources for M&E practice and capacity development, hand-picked from the authors’ research for the book (and most hyperlinked and freely available online). Although the book is not entirely free, Chapter 1 – M&E Training that Makes a Difference and Chapter 5 – What makes a good M&E Trainer? are available for download. You can read Dennis' full review of the book here.

Define what is to be evaluated

View DEFINE Task Cluster

Online, interactive software for drawing logic models: One of our enduring blog posts discusses the nuts and bolts of drawing logic models. Since that post, two new tools  Dylomo and Theory Maker were submitted to BetterEvaluation this year, and we think it's great that people are thinking about new ways to make the process of drawing logic models as easy and accessible as possible.

Sketch maps for articulating mental models: We're so grateful Rosemary Cairns shared her experiences in a blog on articulating mental models - she gives some great insights from her experience in the field about getting people to express how they think a program or project works. It's definitely worth a read if you haven't already.

Frame the boundaries for an evaluation

View FRAME Task Cluster

A fill in the blank exercise: 'What I want to know from the [program name] evaluation is ________" We found this in Michael Quinn Patton's book Utilization-Focused Evaluation (2008, pp. 49-51). You can read the full extract on Google Books here - it's a lovely anecdote about Michael trying to engage a room full of hostile stakeholders in order to identify a set of evaluation questions and concerns. After both sides were growing increasingly frustrated and on the verge of calling off the evaluation, Michael asks everyone in the group to fill the blank 10 times - and this, and the subsequent process of narrowing down everyone's questions, completely turns the evaluation around. It's definitely worth a read. Also check out page 52 (Exhibit 2.3), which lists five Criteria for Utilization-Focused Evaluation Questions.

Checklists for KEQs: The CDC Evaluation Questions Checklist was recently recommended to us by Robin Kuwahara, who wrote: "Colleagues of mine at CDC’s National Asthma Control Program created a useful checklist for assessing potential evaluation questions. The list is grounded in the evaluation literature and has benefitted from the practice wisdom of evaluators who serve in a range of capacities and agencies." The list has an emphasis on the importance of involving stakeholders in developing questions.

Another KEQ checklist that we'd recommend is Lori Wingate and Daniela Schroeter's Evaluation Checklist for Program Evaluation, which distills and explains criteria for effective evaluation questions. It is housed on the Evaluation Center at Western Michigan University's website on their Evaluation Checklist page - which I'd highly recommend if you are into checklists (and who isn't, am I right?).

Describe activities, outcomes, impacts and context

View DESCRIBE Task Cluster

Examples of using big data in evaluation In our various interactions with evaluators this year (including sessions on innovation in evaluation in Ottawa and Sydney), big data has been identified as something which very few evaluators have experience or training in using for evaluation.  The report Integrating Big Data into the Monitoring and Evaluation of Development Programmes  by Michael Bamberger for Global Pulse is a Call to Action to encourage and inspire development agencies and evaluators to collaborate with data scientists in to find innovate ways of using big data in development, and includes examples and ways that big data and related information and communications technologies are already being used in programme monitoring, evaluation and learning.

Analyzing data with Excel – Ann K Emery’s set of 50 two minute videos cover topics such as cleaning and tidying data, exploring data, analysing and reporting – and includes ways of using Excel to quickly analyse qualitative data such as text responses to questionnaires.

Data Visualization checklist by Ann K Emery and Stephanie Evergreen - Updated in May 2016, this checklist should be printed out and stuck up on your wall above wherever you work so that there's no way you can miss it. For those who want more information, you can find Stephanie Evergreen's pages and recommended resources on BetterEvaluation under Visualise Data and Develop Reporting Media.

​Evidence from previous evaluations - 3ie's Impact Evaluation Repository was recommended to us by Tricia Petruney via Twitter, who called it a "golden global good" - and we agree. The Repository is an index of all published impact evaluations of development interventions. All studies in the Impact Evaluation Repository have been screened to ensure they meet 3ie’s inclusion criteria and in August 2016, a massive updating exercise of the IER was completed, bringing the total number of evaluations and links to original studies to 4,260 - Quite an undertaking!

Visualising missing data - the Newfoundland and Labrador chapter of the Canadian Evaluation Society recommendedan AEA365 blog post by Tony Fujs, which provides detailed instructions in how to visualise the cases with missing data - and explains why this is important. (If you don't know the AEA365 blog series, do check it out for its amazing range of insights and resources, delivered one a day all year round).

Understand Causes of outcomes and impacts

View UNDERSTAND CAUSES Task Cluster

Pathways to change: Evaluating development interventions with qualitative comparative analysis (QCA)​ - This report by Barbara Befani contains a step-by-step guide on how to apply and ensure the quality of of QCA to real-life development evaluation, including common mistakes and challenges. To quote Rick Davies in his review on M&E News: "This is an important publication, worth spending some time with. It is a detailed guide on the use of QCA, written specially for use by evaluators. Barbara Befani has probably more experience and knowledge of the use of QCA for evaluation purposes than anyone else. This is where she has distilled all her knowledge to date. There are lots of practical examples of the use of QCA scattered throughout the book, used to support particular points about how QCA works.  It is not an easy book to read but is well worth the effort because there is so much that is of value. It is the kind of book you probably will return to many times. "

Synthesise data from one or more evaluations

View SYNTHESISE Task Cluster

Free e-Book by Judy Oakden and Melissa Weenink - What’s on the rubric horizon: Taking stock of our current practice and thinking about what is next. This book explores some of the challenges Judy and Melissa have encountered using rubrics in their practice and includes feedback from a discussion during a practice-based session at the ANZEA Conference in Auckland, New Zealand in 2015 exploring the difficulties evaluators face with rubrics.

Report and Support Use of findings

View REPORT AND SUPPORT USE Task Cluster

The Monitoring & Evaluation and Climate Change Interventions is a weekly online broadsheet of curated sector news and information on M&E, knowledge management, learning, capacity development and informed decision-making in a changing climate. It's a really handy place to hit up for the latest opinions and goings on in the area and we always find something new and interesting on there when we check in, and we also think it's worthwhile to highlight in respect to our Report and Support Use task - thinking about new avenues for sharing reports and findings and starting conversations about these.

General resources

A recommendation from BE Member Jess Noske-Turner: Monitoring and Evaluation for Participatory Theatre for Change guide. Participatory Theatre for Change (PTC), similar to other participatory communication, has typically been one of the ‘hard to measure’ approaches to address social and development challenges. This guide offers practical guidance and tool suggestions for implementing monitoring and evaluation in PTC programs, and highlights considerations and approaches

for process and quality monitoring of PTC. Jess writes: "This is not a breaking-new resource, but I have only just discovered it, perhaps because I assumed that it would be very specific to Participatory Theatre. In fact, I can see that this guide could be adapted and used for a range of different kinds of C4D. I think the strength is the way the theory of change is articulated - which is quite simple but specific - and the way the rest of the guide is built around that."

Tom Archibald let us know his 2016 pick via Twitter - The IIED Briefing Paper: Realising the SDGs by reflecting on the way(s) we reason, plan and act: the importance of evaluative thinking. Tom says: "This resource stood out for me because in the laudable push to evaluate the SDGs, there is a risk of approaching the task as a purely technical endeavor, consisting of the application of predetermined metrics as a sort of compliance activity. This publication compellingly makes the case that evaluation, especially in complex and adaptive contexts such as the SDGs, requires "the skills and dispositions of critical thinking," and that "all of us—evaluators, policymakers, parliamentarians, implementers and the general public—must also think evaluatively."

Lanie Stockman had two top picks for us:

1. The Pelican Initiative: Platform for Evidence-based Learning & Communication for Social Change (which we are also huge fans of). In particular, there were two threads that Lanie really appreciated: "[The threads on] evaluation terms of reference and randomised control trials were energetic and relevant. The discussions confirmed that: (1) if evaluation terms of reference are not sensible, there's a good chance the evaluation report won't be either! and (2) It's ok to question Randomised Control Trial evaluation designs as the 'gold standard' - a range of knowledge forms are legitimate and important. Ultimately the evaluation questions should guide method." - Both very good take away points!

2. Developmental Evaluation by Michael Quinn Patton. Lanie's take away from this book was: "It's all about the relationships! This book reinforced that the evaluator-programmer relationship is critical if the evaluation report is going to come off the dusty shelf and actually be used for learning and program improvement." You can read an overview of the book by Michael Quinn Patton himself here, and we'd also recommend checking out our theme page on Developmental Evaluation for some additional resources.

Thanks to everyone who helped us with this list. Tell us your top 2016 pick in the comments below!

Language
English

Show more