I spend a lot of time thinking about how to best meet the needs of my clients. One of the ways to do this is by keeping up with innovations in the field of evaluation and other related fields. So naturally one of the questions I had been asking myself during the past year leading up to the 2016 American Evaluation Association Conference (#Eval16) was, “what are the current trends in evaluation?” Here are some themes that I’m hearing in my own work and those that I heard at Eval16:

  • Design
  • Databases
  • Evaluation Capacity Building
  • Cultural Responsiveness
  • Communication

Let’s look at each of these areas in more depth:

1. Design

Given that the theme for Eval16 was Evaluation+Design, it should come as no surprise that the field of evaluation is increasingly focusing on design. And while President John Gargani’s 2012 prediction that “evaluation reports will become obsolete” hasn’t quite happened yet, our reports are incorporating many design elements and are evolving into shorter, more visually engaging communications pieces.

That being said, it seems we’re not all fully on board with this design concept yet. When Heather Fleming, Founder of Catapult Designs, asked the plenary audience who considers themselves to be a designer, very few hands were raised. I can relate as I’ve personally struggled with adopting this title of designer. But we are designers! We design evaluations, we design data analysis plans, we design reports, we design communications materials. And with the tools leaders in the field like Stephanie Evergreen, Ann Emery, and Chris Lysy are developing, we will design even better materials. So in 2017 I hope more evaluators will embrace this role and play around with creativity in their work. Come on, say it with me, “I am a designer.”

2. Databases

Increasingly I’m being asked about what programs organizations can use to store their information in one central location. I’m hearing a lot of frustration from staff about funders requiring them to use certain databases as well as a lack of communication across data systems. This is making it difficult for organizations to understand, analyze, and use their data.

The State of Evaluation 2016 found that “55% of nonprofit organizations are using 4 or more data storage methods.” I mean, really, who can keep track of data in 4+ different databases – that’s confusing for everyone! So I think we’re going to begin to see even more community-based organizations moving toward customizable CRMs and databases like Salesforce and Efforts to Outcomes (ETO).

3. Evaluation Capacity Building

This year the phrase “evaluation capacity building” was included in the title of 20 AEA conference sessions while another 9 session titles mentioned “capacity building.” Evaluation capacity building (ECB) is not a new concept. In fact, the 2000 AEA Conference Presidential Theme was Evaluation Capacity Building. That being said, I think we still have more work to do around ECB.

  • “28% of nonprofit organizations exhibit promising evaluation capacity” (State of Evaluation 2016). This is the same percentage as reported in 2012.
  • 69% of respondents said their foundation invests too little in “improving grantee capacity for data collection or evaluation” (Benchmarking Foundation Evaluation Practices). That’s a lot of respondents reporting a need for ECB!

While there’s a lot of great work happening around ECB, very little information is being disseminated across consultants, organizations, and countries. I’m looking forward to seeing how the AEA Organizational Learning and Evaluation Capacity Building Topical Interest Group moves forward with the ECB Commons Project to help bridge these gaps. I’m hoping this next year brings more ECB collaborations and information sharing.

4. Cultural Responsiveness

The phrase “culturally responsive” was mentioned in the title of 16 AEA sessions while “cultural responsiveness” was mentioned in one other. Unfortunately, I think sometimes that’s where our attention to cultural responsiveness ends in evaluation – in the title or as a catch-phrase used in reports. But I noticed something different at Eval16 as compared to previous conferences; I heard colleagues talking about the intersection of culture and evaluation both inside and outside of conference sessions.

One of the things that really impacted me was when Dr. Nicole Bowman read A Cherokee Prayer during the Opening Plenary. The prayer reminded me of the importance of understanding the origins of our field of evaluation, the diverse backgrounds of our participants, our own backgrounds, as well as the history and culture of the organizations with which we work. As evaluators, we must continue to recognize, discuss, and be responsive to cultural context and the cultural diversity of individuals, communities, and organizations.

Want to learn more about cultural responsiveness in evaluation? Check out Continuing the Journey to Reposition Culture and Cultural Context in Evaluation Theory and Practice or read the AEA Statement on Cultural Competence in Evaluation.

5. Communication

The word “communication” was included in the title of 11 conference sessions. This is not surprising since our role as evaluators requires us to communicate with many different entities, including participants, staff, media, policymakers, foundations. It’s important for our field to understand how to target messages to different audiences and to deliver findings in captivating in innovative ways (see #1 Design).

However, I think the inclusion of communication in the sessions this year speaks to something deeper. The sessions didn’t just focus on how to communicate, they focused on intentional and transparent communication. They encouraged vulnerability (want to learn more about vulnerability? Check out Brene Brown’s work!).

  • John Gargani coined the word “Breart” to describe the need to think with both our brains and our hearts in evaluation
  • Michael Quinn Patton encouraged attendees to walk the talk of learning from failure by doing it ourselves
  • Stephanie Evergreen talked about embracing and learning from your evaluation failures saying, “the only way out is through” and to get over your fear, you have to “fail big, often, and in front of other people”
  • Sharon Rallis discussed the important of trust in evaluation and how one way to build trust is to have the confidence and willingness to be vulnerable

So there you have it. I think we’re going to see and hear a lot more about design, databases, evaluation capacity building, cultural responsiveness, and communication/vulnerability as the field of evaluation grows and evolves.

Now it’s your turn, what trends are you seeing in your own work? Share your insights in the comments and on social media.