On 3 & 4 May, 120 people gathered at the sold-out Complexity and Evaluation Conference 2018 to share learnings from the evaluation edge of complexity and systems change. Here, we post the resources, tools and question that were shared.
Day 1: focused on framing systems change and approaches to evaluating it
Our partner – Asia Pacific Social Impact Centre at Melbourne Business School – turned on a spectacular learning venue and good Melb coffee on arrival. After opening protocols, Conference designers Kerry Graham, Liz Skelton, Kate McKegg and Jess Dart framed the Conference by building a shared understanding and language for systems change (slides: 1. Day 1_opening session_What is system change)
The Conference designers were joined by Mark Cabaj for a Plenary Panel exploring What frames do you use to understand system change in order to allow you to evaluate? They unpacked 3 different approaches (slides: 2. Day 1_plenary panel_Frames to understand system change_small):
- Principle based approach
- Critical systems approach
- Theory of change approach
Participants went deeper into two of these approaches through a plenary workshop on using a principle based approach (Handout: 3. Day 1_plenary workshop_Principles approach to Systems Change_Handout) and an ‘lessons from the field’ session on using a theory of change approach (slides: 4. Day 1_plenary session_Theory of change approach_small).
After lunch, participants self-organised into two break out sessions on:
- Measuring progress and impact for social innovation processes (slides: 5. Day 1_break out session_social innovation_small)
- Tracking progress and evaluating place-based reform. This session introduced participants to 2 frameworks (slides: 6. Day 1_Breakout_place based_small) and 6 tools being:
Day 1 closed with a deep dive into the case study – Children and Youth Area Partnership (slides: 12. Day 1_closing case study_CYAP _small) and networking drinks, grazing food and loads of conversation.
Day 2: moved from framing systems change to measuring it
In changing gears towards measuring (not framing) systems change, the plenary session on How do we know we are on track for systems change – using and blending evidence to make sense of the quality, value and importance of systems change efforts was a cracker. The slides and handouts are below
- Slides: 13. Day 2_plenary session_how do we know we are on track_small).
- Handout: 14. Day 2_ plenary session_ What We Know So Far-Scaling Evaluation_Handout
Participants moved into two sessions of participant-directed learning with workshops to develop and apply the principles generated on Day 1, and case consultations (process: 15. Day 2_ Evaluation Case Consultation process_Handout)
The Conference closed with the awesome speaker team responding to the top learning questions generated by participants across the Conference. One of those questions was about contribution analysis (slides: 16. Day 2_closing session_contribution analysis_small), but in a not-so-unexpected twist, the top questions were less about evaluation and more about engagement and collaboration. You can see the questions at 17. Day 2_closing session_Top learning questions
With thanks to….
… the speakers for creating such an engaged learning environment
…. our partners for the time they invested and the way we worked together
… the CFI team who always deliver over and above
… and to all those who participated, learned, challenged and grew the field of collaborating for impact in Australia. We salute you!