Conference workshop program: Tuesday 26 September 2023

>>> DOWNLOAD a printable conference workshop program

Having trouble seeing the full overview table? Switch to mobile version here.
View Monday conference program here.

8–9am REGISTRATION
9am–12:30pm WORKSHOP PROGRAM

The marathon, not the sprint: Building an evaluation career that lasts (half day)

Jade Maloney, Jo Farmer, Sharon Marra-Brown 

> Details
> Register

Changing world, changing roles: Evaluator at the table in systems innovation (full day)

Jess Dart, Zazie Tolmer; via video: Charlie Leadbeater (UK), Jennie Winhall (Denmark)

> Details
> Register

  Designing credible and useful impact evaluations (full day)

Brad Astbury, Andrew Hawkins

> Details

 By us, for us: First Nations leadership in evaluation design, governance, delivery, analysis and translation (full day)

Amunda Gorey, Veronica Turner, Jen Lorains

> Details

 

Mixed methods evaluation design for transformative purposes (half day)

Donna M Mertens

> Details
> Register

Creative evaluation and engagement:  purpose-driven evaluation, supporting changemakers to bring about transformative change and create a more whole, beautiful, and just world (full day)

Nora Murphy-Johnson, Raphael Johnson, Kate McKegg

> Details
Register

Toggle editor 

12:30–1:30pm LUNCH
1:30–5pm WORKSHOP PROGRAM

Evaluation and Value for Money: An inter-disciplinary, participatory approach using evaluative reasoning and mixed methods (half day)

Julian King 

> Details
Register

Dart, Tolmer, Leadbeater, Winhall continued

Astbury, Hawkins continued

Gorey, Turner, Lorains continued

The realist difference: Tips and explanations for evaluators using a realist approach (half day)

Emma Williams, Cara Donohue

> Details
Register

Johnson, Johnson, McKegg continued

 

WORKSHOP DETAILS

Categories: 
A.
Foundational evaluation skills and capabilities 
B. New tools; approaches and ways of thinking for a transforming context   
C. Advanced evaluation topics


The marathon, not the sprint: Building an evaluation career that lasts

presented by Jade Maloney, Jo Farmer, Sharon Marra-Brown   HALF DAY (MORNING)  |  CATEGORY: A,B, C

Join us for an interactive workshop that will equip you with practical strategies and tools to build and sustain a fulfilling career. 

As evaluators, we spend a lot of time honing our technical skills. Some of us also spend time thinking about a trauma-informed approach to delivery. There are plenty of resources to help us with these, but fewer to help us think about how we manage the challenges we encounter as evaluators – from finding that we’re unwelcome, supporting people through evaluation anxiety, through to identifying and responding to vicarious trauma.

However, these are the kind of skills we need to sustain ourselves and our teams, as our work involves grappling with some of society's most challenging problems, engaging with people in difficult circumstances, listening to stories (narrated by individuals or writ large in the numbers) that are sometimes dark, and supporting staff to hear criticisms of programs they are passionate about. They’re even more important as the conception of the role of evaluator shifts from independent outsider to someone who brings their whole self to the work and effectively engages with people with lived experience.

Through a combination of knowledge-building and reflective activities – drawing on leadership and evaluation literature – the workshop will cover a range of topics that support a sustainable career. These include:

  • identifying personal values
  • managing imposter syndrome
  • protecting your own and other’s wellbeing
  • building your own board of directors
  • navigating ethical dilemma by drawing on your values

By the end of the workshop, participants will have a clear sense of who they are as evaluators, the practices they can adopt to thrive in their roles – as individuals and team leaders – and the core they can come back to when the going gets tough.

The workshop aligns with AES Evaluator’s Professional Learning Domain 1 (Evaluative Attitude and Professional Practice) by supporting participants to proactively plan how they will approach maintaining their integrity and building their professional practice. It also aligns with Domain 6 (Interpersonal Skills) by building self and team management skills.

This workshop covers an advanced evaluation topic, but is suitable for evaluators at all stages of their careers.

Jade Maloney is the CEO of ARTD with 15 years managing and overseeing evaluations in disability, mental health and suicide prevention. She has completed a 12-month program with Women and Leadership Australia and grown ARTD’s tools of coaching mentoring, and leading teams in challenging contexts. She is an experienced facilitator and has previously run interviewing skills workshops for the AES and convened the 2019 conference.

Jo Farmer brings policy expertise in work mental health and wellbeing, as well as practical and academic experience in designing trauma-informed evaluation, both for participants and evaluation teams.

Sharon Marra-Brown regularly manages evaluations in challenging contexts, manages teams of lived experience evaluators, and facilitates engaging workshops for people with all levels of experience in evaluation.

> back to overview > register


Evaluation and Value for Money: An inter-disciplinary, participatory approach using evaluative reasoning and mixed methods

presented by Julian King  HALF DAY (AFTERNOON) |   CATEGORY: B, C

This workshop provides practical guidance, underpinned by sound theory, for evaluating Value for Money (VfM). It unpacks a process of explicit evaluative reasoning (using rubrics) and the use of mixed methods. A sequence of steps will be shared to help evaluators and commissioners to develop and use context-specific definitions of good VfM. These definitions provide a system for ensuring the evaluation: is aligned with the design and context of the policy or program; engages stakeholders in evaluation design and sense-making; collects and analyses credible evidence; draws sound conclusions; and answers VfM questions. The approach is intuitive to learn and practical to use. 

Participants will learn how to: frame an evaluative question about VfM; develop rubrics setting out VfM criteria and standards; combine multiple sources of evidence; incorporate economic evaluation within a VfM framework where feasible and appropriate; interpret the evidence on a transparent basis; and present a clear and robust performance story, guided by the rubrics. 

The workshop involves a mix of powerpoint presentations, group discussions and examples. Participants will receive optional pre-workshop reading, and a post-workshop take-home pack including a copy of the slides, exercises and links to online resources. 

This workshop includes a brief overview of economic methods of evaluation (e.g. cost–benefit analysis) including considerations for determining when to use them in a VfM assessment. It doesn’t provide detailed instruction in the design and implementation of economic evaluations. There are courses already on offer that focus on economic methods of evaluation.

Julian King specialises in evaluation and value for money. His PhD research developed the Value for Investment system which is the focus of this workshop. The system is used worldwide to evaluate complex and hard to measure programs and policies. In 2021 Julian received the AES Evaluation Systems Award in recognition of its widespread application. Julian has over 20 years of evaluation and facilitation experience and has delivered this workshop for evaluation associations, private companies and NGOs on every continent except Antarctica. 

> back to overview > register


Changing world, changing roles: Evaluator at the table in systems innovation

presented by Jess Dart, Zazie Tolmer; via video: Charlie Leadbeater (UK), Jennie Winhall (Denmark)   FULL DAY |  CATEGORY: B

We are at a critical historical juncture. Inequity is increasing, climate change is well on its way to the point of no return, and the robots are here. The time for change is now. Disruption offers strategic opportunities to reset the system into a better place. In many forums and spaces people are having critical discussions about equity, racial justice, decolonisation, and climate change. With this comes new actors, new funding streams, and new data flows.

Evaluation is critiqued for contributing to locking inequitable systems in place. How do evaluators show up differently, and how do evaluators play a constructive role in this fast-changing space? This workshop explores the role of evaluators in System Innovation and puts to the test the evaluation field’s broadly accepted competencies around the purpose and logic of evaluation.

The workshop will start with an overview of the Systems Innovation field before proposing a conceptual framework to explore the role of evaluation in System Innovation. The frameworks has three elements to it: the Four Keys to unlocking and shifting systems (power, resource flows, purpose, and relationships); The system levels (micro, meso, and macro), and the three phases of System Innovation journey. 

Using this framework, we will examine how evaluation can be useful and at times be a hindrance at each of the three phases of System Innovation. We will be joined by Charlie Leadbeater and Jennie Winhall from the systeminnovation.org in Denmark and the UK via video. They will introduce the Four Keys. We will touch on relevant evaluation approaches and much-needed adaptations. We will end with a critical examination of how evaluators might best position themselves as one actor at the table and what skills and aptitudes are needed for this work. Throughout the day we will use a mixture of presentation, small groups discussion, and real case studies.

Dr Jess Dart is a fellow of the AES, a recipient of two awards, including Outstanding Contribution to Evaluation (2018) and holds a PhD in evaluation. She has over 25 years of experience in evaluation. She works as a generalist across many sectors, domestic and international, not-for-profit and Government. She has published in the American Journal of Evaluation and New Directions. Jess has presented pre-conference workshops more than ten times, at AES and other conferences. She gets very positive feedback and is considered a highly engaging facilitator.

Zazie Tolmer has 17 years' experience as an evaluator and worked for seven of these as an evaluation consultant in Australia. Zazie has delivered workshops and training including a workshop at the AES in Perth and has presented at the AES, EES and several other conferences.

Jennie Winhall, The System Shift Labs, Copenhagen, Denmark, is an expert in user-centred design, social innovation, and translating big policy ideas into action through services that people love. Jennie is the founder of ALT/Now in the UK which runs programmes for practical system innovation and was until recently the Director of Social Innovation for the Rockwool Foundation in Denmark. [Jennie will present remotely.]

Charlie Leadbeater, The System Shift Labs, London, UK, is the author of Living on Thin Air and We-Think: mass innovation not mass production. Charlie works internationally as an adviser on innovation strategy to companies, cities, and governments. After leaving the Financial Times, where he was Industrial Editor, he became an independent writer, policy adviser and social innovator. [Charlie will present remotely.]

> back to overview > register


  Designing credible and useful impact evaluations

presented by Brad Astbury, Andrew Hawkins    FULL DAY  |  CATEGORY: A, B, C

This workshop focuses on the principles and logic associated with the selection, design, and application of different kinds of impact evaluations, ranging from randomised control trials to theory-driven and qualitative approaches to causal inference. Different approaches to impact evaluation will provide more or less accurate (as well as precise) answers to different questions, e.g.: 

  1. Did the program have an impact?
  2. What was the size of the program’s impact?
  3. What was it about the program that had impact?
  4. In what situations and for which people did the program activities have an impact?
  5. Should we continue the program, scale it up, modify it, better target it, or do something different to maximise future impact?

A single approach or combination of approaches may be more or less appropriate depending on the questions you are seeking to answer and other contingencies, such as the nature of the evaluand and the evaluation setting (e.g. time, budget, data availability, stakeholder information needs, degree of uncertainty that can be tolerated and evaluator expertise).

Practical aspects of delivering high-quality and useful impact evaluations will be explored in this workshop through case applications. Participants will learn about:

  • Different understandings of impact evaluation, including:
    – Differences between causal descriptions and causal explanations
    – Differences between causal inference and effect sizes
    – The way that systems approaches deal with causality and attribution
  • Approaches to classifying quantitative and qualitative impact evaluation designs
  • Ways in which impact evaluation designs differ and what this means for practice
  • Beyond the hierarchy of methods: dangers involved in relying too heavily on any one approach to conducting impact evaluation 
  • A contingent approach: considerations for selecting and combining impact evaluation designs and methods based on situational analysis.

This workshop aligns with competencies in the AES Evaluator’s Professional Learning Competency Framework. The identified domains are: Domain 1 – Evaluative attitude and professional practice; Domain 2 – Evaluation theory; and Domain 4 – Research methods and systematic inquiry.

The workshop is designed for both new and experienced evaluators and commissioners of evaluation. 

Associate Professor Brad Astbury is a mixed method evaluator and health systems researcher with expertise in combining diverse forms of evidence to improve the quality and use of evaluation findings. For the past two decades Brad has worked with local, state and national government agencies, industry and the not-for-profit sector to design and deliver evaluations and enhance integration of evaluation processes into organisational systems. His primary areas of evaluation practice are health care, mental health, education, community services and justice. He is interested in the practical application of evaluation theory and methodology to support continuous improvement and evidence-informed decision making about what works, for whom, under what circumstances, why and at what cost.

Brad is passionate about advancing the theory, practice and use of evaluation through:

  • situationally responsive evaluation that is tailored to the level of complexity and stage of program development, consumer needs and equity considerations, intended uses of the evaluation, and available time, budget and data
  • delivering high-quality training and capacity-building, including the development of policies and tools that support individuals and organisations to embed evaluation into everyday practice
  • application of methods to strengthen causal inference in impact evaluations, such as theory-based evaluation, realist evaluation and Qualitative Comparative Analysis (QCA)
  • the use of evaluation-specific methodology, such as rubrics to synthesise evidence and provide evaluative conclusions about the merit, worth and significance of a program or policy
  • strategies to enhance program sustainability and scale up innovations
  • systematic reviews using mixed method approaches to build knowledge of effective solutions to policy challenges
  • meta-evaluation to improve the quality of evaluation processes and findings.

Brad is an active contributor to professional societies for evaluators in Australia, Europe and North America and publishes regularly in leading evaluation journals. He currently serves on the editorial advisory board for Evaluation – The International Journal of Theory, Research and Practice, the Evaluation Journal of Australasia and New Directions for Evaluation.

Andrew Hawkins has worked at ARTD for 15 years as a full-time evaluator. In that time, he has directed or managed more than 200 substantial evaluation projects and logged over 17,000 hours of direct client service delivery. Andrew considers evaluation as intertwined with reasoned action and strategy in a complex and uncertain world. Andrew is a pragmatist, a realist and a systems thinker living in a world often split between empiricists and constructivists. Andrew focuses on logical analysis, explicating and testing the rationale that underpins a policy or program, and cost-effective means of gathering evidence and insight for decision making. Andrew’s clients value his ability to listen deeply, probe, anticipate their needs and facilitate new understandings. Andrew is often asked to bring this focus on reasoned action to strategy and program design as well as to the development of monitoring and evaluation frameworks and approaches for impact evaluation. In his preferred line of work he is less focused on the question of ‘what works’ and more on ‘how can we make it work?’ He is an unrepentant evaluation fanatic and sees that our ability as a specific for good evaluation, that is decision making about priorities and action as the most important requirement for the survival of our species and for creating value to life on earth.

Andrew is a member of the editorial advisory board for the Australasian Journal of Evaluation. He is the founding Co-Chair of the Australian Evaluation Society’s (AES) Systems Evaluation Special Interest Group (SIG) and is an honorary fellow with Charles Darwin University’s Northern Institute where he published 'Realist evaluation and randomised controlled trials for testing program theory in complex social systems' and was a contributor to 'RAMESES II reporting standards for realist evaluations'. Most recently Andrew has been active in the development of a more pragmatic theory of evaluation that shifts the focus from social science research to reasoned action and managing the risk of program failure www.propositionalevaluation.org. His qualifications are a Masters of Administrative Law and Public Policy (2007, University of Sydney) and a Bachelor of Arts, Psychology (Hons) (2000, University of Sydney)

> back to overview 


 By us, for us: First Nations leadership in evaluation design, governance, delivery, analysis and translation

presented by Amunda Gorey, Veronica Turner, Jen Lorains    FULL DAY  |   CATEGORY: B

Evaluators work with people from all walks of life and cultural backgrounds. We hold a responsibility to understand, engage with, respect and respond to local cultural contexts in our evaluation approaches, methods and management. Evaluation work with First Nations people and programs is a critical enabler for genuine evaluation findings and community empowerment. 

‘Aboriginal people have been researched to death. It’s time we researched ourselves back to life” (William Tilmouth, CG Chair).

At Children’s Ground (CG) First Nations leaders, staff and families are designing, governing, delivering and translating evidence into practice every day. Using CG’s evaluation framework and 10 years of practical experience (strengths and challenges) the objectives of this workshop are to build on evaluation capacity of participants through engagement in new ways of thinking about and doing evaluation with (not to or for) First Nations people and to learn new knowledge, understandings and strategies for embedding First Nations leadership in evaluation at all stages 

This full-day workshop will involve practical experiences/examples, case studies, and interactive activities, providing opportunities to: 

  • Reflect on current/planned evaluation practice
  • Unpack strengths/challenges of First Nations leadership in western and cultural evaluation 
  • Explore applications of First Nations evaluation and data sovereignty
  • Plan approaches/strategies for First Nations leadership in evaluation 

Practical examples/resources will be worked through ensuring time for sharing, peer learning and questions.

Participants will gain an increased understanding of and practical strategies to embed First Nations leadership in evaluation, including understanding and differentiating between First Nations leadership in western evaluation and cultural evaluation approaches, and finding the balance between both. 

This workshop builds knowledge and skills in the following Evaluators’ Professional Learning Competency Framework domains: Evaluative attitudes/practice, Culture/Context, Research Methods and Management and Evaluation Activities. 

No pre-requisites for participation. Workshop is suitable for evaluators of any experience levels currently/planning evaluation of First Nations programs/policies/services.

Amunda Gorey is an Arrernte woman living in Alice Springs with her three children. Her traditional connection to lands are Irpmangkere (south/west of Alice Springs) & Irlmpe (North of Alice Springs). She is an artist, experienced community health researcher and is a specialist in facilitating First Nations/Western relationships. Amunda is currently the Co-Coordinator of Research & Evaluation at Ampe-kenhe Ahelhe (Children’s Ground Central Australia) working with First Nations staff, families and communities to undertake a 25-year longitudinal evaluation of the impact of the Children’s Ground Approach in Central Australia. She is also a member of the Ingkerrekele Arntarnteareme – the local First Nations governance group for Children’s Ground in Central Australia, who are leading the design, delivery and decision-making that informs the strategy and operations in Central Australian communities. Amunda has worked on a number of research projects, working with First Nations communities across the Northern Territory, undertaking and supporting research. She has undertaken many formal research training sessions and courses and has applied her learning to community researcher roles and other work. Amunda brings important practical experience and theoretical knowledge in research and evaluation with First Nations communities.  In all her roles, Amunda has worked as a liaison between Western and First Nations organisations, ensuring community voices are upheld and respected. She works continuously to ensure cultural safety to research and evaluation processes and importantly is able to communicate evaluation work and concepts into Arrernte language to enable genuine understanding and leadership by local people in all research and evaluation work. 

Veronica Turner is an Arrernte woman. Her traditional lands are Sandy Bore (Mpweringke Anapipe or Alenyerrekatherre) outstation. She speaks Arrernte and English. Veronica is currently a Co-Director at Ampe-kenhe Ahelhe (Children’s Ground Central Australia), as well as a Cultural Advisor and Senior Arrernte Educator at Children’s Ground. She is also a member of the Ingkerrekele Arntarnteareme which is the local First Nations governance group for Children’s Ground in Central Australia, who are leading the design, delivery and decision-making that informs the strategy and operations in Central Australian communities. Working alongside western-trained managers and early childhood educators, Veronica is also responsible for the co-development of learning resources and ensuring the learning programs are being delivered in line with what Arrernte people have said they want for their children. As a First Nations staff member and leader, Veronica has been involved in leading the longitudinal evaluation of the impact of the Children’s Ground Approach in Central Australia since 2017. Veronica brings important cultural safety to evaluation processes and extensive contributions to First Nations analysis and reporting. 

Jen Lorains is a non-First Nations woman, living and working on Arrernte country in Alice Springs. She has undergraduate and postgraduate qualifications in applied social research and over 20 years' experience designing and undertaking research with communities and services. Jen has undertaken research and evaluation with and for a diverse range of stakeholders, including local and state governments, non-government organisations, research institutions, universities and community led research initiatives. Since 2016 Jen has been the Director of Research & Evaluation at Children’s Ground. She travels and works across Children’s Ground’s Central Australian and Top End operations, working with each community, through participatory approaches, to evaluate and evidence the impact of Children’s Ground’s systems reform and integrated service platform. Jen has worked in research and evaluation roles with services and communities in early childhood, school transition, mental health promotion, consumer participation in health, education, youth engagement and healthy sporting clubs. She has undertaken research with a range of cultural groups including First Nations Australians and refugee communities.

> back to overview 


Mixed methods evaluation design for transformative purposes

presented by Donna M Mertens   HALF DAY (MORNING)  |   CATEGORY: B, C

Evaluators can be contributors to consciously addressing inequities in the world by the way they design their evaluations. Transformative mixed methods designs are explicitly constructed to serve this purpose. This workshop is designed for evaluators who want to learn how to use mixed methods for transformative purposes to better address the needs of members of marginalized communities, such as women, people with disabilities, people living in poverty, racial/ethnic minorities, and religious minorities. It addresses mixed method strategies that can enhance the ability of evaluation designs to contribute to addressing social inequities. Participants will learn how to use a transformative lens to identify those aspects of culture and societal structures that support continued oppression and how to apply mixed methods designs to contribute to social transformation. Interactive learning strategies will be used including whole group discussion and working in small groups to apply the design of a transformative mixed methods evaluation to a case study. 

This workshop is designed for intermediate level evaluators. Some evaluation design experience is needed to add the transformative mixed methods design skill set to participants’ repertoire. 

Donna M Mertens, Professor Emeritus at Gallaudet University, specialises in transformative research and evaluation methodologies that support social, economic, and environmental justice and human rights, seen in her publications: Mixed Methods Research, Program Evaluation Theory and Practice (2nd ed.); Mixed Methods Design in Evaluation; and Research and Evaluation in Education and Psychology (5th ed.). She consults with such organisations as U.N. International Fund for Agricultural Development, UN Women, and Engineers without Borders Canada. Mertens served as the editor of the Journal of Mixed Methods Research, President of the American Evaluation Association (1998), and founding Board member: International Organization for Cooperation in Evaluation and Mixed Methods International Research Association.

> back to overview > register


The realist difference: Tips and explanations for evaluators using a realist approach

presented by Emma Williams, Cara Donohue   HALF DAY (AFTERNOON)  |   CATEGORY: C

This workshop is targeted to experienced evaluators who are interested in trialing ‘scientific realist’ evaluation as proposed by Pawson and Tilley, or for evaluators who have begun to use this realist approach and would like to check that their evaluation designs and practice are fully realist. Addressing Domains 2 and 4 of the AES Professional Learning Competency Framework, the workshop will cover: 

  • How and why the early design phase of realist evaluations differs from other evaluation approaches, even other theory-based evaluations;
  • How and why realist qualitative methods such as interviewing differ from other approaches, including other semi-structured interview methods;
  • How and why realist quantitative methods such as surveys differ from other types of surveys;
  • How and why sampling (for qualitative or quantitative work) for a realist evaluation is different from sampling in other approaches;
  • How and why data analysis (for qualitative or quantitative data) for a realist evaluation is different from data analysis in other approaches;
  • How and why reporting findings from a realist evaluation is different from reporting findings from other types of evaluations;
  • How and why ethics applications for realist evaluations differ from others.

Workshop participants will be asked to bring with them an example of a realist evaluation they have conducted or a topic they would like to evaluate using a realist approach. Small group work will enable participants to know whether (and why) every stage of their evaluation design and practice is ‘really realist’, and have the skills to make any adjustments required. By the end of the workshop, participants will be competent to identify realist evaluation practice at all stages of evaluation, and justify their practice to clients, ethics review bodies, and to peer reviewers

Emma Williams has had a long career in evaluation, research and program development in Canada and Australia, moving between academia, public service and private practice. She is a member of the RREALI (Realist Research, Evaluation and Learning Initiative) team and a Credentialed Evaluator with experience in realist, observational and participatory evaluations on topics such as throughcare, family violence, service access, employment, environmental issues and international development. She has also conducted innovative research in areas such as urban design and language acquisition and has a special interest in child empowerment and in evaluation ethics.

Cara Donohue has been a research fellow with the Realist Research, Evaluation and Learning (RREALI) group at CDU’s Northern Institute since 2020 and in the evaluation field for twelve years. She specialises in evaluation, research, and program design, particularly using realist and theory-based approaches. Cara has a work background in the international and community development fields, and has worked in university, international and domestic NGO settings. She has experience with programs serving at-risk youth, low-income, disabled, refugee, Indigenous, and rural populations.

> back to overview > register


Creative evaluation and engagement:  purpose-driven evaluation, supporting changemakers to bring about transformative change and create a more whole, beautiful, and just world

presented by Nora Murphy-Johnson, Raphael Johnson, Kate McKegg    FULL DAY   |  Category: B

The purpose of this workshop is to introduce participants to a way of embodying a radically new evaluation purpose. The workshop content will be guided by a set of principles and move through four phases: align, learn, adapt, and embody. The facilitators will introduce participants to the roots, guiding principles, methods and essential strategies of Creative Evaluation and Engagement (CE&E). The workshop facilitators will also support participants in learning how to practice purpose-driven, principles-guided evaluation in a supportive, generous, and constructive way. It will introduce participants to the steps and tools to practice evaluation in ways that are self-aware as well as aware of the interconnectedness and wholeness of living systems.

The workshop is directly relevant to all the domains of the AES’s evaluator’s professional learning competencies. It will combine face-to-face and online facilitation with four of the world’s leading evaluation guides and facilitators.  It will be engaging, creative, highly participatory, and full of inspiration. This workshop is for those with at least some understanding of evaluation and systems change, but we welcome all levels. There are no other prerequisites other than being prepared to bring your whole self, be curious and creative. 

Kate McKegg is the director of The Knowledge Institute Ltd (www.knowledgeinstitute.co.nz ) and a member of the Kinnect Group (www.kinnect.co.nz), as well as an indigenous-led collective Tuakana Teina, based in the Waikato region of New Zealand. She is also a co-founder, along with Nan Wehipeihana and Nora Murphy of the Developmental Evaluation Institute (https://developmental-evaluation.org/about), and a founding member and past Convenor of the Aotearoa New Zealand Evaluation Association (ANZEA). Kate is co-editor of New Zealand’s only evaluation text, Evaluating Policy and Practice, a New Zealand Reader (2003). She is also co-editor (along with Michael Quinn Patton and Nan Wehipeihana) of the book Developmental Evaluation: Real World Applications, Adapted Tools, Questions Answered, Emergent Issues, Lessons Learned, and Essential Principles, Guildford Press, New York, (2015). 

Seeing the potential in a complexity-aware evaluation practice to support change, Kate has been drawn to developmental evaluation and other creative forms of evaluation practice because of her deep commitment to social and environmental justice and equity. She has worked alongside many people in complex settings who are innovating to create systems change and has seen the possibilities that a different kind of evaluative practice can bring. 

A. Rafael Johnson, MFA is the Vice-President of Inspire to Change, where he uses the methodologies of the arts to understand systems, organizations, and programs. His fiction and essays have appeared in Temenos Journal, AEA365, Callaloo, Kweli Journal, African American Review, and the anthology Excavating Honesty: An Anthology of Rage and Hope in America. Andy is an adjunct faculty member at the Minneapolis College of Art and Design, where he currently teaches Creative Analytics with Nora Murphy Johnson. Andy holds an MFA in Creative Writing from The University of Alabama and is a fellow at Kimbilio Fiction. His newest book Creative Evaluation and Engagement: The Essentials (co-authored with Nora Murphy Johnson) revives Michael Quinn Patton's 1982 classic Creative Evaluation. Essentials positions the arts as an essential way of knowing and communicating, and prepares changemakers to collect inspiring data and build a body of evidence that can make the world more whole, just, and beautiful.

Nora Murphy Johnson, Ph.D., is the President of Inspire to Change and its principal investigator. Nora believes that all systems of people and institutions are connected and that all parts of the system need to be strong and healthy. Nora encourages clients and stakeholders to think outside of our boxes and disciplines and create a coherent shared vision for something greater than what exists now. Evaluation can be an integral part of working towards this vision. Nora works towards understanding (1) how principles-focused, developmental evaluation can be used for systems change and social justice, (2) ways to create a coherent and shared vision that allows for contextualized learning and adaptation, and (3) how to best engage people in useful evaluations that inform and inspire. Nora is best known for her publications Nine guiding principles to help youth overcome homelessness: A principles-focused developmental evaluation (Developmental Evaluation Exemplars, 2015) and Connecting Individual and Societal Change (Stanford Social Innovation Review, 2020). Her newest book Creative Evaluation and Engagement: The Essentials (co-authored with A. Rafael Johnson) revives Michael Quinn Patton's 1982 classic Creative Evaluation. Essentials positions the arts as an essential way of knowing and communicating, and prepares changemakers to collect inspiring data and build a body of evidence that can make the world more whole, just, and beautiful. She holds a PhD in Evaluation Studies from the University of Minnesota.

> back to overview > register