Conference workshop program: Monday 25 September 2023

>>> DOWNLOAD a printable conference workshop program

View Tuesday conference workshop program here.


8–9am REGISTRATION
9am–12:30pm WORKSHOP PROGRAM: First half of full day workshops
12:30–1:30pm LUNCH
1:30–5pm WORKSHOP PROGRAM: Second half of full day workshops; Afternoon half-day workshop

WORKSHOP DETAILS

Categories: 
A.
Foundational evaluation skills and capabilities 
B. New tools; approaches and ways of thinking for a transforming context   
C. Advanced evaluation topics


Bias in program evaluation: a practical introduction

presented by Samantha Abbato    FULL DAY | CATEGORY: A

Biases are ever-present in evaluation processes, methods, and evaluative judgements. Often unseen and unspoken, they have an insidious impact on the credibility of evaluations, decision-making and actions that result. In this practical workshop, we look through a critical lens to illuminate error and bias in 1. evaluation processes, 2. quantitative, qualitative, and mixed evaluation methods, and 3. evaluative judgments, reporting and decision-making.

To start, we define bias and provide an understanding of the difference between errors of noise and bias. We introduce a low-jargon bias for beginners ‘field guide’ to help you name and identify common biases for evaluation. Understand the roles of and differences between systemic and random error, precision and accuracy, and reliability and validity.  Explore pervasive cognitive biases such as over-confidence bias, confirmation bias and the Dunning-Kruger effect. Learn about common errors in scientific measurement resulting from pre-held theories and cultural blind spots that are ever-present in evaluation work. Be able to identify and address the bias of common evaluation methods.

We go out into the ‘field’ using the field guide to provide a lens to identify major biases and investigate their impacts on program evaluations using practical real-world case studies. 

Finally, we share practical strategies to take back to improve your own evaluation projects and to assess the merit of the evaluation results and conclusions of others prior to using them for decision-making.

Learning outcomes include:

  • Understanding of the difference between noise and bias and the impact of each on evaluation processes, methods, and judgements
  • Knowledge of major types of biases impacting evaluation processes, methods, and judgements
  • Ability to identify errors and biases in evaluation work
  • Knowledge of actions and strategies to minimise bias in evaluation projects.

The workshop aligns with all seven of the AES Professional Learning Competency domains.

Dr Samantha Abbato is a senior evaluation consultant with more than twenty years of experience and strong methodological expertise across a range of qualitative and quantitative disciplines. Sam’s academic grounding in quantitative methods is built upon a bachelor's degree in mathematics and physiology and a Master of Public Health and a PhD in epidemiology and biostatistics. She has extensive qualitative training in medical anthropology (PhD, UC Berkeley). Through this training, Sam has gained a depth of knowledge and experience in identifying and addressing bias and other errors for application to evaluation.

Sam employs a utilisation-focused approach to evaluation practice and consistently uses mixed methods, case studies and collaborative processes that consistently incorporate skills transfer to clients. As an independent consultant, Sam has assisted various clients including non-government organisations and local, state, and Commonwealth governments.

As the director of Visual Insights People since 2013, she has introduced a pictures and stories approach to evaluation and evaluation capacity building. Using cartoons, animation, and checklists, Visual Insights People has created a range of engaging evaluation training tools for use in AES workshops and client team training. Sam is the recipient of evaluation awards including the AES 2015 award for best evaluation publication.

 > register


Scaling innovation for impact

presented by Matt Healey    FULL DAY | CATEGORY: B

Many organisations emphasise the importance of innovation as a means of addressing complex problems. As part of any sort of innovation process is usually a question of scalability – to what extent is the innovation / intervention / program scalable? It’s an important question, but one that can be difficult to answer given the diversity of innovations and needs that exist. 

Given that, the purpose of this workshop is to present a generalisable framework and process for understanding and assessing the scalability of innovations and interventions. Over the course of the day participants will: 

  • Learn evidence-based theory, concepts and requirements for scalability assessment and scaling processes
  • Practice techniques to assess the different components of scalability of an innovation or intervention
  • Produce a scalability assessment summary and recommendations
  • Develop an embedded scaling assessment plan and ongoing evaluation framework.

This workshop has been designed from the bottom-up to be highly practical and hands on, and builds off an extensive evidence base for understanding scalability. It also draws on the facilitator’s own experience in evaluating the scalability of innovations in domestic and international contexts, and the associated practicalities for planning to scale. The workshop will involve a mix of lecture-style content, small group exercises and individual activities guided by a comprehensive workbook and templates. Supplementing this will be space for individual reflection designed to support participants in considering how they can apply what they learned to their own context. 

The workshop is equally suited to evaluators, program managers and those with a role in developing innovations or interventions.

Given the complexity associated with assessing scalability, as well as the highly practical nature of session content, this workshop addresses the following AES Professional Learning Competencies: Competency 1: Evaluative Attitude and Professional Practice, Competency 4: Research Methods and Systematic Inquiry and contributes to Competency 5 in being able to choose appropriate methods.

Matt Healey is a Senior Consultant and Co-Founder of First Person Consulting. He works on projects that address complex areas in the health, social justice and environmental sectors. This includes mental health and suicide prevention, health promotion, e-waste and the circular economy and social innovation.

Matt has a strong interest in the intersection of design, systems and evaluation, and the role that bringing different areas of practice together can play in supporting the scaling of effective interventions and innovations. Over the last five years, he has worked with dozens of clients to understand what works, why and the different mechanisms by which these innovations can scale further.

 > register


LGBTIQA+ awareness and inclusivity for evaluators

presented by Alison Barclay, Phillipa Moss, Ruth Pitt    HALF DAY (MORNING) | CATEGORY: A 

Evaluators have increasingly recognised the need for evaluation to be more culturally responsive and more inclusive of ‘broader spectrums of identity, culture and ways of being’. In working with diverse communities, we need to keep learning, reflecting, and translating this recognition into improved practice. 

Meridian is a community-controlled, peer-led organisation that provides health and social support services in the ACT for the LGBTIQA+ community, people impacted by HIV, sex workers and people who use drugs. This workshop provides an evaluation-specific version of Meridian’s popular LGBTIQA+ Awareness & Inclusivity workshops, which are based on peer-based lived experience, professional experience and academic education. 

The training will comprise multimedia presentation, discussion and practical activities. Participants will leave with practical tools and resources to help their organisations embrace diversity and to improve how they consider LGBTIQA+ inclusion in the design and delivery of evaluations.

By the end of the workshop, participants will be able to: 

  • recognise diversity within the human experience of sex, sexuality and gender
  • confidently use appropriate language and terminologies to describe LGBTIQA+ people and communities
  • reflect on the history and ongoing impact of stigma and discrimination experienced by LGBTIQA+ people, 
  • apply best practice for LGBTIQA+ inclusion when designing surveys and data collection systems
  • consider a range of options for improving LGBTIQA+ inclusion in their evaluation practice.

The workshop is suitable for participants with any level of evaluation experience who are new to working with LGBTIQA+ communities. It may also be suitable for those with experience working with LGBTIQA+ communities who are new to evaluation and who wish to learn more about data collection standards. 

The workshop will support evaluators to build foundational evaluation skills and capabilities, as described in the professional competency standards from Domain 3 (Attention to Culture, Stakeholders, and Context) and Domain 6 (Interpersonal Skills).

Ruth Pitt is Collective Action's evaluation specialist, with diverse experience in consulting, government and not-for-profit organisations. Her qualifications include a Master of Public Health and a Graduate Certificate in Conflict Resolution, which she completed as part of a Graduate Degree Fellowship at the East-West Center. Her training experience includes a Certificate IV in Training and Assessment, and delivery of workshops for the AES at AES18, AES22 and at the 2019 Autumn intensive.

Alison Barclay is the Director and Founder of Collective Action. She specialises in working with communities who experience social injustice to support their involvement in the design and evaluation of programs that impact their lives. Collective Action has worked with peer-led organisations around Australia, including Meridian, the WA AIDS Council, Gender Agenda and AFAO. Alison holds an MA in gender and development studies. Alison is regularly engaged as a facilitator of evaluation and service design workshops.

Philippa Moss is the CEO of Meridian and the previous Chair of the Board of the National LGBTI Health Australia. In 2015 she was awarded the ACT Telstra Business Women's Award for Purpose and Social Enterprise, along with the Australian Institute of Management's Not for Profit Manager of the Year (ACT) award.  Philippa has a long and successful history of working in the community sector, developing sector capacity. As a member of the LGBTIQA+ community, Philippa is passionate about centring lived experience in all aspects of Meridian's work including evaluation. With Philippa's leadership, Meridian has grown from a small HIV organisation to the leading provider of LGBTIQA+ health and wellbeing services in the ACT. Meridian also delivers tailor-made training programs to improve the broader community's understanding of sex and gender diversity.

 > register


Transforming evaluation through Indigenist tools and methods

presented by Bobby Maher, Megan Williams, Corinne Hodson, Oumoula McKenzie, Gulwanyang Moran     HALF DAY (AFTERNOON) | CATEGORY: B

The workshop is an opportunity for participants to critically reflect and learn new approaches and tools for supporting leadership of Aboriginal and Torres Strait Islander people in evaluation. In doing so, the workshop reflects the rights of Indigenous people to self-determine and ensure Indigenous peoples’ ways of knowing, ways of being and ways of doing are centred in collaborative and respectful relationships with communities and organisations. Learning and using a transformative approach in evaluation means participants are required to consider their positioning and lifeworlds in the evaluation process, reflect on processes that privilege Indigenous peoples’ expertise, cultural protocols, responsibilities and priorities, while planning for required transfer of knowledges between generations and for Country. Knowledge and skills considered in this workshop are especially important given significant legislation and sector changes requiring cultural safety in health care, for example, the need for health and climate justice, and rectifying institutional injustices and inequities impacting Aboriginal and Torres Strait Islander people. Through this interactive workshop, we will present the Ngaa-bi-nya Aboriginal evaluation framework and share how it has been widely used to engage key stakeholders, select data, and guide data analysis and reporting. We will provide a demonstration and together will practice applying the lens of Ngaa-bi-nya and Indigenous Data Sovereignty to a health program and embed principles of the AES Cultural Safety Framework. We will also critically reflect on implications of new cultural safety legislation in health and how progress toward meeting it might be evaluated; this has implications for other sectors beyond health, and for transforming professional development and evaluation curriculum, workforce development and supervision, performance indicators and also evaluation methodology and measurement tools – focussing on the future and across cultures and sectors as well as skills.

Bobby Maher is an Aboriginal woman (Yamatji), her ancestral links are to the Kimberley (Kija), Pilbara(Njamal) and Noongar Nations. Bobby is a PhD candidate and Research Associate at the National Centre for Aboriginal and Torres Strait Islander Wellbeing Research, Australian National University. Her PhD research has a focus on collective capability in Indigenous evaluation practice in Australia. Bobby has extensive experience working in Aboriginal and Torres Strait Islander health and social policy, including the Australian Commonwealth Government Departments – Indigenous Affairs Group, Prime Minister and Cabinet (PMC) and Health (DoH), and the non-Government sector as a sexual health educator for Aboriginal youth and Aboriginal communities in WA for Sexual Health Quarters (formerly FPWA). Bobby holds a Masters of Philosophy (Applied Epidemiology) and has completed a Bachelor of Applied Science (Indigenous Australian Research) (Honours), Curtin University. She has experience in qualitative, quantitative and community-based participatory research, including evaluation. Bobby is also a member of the Maiam nayri Wingara Indigenous Data Sovereignty Collective and the Global Indigenous Data Alliance.

Corinne Hodson is a Ngunnawal/Wiradjuri woman Aboriginal woman, with ancestral connections to the Riverina region of NSW. Corinne currently lives on the lands of the Darkinjung people on the Central Coast of NSW where she works as the Manager Community Engagement and Partnerships with Barang Regional Alliance. Corinne is the chair of Ngiyang Wayama Aboriginal Data Network Central Coast – the only Aboriginal community led data network in the country. Corinne is also a member of the Gilibanga.

Megan Williams is Wiradjuri through paternal family and has worked for over two decades advocating for the use of Aboriginal and Torres Strait Islander people’s expertise in health service design and evaluation, research, ethics and university curriculum, especially to improve access to health care for people in prison and prevent incarceration. Megan authored the Ngaa-bi-nya Aboriginal evaluation framework available at the Evaluation Journal of Australasia. Megan is a director of Yulang Indigenous Evaluation Aboriginal-led company and is a professor of Indigenous health at UTS. Megan has MRFF, NHMRC, government and industry funds for research, and has had local and national roles including as a Human Research Ethics Committee chairperson, and Health Sociology Review associate editor including for the Yuwinbir Special Issue. Megan is currently a member of the AIHW National Prisoner Health Information Committee, and Corrective Services NSW Aboriginal Advisory Council. Megan has been miimi (sister) of Mibbinbah community organisation for 15 years, and is Chairperson of independent media company Croakey.org.

Gulwanyang Moran is a proud Birrbay and Dhanggati woman of the Gathang language group in NSW. Gulwanyang has authored book chapters for University of Sydney on the concepts of power, place, and space in decolonising methodologies in research and applies these concepts as well as Birrbay and Dhanggati ways of knowing, being and doing in evaluation and language revitalisation spaces. A proud member of Gilibanga, Gulwanyang is passionate about broadening the scope of evaluation theory and practice utilised in the evaluation sector to better consider the use of Indigenist tools and methods. Gulwanyang often draws correlations between modern and traditional practices of evaluation and sees evaluation as ceremony. Gulwanyang is big on Ngukalil which in Gathang means I give, you give and speaks to our ways of knowledge sharing, respect, and reciprocity in thought leadership environments. Gulwanyang often provokes a reflection of power through her story sharing and reflecting on power, operationalising principles of Indigenous data sovereignty and Indigenous data governance, she believes, is vital in evaluative thinking and approach that impact/benefit First Nations peoples.

Darren Clinch is a Badimia man from Yamatji in the mid-west of Western Australia. Darren's background is in health and holds a Master of Public Health degree along with more than nine years with the Department of Health and Human Services. The last 2.5 of those years working in System Intelligence and Analytics, providing Geospatial Support and Business Intelligence Development. Darren specialises in Data Visualisation and demystifying data for First Nations community organisations and people. Darren has presented extensively on the topic of Indigenous Data Sovereignty and Governance and is a huge advocate for growing the fledgling Blackfulla data and tech community.

Oumoula McKenzie is a strong Yankunytjara, Pitjantjatjara, and Arrente man from Alice Springs, NT, and a proud father. Having spent many years working with Ngaanyatjarra Pitjantjara Yankunytjatjara Women’s Council, he has worked tirelessly to empower his community toward self-determination and governance. Oumoula is particularly passionate about working with future Anangu leaders, helping them to become advocates within their communities led by culture. Oumoula uses his skills in the animation and graphic design space to strengthen the voice of First Nations people. Oumoula has been using his creative talents to educate our communities on the importance of Indigenous data sovereignty and what that looks like via digital animations.

> register


Changing the Theory of Change? Integrating change theories from other disciplines to standard ToC practice so we can better inform responses to complex situations and a climate-changing world 

presented by Brian Keogh, Kara Scally-Irvine      FULL DAY | CATEGORY: A

This workshop seeks to extend the ‘Theory of Change’ through established and researched models of change from disciplines outside of evaluation. This supports a more comprehensive understanding of the change process, enabling evaluators to present their work well-grounded in wider theories. Ultimately, we hope this will increase the utility of a fundamental evaluation tool for an increasingly unpredictable future.

The workshop will start with classic approach to creating program logics, involving a series of interconnected ‘if’ ‘then’ propositions. From this foundation, other change theories will be introduced via three conceptual ‘lenses’, highlighting other components that are fundamentally important to achieving change, beyond what is typically captured in a programme logic:

  1. People – This section explores why and how recognising and understanding personal, group and broader social psychological states during change is important, and how to consider this when developing an evaluation approach and/or framework.
  2. Power – We delve into why systematically considering power and power dynamics of the individual, group and the level of intervention is important, and discuss different theoretical frameworks to do this.  
  3. Process – Using the lenses above we revisit and discuss how to enrich the program logic process. The Theory of Change is a great logical starting point.  But psychological state, timing, power, breadth and depth of influence are critical enablers to the progression of a ‘logical’ (predictable) change process.  

The workshop extends level 2 of the evaluator’s competency framework, 'Theoretical Foundations'. In doing so, it also touches on all the other levels of the framework. It is best suited to intermediate or advanced practitioners.

The workshop will be highly interactive to help participants understand the relevance of these additional lenses. This will primarily be through activities focused on applying them to an adapted real-life climate change project, which exemplifies the complexity evaluators are increasingly facing.  

Brian Keogh has over 20 years' experience in creating and evaluating programs and business models. His consultancy work ranges from facilitating the change required for effective strategic plans and business cases, to carrying out detailed impact, efficiency, and cost benefit evaluations. He is particularly interested in systems, systems change and the integration of strategy and evaluation. His early studies major concentrated on the work and methodologies of Saul Alinksy (American community activist), Margaret Barry (Waterloo Resident Action Group), and Charles Perkins (Indigenous civil rights activist). This focus was about how to empower disadvantaged communities to implement change. In his later MBA studies, he specialised in managing change - learning and understanding how power within organisations is created and exercised. He is one of the Systems Special Interest Group co-chairs and worked with Dr Ralph Renger through the mid-west of America learning Systems Evaluation Theory. Understanding systems and systems change is the foundation to all his work. His most recent work has been with the WaterFix Residential Program and Cooler Classrooms programs for the NSW Climate Change Fund, auditing the Sydney Drinking Water Catchment and evaluating the Cooks River Alliance of Councils. He has helped establish the evaluation frameworks for the Murray Darling Basin Authority, the NSW Office of Water and the Sydney Catchment Authority. In between this climate/water preoccupation, he has been working with performing arts venues like Melbourne's Arts House and the Sydney City Recital Hall in creating their strategic plans and solidifying how they would like to change the world.

Kara Scally-Irvine is a kiwi with over 20 years’ experience in research, monitoring, and evaluation. Kara’s academic background spans zoology, management, and psychology with a special interest in co-management. Her early studies signify a life-long recognition that addressing our collective problems requires  wholistic and joined up thinking and approaches. Her career started in science policy before she returned to complete her PhD. This is where Kara’s systems journey formally started. She joined what was at the time the only fully interdisciplinary schools in Australasia at the University of Queensland, and used systems thinking and interdisciplinary approaches to explore the factors that were preventing Integrated Conservation and Development Projects (ICDPs) from being successful. She wove system modelling with grounded theory and linked her findings back to the literature via social capital, power, and resilience theory. Since accidentally falling into the evaluation profession 12 years ago, she has worked in a diverse range of sectors, including science research and innovation, sport and recreation, education, and international development. In recent years she has been working in a capability support role for the New Zealand’s Crown Research Institutes to help embed evaluative thinking and practice into science and research to maximise research impact. Kara is an active member of the evaluation community. She currently sits on the AES Relationships subcommittee, New Zealand subcommittee, and co-convenes/chair the AES systems SIG. She is also Convenor (chair) of ANZEA, as part of her second term

 > register


Principles and practice of economic evaluation: lessons from health care and relevance to other sectors

presented by Michael Drummond     FULL DAY | CATEGORY: C

Economic evaluation is essential in measuring and understanding the costs and outcomes of public policy interventions, making better informed decisions about resource allocation and determining priority groups for inclusion in interventions. But as a specialised discipline within evaluation, it is not always well understood, which can lead to selecting program options with lower value propositions than alternatives, and missed opportunities to improve outcomes and systems.  

The purpose of this workshop is to provide a grounding in the key principles of economic evaluation, and explore the issues that can arise in the use of economic evaluation. 

The workshop will cover study design, the measurement of costs and benefits, discounting of future events and the characterization of uncertainty. Secondly, issues in the use of economic evaluation will be discussed, drawing on the presenter’s work in economic evaluation in healthcare policy and management. These will include in developing programs and guidelines, transferring economic evaluation results from one location to another, and using economic evaluation in determining priority groups for inclusion in interventions. 

Participants will be asked to work through case study examples from the health sector – where economic evaluators must negotiate a difficult trade off between public value and the widely held value of the ‘pricelessness’ of life – in both a high and lower income setting, and discuss their policy decision and justification, drawing on what they’ve learned in the workshop. 

Attendees are encouraged to bring examples from their own work for discussion, to enable them to better understand how to apply economic evaluation principles in their sector. 

The objective of the workshop is to support evaluation commissioners, program designers and decision makers to feel confident in their understanding of economic evaluation’s benefits and potential issues, and in applying economic evaluation reasoning in their work. It aims to support evaluators to deepen their understanding of how to apply insights from economic evaluation approaches in various settings, and to be able to better support optimal decisions through utilising economic data and reasoning.

Michael Drummond's main field of interest is in the economic evaluation of health care treatments and programmes.  He has undertaken evaluations in a wide range of medical fields including care of the elderly, neonatal intensive care, immunization programmes, services for people with AIDS, eye health care and pharmaceuticals. He is the author of two major textbooks and more than 700 scientific papers. He has been President of the International Society of Technology Assessment in Health Care, and the International Society for Pharmacoeconomics and Outcomes Research. In October 2010 he was made a member of the National Academy of Medicine in the USA. He has advised several governments on the assessment of health technologies and chaired one of the Guideline Review Panels for the National Institute for Health and Care Excellence (NICE) in the UK. He is currently Co-Editor-in-Chief of Value in Health and has been awarded three honorary doctorates, from City University (London), Erasmus University (Rotterdam) and the University of Lisbon. He was a member of the Steering Group for the 2020-22 NICE Methods Review. .

 > register


Measurement, evaluation and learning today

presented by Froukje Jongsma, Shani Rajendra    FULL DAY | CATEGORY: A, B

MEL is the process of measuring, evaluating, and learning about the progress and results of any change initiative. MEL works best when it is embedded into everyday ways of working and should commence at the start of an initiative and continue all the way through. MEL is practised by change makers (communities, program teams, commissioners, and funders) and supports them to surface and understand the impact that their initiative is having, while also providing evidence and insights to help them continually adapt their work to help achieve better outcomes. It guides change makers to:  

  • get clear and on the same page about the change they want to make and how to work towards it
  • work out what numbers and stories to collect to measure progress and results
  • draw out insights about what is working, what is not working, and why to adapt the work
  • set and answer the big questions about things like whether the program is worthwhile, whether it is equitable, etc.
  • incorporate the values of stakeholders into the judgement-making process 
  • powerfully report to your community and others about what you have learned and what difference you have made. 

In this workshop, we will introduce you to the principles of MEL and the components of building and implementing a MEL system.  We will work with a practical case study and small group discussions. We will also explore how MEL can work across different contexts, focussing on community engagement, collaboration and co-design. Beginners are welcome to this workshop.

Why MEL?

MEL is being increasingly adopted by community groups, not-for profits and governments both here in Australia and in the international development space. The strong focus on the L in MEL is particularly applicable to today’s context of complexity and crisis – the need to take an adaptive approach to social and environmental initiatives is critical. MEL is also increasingly being adopted in diverse cultural contexts and led by local teams. The ability of MEL to be adaptable, and led by local people is becoming critical under the move towards decolonisation and locally-led approaches.The slight shift in terminology from monitoring to measurement also marks the increasing use of digital approaches and public data sets and incorporates the move towards 'social impact measurement'. 

Froukje Jongsma is a Senior Consultant at Clear Horizon with 10+ years of experience in supporting teams and communities to develop effective MEL plans. She co-led the co-design process of Hands Up Mallee’s MEL Framework, winner of the 2022 SIMNA Award in the category ‘Outstanding Collaboration in Social Impact Measurement’. Other examples of her work include designing the MEL system and tools for the University of South Australia’s multiple award-winning Community Connect program together with students working with the Maningrida community in Northwest Arnhem Land to co-create a two-worlds MEL plan for the Maningrida Youth Strategy collaboration which brings Aboriginal and non-Aboriginal approaches to MEL together. Froukje has extensive experience supporting people and organisations to develop a deep understanding of the issues they are aiming to solve and translate their insights into impactful strategies, action, and results. A key highlight of her career has been co-leading the Connected Beginnings work in Galiwin’ku together with Yolŋu leaders.

Shani Rajendra is a Principal Consultant at Clear Horizon and has been working the field of evaluation for over five years. Shani has extensive experience in community-led initiatives, organisational strategy, and social enterprise. She specialises in incorporating design thinking into evaluative practice and primarily works in community-led or systems change interventions as well as in organisational strategy. Shani has co-developed over 20 MEL plans, including with diverse community groups and in complex settings. An experienced researcher and facilitator, Shani brings her appreciation for diversity (in thought and culture) and a commitment to collaboration to her work in design and evaluation.  She is most passionate about community-led change and is committed to supporting people to shift the systems that they are within.

 > register