2015 conference schedule

2015 Pre-Conference Workshops

Thursday, September 10, 2015, Koʻolau Ballrooms, Kāneʻohe

Register Now


A. Actionable Evaluation: Evaluative Reasoning and Practical Methodology

Presenter: Jane Davidson

9:00 am - 4:00 pm, Thursday, September 10, 2015 (lunch provided, 12:00-1:00 pm)

Cost: $115.00

This workshop is the antidote to evaluations that get lost in indicators, metrics, observations, and stories. If evaluation feels more like a measurement or opinion-gathering exercise; if you wonder if there’s something clearer and more valuable you could be delivering for clients – or commissioning, if you are a client – then this workshop is for you. It will show you how to keep your eye on the big picture; ask the most important questions; and deliver well-reasoned answers that people can clearly understand and use to inform action.

The focus is on how we use evaluative reasoning to ask and answer big-picture questions about the quality, value, and importance of program design, implementation, outcomes, and overall value. We make the evaluative reasoning systematic and transparent through the development and application of an evaluation-specific methodology – rubrics. The emphasis is on asking the right high-level evaluative questions and generating clear, direct answers that are both well-reasoned and well-evidenced.

You will learn:
  • How to write a set of big-picture overarching questions to guide the evaluation;
  • How to develop and use rubrics to get clear, defensible answers to these questions;
  • How to respond appropriately to simplistic indicator-based thinking and accusations of subjectivity;
  • Tips for evaluation conceptualization and reporting to help you deliver clear, to-the-point, and actionable evaluation.

This workshop combines interactive mini lectures, whole-group discussions, and small group exercises. It is not a research methods workshop.

The target audience is less seasoned or even beginner evaluators and evaluation clients who may or may not be familiar with applied research methods (not a pre-requisite) but who have the niggling feeling there’s “something more” they need in their toolkits, something simple but not simplistic.

 

Hailing from Aotearoa/New Zealand and with a doctorate from CGU (California), Jane runs her own evaluation consulting firm, serving central government and other clients across multiple sectors. Her work includes evaluation capacity building, training and development, strategic evaluation advice and support, collaborative and participatory evaluation, as well as independent evaluations.

Jane is a winner of AEA's Marcia Guttentag Promising New Evaluator Award, co-editor of the Journal of Multidisciplinary Evaluation, and former Associate Director of The Evaluation Center at Western Michigan University and Director of WMU's Interdisciplinary Ph.D. in Evaluation.

She is author of Evaluation Methodology Basics: The Nuts and Bolts of Sound Evaluation (SAGE, 2004), which was recommended by AEA President Debra Rog in her Evaluation 2009 presidential address and is widely used internationally by both practitioners and graduate students.

Jane has presented numerous keynote addresses and professional development workshops internationally, including those for the American Evaluation Association, the UK Evaluation Society, the Evaluator’s Institute (U.S.A.), the Aotearoa New Zealand Evaluation Association, and the University of South Africa.

Visit Jane's website: http://realevaluation.com/


Jane Davidson  

B. From Idea to Impact: Developing a Meaningful Logic Model

Presenter: Jack Barile

9:00 am - 12:00 pm, Thursday, September 10, 2015 (lunch provided, 12:00-1:00 pm)

Cost: $65.00; register for the afternoon session too and receive a discount!

Goals: Participants will learn
  • why creating and maintaining a logic model is so important.
  • the fundamentals of creating a logic model from scratch.
  • how to identify common barriers to using a logic model.
  • how to communicate logic model components to diverse audiences.

Logic models are a graphical representation of the relationships between program settings, activities, outcomes and intended impacts. Logic models were first conceived in the late 70s (Wholey, 1979) but were rarely used in system level planning and evaluation prior to the late 90s (Julian, 1997; Julian, Jones, & Deyo, 1995). Despite their rather slow integration into system-level evaluations, logic models can and likely should be used as the basis for large and small program evaluation activities. Logic models provide programs the opportunity to communicate a unified set of goals and objectives that can be understood by all stakeholders. Specifically, logic models provide organizations an opportunity to articulate the need for specific programs, describe the setting in which they operate, identify how programmatic activities work towards their goals and objectives, and ultimately impact their community. Moreover, logic models allow stakeholders the opportunity to ensure that the aims of the program align with the mission of the organization.

Format: Interactive with several hands-on activities.

Target audience: Beginners and those who would like to refresh their understanding of how to develop and discuss logic models with stakeholders; evaluators and program managers.

Agenda:
  • Introduction and objectives
  • The logic of a logic model
  • Common formats and presentation (with activity)
  • Steps in constructing a logic model
  • Constructing basic logic models for your own organizations (with activity)
  • Identifying cracks in the logic (with activity)
  • Communicating ideas effectively

Jack Barile joined the faculty at the University of Hawaiʻi at Mānoa in the Fall of 2012 after a two-year post-doctoral fellowship at the US Centers for the Disease Control and Prevention. He is trained as a community psychologist and is interested in conditions and services that impact health and well-being. Over the past ten years, Jack has conducted process and outcome-focused evaluations for outpatient mental health programs, housing services, afterschool programs, and community collaboratives. Jack currently works with local and national non-profits on identifying ways to best utilize data currently on hand and developing sustainable evaluation methods for on-going programmatic feedback.

Visit Jack's website: http://www.jackbarile.com/

 

Register Now

 

Jack Barile

C. From Idea to Impact: Putting Logic Models to Work

Presenter: Jack Barile

1:00 pm - 4:00 pm, Thursday, September 10, 2015 (lunch provided, 12:00-1:00 pm)

Cost: $65.00; register for the morning session too and receive a discount!

Goals: Participants will learn

  • how to match program monitoring and measurement activities to short and long-term goals.
  • how logic models and measurement activities can be used to answer research questions.
  • how to use their findings to inform stakeholders, make programmatic decisions, and make revisions to their logic model.

Logic models can and should be used as the basis for program monitoring and measurement activities. Often times, logic models, program monitor activities and evaluator research questions are conceived and carried out as separate endeavors. This workshop aims to demonstrate how to connect these components.

Format: Interactive with several hands-on activities.

Target audience: Those who are familiar with logic model basics; those interested in learning ways to use logic models; evaluators and program managers.

Agenda:

  • Matching logic model components to a measurement plan (with activity)
  • Matching a measurement plan to process and outcome oriented research questions (with activity)
  • Communicating your findings to stakeholders
  • Making programmatic decisions and revising your logic model (with activity)


See Jack's bio above.

 
Visit Jack's website: http://www.jackbarile.com/
   

 

updated 05/18/2015