H-PEA Summer Workshop: Become a Data Visualization Pro!

A hands-on training to make data visualizations in Microsoft Excel. 

Thursday, June 8. 3:00 at Hālau ʻĪnana (2482 S. Beretania St.)



09/21/2017 8:00 AM • Ko'olau Ballrooms

2017 H-PEA Conference

September 21 and 22, 2017

Ko'olau Ballrooms in Kane'ohe


We are happy to announce to have 

Michael Quinn Patton 

as our keynote speaker!

Michael Quinn Patton has more than 40 years of experience conducting applied research and program evaluation; he works as an independent consultant. 


H-PEA invites you to participate in one or more of the following opportunities at the annual conference:

  • Demonstration
  • Roundtable (Work-in-progress/Issue & discussion)
  • Paper Presentation
  • Symposium
  • Poster session

To meet demand for hands-on presentations we will be holding separate sessions for demonstrations and roundtable discussions and encourage proposals for these types of presentations.

Learn more and submit a Proposal using the link to the Proposal Submission Form! 

This form requires a summary, title, abstract, theme, intended audience and relevance statement.


Workshops on Thursday, September 21st

  One 3-hour (half-day) Workshop $70.00

  Two 3-hour (full-day) Workshops $120.00

     Important Dates:

  • May 18th (5pm HST): Proposal submission deadline
  • June 1st (5pm HST): Notification of acceptance
  • Sept. 1st (5pm HST): Early-bird registration deadline 
  • Sept. 1st (5pm HST): Last day to get 100% refund on registration
  • Sept. 16th (5pm HST): Last day to get 50% refund on registration; no refunds past this date
  • Workshops: Thursday, Sept. 21, 2017, 9:00am-4:00pm
  • Conference: Friday, Sept. 22, 2017, 8:00am-4:00pm

H-PEA Best Poster Awards In Honor of Lois-ellin Datta  

First place:      $50

Second place: $30

Third place:     $20

Conference attendees will have the opportunity to evaluate and score posters that will be on display throughout the day. 

Please check out the poster post on evergreendata!


H-PEA 2017 Conference Workshops

Development Evaluation

Michael Quinn Patton

Thursday, September 21st, 9am-12pm

Developmental evaluation provides evaluative information and feedback to social innovators, and their funders and supporters, to inform adaptive development of change initiatives in complex dynamic environments. Developmental evaluation brings innovation and adaptation the processes of asking evaluative questions, applying evaluation logic, and gathering and reporting evaluative data, to inform and support the development of innovative projects, programs, initiatives, products, organizations, and/or systems change efforts with timely feedback.

Participants will learn:

  • The five types of Developmental Evaluation (DE)
  • When DE is appropriate
  • The 8 DE principles
  • Strengths and weaknesses of the DE approach
  • The niche of DE

Principles-focused Evaluation

Michael Quinn Patton

Thursday, September 21st, 1pm-4pm

Evidence about program effectiveness involves systematically gathering and carefully analyzing data about the extent to which observed outcomes can be attributed to a program’s interventions. It is useful to distinguish three types of evidence-based conclusions:

  1. Single evidence-based program. Rigorous and credible summative evaluation of a single program provides evidence for the effectiveness of that program and only that program.
  2. Evidence-based model. Systematic meta-analysis (statistical aggregation) of the results of several programs all implementing the same model in a high-fidelity, standardized, and replicable manner, and evaluated with randomized controlled trials (ideally), to determine overall effectiveness of the model. This is the basis for claims that a model is a “best practice.”
  3. Evidence-based principles. Synthesis of case studies, including both processes and outcomes, of a group of diverse programs or interventions all adhering to the same principles but each adapting those principles to its own particular target population within its own context. If the findings show that the principles have been implemented systematically, and analysis connects implementation of the principles with desired outcomes through detailed and in-depth contribution analysis, the conclusion can be drawn that the practitioners are following effective evidence-based principles.
  4. Principles-focused evaluation treats principles as the intervention and unit of analysis, and designs an evaluation to assess both implementation and consequences of principles.  Principles-focused evaluation is a specific application of developmental evaluation because principles are the appropriate way to take action in complex dynamic systems.  This workshop will be the worldwide premiere of principles-focused evaluation training.  Specific examples and methods will be part of the training.

Participants will learn:

  • What constitutes a principle that can be evaluated
  • How and why principles should be evaluated
  • Different kinds of principles-focused evaluation
  • The relationship between complexity and principles
  • The particular challenges, strengths, and weaknesses of principles-focused evaluation.

Michael Quinn Patton

Former President of the American Evaluation Association; recipient of both the Alva and Gunnar Myrdal Award from the Evaluation Research Society for "outstanding contributions to evaluation use and practice" and the Paul F. Lazarsfeld Award for lifetime contributions to evaluation theory from the American Evaluation Association.  Author of eight evaluation books including 4th editions of Utilization-Focused Evaluation (2008) and Qualitative Research and Evaluation Methods (2015), books have been used in over 500 universities worldwide.

Questions about this workshop may be addressed to mqpatton@prodigy.net.



Quasi-experimental Designs: When Experimental Designs Are Not Good Enough

John P. Barile

Thursday, September 21st, 9am-12pm

Evaluators often pursue experimental designs due to perceived superiority but these approaches are often inappropriate and can result in misleading conclusions. This workshop will provide an overview of quasi-experimental techniques, provide examples of when each technique is most appropriate, and provide a tutorial on how to conduct propensity score matching for the purposes of program evaluation.

Propensity score matching a specific approach to conducting a quasi-experimental evaluation. Propensity score matching is used to create pairings between individuals, classrooms, communities that received an intervention to individuals, classrooms or communities that did not receive an intervention. Propensity score matching techniques utilize data regarding why individuals self-select into a program in order to match them to statistically similar individuals who did not receive the program. This workshop will show how one-to-one and full matching (matching on more than one case) can be conducted using SPSS and R statistical packages.

Participants will:

  • Learn about different quasi-experimental approaches to conducting program evaluations
  • Learn when each approach is most appropriate given available data and program characteristics
  • Learn how to conduct propensity score matching with data in SPSS and R
  • Learn how matched data can be used to determine the impact of a program

John (Jack) P. Barile

Jack Barile is an assistant professor at the University of Hawai‘i at Mānoa in the Department of Psychology. Jack's research concerns ecological determinants of health-related quality of life and program evaluation. This line of research includes the study of individual and neighborhood-level factors associated with social disadvantage and well-being. Prior to coming to UH Mānoa, Jack served as research fellow at the US Centers for Disease Control and Prevention.


Social Action Evaluation Using Photography: From Needs Assessment and Program Development to Implementation and Evaluation

Jacqueline Ng-Osorio & Anna Smith Pruitt

Thursday, September 21st, 1pm-4pm

“A picture is worth a thousand words,” so goes the old adage. Able to convey complex notions in a succinct form, photos can be powerful and persuasive tools. Photovoice, a participatory research methodology, uses photography, critical analysis, and group discussion to capture the perspectives and experiences of marginalized people in an effort to give voice to underserved communities and populations. Participants become active researchers at each stage of the research process, providing insight through their analysis of their photos and assisting in the dissemination of findings. This method is especially useful when evaluating programs that work with marginalized groups, youth, and indigenous populations. By giving program participants and stakeholders a voice in the evaluation, Photovoice can lead to more accurate accounts of people’s experiences with the program, resulting in richer and more valuable data. Although particularly beneficial when conducting evaluations for programs with marginalized groups, Photovoice also can be useful when engaging various stakeholders at different stages of evaluation – from engagement, needs assessment, and program implementation to process and outcome evaluations. This workshop will provide hands-on instruction for using Photovoice in various evaluation projects, with an emphasis on the ultimate goal of Photovoice: to achieve social action through the dissemination of the results.

Participants will:

  • Learn how Photovoice has been used in evaluations and research that work with different underserved communities and populations;
  • Learn how different populations can be engaged through community-based participatory action research (CB/PAR) to achieve project and evaluation goals;
  • Practice analyzing photos in a Photovoice format; and
  • Explore how social action may be facilitated through Photovoice projects.

Participants are encouraged to bring a picture answering the question: 

“What does evaluation look like for me/my organization?”

Format: Interactive with hands-on training in Photovoice

Target audience: Beginners to Intermediate

Jacqueline Ng-Osorio

Dr. Ng-Osorio is responsible for the assessments and evaluation for both the Department of Nursing and Department of DentalHygiene. She also works with graduate ‘IKE AO PONO students as an advisor. Prior to coming to UH Mānoa Nursing and Dental Hygiene, Dr. Ng-Osorio worked as a researcher at Kamehameha Schools focusing on the K-12 internal and external programs. Her research focuses on Native Hawaiian adolescents, healthy lifestyle behaviors including physical activity and eating habits, as well as the relationship between education and health outcomes. She also provides training and workshops on the qualitative methodology Photovoice.

Anna Smith Pruitt

Anna Smith Pruitt is a doctoral candidate in the Community & Cultural Psychology program at the University of Hawai‘i at Mānoa. Her research focuses on the ways in which individuals and communities impact and are impacted by environmental and contextual factors (including historical, cultural, and social factors). By taking an ecological and feminist intersectional approach, her research attempts to quantify and explain the ways in which individual and contextual factors interact to impact community and individual health and quality of life. In particular, her research interests include university-community partnerships, developing methodologies that capture context, innovative qualitative methods (e.g., PhotoVOICE), and the interaction between research, evaluation, and social policy. Her research has practical application with an ultimate goal of social justice and equitable distribution of resources.



See Event Photos

Contact Us

Hawai'i-Pacific Evaluation Association

P.O. Box 283232, Honolulu, HI 96828

H-PEA is a tax-exempt charitable organization under Section 501(c)(3) of the Internal Revenue Code and is eligible to receive tax-deductible contributions.

Copyright 2016 - Hawai'i-Pacific Evaluation Association

Powered by Wild Apricot Membership Software