Log in

Hawai'i - Pacific Evaluation Association

Log in

Register Now!

A. Presenting Data Effectively by Stephanie Evergreen - SOLD OUT

Presenter: Stephanie Evergreen

9:00 am - 4:00 pm, Thursday, September 8, 2016 (lunch provided, 12:00-1:00 pm)

Cost: $115.00


Crystal clear graphs, slides, and reports are valuable – they save an audience’s mental energies, keep a reader engaged, and make you look smart. In this workshop, attendees will learn the science behind presenting data effectively and will leave with direct, pointed changes that can be immediately administered to significantly increase impact. The workshop will address principles of data visualization, report, and slideshow design that support legibility, comprehension, and retention of our data in the minds of our clients. Together we will focus on how to make visual sense of data and distribute it in a way that readers will discuss, remember, and love. This distribution includes dashboards, infographics, and simple interactive games. Grounded in visual processing theory, the principles will enhance attendees’ ability to communicate more effectively with peers, colleagues, and stakeholders.

Attendees will receive loads of handouts and materials exclusively available to workshop participants that will guide you through the tried-and-true process of developing visuals that teach, engage, and make you look like a rockstar.

Dr. Stephanie Evergreen is an internationally-recognized speaker, designer, and researcher. She is best known for bringing a research-based approach to helping researcher better communicate their work through more effective graphs, slides, and reports. She holds a PhD from Western Michigan University in interdisciplinary evaluation, which included a dissertation on the extent of graphic design use in written research reporting. Dr. Evergreen has trained researchers worldwide through keynote presentations and workshops, for clients including Time, Verizon, Head Start, American Institutes for Research, Rockefeller Foundation, Brookings Institute, and the United Nations. She is the 2015 recipient of the American Evaluation Association’s Guttentag award, given for notable accomplishments early in a career. Dr. Evergreen is co-editor and co-author of two issues of New Directions for Evaluation on data visualization. She writes a popular blog on data presentation at Her book, Presenting Data Effectively: Communicating Your Findings for Maximum Impact, was published by Sage in Fall 2013 and was #1 in Social Science Research on Amazon in the US and UK for several weeks. Her second book, Effective Data Visualization, will be published in Spring 2016.

B. Using a Validity Argument to Plan Better Surveys

Presenter: George Harrison

9:00 am - 12:00 pm, Thursday, September 8, 2016 (lunch provided, 12:00-1:00 pm)

Cost: $65.00; register for the afternoon session too and receive a discount!


Freakonomics author Stephen Dubner characterizes survey responses as the lowest form of data. The biggest problem is that self-report data are prone to biases we seldom encounter in other data collection methods such as assessments or direct observations. Another problem is that we often rush through the survey development and assume we have a good instrument. Surveys are not going away, though. They address attitudes, perceptions, and behaviors that are not easily measured using other methods. The question is, how can we strengthen the claims we make based on survey responses? The answer is simple: We present a validity argument. A validity argument substantiates your claims about the interpretations of the survey data by drawing from multiple sources of evidence. More importantly, planning a validity argument can help you plan your survey. By working backwards, we anticipate weaknesses in the argument, consider our resources and stakes, and focus our survey development to make the strongest case possible.

Validity argument evidence can include data from content experts, cognitive interviews, factor analysis, correlations with other variables, and unintended consequences. But not all of these are necessary for every survey. Decisions about which types of evidence to include depend on the nature of the constructs you are measuring, the feasibility in collecting the evidence, and on the stakes of the use of the surveys. Using a graphical Toulmin argument model with knowledge of types of validity evidence and a description of the stakes and feasibility, participants will be invited to develop a skeleton of their own validity argument. By planning a validity argument, participants will gain insight into ways they can improve their instrument development and strengthen the credibility of their survey results.

Participants will

  • gain an understanding of what validity is,
  • learn about common threats to validity in self-report questionnaires,
  • learn what types of evidence can constitute a validity argument,
  • develop a graphical Toulmin argument model for their own survey project, and
  • discuss ways that a survey project can be planned so that validity can be strengthened.

Level: Intermediate. Participants with experience in writing survey questions and an understanding of how surveys are used in evaluation will gain the most insight.

Dr. George Harrison is an assistant professor of evaluation in Curriculum Research & Development Group, University of Hawai‘i at Mānoa, where he also serves as a cooperating faculty in the department of Educational Psychology and teaches courses in assessment, statistics, and survey research. He specializes in instrument development for use in evaluating educational programs and in conducting research on evaluation. His recent work has been in measuring ‘non-cognitive’ skills such as metacognition. He has published articles on assessment and evaluation in measurement and evaluation journals.

C. Focus Groups 101: Reading Between the Numbers

Presenter: Marissa Vasquez Urias and Ana Bravo

1:00 pm - 4:00 pm, Thursday, September 8, 2016 (lunch provided, 12:00-1:00 pm)

Cost: $65.00; register for the morning session too and receive a discount!

What stories do your survey results tell you? Quantitative methods may answer the What, Where, When, How questions your organization asks, but how do you obtain the Why? Focus group interviews reveal the stories surrounding a similar topic, issue, or experience that quantitative methods may not reveal. This insight is rich and valuable data needed to deliver thoughtful and optimal interventions and services to our students or clients. However, using focus groups for obtaining data requires deliberate planning, execution, and analysis. This workshop will provide you knowledge and skills to effectively utilize focus group interviews as a method to supplement your current methods of evaluation. By the end of this workshop, you will have developed a preliminary focus group plan from your organization’s existing quantitative data.

Participants will

  • understand how data can inform praxis and decision making
  • design an effective focus group protocol
  • practice focus group facilitation

Participants are encouraged to bring (or have online access to) a sample survey result or quantitative data from their own organization.

Format: Interactive with hands-on training on designing and conducting focus groups.

Target audience: Beginners to intermediate

Marissa Vasquez Urias, Ed.D., is an assistant professor in the Department of Administration, Rehabilitation, and Postsecondary Education (ARPE) at San Diego State University (SDSU).

As the Associate Director of the Minority Male Community College Collaborative (M2C3) at SDSU and a Faculty Affiliate with Project MALES at UT Austin, Marissa is actively engaged in critical and applied research that challenges structural and cultural praxis that prevent equitable outcomes for historically underserved students in higher education. In particular, Marissa’s work examines factors that facilitate the success of male students of color, particularly Latino and African American men, in community colleges. As a scholar-practitioner, Marissa also serves as the Managing Editor for the Journal of Applied Research in Community College (JARCC) and as a Board Member for the Council for the Study of Community Colleges (CSCC).

Marissa earned an associate degree from Southwestern College in San Diego, CA. She then earned a bachelor’s degree in English from the University of California, Berkeley; a master’s degree in Counseling with a specialization in College Counseling and Student Development from the University of San Diego; and an Ed.D. in Educational Leadership from San Diego State University.

Ana Bravo, M.A., is a Counseling faculty for the First Year Experience (FYE) program at Kapi‘olani Community College. In addition to assisting students in their transition to college, she also serves as an Assessment Coach for Student Affairs and the Counseling Assessment Coordinator. Her assessment experience ranges from academic affairs to student services, including course assessment, accreditation, high school outreach, college prep, career exploration, federally & state-funded student services programs, student veterans, student leadership, and community organizations. Her passion lies in improving the access, retention, and success of underrepresented and marginalized students.

She holds B.A. degrees in Psychology and Sociology from University of California, Los Angeles (UCLA), M.A. in Counseling with a specialization in College Counseling and Student Development from University of San Diego, and is finishing her Ed.D. in Educational Leadership from San Diego State University. Using student voices, her dissertation research explores the community college experience of Filipino American students in Hawai‘i.

updated 05/03/2016

Contact Us

Hawai'i-Pacific Evaluation Association

P.O. Box 283232, Honolulu, HI 96828

H-PEA is a tax-exempt charitable organization under Section 501(c)(3) of the Internal Revenue Code and is eligible to receive tax-deductible contributions.

Powered by Wild Apricot Membership Software