• Home
  • 2012 Pre-Conference Workshops

2012 Pre-Conference Workshops

Thursday, September 6, 2012, Koʻolau Ballrooms, Kāneʻohe


A. Practical Program Design: Principles and Tools for Evaluators (Session 1)

by John Gargani and Stewart Donaldson

9:00am-12:00 pm, Thursday, September 6, 2012

This hands-on workshop consists of two half-day sessions. In the first session, participants will learn concrete tools they can use to design new programs or strengthen the designs of existing programs. In the second session, participants will apply these tools to their own programs—new or existing—and produce a complete “design sketch.”

Please note that attendance in the second session requires attendance in the first. However, participants may choose to attend the first session only.

Participants may want to attend the workshop in teams that collaborate on the development, management, and/or evaluation of the same program.

By the end of the workshop, participants will be able to use concrete tools to design new programs and improve the designs of existing programs. The workshop is organized around the five elements of a program design. Participants will learn about each element and how it relates to program theory and social science research. We will outline some common modes of program failure and how specific features of a program’s design may help prevent them. Participants will reinforce what they learn by working in small groups to complete hands-on activities which are taken from evaluation practice.


B. Practical Program Design: Principles and Tools for Evaluators (Session 2)

by John Gargani and Stewart Donaldson

1:30-4:30 pm, Thursday, September 6, 2012


This session will provide participants with an opportunity to apply what they learned in the first session to the designs of their own programs. Participants may design a new program from scratch or work to improve the design of an existing program. With the assistance of the instructors, participants will produce a comprehensive “design sketch” that includes all five elements of a program design. The session will conclude with a “studio walkthrough” in which participants describe their designs and receive constructive feedback from attendees and the instructors. Participants can choose to craft their designs individually or in teams. The design studio may be of particular interest to teams of professionals who are developing, managing, and/or evaluating the same program.

 


John Gargani is the President and Founder of Gargani + Company, Inc., a program design and evaluation firm located in Berkeley, California.


He helps clients—nonprofit organizations, foundations, corporations, and government agencies—achieve their social missions.


Over the past 20 years, he has designed innovative programs and curricula; directed randomized trials of educational reforms; developed new reading, writing, science, and math assessments; and created novel technologies that measure how people think. His work has taken him to diverse settings, including public housing projects, museums, countries adopting free market economies, and 19th century sailing ships.


He shares his knowledge of program design and evaluation in his blog, published articles, workshops, and speaking engagements.


He holds a Ph.D. in Education from UC Berkeley, where he studied measurement and evaluation; an M.S. in Statistics from New York University’s Stern School of Business; and an M.B.A. from the University of Pennsylvania’s Wharton School of Business.


 

Stewart I. Donaldson is Professor and Chair of Psychology, Director of the Institute of Organizational and Program Evaluation Research, and Dean of the School of Behavioral and Organizational Sciences at Claremont Graduate University. Dr. Donaldson is also currently the Director of the American Evaluation Association’s (AEA) Graduate Education Diversity Internship (GEDI) Program and is serving a 3-year elected term on the AEA Board. 


He leads the Certificate for the Advanced Study of Evaluation Program at Claremont (a distance education program for working professionals), has taught numerous university courses and professional development workshops, and mentored and coached more than 100 graduate students and working professionals during the past two decades. He has also provided applied research and evaluation services to more than 100 different organizations. 


He is a fellow of the Western Psychological Association, and is on the Editorial Boards of the American Journal of Evaluation, New Directions for Evaluation, Evaluation and Program Planning, the Journal of Multidisciplinary Evaluation, and the SAGE Research Methods Online. Professor Stewart Donaldson has authored or co-authored more than 200 evaluation reports, scientific journal articles, and chapters and his recent books include: Social Psychology and Evaluation (2011); Advancing Validity in Outcome Evaluation: Theory and Practice (2011); Applied Positive Psychology: Improving Everyday Life, Health, Schools, Work, and Society (2011); Teaching Psychology Online (in press); Emerging Practices in Development Evaluation (forthcoming); The Future of Evaluation in Society: A Tribute to Michael Scriven (forthcoming); What Counts as Credible Evidence in Applied Research and Evaluation Practice? (2008); Program Theory-Driven Evaluation Science: Strategies and Applications (2007); Applied Psychology: New Frontiers and Rewarding Careers (2006); and Evaluating Social Programs and Problems: Visions for the New Millennium (2003). 


Dr. Donaldson has been honored with Early Career Achievement Awards from the Western Psychological Association and the American Evaluation Association.


C. Survey Boot Camp: Maximize the Value of Your Surveys

by Katherine Tibbetts and Jim Dannemiller

1:30-4:30 pm, Thursday, September 6, 2012

Whether you write surveys yourself, direct others to write them, or just read the ones that other people write, this seminar will provide the understanding it takes to design survey content that works.


Surveys have always been our “go-to” tool for getting information from people. Survey Monkey has put that tool in the hands of anyone with a laptop. The number of surveys out there has skyrocketed but the quality and effectiveness of surveys have not kept pace. If you want to know how you can make your surveys stand out from the competition, accurately measure attitudes and behaviors, and produce survey results that work, then this workshop is for you!


Participants in this workshop will get hands-on experience developing surveys from the ground up – designing an inquiry, writing effective questions, choosing the most effective response options, optimizing the flow. The workshop is designed for beginner-to-intermediate level survey developers and users.


What you will gain from this workshop:

  • Tools that walk you through the survey design process
  • Guidelines for writing items that
    • reduce unnecessary respondent burden and
    • increase survey completions
  • Design tips to improve quality of responses
  • Methods to increase response rates
  • Lunch and an afternoon snack

 

Katherine Tibbetts conducts educational program evaluation and research at Kamehameha Schools. Her current role at KS includes technical support to KS program staff for program monitoring and evaluation and research related to the well-being of Native Hawaiians. Kathy has taught educational research methods, classroom assessment, survey research, and evaluation methods courses at the University of Hawaiʻi at Mānoa.



 




Jim Dannemiller is President of SMS Research. He began his career with nine years of writing surveys for the Survey Research Office at the University of Hawaiʻi and then transferred to SMS. In addition to his job, Jim has taught part time for most of his career. He has taught courses in survey design, marketing research, communications research, and statistics at the University of Hawaiʻi at Mānoa, Chaminade University, and at Hawaii Pacific University for 15 years. He has experience in all aspects of surveys research including design, instrumentation, sampling data collection, data processing, analysis and reporting. Lest he become a dull boy, Jim enjoys Hawaiian music, genealogy, cooking, collecting fountain pens, and Wahine volleyball.


Contact Us


Hawai'i-Pacific Evaluation Association

P.O. Box 283232, Honolulu, HI 96828


H-PEA is a tax-exempt charitable organization under Section 501(c)(3) of the Internal Revenue Code and is eligible to receive tax-deductible contributions.


Copyright 2016 - Hawai'i-Pacific Evaluation Association


Powered by Wild Apricot Membership Software