ARCHIVE: H-PEA EVENTS
2020 Annual Conference and Pre-Conference Workshops
2019 Annual Conference and Pre-Conference Workshops
Keynote speaker - Nicole Bowman
Demonstrations, papers, posters, symposium and roundtable sessions
2018 Annual Conference and Pre-Conference Workshops
Keynote speaker - David Fetterman
Demonstrations, papers, posters, symposium and roundtable sessions
Conference evaluation report
2017 Annual Conference and Pre-Conference Workshops
2016 Annual Conference and Pre-Conference Workshops
2016 March Workshop: How can evaluative thinking and practice inform performance models for organizations and personnel?
In the world today there are many ways in which we measure, count and determine whether something is worth the effort or not. In Australia and many other countries, new government legislation is requiring government-funded entities to become more transparent in their practice and to develop a more cohesive narrative about the worth, or impact, for the betterment of society. This places the executives of such entities in a position of needing evaluative thinking and practice to guide how they may build the narrative that documents and demonstrates this type of impact. In thinking about where to start, executives, project and program managers may consider this workshop as a professional development opportunity to explore both the intended and unintended consequences of performance models as tools of evaluation.
This workshop will offer participants an opportunity to unpack the place of performance models as an evaluative tool through the following:
Dr. Lyn Alderman is the Associate Director, Academic Quality and Standards, Chancellery at Queensland University of Technology (QUT). She holds a PhD in evaluation and holds the positions of President, Australasian Evaluation Society and Editor, Evaluation Journal of Australasia. Dr. Alderman lead the implementation of two major evaluation systems for her university and has received a national award for QUT’s approach to standardized reporting where the evaluative framework then leads to curriculum conversations.
2015 Annual Conference and Pre-conference Workshops
2015 July-August Workshop Series: Qualitative Content Analysis using Microsoft Access
Content analysis stands at the intersection of qualitative and quantitative methods. It begins with qualitative data but codes the data for both quantitative and qualitative analysis. The workshop will use a method I have developed to code and manage data for qualitative content analysis using Microsoft Access, which is readily available in the pro versions of Microsoft Office for the PC. With this method you code the data from its original source INTO the database, as opposed to dedicated qualitative software in which you code ON data that is already in digital form. My method can be used with text or visual data, in any language that Microsoft can input and read. In the three sessions of this workshop you will learn how to make an Access database and data collection form that exactly fit the needs of your project, and you will do some initial coding. You will also learn to use Access queries for data cleaning and qualitative data analysis, and to export quantitative data from Access to Excel or SPSS. The workshop will be hands-on, using your own project data.
Dr. Patricia Steinhoff is Professor and Department Chair in the Department of Sociology at the University of Hawaii at Manoa. She is a Japan specialist, speaks and reads Japanese, and conducts research on Japanese society or Japanese Studies in the United States. Dr. Steinhoff's research involves the use of MS Access and she teaches a graduate course on qualitative content analysis using Access.
Qualitative data breathes life into our research by centering the voices of our participants. But how do we approach collecting and analyzing qualitative data in evaluation contexts? In this workshop, you'll have the opportunity to learn more about qualitative methodologies, engage in discussions on utilizing qualitative data in evaluation with colleagues from multiple fields, and apply your knowledge and skills to real qualitative data sets in a team-oriented setting.
Erin Kahunawaikaʻala Wright is from Kalihi, Oʻahu where she still resides. She serves as an assistant professor of Educational Administration at the University of Hawaiʻi at Mānoa's College of Education. Her research interests focus on Native Hawaiian and Indigenous student success in higher education.
Nalani Balutski is from Kahaluʻu, Oahu, and is a Research & Assessment specialist faculty member at the University of Hawaiʻi at Mānoa (UHM). She is the Principal Investigator for several federal grants that support Native Hawaiian higher educational access and success. Nalani is an Educational Administration doctoral student at UHM and previously served on the H-PEA board.
by Gina Cardazone & Ryan Tolman, University of Hawaiʻi at Mānoa
Social network analysis (SNA) can be a useful means of evaluating collaborations between individuals, organizations, or other entities. This hands-on demonstration of social network analysis will include an overview of SNA and examples of how SNA has been used in evaluation, as well as a guided introductory step-by-step “tour” of the SNA software program UCINET. The examples will highlight varied approaches to data collection and analysis, with an emphasis on how evaluation studies can benefit from SNA. The step-by-step guided tour will introduce attendees to the basics of preparing and importing data, quantifying and visualizing networks, and presenting their data. Attendees will also be given a brief overview of other SNA software options.
Please note: Attendees who wish to take part in the guided demonstration must bring their own PC laptop computers with UCINET software pre-installed. A 90-day free trial of UCINET is available for download at https://sites.google.com/site/ucinetsoftware/downloads
Evaluators can use Access to manage data—from data entry to creating reports. This two-day workshop will cover the basics of Access database tables and queries. We will guide you through the process by providing hands-on exercises that will give you the knowledge of creating tables and working with sample data to create common types of queries.
FACILITATOR: Jason Badua currently works at Hui Ho‘omalu, a program of Partners in Development Foundation, as a Quality Assurance & Improvement Specialist. He graduated from Hawai‘i Pacific University with a Bachelor’s in Business Administration. Jason uses Access to manage and analyze various forms of data in his line of work.Workshop Flyer
Learn the power of Excel’s vlookup (vertical look up) and pivot tables to efficiently manage and analyze data. We’ll walk you through real-life examples and then you will practice using a sample data set (or bring your own data set). We have tips for getting vlookup and pivot table to work on the first try!
FACILITATORS. Monica Stitt-Bergh and Nalani Balutski regularly use Excel to manage and analyze evaluation data. They’re self-taught Excel users and believe in sharing knowledge.
Monica Stitt-Bergh is an assistant specialist at the Assessment Office, University of Hawai‘i at Mānoa. She serves as a consultant for and offers workshops on program-level assessment of student learning and program evaluation. Her classroom experience includes teaching courses on written communication and social science research methodology. She has published articles and book chapters on writing program evaluation, self assessment, and writing-across the curriculum.
Nalani Balutski is the Research & Evaluation Coordinator for Kōkua a Puni Native Hawaiian Student Services with the Hawai‘inuiākea School of Hawaiian Knowledge, University of Hawai‘i at Mānoa. Her research involves Native Hawaiian students within the UH system and she has designed and conducted internal evaluation activities for Title III programs.
See our Resources page for a link to the National Science Foundation's 2010 User-Friendly Handbook for Project Evaluation.
Short presentation and talk story with Dr. Ernest R. House, a prominent evaluation theorist and practitioner well known for his attention to values and ethics in evaluation. Dr. House has graciously agreed to speak about his work on evaluation standards and how the concepts can be/are being applied or used by evaluators generally and in specific evaluation contexts.
Dr. House may be best known as author of classic works related to values and ethics in evaluation including Evaluating with Validity (1980), Professional Evaluation: Social Impact and Political Consequences (1993), and Values in Evaluation and Social Research (1999, co-authored with Kenneth Howe), and the chapter “Deliberative Democratic Evaluation (New Directions in Evaluation, Issue 85, 2000, co-authored with Kenneth Howe). Even those who have not read Dr. House’s work will recognize the influence of ideas he has championed on the practice of evaluation. For example, his work was foundational for the theme of the 2010 AEA conference which highlighted the three standards of quality identified by Dr. House in Evaluating with Validity -- Truth, Beauty, and Justice. His work is widely cited in the evaluation literature with recent examples found in the contexts of deliberate democratic evaluation, transformative research and ethics, as well as evaluation ethics in working with underserved communities and internationally. A brief bio of Dr. House is reproduced below.
Ernest R. House is Emeritus Professor in the School of Education at the University of Colorado at Boulder. His primary interests are evaluation and policy analysis. Previously, he was an associate professor at the Center for Instructional Research and Curriculum Evaluation (CIRCE) at the University of Illinois, Urbana-Champaign. He is the 1989 recipient of the Harold E. Lasswell Prize presented by Policy Sciences and the 1990 recipient of the Paul F. Lazarsfeld Award for Evaluation Theory, presented by the American Evaluation Association. He has authored numerous books and peer reviewed articles on evaluation and policy. He was editor of New Directions in Program Evaluation (1982 to 1985) and has served on the editorial board of several professional journals in evaluation. He has been a visiting scholar at UCLA, Harvard, and New Mexico, as well as in England, Australia, Spain, Sweden, Austria, and Chile. He has served on several advisory boards or committees related to STEM education including: evaluation advisory committee for the Office of Studies and Evaluation at NSF; evaluation advisory committee for the Statewide Systemic Initiatives at NSF; expert panel for review of federal education programs in STEM for the Federal Coordinating Council on Science, Technology and Education; and on the evaluation advisory board for the Education and Human Resources Directorate at NSF. The projects he has led include: audit of the Promotional Gates Program evaluation for the Mayor's Office in New York City (1981), assessment of environmental education policies in Europe for OECD (1992), National Center for Research on Evaluation, Standards and Student Testing (1990-1995) and a study of evaluation issues and policies in a large-scale organization for NSF (1991-1996). He received an A.B. in English at Washington University, an M.S. in Secondary Education at Southern Illinois University, and an Ed.D. at the University of Illinois. (source: http://www7.nationalacademies.org/bose/House%20Biosketch.html)
Papers and posters
Conference schedule and details
Evaluation report of the conference and workshops (PDF) Executive Summary only (PDF)
Papers and posters
Conference schedule and details
Hazel Symonette: PowerPoint and Excerpt from article: Cultivating Self as Responsive Instrument
Lois-ellin Datta: Keynote
Pre-conference workshop descriptions
- Ho‘omau I Nā ‘Öpio: Recognizing Youth Developmental Assets and Hawaiian Cultural Connectedness, Kathy Tibbetts
- The Five "R's" of Culturally Responsive Evaluation within a Native Hawaiian Context: Relationship, Rigor, Relevance, Resilience, and Responsibility, Anna Ah Sam, Herb Lee, Darlene Martin, & Verlie Ann Wright
- Addressing Data Collection Challenges in a Complex Community-Based Health Evaluation, Gina Cardazone, Landry Fukanaga, & Christy Nishita
Evaluation report of the conference and workshops
H-PEA sponsored lunch and networking with featured speaker Roger Chesswas, Director of Research and Chief Operations Officer for PREL (Pacific Resources for Education and Learning). Roger discussed the evaluation of Pacific-CHILD (Communities with High Standards in Literacy Development), a professional development program aimed at improving classroom practices and increasing student achievement.
- "Practical Program Evaluation: A Program Theory Approach"(Steward Donaldson & Christina Christie)
- "Squishy and Marvin, Same Old/Really Different, 21 Dead Babies, and Other Adventures in Evaluationland" (Lois-ellin Datta)
- "Shifting Evaluation Theory Toward Empiricism: Helping to Ensure Credible and Useful Evaluation" (Christina Christie)
- "What Counts as Credible Evidence in Contemporary Evaluation Practice?" (Steward Donaldson)
Keynote Address by Lois-ellin Datta. Fusion Cuisine: Luncheon Reflections on the Future of Evaluation in the Hawai'i`i-Pacific Rim