Session: Innovations in Evaluation Theory, Methods and Models
Three dimensions of process in collaborative approaches to evaluation: How do they play out in practice?
Stream: Evaluation Foundations and Methodology
Thursday, October 24, 2024
4:15 PM - 4:30 PM PST
Location: C125-126
Abstract Information: Understanding participatory and collaborative approaches to evaluation (CAE) has great potential to accommodate contextual and cultural complexity and can help to integrate different ways of knowing into the evaluation process. Twenty-five years ago, Cousins and Whitmore (1998) published a chapter in New Directions for Evaluation that described practical and transformative streams of participatory and collaborative approaches. The chapter also highlighted three fundamental process dimensions; each can be conceptualized as a sematic differential scale: (i) control of evaluation decision making (evaluator-led vs program community-led), (ii) diversity of program community participation (primary users vs diverse community participation), and (iii) depth of program community participation (consultative input vs comprehensive engagement with the evaluation). Over the years this chapter has received and continues to receive global recognition (hundreds of citations, award/recognition). In this paper, I tracked the application of the three-dimensional process framework. Through systematic search protocols, I identified 50 peer reviewed empirical articles published between 2000 and 2024 that reference the dimensions of process. Half of the studies were from the US, 25% from Canada and 25% from Europe, Australasia, and other jurisdictions. From this sample, I culled a set of 15 case applications of CAE as the base sample for analysis. The questions guiding my analysis are: 1. To what extent can the case applications be categorized as either practical participatory evaluation (P-PE), or transformative participatory evaluation (T-PE)? 2. How are case applications profiled based on semantic differential scores on each of the three dimensions of process? 3. Do process profiles differentiate P-PE vs T-PE applications? If so, how? In the analysis I am using Dedoose, a mixed method data analysis software package, including an inter-rater agreement protocol to assure coding validity. The empirical case applications were coded according to goals (PPE, TPE), 3 dimensions of form, design and methods, and results. Not all of the articles were readily codable for all codes, but a number sufficient for the present purposes were. Preliminary results confirm some expectations with regard to process profiles. Case applications that were decidedly practical in orientation (PPE) tended to exhibit balanced control, limits on the number of program community member participants, and intermediate-level engagement with the evaluation process by the participants. On the other hand, articles coded as T-PE were more likely to involve a wider array of program community members. These preliminary findings will continue to undergo analysis leading up to the development of a full paper prior to the conference. The paper contributes a unique perspective on the conference theme which features the engagement of new and emerging evaluators. Evaluation capacity building (ECB) as we know it can be either direct (e.g., courses, degree programs, workshops, professional development experiences) or indirect, through participation in authentic CAE projects. The ongoing analysis of CAE case applications in this project will reveal whose voices are being heard, how, and to what end. Cousins, J. B., & Whitmore, E. (1998). Framing participatory evaluation. In E. Whitmore (Ed.), Understanding and practicing participatory evaluation. New Directions in Evaluation, No. 80 (pp. 3-23). Jossey Bass.