Conducting participatory data analysis in evaluations of government programs
Stream: Government and Public Policy
Thursday, October 24, 2024
2:50 PM - 2:55 PM PST
Location: E147-148
Abstract Information: Participatory data analysis is an evaluation methodology that brings together a group of impacted individuals―such as program participants―to collaboratively interpret data, uncover meaning, and inform evaluation recommendations. The benefits of participatory data analysis sessions include ensuring individuals impacted by a program have a voice in the evaluation, integrating multiple perspectives into data analysis and interpretation, and building trust and buy-in of evaluation findings from program leadership. Summit is currently conducting an evaluation of a federal agency’s internal program (while the evaluation will be complete at the time of presentation, this presentation will not identify the agency). As part of this mixed-methods evaluation, Summit is conducting two participatory data analysis sessions focused on several evaluation tasks. This presentation will focus on the benefits and challenges of conducting participatory data analysis sessions with federal employees or other government program staff. While the use of this methodology with federal employees has many benefits (as described above), there are also challenges. For example, federal staff have demanding jobs supporting the American public by achieving their agency’s mission; it can be difficult to solicit engagement in such a session, which requires time spent reviewing materials in advance of a 2- to 3-hour (or longer) discussion. Additionally, the staff most relevant to the topics these sessions will discuss may not be data scientists or evaluators; while they bring subject-matter background, they may not be familiar with conducting data analysis and drawing meaning out of large data summaries. Based on the participatory data analysis sessions Summit will be conducting as part of this evaluation, this presentation will discuss the challenges experienced, how we mitigated each challenge, lessons learned to inform the design of future sessions, and the benefits to both participants and the evaluation team stemming from successful completion of these two sessions.