California site visit is part of broader push to incorporate community priorities and formal evaluation into decision making
September 29, 2021 | Lisa A. Frazier | community engagement, evaluation
Center for HOPES evaluators Saira Nawaz and Anne Trinh recently returned from the west coast on a site visit to AltaMed’s vaccination outreach program. The aim of the program is to increase COVID-19 vaccinations among the Latinx community in southern California by bolstering a community-based workforce of health workers.
This work is funded by the Health Resources and Services Administration (HRSA) and led by AltaMed, the nation’s largest federally qualified health center. The Center for HOPES’s charge is to evaluate the program’s effectiveness at increasing vaccinations in the target communities. The site visit was written into our scope of work specifically to ensure that sites are collecting data required for the funder as well as information that is useful to AltaMed and its partner organizations in serving their communities.
Building relationships during site visits
Evaluation isn’t just about monitoring and measuring. It involves listening, learning, and offering support to community partners. Beyond fulfilling terms required by a granting organization, site visits provide the opportunity for program evaluators to build deeper relationships with local organizations and the people they serve. During their visit, Nawaz and Trinh heard from a broader group of leadership and frontline staff than they had been connected to virtually, and they were able to observe first-hand the need, implementation, and impact of AltaMed’s vaccination outreach efforts. (Photos courtesy Fourth Street Bridge Strategies)
While in Merced, CA, Trinh and Nawaz met with members of Golden Valley Health Centers Corporate. They discussed partner strategies for outreach and education, overcoming barriers for vaccine uptake, and supportive services for addressing those barriers. Being on site also allowed them to provide some additional technical assistance: “We were supposed to be training for data collection and evaluation but spent a lot of time supporting teams in finalizing their work plans for the grant,” says Trinh.
In San Ysidro (southern San Diego County), Nawaz and Trinh visited a vaccine clinic run out of San Ysidro Health Center’s mobile clinic. Adapting to the needs of the community, the clinic was transformed from providing general preventive health services to providing COVID vaccines exclusively. Trinh and Nawaz met with community health workers and learned about their workflow as they collected data required for HRSA; they also saw several community members come by to get vaccinated.
Community health workers employed by Latino Health Access, based in Santa Ana, conduct vaccine education outside a local elementary school. The workers dress up as COVID and syringes to draw the attention of kids and their parents during school pick-up. They talk to parents about the vaccine, answer their questions, and help them set up vaccination appointments.
Bridging research to practice through evaluation
Beyond fulfilling the terms of a grant, evaluation processes are an opportunity to communicate and integrate community priorities into program research and public health practice. Evaluations provide a rich source of feedback for community organizations about how their program operates and its impact on their communities, identifying both things that are working well and possible areas for improvement in design and implementation. For example, geo- and time-coded data can reveal areas of particularly high traffic for mobile vaccine clinics and suggest strategies for relocating units over time and space for greater impact.
Evaluations can also provide valuable feedback to researchers and granting organizations about what and how to measure program processes and outcomes. A common criticism from grantees is that evaluation requirements involve collecting and reporting data that they feel are not reflective of the essential elements of the program. Center for HOPES evaluators are trained in both the standard evaluation methods and measurements of major federal agencies and in community engagement techniques, and they see both as equally valuable and essential in carrying out program evaluations.
During the course of their work, Center evaluators thus work with community partners to identify criteria that the communities themselves feel have been missing from the standard menu. For example, AltaMed and its partners view measures of health equity, civic engagement, and social capital as essential to understanding the relative success of their outreach programming. Our evaluators are thus including such measures in their data collection on top of those required by HRSA.
Because they both reflect and provide information to grantor and grantee alike, program evaluations are thus a valuable mechanism for communication and adaptation at either end of the funding relationship. At the Center for HOPES, we view our evaluation services as a means of improving both accountability and trust among communities and funders.
Evaluation as a national health policy priority
Formal evaluation – of programs and of policies – has increasingly become a priority for funders and policymakers, including the federal government. The Biden administration recently launched evaluation.gov, a hub to learn about the federal evaluation community, plans, and activities. The initiative’s vision statement contends that, “[e]valuation plays a critical role in evidence-based policymaking, helping agencies determine what is and is not working well and answer questions regarding why, for whom, and under what circumstances.” Evaluation is thus critical to understanding how effective government and government-funded programs are at serving Americans.
The Center for HOPES commends federal investment in evaluation activities for the sake of evidence-based policymaking and accountability to the people. Because we have observed the additional value that evaluation can provide as a means of balancing communication and decision-making authority across funders and communities, we will continue to practice and advocate for community-engaged evaluation of health and social programming.