question mark

Your Evaluation Questions Answered: Part 3

In April, I was invited to participate in the American Heart Association’s Tobacco Endgame Center for Organizing and Engagement’s Affinity call to share ideas about how evaluation can inform and support project decisions and actions. Or as Paul Knepprath, the Tobacco Endgame Center’s project director, put it: How to use evaluation to win campaigns. 

Although the presentation was not recorded, we thought it would be beneficial for you to see the questions participants had and our responses. So, we’ve been sharing them in a series of articles in the TCEC newsletter. This last installment is Part III. You can read Part I and Part II here.

If you have evaluation questions that you’d like answered, contact our friendly evaluation associates at tcecTA@phmail.ucdavis.edu for a speedy response!

Questions about Training

How can we get data collectors to understand the importance of training and collecting high-quality data?

Every training should start by explaining the need and purpose of the data and how it will be used so that data collectors understand that they are a part of important work that depends on high quality data. Go over the entire instrument and protocol. Provide ample time to practice. Assess data collector accuracy and readiness. Don’t assume that because they are adults or in tobacco control that they automatically understand the instrument.

Along the way and at the end, talk to your team about the data collection process. Their experiences can make for great stories at coalition meetings, city council hearings, on social media, and future data collection trainings.

What is the best way to condense training for youth partners on evaluation data recording?

If anything, training needs to be expanded, not condensed, with youth. They usually need training on the topic content first to become familiar with the issue. Then they need training on the data collection instrument and protocol. They need ample practice time, assessment of accuracy and readiness, and monitoring in the field to ensure quality data. For that reason, it may make sense to break the training up into chunks over several days if possible.

How do we ensure quality control on surveys run by volunteers or subcontractors?

Your Project Director or Internal Evaluation Project Manager (IEPM) is responsible for reviewing and monitoring every aspect of evaluation work. That means ensuring that data collection starts with a carefully designed instrument and protocol; leading or overseeing data collector training and incorporating opportunities to observe and assess data collector readiness (that measures everyone’s accuracy and adherence to the protocol); setting quality standards and accuracy rates as part of the subcontractor’s contract; and doing initial and periodic field checks of data collector accuracy (by shadowing them and checking their coding or results against your own). Each RFA specifies the required evaluation oversight the project must provide.

See the California Tobacco Control Evaluation Guide and TCEC's resources on Data Collector Training.

Questions about Sharing Results 

What are some best practices/resources/trainings for data visualization?

For some basics on creating data visualizations, see TCEC’s page on Data Visualization. 

If you're interested in a deeper dive into data visualization best practices, we recommend these resources: 

What is the best way to use or share the data that we collect with decision makers?

While it can depend on their individual informational preferences, usually decision makers are busy people with little time to read through detailed reports. Instead, they want easily digestible chunks of information in the form of a brief summary of the findings and their implications for decision making such as a 1-2 page fact sheet with data visualizations and bullets that summarize key messages and make actionable recommendations. Be sure to cite data sources though or even attach the full report as an appendix in case they or their staff want to scrutinize the info.

See the California Tobacco Control Evaluation Guide.

Miscellaneous Topics

What method would be the best for evaluating community engagement?  

It is hard to know from this question what kind of community engagement you are thinking of. 

If you are talking about community involvement in your coalition or volunteering on activities, tracking the diversity of who is participating can be useful to understand where gaps exist and which populations you might want to recruit further. Projects working on asset objectives often want to build skills and capacity within communities, so tracking the type of activities people are involved in, doing knowledge testing and education/participant surveys, and monitoring how participants apply new skills (e.g., serve as spokesperson, letter writing, giving testimonials, etc.) helps assess the transfer of knowledge or skills. The most common method for assessing the community is to conduct public opinion surveys to gauge awareness and support of tobacco control issues as well as measuring how these issues affect lives.

Can evaluation be used to recruit coalition members or other community partners?

Almost any evaluation results can be used to recruit people to the cause.  When people see how a problem affects them/their community, they get mad and want to join the effort to make a difference. So documenting the scope of the problem and/or comparing it to other communities can spur people to work for change. Showing whose voices are NOT at the table (at the coalition or involved in the effort) might also encourage some to engage.

See the California Tobacco Control Evaluation Guide sections on Data Sharing and Reciprocity.

How do you find a good evaluator?

Ask projects that understand how to use evaluation about their evaluator. Interview potential evaluators and ask about their processes, availability, and, if they have other CTCP-funded projects, how they will manage the demands of multiple projects with similar deadlines (e.g., around progress reporting time). Look at work product created by potential evaluators (scores and feedback on final evaluation reports, data collection instruments they have created from scratch, evaluation activity summaries, and workplans they helped develop) and make sure these are clear and logical.

For guidance and questions to ask, see TCEC's resources on Working with Evaluators.

Explore qualifications in CTCP’s evaluator directory (password required).

See also the California Tobacco Control Evaluation Guide.

What tobacco evaluation resources do we have for ages 5-11?

Because CTCP-funded projects focus on policy, environmental, and systems (PSE) change, they do not typically work with children. Instead, they aim to educate, engage, and empower voting-age adults who will have power and sway over policymakers. They also recruit and train high school aged teens to get on the bandwagon so they’re ready to go at 18! For students younger than middle school age, the Tobacco Use Prevention Education programs handle in-school education.

What if the order of activities doesn’t make sense? How do we change that?

You can talk to your evaluator or contact the Tobacco Control Evaluation Center to explore ideas for what order would make the most sense. However, you must seek approval from your Program Consultant at CTCP first before making any changes to your workplan. Except for activities with specified due dates in your LLA Guidelines or Request for Applications, flexibility tends to be granted when the logic for a proposed change makes sense. See your specific procurement documents for more information about required activities and deadlines.

See the California Tobacco Control Evaluation Guide.

How much latitude do evaluators have to do evaluation activities that are not standard CTCP activities?

It depends. In workplans, CTCP likes to see innovation, but only if activities make logical sense and move the objective forward. If you can make a good case for the value and the purpose for an activity, Program Consultants will often make accommodations or substitutions for non-required evaluation activities within the allowable terms of your contract or procurement. Feel free to connect with TCEC first – we can advise you about the feasibility of your idea, help you fine tune it, or provide you with additional credibility. 

See the California Tobacco Control Evaluation Guide— in particular the link to Activity 15: "Other—Not Listed."

Primary Category