Evaluation Design

Are you writing a proposal or working on a project? Incorporating an evaluation into project design and implementation is essential for understanding how well the project achieves its goals. The Evaluation Planning and Pathway resources presented here provide guidance and tools to effectively design and carry out an evaluation of your project.

Evaluation design is broken down in to 5 steps, which you can explore in detail using NEO’s Evaluation Booklets. Basic steps and worksheets are presented in the tabs above. While you are developing your evaluation plan, it is important to consider the context of your program. The context can change how your evaluation is designed and implemented, because different settings and populations have different needs and abilities to participate in evaluations. Considering this, the 5 steps are applied across 4 common populations (K-12 Health, Rural Health, Race & Ethnicity, and LGBTQIA+ Health) in pathways.

Image of 5 steps to evaluation and integration of Pathways

Each pathway presents special considerations, resources, and tools for carrying out an evaluation of a project working with that population. You can explore the pathways embedded in the evaluation steps below or explore each pathway on its own using the menu buttons above. Each pathway includes a real-life example of a program creating an evaluation plan that walks step-by-step through the evaluation planning process. At the conclusion of each pathway is an example evaluation plan created using the 5 steps to evaluation.

Working on a project with a small award (less than $10,000)? Use the Small Awards Evaluation Toolkit and Evaluation Worksheets to assist you in determining key components for your evaluation!


5 Steps to an Evaluation

Main Step

The first step in designing your evaluation is a community assessment. A community assessment helps you determine the health information needs of the community, the community resources that would support your project, and information to guide you in your choice and design of outreach strategies. The community assessment phase is completed in three segments:

  1. Get organized
  2. Gather information
  3. Assemble, interpret, and act on your findings

Get Organized


Gather Information

  • This phase includes gathering data from different external and internal sources to inform the development of your program and evaluation plan.
  • Collect data about the community

Assemble, Interpret, and Act on your findings

Main Step 2

The second step in designing your evaluation is to make a logic model. The logic model is a helpful tool for planning a program, implementing a program, monitoring the progress of a program, and evaluating the success of a program.


Make a Logic Model

  • Consider how the program's logic model will assist in determining what is, or is not, working in the program's design to achieve the desired results.
  • Be open to new and alternative patterns in your logic model. Have stakeholders and community members review your logic model, paying particular care to how change occurs across the logic model. Listen to voices who can say whether the strategies are beneficial, and whether strategies could be successful.
  • How to create a logic model
    • Outcomes are results or benefits of your project – Why you are doing the project
      • Short-term outcomes such as changes in knowledge
      • Intermediate outcomes such as changes in behavior
      • Long-term outcomes such as changes in individuals’ health or medical access, social conditions, or population health
      • More information on outcomes
    • Blank Logic Model Template Worksheet: Logic Models

Tearless Logic Model


Knowledge, Attitudes, & Behaviors

  • Programming should consider improvement in the knowledge-attitude-behavior continuum among program participants as a measure of success. The process of influencing health behavior through information, followed by attitude changes and subsequent behavior change, should be documented in the logic model.
  • Focusing on behavior change is more likely to require Institutional Review Board (IRB) approval.
  • Knowledge Acquisition:
    • What is the program trying to teach or show program participants?
    • What understanding would a program participant gain over the course of the program? Often knowledge acquisition occurs as a short-term outcome of the program.
    • Be sure to examine not only what is learned, but where, when, and how program participants will learn it.
  • Attitude Change:
    • What mindsets or beliefs is the program seeking to build or change? Be sure to the consider cultural differences in attitudes in the community you are working with.
    • Are there misconceptions about the topic, and does that belief change after the program has been implemented?
    • To what extent do participants agree with statements related to the new material presented?
  • Behavior Change:
    • After some time has passed from implementation of the program, are the actions of participants different than what they presented before the program began?
    • Are the new behaviors in alignment with the expectations of the program?
  • Note that most NNLM projects focus on the dissemination of health information/health education and often do not take place over a long enough period of time to observe behavior change.
    • As such, examining/measuring behavior change may be out of the scope of the NNLM-funded project unless the projects runs for multiple cycles over an extended period of time.

Main Step 3

The third step in designing your evaluation is to select measurable indicators for outcomes in your logic model. Indicators are observable signs of your outcomes and will help you measure your achievement. Identify indicators that align to the outcomes in your logic model. Tracking these indicators over time will help you reflect upon your progress and help you collect and show results.


Measurable Indicators


Demographic Data

  • Collect data on demographics as part of your survey process. It is important to understand your participants' backgrounds and how that may affect their engagement with your program.
  • During analysis, compare differences in backgrounds against outcome results. If there are gaps identified in reaching certain populations, consider adjustments to program implementation to serve all participants equitably.
  • More information on collecting demographic data

Main Step

The fourth step in designing your evaluation is to create the evaluation plan. An evaluation plan describes how the project will be evaluated. It includes the description of the purpose of the evaluation, evaluation questions, timetable/work plan, as well as a description of the data collection tools to be used, an analysis framework, and a section articulating how data will be used and disseminated. An evaluation plan is often a key component of a grant proposal but will also serve as your guide for implementing the evaluation. This phase is completed in three segments:

  1. Defining evaluation questions
  2. Developing the evaluation design
  3. Conducting an ethical review

Defining Evaluation Questions

Evaluation questions help define what to measure and provide clarity and direction to the project.

Process evaluations and outcome evaluations are two common types of evaluations with different purposes. Consider which makes the most sense for you and the objectives of your evaluation (you DO NOT need to do both). Then explore the resources to inform your evaluation plan.

Process Questions - Are you doing what you said you'd do?

  • Process evaluation questions address program operations – the who, what, when, and how many related to program inputs, activities, and outputs.
  • The CDC recommends the following process to guide development of process evaluation questions that reflect the diversity of stakeholder perspectives and the program's most important information needs:


Outcome Questions - Are you accomplishing the WHY of what you wanted to do?

  • Outcome evaluation questions address the changes or impact seen as a result of program implementation.
  • Use the same CDC process for developing process evaluation questions to develop outcome evaluation questions.
    • Consider whether the impact assessed relates to short-term, intermediate, or long-term outcomes outlined in your logic model.
  • Outcome Objective Blank Worksheet (bottom section) Book 2 Worksheet: Outcome Objectives

Evaluation Design

  • Evaluation design influences the validity of the results.
  • Most NNLM grant projects will use a non-experimental or quasi-experimental design.
    • Quasi-experimental evaluations include surveys of a comparison group - individuals not participating in the program but with similar characteristics to the participants - to isolate the impact of the program from other external factors that may change attitudes or behaviors.
    • Non-experimental evaluations only survey program participants.
    • If comparison groups are part of your evaluation design, use a 'do no harm' approach that makes the program available to those in the comparison group after the evaluation period has ended.
  • More information on Evaluation Design

Ethical Considerations

Image of Belmont Report principles

Consider the Belmont Report principles to examine ethical considerations:

  • Respect for Persons: Protecting the autonomy of all people and treating them with courtesy and respect and allowing for informed consent
  • Beneficence: The philosophy of "Do no harm" while maximizing benefits for the research project and minimizing risks to the research subjects
  • Justice: Ensuring reasonable, non-exploitative, and well-considered procedures are administered fairly — the fair distribution of costs and benefits to potential research participants — and equally

Trauma-Informed Evaluation

  • Asking someone about trauma is asking that person to recall potentially difficult events from their past.
  • If absolutely necessary for the evaluation to ask questions about potentially traumatic events, incorporate a trauma-informed approach to collect data in a sensitive way.
  • Amherst H. Wilder Foundation Fact Sheet on Trauma-Informed Evaluation
  • More information on Trauma-Informed Evaluation

Ethical Review

  • The Institutional Review Board (IRB) is an administrative body established to protect the rights and welfare of human research subjects recruited to participate in research activities.
    • The IRB is charged with the responsibility of reviewing all research, prior to its initiation (whether funded or not), involving human participants.
    • The IRB is concerned with protecting the welfare, rights, and privacy of human subjects.
    • The IRB has authority to approve, disapprove, monitor, and require modifications in all research activities that fall within its jurisdiction as specified by both the federal regulations and institutional policy.
  • Click here for more information on the IRB, including contact information for your local IRB
  • Depending on the nature of the evaluation, the IRB may exempt a program from approval, but an initial review by the Board is recommended for all programs working with minors.

Jump to a Pathway