After writing outcome measures, you must collect data to determine whether or not your program is meeting its established goals. AWA created a concise OM chart, as shown in last week’s blog article. Now it must gather the right mix of information to show evidence that its onboarding program is performing against those outcome targets. In this final article in the Brighter Strategies Performance Series, we’ll show you how AWA collected and analyzed data, and then used that data to report program results.

6. Determine appropriate data collection processes, and collect data.

As you will learn when you embark on your own OM process, there’s a lot of data out there! It’s important to choose the right types of data for your unique organization and particular program. Qualitative (“soft”) data deals with descriptions and can be observed, while quantitative (“hard”) data deals with numbers and definitions and can be measured.

AWA used the following sources for data collection:

  • Organization records: show historical data of onboarding program’s past performance
  • Stakeholder surveys: report past and current new employee satisfaction with orientation
  • Objective observation of participants: reveal program participants’ knowledge, skills, abilities, and behaviors on a qualitative level from perspective of outside observer
  • Interviews: reveal above-mentioned qualitative data with an added perspective from individual participants

7. Analyze data to better understand program achievements.

After collecting data, you must interpret what it is saying about your outcomes. The data should be totaled for the program as a whole, as well as organized per specific participant sub-groups, as appropriate. AWA organized its data into the following categories:

  • Onboarding program location
  • Employee lifecycle duration of each participant
  • Team or department of each participant

The key to making data meaningful is uncovering comparative conclusions and showing trends over time. AWA completed the following steps to analyze its OM data:

  • Compared results of data collection to individual outcomes’ targets for each indicator
  • Compared data to that of benchmarked competitors, as available
  • Identified differences between major sub-categories
  • Determined the beginning data points for tracking trends over time (this was the first round of OM reporting)
  • Identified any outlier data points that seemed to represent unusually positive or negative outcomes (for example, a new employee left the organization a week after accepting his job because of a family tragedy)

One of the most important insights to remember about data as you go through this process in your own organization: Data is not always right, and it’s OK for you to question it. However, make inferences and draw conclusions by looking at multiple data points, rather than focusing too heavily on single data points.

8. Report data findings in a user-friendly format.

Data collection and analysis is only as good as your data reporting allows it to be. Presentation is critical in making the information you gathered useful. The scope of information included in the report depends on to whom you are reporting (for example senior leaders versus program staff).

For its data report, AWA used the below template to present findings to its Board of Directors – the group that originally requested the OM performance process. The charts and tables created throughout the OM process are especially helpful for concise and visually appealing reporting.

Sample Evaluation Report Template

1.     Title page: name of the organization, name of the program, and date

2.     Table of contents (outlines the below components)

3.     Description of the outcome measurement process, including the plan, team members, and timeline

4.     Overview of the program being evaluated, including its history, mission, participants, and supplementary logic model (if applicable)

5.     List of identified outcomes, indicators, and targets

6.     Description of data collection methodology, types of data collected, and processes used to collect and analyze data

7.     Summary of findings from data analysis, including interpretations and caveats

8.     List of recommendations, such as decisions that must be made about the program, immediate focus needs, and outcome measurement process revisions for the future
*Possible appendices: organization/program assessment results, such as completed 7-S framework analysis, logic model, chart of outcomes and indicators, and data summaries

9. Use the findings and take action to improve the quality of the program.

The final stage of outcome measurement involves using all that you have done up until this moment to achieve the process’s initial goal: improve the program! AWA created the below Action Planning Chart for this purpose. It is populated with specific steps for the first outcome defined in last week’s article.

Action Planning Chart






Onboarding will teach staff about AWA’s unique organizational vision. Begin to assimilate new employees into AWA culture on Day One. Trainers will teach AWA’s mission, vision and values. Trainers will distribute printed and digital copies of AWA’s mission, vision, and values. Employees will recite from memory AWA’s mission, vision, and values. The onboarding program team 30 days after participants begin the program
Orientation training will effectively prepare staff to complete their job tasks.
New employee attrition will remain low as a result of employee onboarding.


For practice, and to continue the conversation around outcome measurement, we encourage you to complete the chart for AWA’s second and third outcomes. Use the comment box below to state your suggested objective, actions, responsibility, and timeline.

We at Brighter Strategies are happy to talk to you more outcome measurement and performance improvement strategies, as well as the specific opportunities and challenges your nonprofit is facing. Please contact us at or 703-224-8100 for more information.