I met Audrey Loper in 2017 while teaching a data visualization workshop to her grantees. Her data visualization progress after that workshop has been phenomenal. Keep up the good work, Audrey!

A few years ago, I found myself in a frustrating predicament. I was writing dozens of evaluation reports for my grantees. Next to none of them were being read. I scheduled webinars, conference calls and face-to-face meetings to talk through the results. Each time, it was more or less the same: I was verbally summarizing a report no one had read. I assumed folks were too busy, or that they didn’t really care that much about the results. Then I went to one of Ann’s data visualization workshops and had a heart-to-heart with myself. The problem wasn’t my grantees busy schedules or lack of interest in outcomes – it was the report.

Before

I was sending my sites six pages of mind-numbingly boring text, accompanied by a few yawn-worthy tables and graphs:

I was sending my sites six pages of mind-numbingly boring text, accompanied by a few yawn-worthy tables and graphs.

I was making my readers work way too hard. Instead of reporting agency outcomes front and center on page one, I was expecting folks to wade through six pages to find them, each on a separate page. Instead of titling my table and figures in a way that drew attention to the take-home message, I was using titles like ‘Figure 1. Percent of Respondents Who Correctly Answered Questions on Sexual Knowledge.’ Instead of using color for emphasis, I was using color with the enthusiasm of a toddler. It was time for a reboot.

After

I started out by talking to my grantees. I wanted to make sure that this time around, they got something they would actually read. And hopefully use. My sites told me that my suspicions about what was wrong with the reports were absolutely right. I wasn’t making it easy for them to find what they needed, and for the few who actually read the report, they were having to repackage what I sent them to share with their stakeholders.

Based on Ann’s data viz best practices and the feedback from the grantees, I made myself the following to do list:

  1. Don’t bury the lead. Start off with the important stuff, then follow with details.
  2. Limit yourself to one page (front and back).
  3. No blocks of narrative text!
  4. Use color for emphasis.

Here’s what I came up with:

Based on Ann’s data viz best practices and the feedback from the grantees, I made myself the following to do list: 1) Don’t bury the lead. Start off with the important stuff, then follow with details. 2) 3) Limit yourself to one page (front and back). No blocks of narrative text! 4) Use color for emphasis.

On page one, after the title, I listed all three outcome objectives, the target and actual values, and add a check mark to clearly identify which had been met. The green in the title matches the green in the arrow.  The dumbbell plots that follow clearly show pretest and posttest values for each objective if folks want to dig deeper.

On page one, after the title, I listed all three outcome objectives, the target and actual values, and add a check mark to clearly identify which had been met. The green in the title matches the green in the arrow.  The dumbbell plots that follow clearly show pretest and posttest values for each objective if folks want to dig deeper.

Page two is reserved for data about program implementation and demographics. (In my previous report, these hogged all the space on page one.)  I designed a snazzy new graphic to depict whether or not the grantee met the target number of youth at the correct dosage level. For the demographics, I created side-by-side bar charts so readers could more easily compare the age, gender and race/ethnicity of the two groups.

Page two is reserved for data about program implementation and demographics. (In my previous report, these hogged all the space on page one.)  I designed a snazzy new graphic to depict whether or not the grantee met the target number of youth at the correct dosage level. For the demographics, I created side-by-side bar charts so readers could more easily compare the age, gender and race/ethnicity of the two groups.

It’s not perfect, but it’s better. Much better. I pilot tested the changes with a few sites (the lucky few who actually read the six page report), got their feedback, and made a few more tweaks. The response was overwhelmingly positive: Grantees could easily find their results, make sense of them, and communicate them with their stakeholders. What more could a girl ask for?

 

Audrey Loper's headshotAudrey Loper has worked in public health for the past 15 years, with a focus on data and evaluation. In her most recent position, she served as the Evaluation Consultant for North Carolina’s Teen Pregnancy Prevention Initiatives. It was during this time that Audrey got hooked on data viz best practices. She now spends a great deal of time steering her friends away from pie charts. She currently works as an Implementation Specialist at the National Implementation Research Network, which is housed at UNC-Chapel Hill’s Frank Porter Graham Child Development Institute.