4 Comments

  • Mimi says:

    Hey Ann, I was relieved to read this entry!
    As you know, I’m managing a research project, and no, I’m not a data evaluation expert or even project design expert. But I sometimes become overwhelm by the sheer volume of data that the Principal Investigators want to collect, some of which are strongly advised against by fellow economists who have led randomized controlled trials. I have even been prepared to shoot down ideas of randomizing training among our data collectors and shopkeepers who are selling our products (luckily, I didn’t need to) because it’s a data point completely irrelevant to the actual research question.
    It’s comforting to know that there are some researchers who understand the value in thoughtful and focused data collection.

    • AnnKEmery says:

      Hi Mimi,
      I’m also relieved because I’m glad I’m not the only person who favors simple data systems! I’d love to hear more about your project. Perhaps a series of guest posts?
      Talk to you soon, Ann

  • […] Even then, I still wonder whether we’re simply collecting too much data… […]

  • […] I’ve certainly had similar experiences. Mostly, I’ve seen program staff get so excited about data that they want to collect more, and more, and then even more data. You can read about one of my experiences here. […]

  • Leave a Reply

    Your email address will not be published. Required fields are marked *

    Don’t Order All the Data on the Menu

    Updated on: Apr 18th, 2012
    Data Visualization
    ,
    A collage of laptops, clocks, calendars, and charts in Depict Data Studio's purple, blue, green, and fuchsia color palette.

    Several of the managers at my youth center are developing a training curriculum for staff. They just finished the pilot cohort with about 30 of our staff members, and we’ve had a lot of conversations about different ways to evaluate the trainings.

    Here are some ideas we’ve thrown around together:

    • Administering satisfaction surveys at the end of each session to help the instructors tweak the subsequent sessions
    • Administering a pre/post test at the beginning and end of the 8-session series to measure new knowledge gained from the training
    • Interviewing a couple of staff members throughout the series to better understand how their attitudes and/or their everyday work  might be changing little by little
    • Developing an observational rubric so that senior managers can visit the programs and rate the extent to which the training’s principles are being put into action (kind of like how principals visit classrooms and often use structured rubrics to assess teachers)
    • Holding a focus group at the end of each series (or at least with the pilot group) to get ideas for improving the trainings
    • Surveying or interviewing staff a few months after the training to see whether they’ve applied the information they’ve learned

    One of the best parts of my job is watching non-evaluators get excited about evaluation. These managers are so great because they’re supportive of data and want to use evaluation to really improve the trainings.

    After we discussed this “menu” of data collection options and how each method has different purposes, they asked if I had any final advice.

    I said, Make sure you don’t collect too much data. It’s great that you’re enthusiastic about evaluation, but 6 months from now, this could really backfire. You could be spending more time doing evaluation than doing the trainings. If the data isn’t telling you something useful, then just stop collecting that data, or don’t collect it in the first place if you don’t think it’s going to help you run better workshops.

    This is a data menu, after all. You’re not ordering all the data collection options on the menu. You’re just ordering one or two dishes that can give you the answers you’re the most hungry for.

    When I finished talking, there was silence for a few moments. Then I saw a smile. And then a few laughs! They seemed pleasantly surprised (and relieved!) to hear a data enthusiast like me talking about the value of a simplistic, useful evaluation system.

    Is this such a foreign idea? Do most evaluators follow a “more data is better” philosophy?

    More about Ann K. Emery
    Ann K. Emery is a sought-after speaker who is determined to get your data out of spreadsheets and into stakeholders’ hands. Each year, she leads more than 100 workshops, webinars, and keynotes for thousands of people around the globe. Her design consultancy also overhauls graphs, publications, and slideshows with the goal of making technical information easier to understand for non-technical audiences.

    4 Comments

  • Mimi says:

    Hey Ann, I was relieved to read this entry!
    As you know, I’m managing a research project, and no, I’m not a data evaluation expert or even project design expert. But I sometimes become overwhelm by the sheer volume of data that the Principal Investigators want to collect, some of which are strongly advised against by fellow economists who have led randomized controlled trials. I have even been prepared to shoot down ideas of randomizing training among our data collectors and shopkeepers who are selling our products (luckily, I didn’t need to) because it’s a data point completely irrelevant to the actual research question.
    It’s comforting to know that there are some researchers who understand the value in thoughtful and focused data collection.

    • AnnKEmery says:

      Hi Mimi,
      I’m also relieved because I’m glad I’m not the only person who favors simple data systems! I’d love to hear more about your project. Perhaps a series of guest posts?
      Talk to you soon, Ann

  • […] Even then, I still wonder whether we’re simply collecting too much data… […]

  • […] I’ve certainly had similar experiences. Mostly, I’ve seen program staff get so excited about data that they want to collect more, and more, and then even more data. You can read about one of my experiences here. […]

  • Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Might Like

    Our complimentary mini course for beginners to dataviz. Takes 45 minutes to complete.

    Enroll

    Inside our flagship dataviz course, you’ll learn software-agnostic skills that can (and should!) be applied to every software program. You’ll customize graphs for your audience, go beyond bar charts, and use accessible colors and text.

    Enroll

    Subscribe

    Not another fluffy newsletter. Get actionable tips, videos and strategies from Ann in your inbox.