5 Comments

  • At my job, it’s relatively easy. We have a internal committee of the organization’s leadership that reviews evaluation findings, as well as a board that specifically looks at evaluation findings and implications for the agency.
    There has to be a culture of evaluation that is encouraged and cultivated for this to take place in my opinion.

  • Jonathan O. says:

    Karen,
    I agree that a culture of evaluation, or at least performance measurement, must be in place for findings to gain traction. Are the evaluators included in your organization’s internal committee and board meetings when problems or findings are being addressed? Such collaboration may help to minimize any misinterpretation of data by a less technical audience. This didn’t quite happen at my previous organization (caveat: I was external) and concerned me over the direction “the evaluated” took after the evaluation.
    Best

  • Jonathan, my very practical advice would be to try to use as much of your evaluation findings as you can in every meeting you attend or every interaction you have. The formal meetings and presentations are great, but I have found the best way to get people to actually use evaluation findings is to present them when they are most relevant to an ongoing conversation.
    For example, in an old position, during a meeting with senior staff to plan out programming for the next year, I presented evaluation findings that showed which programs were getting good outcomes, and which were underperforming. That information really shifted the the tone of the meeting from ‘how to we plan to continue these programs’ to ‘how can we improve these programs.’
    Timing is very critical – you could have the best evaluation data in the world, but if you present it after decisions have been made, they the data won’t get used. You need to think of your evaluation findings (and other sources of data) as resources for others, and you need to be ready to share those resources at any time when they can help to inform a conversation or decision.

  • Jonathan,
    I think it really depends on the organization. I’ve worked as an internal evaluator, and been part of the management team that made key decisions. Our data were brought to bear on lots of major decisions, and when they were in the planning stages we were often asked to present data to help determine what direction to go in. But this is a bit different than really using the findings to make the more critical decisions, such as whether to continue funding a program.
    I’ve also had clients as an external evaluator who made us a part of their team, invited us to all key meetings, gave us access to staff and kept the information loop flowing at all times. I’ve of course also had clients who took the final report and never looked at it again.
    So I guess I would say that if the organizational culture is one that values data-driven decision making, it may not matter if you are internal or external.
    Taj

  • Great question, and some great comments. I think it is really important to get to grips with this issue – cause change is often so hard to achieve, but suggesting changes is such a key part of our role.
    I am taking more and more notice in my work of where organisations are in in terms of being a “learning organisation”. A couple of question about how past evaluations have been used is usually a good indicator about where you are going to be at i.e. applying that old adage .. past behaviour being the best predictor of future behaviour etc. This helps me get my bearings about what is going to be possible.
    I am also tending to focus on (to use a term from training evaluation literature) barriers to transfer. That is, working with the program team to identify what change is actually “actionable”. vs. wishful thinking. That process of working with the program team, from presenting findings to creating actionable recommendations creates buyin and acceptance to the evaluation. A great start to effective change.
    Coming back to the OP. “… My experience as an external evaluator witnessed the final report being the absolute end product – whether or not the client used the recommendations or had working groups around the evaluation report were beyond our involvement….” Maybe our responsibility needs to be wider, to evaluation being part of a process of change as opposed to producing products?

  • Leave a Reply

    Your email address will not be published. Required fields are marked *

    How can findings influence decision-making? [Guest post by Jonathan O’Reilly]

    Updated on: Aug 21st, 2012
    Data Visualization
    A collage of a calendar, paperclip, papers, and a presentation board.

    The dusty shelf report: The best way to keep evaluation results far, far, away from decision-making

    The dusty shelf report: The best way to keep evaluation results far, far, away from decision-making

    After spending the past couple years as an internal evaluator, I decided to start addressing other internal evaluators’ questions, comments, and concerns. I’ll be sharing their questions on my blog, connecting them to other evaluators, and offering advice from my own experiences with internal evaluation.

    Here’s a question from Jonathan O’Reilly, my friend from the Washington Evaluators. Jonathan recently accepted an internal evaluation position with the Circuit Court for Prince George’s County in Maryland. He writes:

    “I’d like to know more about internal evaluators’ experience with translating research to practice. My experience as an external evaluator witnessed the final report being the absolute end product – whether or not the client used the recommendations or had working groups around the evaluation report were beyond our involvement. In my new position, I am more optimistic about my evaluation findings being used to effect change.

    As an internal evaluator my vision is to call together a working group to present evaluation findings and start the conversation about modifying procedures where necessary. What has your experience as an internal evaluator been with getting research findings to be a key part of administrative decision-making?”

    Do you have any advice for this internal evaluator? Please share the good karma below.

    More about Jonathan O’Reilly
    In my 10 years of public service, my efforts have gone towards creating insights thru research and data analytics. As a trained analyst and SAS certified programmer, I am an extremely detail-oriented and structured thinker. Yet I strive to bring granular findings up to a “big picture” level in order to inform business and policy decisions. As a leader, I aim to lead by example in terms of work ethic and values. Accuracy, timeliness, and actionability are my own professional values when it comes to creating research and analytics products. I consider myself lucky to be in a position to lead a team that together can accomplish more than I alone could ever achieve.

    5 Comments

  • At my job, it’s relatively easy. We have a internal committee of the organization’s leadership that reviews evaluation findings, as well as a board that specifically looks at evaluation findings and implications for the agency.
    There has to be a culture of evaluation that is encouraged and cultivated for this to take place in my opinion.

  • Jonathan O. says:

    Karen,
    I agree that a culture of evaluation, or at least performance measurement, must be in place for findings to gain traction. Are the evaluators included in your organization’s internal committee and board meetings when problems or findings are being addressed? Such collaboration may help to minimize any misinterpretation of data by a less technical audience. This didn’t quite happen at my previous organization (caveat: I was external) and concerned me over the direction “the evaluated” took after the evaluation.
    Best

  • Jonathan, my very practical advice would be to try to use as much of your evaluation findings as you can in every meeting you attend or every interaction you have. The formal meetings and presentations are great, but I have found the best way to get people to actually use evaluation findings is to present them when they are most relevant to an ongoing conversation.
    For example, in an old position, during a meeting with senior staff to plan out programming for the next year, I presented evaluation findings that showed which programs were getting good outcomes, and which were underperforming. That information really shifted the the tone of the meeting from ‘how to we plan to continue these programs’ to ‘how can we improve these programs.’
    Timing is very critical – you could have the best evaluation data in the world, but if you present it after decisions have been made, they the data won’t get used. You need to think of your evaluation findings (and other sources of data) as resources for others, and you need to be ready to share those resources at any time when they can help to inform a conversation or decision.

  • Jonathan,
    I think it really depends on the organization. I’ve worked as an internal evaluator, and been part of the management team that made key decisions. Our data were brought to bear on lots of major decisions, and when they were in the planning stages we were often asked to present data to help determine what direction to go in. But this is a bit different than really using the findings to make the more critical decisions, such as whether to continue funding a program.
    I’ve also had clients as an external evaluator who made us a part of their team, invited us to all key meetings, gave us access to staff and kept the information loop flowing at all times. I’ve of course also had clients who took the final report and never looked at it again.
    So I guess I would say that if the organizational culture is one that values data-driven decision making, it may not matter if you are internal or external.
    Taj

  • Great question, and some great comments. I think it is really important to get to grips with this issue – cause change is often so hard to achieve, but suggesting changes is such a key part of our role.
    I am taking more and more notice in my work of where organisations are in in terms of being a “learning organisation”. A couple of question about how past evaluations have been used is usually a good indicator about where you are going to be at i.e. applying that old adage .. past behaviour being the best predictor of future behaviour etc. This helps me get my bearings about what is going to be possible.
    I am also tending to focus on (to use a term from training evaluation literature) barriers to transfer. That is, working with the program team to identify what change is actually “actionable”. vs. wishful thinking. That process of working with the program team, from presenting findings to creating actionable recommendations creates buyin and acceptance to the evaluation. A great start to effective change.
    Coming back to the OP. “… My experience as an external evaluator witnessed the final report being the absolute end product – whether or not the client used the recommendations or had working groups around the evaluation report were beyond our involvement….” Maybe our responsibility needs to be wider, to evaluation being part of a process of change as opposed to producing products?

  • Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Might Like

    Our complimentary mini course for beginners to dataviz. Takes 45 minutes to complete.

    Enroll

    4 Types of Maps: Pin Drops, Heat Maps, Tile Grids, and Overlays

    Which type of map will best show the number of course participants from each country?? In this lesson with Sue Griffey, you’ll see the pros and cons of 3 free options, made with Excel and Google.

    More »

    Inside our flagship dataviz course, you’ll learn software-agnostic skills that can (and should!) be applied to every software program. You’ll customize graphs for your audience, go beyond bar charts, and use accessible colors and text.

    Enroll

    Subscribe

    Not another fluffy newsletter. Get actionable tips, videos and strategies from Ann in your inbox.