6 Comments

  • Elizabeth, evaluation apprehension is definitely real, and it becomes more pronounced the closer you get the front-lines or ground level staff.
    In situations where I think people will be uncomfortable with evaluation work, I stay away from jargon or any terms that may be laden with negative connotations. Never say ‘evaluation’ – say you are here to try to make things better. Never say ‘data’, say ‘information’. Just being aware of your language and the words you use can really make a difference in overcoming fears of evaluation.
    Depending on your personality, you could also use emotional appeals or humor to help with the situation. Appealing to people’s desire to help others is something I’ve done in the past – so evaluation isn’t a negative activity that will force change, but rather it is a positive activity that will improve the lives of others. I’d also recommend being sympathetic to the burdens evaluation will place on people – don’t ignore these – but explain that they are necessary steps to making sure you can improve things for service recipients.

  • Elizabeth, it’s great that you’ve had this insight so early in your career! I’ve learned that I need to explain very early and very clearly that my work has nothing to do with personnel evaluation or finances. I’m not an auditor and I don’t hire or fire anyone. When front line staff see an evaluator walk in with a laptop or notepad, the first thing they assume is that I am evaluating THEM, not the program. If I can explain otherwise it’s much easier to help staff relax and look on the process as one of helping the program improve.

  • I agree with Isaac about staying away from jargon, and Maria about stressing that it’s not about individual job performance. I have worked on a few process/implementation evaluations of programs that are in early stages of development, where frontline staff can feel especially put on the spot by evaluation–they feel that evaluation means they are expected to produce significant positive results, when they are still trying to figure out how to do the program. I try to emphasize the different stages of eval, and that an important aspect of the eval process and goals is to help others (including the evaluators) learn about how the program works–the program staff are the experts on that, not me. For evals that are more outcome/impact focused, I try to make time at the beginning of the process for the frontline staff to identify questions that they have about how the program is doing and include those questions in the design. This is a way to help the program staff feel more personally invested in the process, and hopefully less threatened. Also, it’s helpful to make it clear in the initial conversations that you value staff input on how to conduct the evaluation in a way that minimizes the burdens both on the staff and the clients. And as Isaac mentions, I find it helpful to focus on how the evaluation can help bring what they know to others who are trying to do similar work.

  • I’d recommend cultivating greater empathy. It’s essential to recognize that your counterparts often have perfectly valid reasons for their apprehension, and some of them probably have experience that turned out differently than you expect. For instance, doing things “cheaper, faster, and more efficiently” may mean the organization needs only two program assistants where they now have three or four. That’s someone’s livelihood. As evaluators we need to know our limitations, and we typically have little control over how findings, conclusions, and recommendations are used — it is not always as we might wish.
    Don’t be so quick to assume people haven’t ‘thought things through’, either. If people pick up that vibe from you, they’re right to be apprehensive. As Diane says, the program staff are the experts on what’s going on, how, and why. They may not have it assembled in an evaluation framework but they’ve thought about it for much longer and from many angles you have not considered yet. What seems obvious to a novice at first glance might be something that’s been tried and found not to work.
    My experience in international development evaluation leads me to suggest deep reflection on the combat metaphor. The evaluator can’t “win”, and won’t even be there to face the evaluation’s consequences.
    Promising sunshine and roses is more likely to undermine than build trust. Focus instead on working hard to learn and appreciate the local perspective. Learn about the program, and while you’re there find ways to contribute a different perspective or insights that people on the ground can use.

  • Elizabeth Silverstein says:

    Thank you all for your replies! I think that all four of your responses have clear and easily implementable ways to combat this problem. I really appreciate your insight and hope to use them in my (hopefully) near future evaluations.

  • These comments are all excellent. I also try to remind clients that evaluation allows them to show off their successes, as well as highlight areas where they can improve. There is so much deficit focus in the current environment that non-profit and educational staff start to feel beat down and forget the impact they are having on those they serve. It is rare to have a program that does not have some level of success, and speaking with clients to find out what is important to them really helps frame the evaluation in terms that are meaningful to them. Reminding them that the end goal is to do X (help youth in detention reintegrate, etc.) and that you are there to provide information to help them reach that goal, or show in what ways they are moving toward achieving it already, makes them much more open. Of course, building trust will always take time, and you will need to be true to this throughout the evaluation process to build it.

  • Leave a Reply

    Your email address will not be published. Required fields are marked *

    How can you help people when they are afraid? [Guest post by Elizabeth Silverstein]

    Updated on: Sep 11th, 2012
    Data Visualization
    Collage of clipboard, clock, people, calendar and calculator.

    Hello Evaluation community! My name is Elizabeth Silverstein and I am probably as novice an evaluator as you will ever find. Currently I am a lowly graduate student in Public Affairs in Wisconsin, but have been spending this summer interning for a government agency in DC. It was this internship that introduced me to evaluation, where I finally think I have found an intellectual home.

     While I find most of evaluation fascinating, one aspect of evaluation which continues to worry me is a term we deem in our office “evaluation apprehension.” It represents all of the negative feelings that many clients seem to carry concerning evaluations.

    I find this idea that clients would be afraid of evaluations both baffling and frightening. For me, evaluation represents the ability to do everything cheaper, faster and more efficiently. Initially, the lack-luster performance of many innovative programs I saw during my study abroad turned me towards the evaluation light.

    “But wait!” I thought, “if you just thought it through, you could figure out why the new well you built is being used for garbage instead of its intended purpose!” And yet, many people see evaluators as the enemy, as someone who is coming in to tell people what they are doing incorrectly in their program.

    My question for the evaluation community is, how can you help people when they are afraid of what you might find? Is this apprehension as widespread as it appears, representing perhaps a greater publicity problem for evaluators? Finally, what is the best way you have found to combat this evaluation apprehension?

    — Elizabeth Silverstein

    More about Elizabeth Silverstein
    Growing up outside Chicago I knew I wanted to dedicate my life to helping people. After graduating with a Masters in Public Affairs, I took on a roles in the non-profit and private consulting world to try and find where my technical skills fit with my drive to help people. We live in a world where data is all around us and I am passionate about helping people use this data to live better lives. After completing my training at Fullstack Academy of code, I look forward to writing beautiful code to unlock the potential of the data all around us.

    6 Comments

  • Elizabeth, evaluation apprehension is definitely real, and it becomes more pronounced the closer you get the front-lines or ground level staff.
    In situations where I think people will be uncomfortable with evaluation work, I stay away from jargon or any terms that may be laden with negative connotations. Never say ‘evaluation’ – say you are here to try to make things better. Never say ‘data’, say ‘information’. Just being aware of your language and the words you use can really make a difference in overcoming fears of evaluation.
    Depending on your personality, you could also use emotional appeals or humor to help with the situation. Appealing to people’s desire to help others is something I’ve done in the past – so evaluation isn’t a negative activity that will force change, but rather it is a positive activity that will improve the lives of others. I’d also recommend being sympathetic to the burdens evaluation will place on people – don’t ignore these – but explain that they are necessary steps to making sure you can improve things for service recipients.

  • Elizabeth, it’s great that you’ve had this insight so early in your career! I’ve learned that I need to explain very early and very clearly that my work has nothing to do with personnel evaluation or finances. I’m not an auditor and I don’t hire or fire anyone. When front line staff see an evaluator walk in with a laptop or notepad, the first thing they assume is that I am evaluating THEM, not the program. If I can explain otherwise it’s much easier to help staff relax and look on the process as one of helping the program improve.

  • I agree with Isaac about staying away from jargon, and Maria about stressing that it’s not about individual job performance. I have worked on a few process/implementation evaluations of programs that are in early stages of development, where frontline staff can feel especially put on the spot by evaluation–they feel that evaluation means they are expected to produce significant positive results, when they are still trying to figure out how to do the program. I try to emphasize the different stages of eval, and that an important aspect of the eval process and goals is to help others (including the evaluators) learn about how the program works–the program staff are the experts on that, not me. For evals that are more outcome/impact focused, I try to make time at the beginning of the process for the frontline staff to identify questions that they have about how the program is doing and include those questions in the design. This is a way to help the program staff feel more personally invested in the process, and hopefully less threatened. Also, it’s helpful to make it clear in the initial conversations that you value staff input on how to conduct the evaluation in a way that minimizes the burdens both on the staff and the clients. And as Isaac mentions, I find it helpful to focus on how the evaluation can help bring what they know to others who are trying to do similar work.

  • I’d recommend cultivating greater empathy. It’s essential to recognize that your counterparts often have perfectly valid reasons for their apprehension, and some of them probably have experience that turned out differently than you expect. For instance, doing things “cheaper, faster, and more efficiently” may mean the organization needs only two program assistants where they now have three or four. That’s someone’s livelihood. As evaluators we need to know our limitations, and we typically have little control over how findings, conclusions, and recommendations are used — it is not always as we might wish.
    Don’t be so quick to assume people haven’t ‘thought things through’, either. If people pick up that vibe from you, they’re right to be apprehensive. As Diane says, the program staff are the experts on what’s going on, how, and why. They may not have it assembled in an evaluation framework but they’ve thought about it for much longer and from many angles you have not considered yet. What seems obvious to a novice at first glance might be something that’s been tried and found not to work.
    My experience in international development evaluation leads me to suggest deep reflection on the combat metaphor. The evaluator can’t “win”, and won’t even be there to face the evaluation’s consequences.
    Promising sunshine and roses is more likely to undermine than build trust. Focus instead on working hard to learn and appreciate the local perspective. Learn about the program, and while you’re there find ways to contribute a different perspective or insights that people on the ground can use.

  • Elizabeth Silverstein says:

    Thank you all for your replies! I think that all four of your responses have clear and easily implementable ways to combat this problem. I really appreciate your insight and hope to use them in my (hopefully) near future evaluations.

  • These comments are all excellent. I also try to remind clients that evaluation allows them to show off their successes, as well as highlight areas where they can improve. There is so much deficit focus in the current environment that non-profit and educational staff start to feel beat down and forget the impact they are having on those they serve. It is rare to have a program that does not have some level of success, and speaking with clients to find out what is important to them really helps frame the evaluation in terms that are meaningful to them. Reminding them that the end goal is to do X (help youth in detention reintegrate, etc.) and that you are there to provide information to help them reach that goal, or show in what ways they are moving toward achieving it already, makes them much more open. Of course, building trust will always take time, and you will need to be true to this throughout the evaluation process to build it.

  • Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Might Like

    Our complimentary mini course for beginners to dataviz. Takes 45 minutes to complete.

    Enroll

    Unlocking Creativity: Simple Steps for Non-Designers to Build Powerful Visual Frameworks

    Want to make sure your presentation sticks with people? Visual frameworks are diagrams that help your audience see how everything fits together. In this post, you’ll go behind the scenes with Kate Hall to see how she developed a framework for a customer service training at her library.

    More »

    Inside our flagship dataviz course, you’ll learn software-agnostic skills that can (and should!) be applied to every software program. You’ll customize graphs for your audience, go beyond bar charts, and use accessible colors and text.

    Enroll

    Subscribe

    Not another fluffy newsletter. Get actionable tips, videos and strategies from Ann in your inbox.