5 Comments

    Leave a Reply

    Your email address will not be published.

    My Ongoing Quest to Define a “Culture of Learning”

    Updated on: Apr 5th, 2012
    Data Visualization
    ,
    A collage of a calendar, paperclip, papers, and a presentation board.

    Data, research, and evaluation have been on everyone’s radar for a while now. Schools took data seriously after NCLB in 2001, and nonprofits and their donors are also starting to take a closer look at their numbers.

    Data’s exciting, but using data to inform action is even better. But how exactly do we use data?

    Everyone seems to agree that building an organizational culture of learning (within the school, youth center, etc.) is important. And how exactly do we build an organizational culture of learning?

    In my quest to find resources about organizational learning, I’ve found that there’s little agreement or clarity about what a “culture of learning” really means.

    Here’s my running list of what an organizational “culture of learning” might look like:

    • There are organizational policies and procedures in place that fit Michael Quinn Patton’s description of a fully integrated and highly valued internal evaluation system
    • Evaluation is institutionalized. Every program, department, project, etc. has a logic model, collects their own data, and uses it.
    • The logic models are made with SMART goals in mind
    • Programs collect quantitative and qualitative data
    • Qualitative data collection, like interviews and focus groups, is culturally-competent
    • Programs use formative and summative assessments to guide decision-making
    • Formative assessments are woven into the program’s design – for example, an advocacy program in which youth to testify to local congressmen will need to teach youth about public-speaking skills, so why not design simple rubrics to assess their progress in their public speaking skills over time? Youth are hungry for feedback and love seeing their own improvement over time.
    • Staff members take an initiative to monitor data without nudging from evaluators
    • Evaluation results get used to improve programs
    • Staff members demonstrate critical thinking skills when thinking and talking about their program
    • Staff members demonstrate depth of knowledge when thinking and talking about their program
    • Capacity building takes place on a day-to-day basis
    • Evaluation is conducted because everyone wants to improve programming, not because a report is due to funders.
    • Curiosity about outcomes. The staff members start with some research questions, and then you find some answers in the data together, and the staff members have even more questions.
    • Staff members comment, “These numbers are great, but how can we measure progress??” Staff understand that measuring outputs is not the same as measuring outcomes.
    • Staff want feedback, especially individual feedback, so they can improve their work.
    • Staff want to design satisfaction surveys and are genuinely concerned about the participants’ experiences.
    • Staff look forward to discussing results together.
    • Staff learn about about another organization’s performance management system or read evaluation reports about other agencies and comment, “But this doesn’t really tell me anything… They didn’t even say whether the youth benefited! They just flashed some numbers are talked about how many people were served.”

    What did I miss?

    More about Ann K. Emery
    Ann K. Emery is a sought-after speaker who is determined to get your data out of spreadsheets and into stakeholders’ hands. Each year, she leads more than 100 workshops, webinars, and keynotes for thousands of people around the globe. Her design consultancy also overhauls graphs, publications, and slideshows with the goal of making technical information easier to understand for non-technical audiences.

    5 Comments

      Leave a Reply

      Your email address will not be published.

      You Might Like

      Our complimentary mini course for beginners to dataviz. Takes 45 minutes to complete.

      Enroll

      Change Takes Time: How to Practice Patience in Report Redesign Processes

      Guest author Abby Henderson started to have conversations with her colleagues about how they could change their reporting. She started by suggesting they add more data visualizations and fewer tables. When she met resistance to this idea, she started to produce two versions of the reports. One version included the tables they were accustomed to, and the second version included more elements of data visualization. Through providing both options, Abby was able to slowly garner traction and buy-in on including data visualizations.

      More »

      Inside our flagship dataviz course, you’ll learn software-agnostic skills that can (and should!) be applied to every software program. You’ll customize graphs for your audience, go beyond bar charts, and use accessible colors and text.

      Enroll

      Subscribe

      Not another fluffy newsletter. Get actionable tips, videos and strategies from Ann in your inbox.