Flashback to Mrs. Eckhart’s sixth grade classroom in small-town Iowa. The lesson for the day: Where do storms come from? The teacher opened the floor for questions and a small, pale boy with brown hair raised his hand: “Why does the sky look green when a storm is coming? And what makes the hail? Is that the same thing that causes wall clouds?”

“Paul, why don’t we give someone else a chance to ask questions,” Mrs. E replied with a slight bite in her voice. I must have driven her crazy.

I’ve always loved asking questions. Maybe that’s why I gravitated towards the field of evaluation and impact analytics. But, I was never really trained in asking questions effectively. And you probably weren’t either.

A few weeks ago, I published a post about how to ask better data analysis questions. Now, I’d like to share a few tips for asking better questions of your coworkers and clients to solve problems and understand impact.

1.Choose the Right Time and Place. If your question would take five minutes or less to answer, give someone a call or stop by their desk. For longer discussions, set up an appointment. For executives, ask their assistant or direct reports about their communication preferences, so you can choose the best time and place to get your question answered.

2.Prepare. Don’t hold a conversation with a client or colleague until you’re crystal clear about your intended outcome. Then, create a discussion guide that you can follow when talking about the problem you’re trying to solve or question you’re trying to understand. The questions in your guide should start general and then become more specific. You may want to mark the three most critical questions to be sure you cover them if the conversation wanders.

3.Provide Context. You will naturally want to dive right into the nitty-gritty details. For example, your real question may be, “Why is it that 5% of the case notes we record indicate the client “attended” a session, but the session length was only 1 minute?” Don’t lead with this. Make sure the person you’re talking to understands the purpose of the conversation before asking detailed questions.

One tip I learned the hard way is to respect power dynamics when setting context at the start of a meeting. If you’re not the most senior person in a group discussion, give that person the opportunity to kick things off. Once everyone at the table knows the purpose of the meeting and why you’re asking questions, then you can dive into the detail.

4.Balance Macro and Micro Questions. Chances are you’ve been aggregating questions about a particular subject. When presenting these questions to clients or colleagues, start high level. I find a basic opening question along the lines of:

“Walk me through how you do…?”

“Tell me about your experience with…?”; or

“What do you want to see happen when…?”

…often gets me 80% of what I need to know. And sometimes these broad, open-ended questions have led me to learn things I didn’t even know to ask!

5.Ask Follow-up Questions. Surely there was something interesting or unexpected in your interviewee’s answer to the initial broad questions. Trust your gut and follow up on it, especially if you suspect there might be more to the story than they’re letting on. Don’t just take my word on this — author and consultant Peter Blockprovides some great tips on how to behave authentically in critical conversations.

6.Acknowledge Feelings. Some of the most useful follow-up questions I ask are around exploring others’ motivations and reactions to their circumstances. Presumably, this challenge or question has existed quite a while. Consider the following questions:

Why is now is the right time to tackle this question or problem?

How does not knowing the answer make other staff and clients feel?

What are other people in the organization (or even your interviewees) doing to perpetuate the problem?

Despite the problems, what’s been working well?

7.Manage Cadence. Pay attention to the cadence of the interview. Some people ask and answer questions aggressively, talking over others like they’re placing a trade on Wall Street. Others treat an conversation like a therapy session, requiring a lot of silence before opening up. Learn to read the cadence of an interview quickly and match your style to your interviewees to maximize the amount and quality of information you get.

Finally, remember that it’s up to you to engineer a good ending to conversations exploring important questions or problems. Regardless of what happened, give genuine thanks to those who shared their knowledge and experiences. Acknowledge the fact they had other priorities, and share with them what to expect from you or others next.

For me, asking good questions is a critical skill for those trying to build data-literate and impact-literate teams. Do you have other questions about how to promote data literacy in your organization? I’ll respond directly to any comments and also include your thoughts in my next post.

Asking good questions with the data at hand is an important skill for teams and organizations making the world a better place. But all this questioning and analysis can have a shadow side — asking the wrong questions can be a waste of time and distract your team from your most important goals. Here are 5 common traps leaders in the social sector make when asking questions, and how to avoid them.

1.Questions without an objective: Those of us who work in the nonprofit or social enterprise space often give up financial gain to pursue work we find interesting, challenging, and meaningful. But when we’re thinking about investing resources to answer questions about the work we do, “interesting” is not good enough. Is it clear to your staff why your question is important? For example:

OBSCURE OBJECTIVE: “I just saw one of my friends share a really interesting article from the Huffington Post about the link between family stressors and early childhood health outcomes. Can you tell me how these are related in our clients?”

CLEAR OBJECTIVE: “As we talked about last year, one of our top learning is to understand and promote positive health outcomes for our clients. Can you help us analyze whether our children’s health indicators have any relationship to their parent’s reported stress levels?

2.Questions that have low strategic value: A second criteria for considering questions to answer through data analysis is, “Does the answer help us improve our ability to fulfill our mission?” For example, questions that test your assumptions or logic of how change happens are often strategically important, even if they make you or your staff uncomfortable. So too are questions that have clear implications for your budget or resource allocation.

LOW VALUE: “How many clients expressed satisfaction in our financial counseling program last year? Is there a better number that demonstrates how well our program is working?”

HIGH VALUE: “Our program assumes that because clients set unique financial goals and have unique circumstances, the amount of support each will receive must vary widely. However, are there any cutoffs where additional support from us is unlikely to help a client increase his or her financial well-being?”

3.Questions where you don’t have expectations: A third important aspect to a good analysis question is: Is the answer testable? If you don’t have any internal expectations or external points of comparison, interpreting whether the answer you get is good or bad isn’t possible.

NOT TESTABLE:“What was the effect of our program to increase parental resilience among the at-risk families we serve?”

TESTABLE:“Other programs like ours using the same assessment tools find they have a small positive effect on parental resilience, as measured by an effect size (Cohen’s D) of 0.3 or greater. What’s our effect size?”

4.Questions that have been answered before: In the world of nonprofits and social enterprises, there is a commonly held belief that each organization is unique beyond the point of comparison. While this is true to a certain extent, chances are the issue you’re addressing has been around for a while. Please, before you start mining your own data spend a few hours on Google Scholar or at your local college or university checking out the research literature on the issue you’re addressing. You may not find the answer you need, but may find out how to ask a better question.

ALREADY ANSWERED: “Is the head start approach to early childhood education really effective?”

NOT YET ANSWERED: “As a startup early childhood education provider, how does the social and emotional development of our children compare to that seen in similar head start programs?”

5.Questions where the opportunities for action are unclear: Perhaps the most important aspect of a good analysis question is whether the results are actionable. Will the results potentially cause you to change your strategy, execution, or culture? If not, the question may not be worth exploring.

NOT ACTIONABLE:“Can you give me a summary of client outcomes for each of my direct reports? I’m not sure we have enough data to talk to them about the results, but I think looking at this could be interesting.”

ACTIONABLE: “Can you tell me which workshops last year were most engaging for our clients based on their feedback? We only have budget this year for three workshops and I need to decide which ones we should focus on.”

One final thought — as a changemaker in the social sector, timing is everything. Sometimes it might seem that you can’t make a decision without more analysis; other times you don’t have time for any analysis; often you’ll have these thoughts at the same time! One strategy is to set aside time for answering critical questions about your finances, programs, and operations every year. As questions come up, save them for your Annual Analysis project. Prepare your staff to be engaged, and if you need outside experts or volunteers, get them lined up. Then focus your attention on your most important, timely, and strategic questions.

Do you have other tips to help social sector leaders ask smarter questions? Send them to me and I’ll share your thoughts in an upcoming post.

This post originally appeared on the American Evaluation Association's blog, and can be found here.

Over the last two years I worked as the Data and Evaluation Manager at the San Francisco Child Abuse Prevention Center (SFCAPC), a mid-size nonprofit focused on ending child abuse in San Francisco. This was a fantastic learning experience — I worked with 50+ staff members ranging from policy advocates to social workers, helping them use our data to serve clients better.

About two months into this job, I read a book that changed how I approach this work entirely:

I believe great people to be those who know how they got to where they are, and what they need to do to go where they’re going. They go to work on their lives, not just in their lives… They compare what they’ve done with what they intend to do. And when there’s disparity between the two, they don’t wait very long to make up the difference. — Michael E Gerber, The E-Myth Revisited

Reading Gerber’s book convinced me on the importance of systems and habits to help people succeed at their jobs. This particular nonprofit had a database that held client information — Efforts to Outcomes — but had few habits to improve that system and use that data to serve our clients better. So, I set out to create habits for how we do our “data” work.

I read through books, blogs, and web sites, and talked to mentors and friends to get a sense for what other organizations do. I found many resources that shaped my approach to creating these new “data habits”. Some of the best are:

The books, whitepapers, and newsletter from the Leap of Reason Institute — The free e-books on this site by Mario Moreno and David Hunter, and the institute’s recent whitepaper, The Performance Imperative, are the resources I recommend most often to others.

Sheri Cheney Jones’ book Impact and Excellence — Jones’ book contains many useful strategies she uses in her consulting practice to help clients generating insights from their data when resources are scarce.

The Root Cause Blog — Root Cause is another consulting firm that supports nonprofits in creating effective data and evaluation habits; they share some of their tricks on their blog.

The Data Analysts for Social Good professional association — Members get access to 25+ webinars ranging on topics from justifying the return on investment of data analysis to introductory analytical techniques using Excel, R, and other platforms.

After reading these I was convinced that developing regular habits would be critical to my organization’s ability to use it’s data to improve the lives of clients. But exactly what these habits should be remained unclear.

So with the support of our programs staff, I began experimenting. Over the last 18 months, we arrived at six specific habits that helped me provide consistent value to our staff.

Keep a reporting calendar: Organizations are often required to submit detailed program participant and activity uploads for certain government contracts. I created a comprehensive calendar of when to submit these uploads and who needed to review the data before it uploaded.

Define data integrity controls: Data integrity controls minimize the risk that information in a database is incorrect. For example, we scrubbed our database of test data monthly and audited a sample of new families to verify the accuracy of data entry each quarter. We summarized data integrity controls in a spreadsheet outlining each procedure, information source, performer, reviewer, and results.

Review dashboards: The first week of each month, I sent a performance dashboard to each of the program managers. Managers discussed these metrics with their teams. Then they shared explanations for variances and any action items they were going to take at the manager’s meeting the following week.

Schedule time for troubleshooting and report development: To build staff buy-in I needed to be responsive to database troubleshooting and report development needs. I tracked time for these tasks and blocked out time for them weekly. In an average week, I spent 2–10 hours troubleshooting and training, and 5–15 hours developing self-service reports staff could use to access program data themselves.

Automate annual development data pulls: A significant “data” responsibility was pulling data for our development team, including demographics and unduplicated client counts. Working with our development team, I developed a self-service report designed to answer 80% of their “stats” asks, saving everyone time.

Have a data analysis process: Stakeholders across our organization came to me with many good questions to explore in our data which I just didn’t have time to answer. I created a master tracker of these questions and set aside several weeks each year to explore those that were most critical. This “Annual Data Analysis” process set expectations and created a process to focus our limited analysis time.

These habits helped us save time, set expectations, and create lasting systems. They’re far from perfect, but were a huge step forward for us. Now that I’m no longer working for this organization, I realize these habits not only helped organize the work, but enabled continuity between me and my successor.

If you’ve read this and are working in the social sector, I’d love to know — What are the data & evaluation habits you’ve found valuable at your organization?