Other blogs by Idealist

The web seems awash these days with talk of “Big Data” ushering in new ways to look at the world and make strategic decisions. But for now, most of us don’t have dedicated data scientists, or Hadoop Clusters, or the financial resources to pour into these systems.

So how can we help our organizations make better decisions and design better programs with the systems and resources we have?

I recently read a book called How to Measure Anythingby Douglas W. Hubbard. It’s not a new book, but I heard it mentioned at a Meetup I recently attended, and it sounded intriguing. It’s written for people who want to use data in their decision-making processes, but who lack actual training in statistics or data science.

It’s changed the way I look at the strategic role of data. Here are two simple, yet powerful ideas I picked up:

1. If it matters, you can measure it somehow.
A central premise of the book is that anything that is important enough to be part of your strategy can and should be measured in some way. Since many of us work with mission-driven organizations—or would like to—this can be a bit hard to swallow at first. After all, how do you quantify the story of child who has learned to read, or put a value on a life-saving medical treatment?

But does collecting and sharing those stories bring in donations? Does it nab you more volunteers? How does it relate to your primary metrics – people served, dollars-toward-cause, petition signatures, etc?

Consciously or not, you’ve attached some concept of value to each of those things. The point is not to say that these less-tangible things only matter because they affect the bottom line or some other metric-du-jour.

Rather, it’s to create a point of reference so that discussions about the risks and rewards of different strategies can be compared in some objective way.

Do you really need this much precision? Photo credit: Flickr user stevegroom

2. Be OK with ‘just enough’ data.
It’s easy to tie ourselves in knots on a quest for precision and completeness. We want to know exactly what the response rate will be on a new email campaign we’re testing, or exactly how many new volunteers signed up after a recruitment event. It’s tempting to scale up measurement and prediction efforts to try to reduce error.

More data means better decisions, right?

But improved accuracy has a cost. The first few measurements or estimates you make are easy to get and tell you a lot, while each additional bit of precision tells you less and costs more to acquire—more time, money, and attention.

For some critical decisions, you may actually need to be 95% certain that you’re right. Most of time, the cost of being wrong is not that high, and a lot less precision will do just fine.

The goal of measurement is not to eliminate uncertainty (which is impossible), but to reduce uncertainty to an acceptable level so a decision can be confidently made.

See if you can find a way to collect just a bit of data without investing much effort, and then ask yourself: “Is this enough information to make a decision?” The book dives into the math required for this kind of analysis, and breaks it down in a way that’s pretty friendly to statistics neophytes like me.

I’d recommend How to Measure Anything for anyone who’s interested in making better decisions based on data. Do you have a resource you’d recommend? Share it in the comments below.

This is the second of a three-part series in which I’ll share some lessons drawn from the world of software development that can be applied to the social good sector. Today’s post is about using data to make better decisions. Read the first part about recognizing obstacles to action for what they are here.

One of the defining features of Scrum (the software development methodology we use here at Idealist) is the regular opportunity for “retrospectives.” Once a week the team gathers to talk about what went well during the previous week, and what we want to change for the next week.

The key here is the short cycles—it allows us to experiment with semi-crazy ideas, because we’re only committing to them for a week. If they don’t work, we throw ‘em out the next week. It’s very low risk.

Photo via Nomad_Soul on Shutterstock.

An obvious benefit of this “inspect and adapt” habit is that it allows us to continuously improve our processes. A less obvious benefit is that it creates a culture of empiricism. Whenever we can, we bring real data to the retrospective.

We might start off with an instinct or a hypothesis like, “I wonder if we’d get more done if we aimed higher next week” (which is a valid question, not a foregone conclusion).

We can then test that hypothesis immediately and a week later gather to look at the results. We aimed higher—did we or didn’t we get more done?

Inspect, adapt, change the world

Nonprofits and other worldchangers use inspect and adapt processes as well, of course. The staff at Single Stop USA, for example, are working to end poverty. They keep students in school by helping them and their families navigate the world of public benefits, providing them with access to tax preparation support in addition to legal and financial counseling.

Since their founding in 2001, Single Stop has continued to work towards that goal with laser-like focus, but understands that their approach must be nimble enough to evolve based on empirical data.

Nate Falkner is the Vice President of Strategy and says that Single Stop USA makes better use of data than any organization he’s worked with. For example, they’ve used data to identify potential partners to help distribute their programs.

Early on, they looked at studies that showed that programs that gave community college students at risk of dropping out just two to three hundred dollars would often mean the difference between staying in school and dropping out.

A lightbulb went off, and the Single Stop team realized community colleges were ideal partners. Single Stop’s programs could serve as a dropout prevention strategy for the colleges (on average, Single Stop clients receive benefits and services worth over $1,000), while the colleges could provide Single Stop with access to a large number of potential clients and an infrastructure through which to expand.

Similarly, after gathering data that showed the words “tax preparation support” carries less stigma than “government benefits” (think politically charged terms like “food stamps” and “welfare”), Single Stop refined its messaging to potential clients. They focused their outreach message on the tax preparation parts of their program, drawing in clients who later became interested in their other resources.

Let out your inner data nerd

When it comes to developing an “inspect and adapt” process, we recommend keeping the following in mind:

1. Schedule time for reflecting on process, and treat it as sacred.
It can be tempting to skip the retrospective when other things seem more pressing, but we’ve found that treating it as sacred has kept us sharp.

2. Minimize the risks associated with innovating on process.
We limit our experiments to one week, which allows us to try out some pretty dramatic ideas. You’ll often hear someone say, “It’s only for a week, guys” during our retrospective sessions. This reduces anxiety for people who tend to be averse to big changes.

3. Adapt the process; don’t move the goalposts.
As Nate says, “Our mission is ending poverty, and that doesn’t change. We’re being smart and nimble about how we approach that discussion and how we approach stakeholders on their terms.”

How have you used “inspect and adapt” techniques to innovate on your internal processes?

We do, so our developers put together a real-time visualization of our traffic around the world.

Our traffic from 2 p.m. today.

The height of each bar represents the number of visitors we have at any one time. This image is from 2 p.m. NYC time.

You can see that most of our visitors are coming from the East Coast of the U.S., while the West Coast is just starting the day.

We’re also getting quite a bit of traffic from Colombia, Peru, and Argentina to our Spanish-language site, Idealistas.org. It’s early evening in much of Europe and Africa, so we still have some visitors from France, Germany, Ghana, and Cameroon.

Hi there! We’re Kim and Diana, Idealist’s Community Support Team. We read and respond to all the messages you send us, monitor the site’s content, and generally help you get the most out of your Idealist experience.

Got a question about Idealist? Get in touch: idealist.org/contact-us

We learn all kinds of fascinating things on the site and in conversation with you. Here’s our first-ever Idealist Index (inspired by the old Harper’s feature), with a bunch of things we’ve spotted recently:

185,831: Number of people who registered on Idealist.org in 2011
9,590: Organizations that joined Idealist in 2011
10,430,742: Unique visitors to Idealist in 2011
20: Percent increase in number of jobs posted in January 2011 vs. January 2012

5: Babies born to Idealist staff members in the last year
0: Pencils in the Idealist.org NYC office

12: Listings on the site that include the word “ukulele”
1: Number of those listings located in Hawaii

One final number: 9,237. That’s how many messages we received through the Contact Us page in 2011. We love hearing from you, so please let us know what you think of Idealist, if it’s helped you connect with any geeky-nerdy Hawaiians, or if you have questions about any of these numbers. Leave a comment below or connect with us through our Idealist profiles: Kim’s here and Diana’s here.