You probably have heard of a FOIA (Freedom of Information Act) request, but it was probably in the context of journalism. Often, journalists will submit a FOIA request to obtain information that is not otherwise publicly available, but is key to an investigative reporting project.

There may be times when you as an evaluator may be evaluating or researching a topic and your work could be enhanced with information that requires submitting a FOIA request. For instance, while working as EvaluATE’s external evaluator, The Rucks Group needed to complete a FOIA request to learn how evaluation plans in ATE proposals have changed over time. And we were interested in documenting how EvaluATE may have influenced those changes. Toward that goal, a random sample of ATE proposals funded between 2004 and 2017 was sought to be reviewed. However, in spite of much effort over an 18-month period, we still were in need of actually obtaining nearly three dozen proposals. We needed to get these proposals via a FOIA request primarily because the projects were older and we were unable to reach either the principal investigators or the appropriate person at the institution. So we submitted a FOIA request to the National Science Foundation (NSF) for the outstanding proposals.

For me, this was a new and, at first, a mentally daunting task. Now, after having gone through the process, I realize that I need not be nervous because completing a FOIA request is actually quite simple. These are the elements that one needs to provide:

Nature of request: We provided a detailed description of the proposals we needed and what we needed from each proposal. We also provided the rationale for the request, but I do not believe a rationale is required.

Delivery method: Identify the method through which you prefer to receive the materials. We chose to receive digital copies via a secure digital system.

Budget: Completing the task could require special fees, so you will need to indicate how much you are willing to pay for the request. Receiving paper copies through the US Postal Service can be more costly than receiving digital copies.

It may take a while for the FOIA request to be filled. We submitted the request in fall 2018 and received the materials in spring 2019. The delay may have been due in part to the 35-day government shutdown and a possibly lengthy process for Principal Investigator approval.

The NSF FOIA office was great to work with, and we appreciated staffers’ communications with us to keep us updated.

Because access is granted only for a particular time, pay attention to when you are notified via email that the materials have been released to you. In other words, do not let this notice sit in your inbox.

One caveat: When you submit the FOIA request, there may be encouragement to acquire the materials through other means. Submitting a FOIA request to colleges or state agencies may be an option for you.

While FOIA requests should be made judiciously, they are useful tools that, under the right circumstances, could enhance your evaluation efforts. They take time, but thanks to the law backing the public’s right to know, your FOIA requests will be honored.

Survey developers typically spend a great deal of time on the content of questionnaires. We struggle with what items to include, how to ask the question, whether an item should be closed-ended or open-ended; the list of considerations goes on. After all that effort, we generally spend less time on a smaller aspect that is incredibly important to web surveys: the subject line.

I have come to appreciate the extent to which the subject line acts as a “frame” for a survey. In simplistic terms, a frame is how a concept is categorized. Framing is the difference between calling an unwanted situation a challenge versus a problem. There is a significant literature that suggests that the nature of a frame will produce particular types of behaviors. For instance, my firm recently disseminated a questionnaire to gain feedback on the services that EvaluATE provides. As shown in the chart below, initially we received about 100 responses. With that questionnaire invitation, we used the subject line EvaluATE Services Survey. Based on past experience, we would have expected the next dissemination to garner about 50 responses, but we got closer to 90. So what happened? We had started playing with the subject line.

EvaluATE’s Director, Lori Wingate, sent out a reminder email with the subject line, What do you think of EvaluATE? When we sent out the actual questionnaire, we used the subject line, Tell us what you think. For the next two iterations of dissemination, we had slightly higher than expected response rates.

For the third dissemination, Lori conducted an experiment. She sent out reminder notices but manipulated the subject lines. There were seven different subject lines in total, each sent to about 100 different individuals. The actual questionnaire disseminated had a constant subject line of Would you share your thoughts today? As you see below, the greatest response rate occurred when the subject line of the reminder was How is EvaluATE doing?, while the lowest response rate was when Just a few days was used.

These results aren’t completely surprising. In the 2012 presidential election, the Obama campaign devoted much effort to identifying subject lines that produced the highest response rates. They found that a “gap in information” was the most effective. Using this explanation, the question may emerge as to why the subject line Just a few days would garner the lowest response rate, because it presents a gap in information. The reason this occurred is unclear. One possibility is that incongruity between the sense of urgency implied by the subject line and the importance of the topic of the email to respondents made them feel tricked and they opted not to complete the survey.

Taking all of these findings together tells us that a “rose by any other name would not smell as sweet” and that what something is called does make a difference. So when you are designing your next web survey, make sure crafting the subject line is part of the design process.

Deciding what to measure (and what not to measure) towards gathering evidence of impact can be a daunting task, but it doesn’t need to be. In this webinar, Lana Rucks, an ATE external evaluator, will provide a step-by-step approach for making the decisions around what should be measured as an indication of impact. Using an actual ATE project as a framework, attendees will learn how the varying aspects of evaluation (e.g., logic modeling, operationalizing variables, triangulation, etc.) come together in the real world. Regardless of whether you are in the planning phase or already started the implementation your project, you’ll walk away knowing how to better communicate the story of your project/center’s impact.

EvaluATE is supported by the National Science Foundation under grant numbers 0802245, 1204683, 1600992, and 1841783. Any opinions, findings, and conclusions or recommendations expressed on this site are those of the authors and do not necessarily reflect the views of the National Science Foundation.