TAG | free resources

Greetings! Caitlyn A. Bukaty here sharing some exciting insights with you during Disabilities and Underrepresented Populations Topical Interest Group week.

Today I want to offer a few ideas my fellow evaluators might find helpful in making their evaluations more accessible to a wide range of stakeholders. This information comes from my experience collecting feedback from young adults with intellectual disabilities who participated in a workplace problem-solving intervention, but one of my favorite features of these techniques is how helpful they are to a wide range of stakeholders! This concept is known as Universal Design, and the premise is that an option you might offer to one groups of stakeholders, for example those who have difficulty reading, actually makes accessing your evaluation materials easier for other groups, such as stakeholders for whom English is not their first language, or those with visual impairments.

Without further ado, let’s explore some ideas to help your evaluations reach for the stars in terms of accessibility!

Hot Tips:

Add pictures – A well-connected photo can help stakeholders link a question to a certain event, or clarify a response.

In this example, a series of question are linked to a certain part of the intervention using a picture of the person with whom participants interacted:

Go digital – Offering traditionally “print” materials in digital format opens up a universe of accessibility for stakeholders. Users can access screen reader software, text-to-speech features, and even translation applications to better understand the material. This is even more effective if materials are offered on a mobile friendly platform, mobile web access is widely reported as overtaking desktop computer use.

Be allears – Prepare to accept responses from your stakeholders in a variety of creative ways. Offering stakeholders multiple options for response may mean gathering responses from those who would not have been able to participate via a single mode of response. Written or typed responses to forced choice and open ended questions may be traditional, but what if someone wants to dictate a response…can you make a scribe available in person or via telephone to support his or her participation? How about a participant wishing to record a response? This can be achieved via a voice or video recorder on many mobile devices. Depending on the question a pictorial response, such as indicating time spent on a circle graph, might even encourage respondents to participate.

Rad Resources:

Creative Commons Zero (CC0) Imagery – This is the name given to images free from copyrighting. In addition to taking or requesting photos specific to the topic of your evaluation there are resources linking you directly to CCO images such as Pixabay and Unsplash. Web search platforms, such as Google Images also allow you to specify reuse policies during an imae search.

The idea behind today’s post is to maximize stakeholder participation by inviting them to take part in an evaluation in whatever way is most convenient and effective. To learn more about universal design geared towards materials development and response check out the Universal Design for Learning materials offered through CAST.

The American Evaluation Association is hosting the Disabilities and Underrepresented Populations TIG (DUP) Week. The contributions all week are focused on engaging DUP in your evaluation efforts. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Mimi Doll, the owner of Candeo Consulting, Inc., an independent consulting firm that builds organizations’ capacity to create meaningful change in the communities they serve. Sometimes we can prevent scope creep with good planning, other times no matter how good our preparation is, clients either don’t have a clear sense of what they want or simply change their minds.

Hot Tip:

Always Develop a Scope of Services and Contract. Developing a detailed scope of services, including project tasks, work hours, pricing, timeline, roles and responsibilities, makes clear to the client what services and deliverables you plan to provide, and those you don’t. Your scope serves as a communication tool about how you will proceed with the project and provides your client an opportunity to react and clarify their expectations about the work. Similarly your contract lays out a legally enforceable agreement about how you and your client will conduct business together, including key issues such as services offered, payment terms, data ownership, contract termination and renewability. Should you reach that “worst case” scenario when you and your client reach an impasse, your contract makes clear the parameters to which you’ve agreed.

Hone Those Communication Skills.Sometimes there are client-consultant disagreements about how a project should proceed, even after the contract has been signed. These moments call for strong communication skills: listen actively to your client, state your positions clearly, manage strong emotions (yours/your client’s) and maintain professionalism. Remember, conflicts often arise from differing perception of a situation rather than objective facts; it’s important to be able to take the client’s perspective. Make your goal about coming to a mutual agreement.

Be Clear on Your Own Standards. When the client’s expectations about the project change between start and finish of the work, it’s important to be clear about your own standards by writing them down. Consider the following:

Logistics & Scope Changes: How does this impact your project’s time frame, budget and staffing? Where can you be flexible and where can you not? Do alterations erase company profits; place too great a burden on your time/staffing capacity?

Work Quality/Integrity & Scope Changes: Do requested alterations reduce the quality or rigor of data collection, create conflicts of interest, and lessen the impact of your work? In some cases these decisions are clearly outlined by professional standards, while other times we must develop our own professional standards.

The American Evaluation Association is celebrating the Chicagoland (CEA) Evaluation Association Affiliate Week with our colleagues in the CEA AEA Affiliate. The contributions all this week to aea365 come from our CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Kate Rohrbaugh and I am Co-Chair of the Business, Leadership, and Performance TIG along with Michelle Baron. I’m a Research Team Leader at a consulting firm in Virginia leading a group studying capital project organizations and teams in the process industries. Today I’d like to talk about the renaming of our TIG and the tools we used to conduct this work.

When I accepted my current position five years ago, I had to rethink my AEA TIG membership because I had been a faithful member of educationally related TIGs, which were no longer relevant. The number of TIGs at AEA can be overwhelming at times, but it also offers a wide variety of “homes” to evaluators regardless of the content area. In my new position I turned to the Business and Industry TIG where I found a small but dedicated group of professionals. I “lurked” with this group for a year, and within a short time (since it was a smaller group), I was able to take an active role in the leadership of this TIG.

In discussions with the leadership of the TIG and at AEA, we determined that the name of the TIG was unnecessarily limiting both presenters and audience – evaluation issues in for-profit organizations are relevant to a wide variety of evaluation professionals in both private and public sectors. For this reason, we canvassed the membership and working closely with the AEA staff and board, identified a new name for our TIG.

Rad Resources

AEA maintains a list of members in each TIG and faithfully protects AEA membership from unnecessary contact, but this was a great source for contacting our membership about the desire to change the name of the TIG and solicit ideas for renaming the TIG.

To canvass our membership, we turned to the old faithful Survey Monkey which met our simple needs for collection and analysis.

To discuss the results with the TIG leadership located across and outside the United States, we turned to FreeConferenceCall.com, which is exactly what you think it is.

We are excited about the AEA 2012 in Minneapolis and hope to see lots of new faces at our presentations and business meeting!

The American Evaluation Association is celebrating the Business, Leadership, and Performance TIG (BLP) Week. The contributions all week come from BLP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is John Kramer and I am currently a Research Fellow at the Institute for Community Inclusion, University of Massachusetts Boston. My work focuses on research and evaluation of employment outcomes of people with disabilities, participatory research, and aging issues for families of people with disabilities. I am a new member of the American Evaluation Association.

Tip:

Universal Design Principle 3 is “simple and intuitive”. Incorporating clear, simple language in writing while also providing concrete, every day examples improves access in two ways:

it clarifies your intention as a writer and helps you focus on the basic idea you are trying to convey

it allows for more stakeholder access and participation.

Hot tips:

Use plain language. This means substituting simpler words for more complex ones. It also means writing sentences that are free of excessive subordination. Also, try to avoid unnecessary modifiers like “really, totally, very, only, quite,” which may interfere with clarity.

Use concrete, accessible examples including images when helpful. Try to think of examples to illustrate your writing that are easy to picture and relate to. Using images is a good approach as well when appropriate.

Use clear, parallel examples in your writing. For instance, if you frame an example as noun, verb, recipient noun, then make sure all your examples use the same order of presentation.

Rad Resources

There are many good resources for how to incorporate plain language and images into your work. A few especially helpful ones around the web are:

Plainlanguage.gov -A website by the United States Federal government that gives some useful strategies and examples in using plain language.

Grammar Girl -A website that provides some basic tips and tricks to clarify your writing. Not for cognitive access per se, but elements can be useful in UD.

Picture Planner – A website that illustrates an example of how pictures can be used to facilitate cognitive access.

Creative Commons -Here you can find free pictures that you can use, often with attribution, to illustrate your work and writing.

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

My name is Kylie Hutchinson. I am an independent evaluation consultant and trainer with Community Solutions Planning & Evaluation. I give regular webinars and workshops on evaluation topics for the AEA and CES and Twitter weekly at @EvaluationMaven.

Have you ever felt bored sitting in a conference presentation, or frustrated by a presenter with poor delivery skills? I know that I have, particularly when I see great content getting lost in a lackluster or poorly-paced presentation. At a recent conference I amused myself during several sessions by starting a list of 25 Do’s and Don’t for effective conference presentations. Even if you’re not the world’s greatest public speaker, there are many simple and low tech things you can do to dramatically increase your audience’s engagement and effectiveness as a presenter.

Plan to arrive at the annual conference on Wednesday October 24. The Data Visualization and Reporting TIG is teaming up with the Potent Presentations Initiative to hold a Slide Clinic, from 6:10-7pm. Bring your slide deck to get one-on-one coaching and advice from trained TIG members before you present. Keep an eye on the DVRTIG and p2i sites for details on the Slide Clinic’s location.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

When combining survey data with in-depth interviews, national guidelines can help. Our UMMS evaluation team with expertise in quantitative and qualitative methods is studying the Massachusetts Patient Centered Medical Home (PCMH) Initiative. In this project, 46 primary care practices with varying amounts of PCMH experience will transform over a 3 year period and achieve National Council on Quality Assurance PCMH recognition. Three members from each practice completed a quantitative survey as the baseline assessment of medical home competency.

The assessment results surprised us. A group of practices with two years of PCMH experience scored lower than the novice groups when we expected just the opposite. So, we looked to our qualitative results, comparing code summary reports to the quantitative results. The NIH mixed methods guide terms this approach to integrating multiple forms of data, ‘merging’.

The guide describes ‘connecting’ as well. To connect, we included the quantitative analyses in the semi-structured guides used for subsequent qualitative data collection. With these results we understood the novice groups’ advantage. Integrating data further reinforced the importance of teamwork in evaluation work.

Lessons Learned:

Form an interdisciplinary team. We established a ‘mixed methods subgroup’ in which quantitative and qualitative team members work jointly rather than in parallel. In a team the focus shifts from ‘this approach versus that approach’ to ‘what approach works best’. Regular meeting times allow the members to learn to work together. Our team originally formed to investigate a single puzzling result but has expanded its work to merge quantitative and qualitative staff satisfaction data.

Connect your data. We plan to continue using quantitative results in semi-structured interview guides to collect qualitative data. The qualitative results provided an in-depth understanding of the quantitative assessment and the opportunity for interviewees to comment on their practices’ transformation.

Rad Resources:

Best Practices for Mixed Methods Research in the Health Sciences The National Institutes of Health Office of Social and Behavioral and Social Sciences Research commissioned this recently released guide in 2010. Easily accessible on-line it contains seven sections of advice for a conducting mixed methods project and lists of key references and resources.

The American Evaluation Association is celebrating Mixed Method Evaluation TIG Week. The contributions all week come from MME members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.