Most evaluators use a combination of tools to assess the efficacy of their programs and the satisfaction of partners and participants.
As you consider which tools best match what you need to learn about impact and whom you want to measure, consider these questions:
According to the American Society on Aging, quantitative methods are best to
Use qualitative methods to:
Regardless of the method, remember to collect only the information that you are going to use and use all of the information that you collect.
You may encounter concerns about confidentiality when assessing older adults. Promising discretion and anonymity and aggregating results are logical solutions. In addition, the privacy regulations mandated by the Health Insurance Portability and Accountability Act are a barrier in long-term care facilities to collecting medical data such as blood pressure or medication usage. Residents need to sign waivers allowing you access to this information.
Leaders of arts and aging programs universally comment that older adults vote with their feet. And they are rarely shy about sharing their opinions with you. While this is useful feedback, include a variety of evaluation tools to get an objective and credible picture of the program’s impact. In order of complexity, these tools are:
Use these three measures to obtain answers to basic quantitative questions, such as:
Informal feedback is information you receive from participants, caregivers, family members, partners, and anyone involved with the program, plus what you overhear at the celebration of the community sharing of the art—or in the hallway. Often, these anecdotes will enliven an evaluation report or presentation. To be effective, ask leading questions and listen.
Informally, observe participants as they perform or exhibit and audience members who join you in honoring the art created by the older adults. In a session or rehearsal, note
For a more formal observation, use specific questions to ensure consistency among observers and across multiple sessions.
The Visual Analog Recreational Music-Making (RMM) Assessment (VARMMA), developed by Dr. Barry Bittman, is designed to rate six parameters:
The assessment uses a five-point rating scale: 0 = none; 1 = minimal; 2 = at times; 3 = often; and 4 = frequent. 100
In addition to providing invaluable documentation that contributes to marketing, public relations, and advocacy efforts, videography is an effective evaluation tool. Instead of attending sessions and recording observations on the spot, videotape the session and conduct analysis later. A researcher at the University of Oklahoma is reviewing videos of three different sessions of the Alzheimer’s Poetry Project and cataloging the responses of participants. She will compare these findings with reactions to other types of intervention, such as pet therapy and volunteers reading aloud from the newspaper.
These surveys assess participants’ satisfaction with a program. They can also gauge motivations and expectations for participating and collect quantitative data and anecdotes through open-ended questions. Program surveys are the best way to get audience members’ reactions to community sharing of the art, using a short response card rather than a longer survey. Traditionally, responses are anonymous, though you can include an optional name field.
The W. K. Kellogg Foundation has the following tips for making written evaluation surveys as effective as possible:
Don’t you agree that professional teaching artists should earn more money than they currently earn?
Do you believe professional teaching artist salaries are a little lower than they should be, a little higher than they should be, or about right?
The Legacy Works program of Elders Share the Arts asks participants:
To administer a program survey, encourage participants to complete it at the last session. You will net a higher return rate than with a mail-in survey. When conducting audience surveys or response cards, include them in performance and exhibit programs, and have a staff member or volunteer hand them out to attendees. Set up several visible boxes to collect the completed forms. Don’t forget to provide pencils or pens.
Interviews enable you to explore issues in depth and follow up on answers provided in written surveys. Follow these steps when planning and conducting interviews:
1. Determine a sample of participants to interview.
2. Schedule interviews.
3. Develop an interview guide of evaluation questions.
4. Conduct the interview.
5. Listen well.
6. Probe as needed to clarify answers or elicit more details.
7. Take notes.
8. Summarize on a form. 103
Select interviewees that represent the diversity of participants in terms of age, gender, ethnicity, education, experience in the art form, and other factors.
Focus groups are more efficient than interviews because one person’s comments often stimulate others to contribute related ideas. Sometimes people are less candid and more cautious in a group, however, because they feel intimidated by the perceived or actual social status of others. When you use this method, ask the same questions as you would in a one-on-one interview, and assemble the group using similar criteria.
For visual or literary arts, you can use the art created by older adults—and students—to assess change over the course of a program. Review their respective portfolios at the beginning and end of the program to measure skill development. In the performing arts, you can assess skill development over time through participants’ performances. Of course, we have made the point repeatedly that older adults benefit equally from the process of making art; nevertheless, portfolio assessment is an option.
Even though experimental design sounds intimidating and may be time and labor intensive, it is an effective means of capturing how participants change because of your program. Both written surveys and guided observation that test knowledge, beliefs, attitudes, behaviors, skills, condition, or status fit in this category when they are administered before and after the sessions. This method is also known as a pre-test/post-test design. The obvious challenge is that older adults—and young people in an intergenerational program—may be affected by other variables as diverse as pet therapy, church, or another class.
Adding a control group to your experimental design further enhances the evaluation because it helps eliminate alternate explanations of experimental results. Members of a control group who do not participate in the program must be comparable in terms of numbers and demographics to the experimental group—the participants. Once you identify control group members, administer the same pre- and post-test at roughly the same time.
If you have not planned adequately for evaluation or have a limited budget and want to use a modified experimental design, ask respondents at the end of the program to reflect on how they have changed.
Surveys or tests that are appropriate for use in an experimental design evaluation include:
Be aware that you may require special expertise or a specific “key” to scoring to analyze some established measurement tools. Research these types of tools on the Internet, or consult with your evaluation partner. Your area agency on aging or local or state arts agency may have a staff member knowledgeable about evaluation or be able to point you to someone who can help.
Appendix 7: Evaluation Tools