Please take a moment to answer three questions that will help us make this Web site more useful to you.
Question #1: What is your age?
Question #2 What is your sex?
Female Male
Question #3 What is your primary occupation/area of interest?

  

Survey ©2008   Results Technologies Solutions, Inc.

Creativity Matters: The Arts and Aging Toolkit
photo of woman telling a story (ESTA)
 | 

8.4: Implementing the Evaluation

Most evaluators use a combination of tools to assess the efficacy of their programs and the satisfaction of partners and participants.

As you consider which tools best match what you need to learn about impact and whom you want to measure, consider these questions:

According to the American Society on Aging, quantitative methods are best to

Use qualitative methods to:

Regardless of the method, remember to collect only the information that you are going to use and use all of the information that you collect.

You may encounter concerns about confidentiality when assessing older adults. Promising discretion and anonymity and aggregating results are logical solutions. In addition, the privacy regulations mandated by the Health Insurance Portability and Accountability Act are a barrier in long-term care facilities to collecting medical data such as blood pressure or medication usage. Residents need to sign waivers allowing you access to this information.

Leaders of arts and aging programs universally comment that older adults vote with their feet. And they are rarely shy about sharing their opinions with you. While this is useful feedback, include a variety of evaluation tools to get an objective and credible picture of the program’s impact. In order of complexity, these tools are:

Attendance, Enrollment, and Membership Data

Use these three measures to obtain answers to basic quantitative questions, such as:

Informal Feedback  

Informal feedback is information you receive from participants, caregivers, family members, partners, and anyone involved with the program, plus what you overhear at the celebration of the community sharing of the art—or in the hallway. Often, these anecdotes will enliven an evaluation report or presentation. To be effective, ask leading questions and listen.

Observation  

Informal or formal observation includes what you see with your own eyes and through the lens of a camera. Observation is often used with people with dementia.

Informally, observe participants as they perform or exhibit and audience members who join you in honoring the art created by the older adults. In a session or rehearsal, note

For a more formal observation, use specific questions to ensure consistency among observers and across multiple sessions.

Program Example: Formal Observation  

The Visual Analog Recreational Music-Making (RMM) Assessment (VARMMA), developed by Dr. Barry Bittman, is designed to rate six parameters:

  1. Attentiveness (appearing connected to and observant of the RMM activity)
  2. Active participation (the state of actually performing the designated RMM activity)
  3. Socialization (positive interaction with others in the group)
  4. Positive mood/affect (a cooperative favorable disposition)
  5. Happiness/contentment/joy (signifies a pleasurable or satisfying experience)
  6. Meaningful self-expression (appropriateness of one’s contribution and actions to the program content)

The assessment uses a five-point rating scale: 0 = none; 1 = minimal; 2 = at times; 3 = often; and 4 = frequent. 100

 

In addition to providing invaluable documentation that contributes to marketing, public relations, and advocacy efforts, videography is an effective evaluation tool. Instead of attending sessions and recording observations on the spot, videotape the session and conduct analysis later. A researcher at the University of Oklahoma is reviewing videos of three different sessions of the Alzheimer’s Poetry Project and cataloging the responses of participants. She will compare these findings with reactions to other types of intervention, such as pet therapy and volunteers reading aloud from the newspaper.

Program Surveys  

These surveys assess participants’ satisfaction with a program. They can also gauge motivations and expectations for participating and collect quantitative data and anecdotes through open-ended questions. Program surveys are the best way to get audience members’ reactions to community sharing of the art, using a short response card rather than a longer survey. Traditionally, responses are anonymous, though you can include an optional name field.

The W. K. Kellogg Foundation has the following tips for making written evaluation surveys as effective as possible:

  1. Make the questions short and clear, ideally no more than 20 words. Be sure to give the respondents all the information they will need to answer the questions.
  2. Avoid questions that have more than one central idea or theme.
  3. Keep questions relevant to the problem.
  4. Do not use jargon. Your target population must be able to answer the questions you are asking. If they are not familiar with professional jargon, do not use it.
  5. Avoid words that are not exact (e.g., generally, usually, average, typically, often, and rarely). If you do use these words, you may get information that is unreliable or not useful.
  6. Avoid stating questions in the negative.
  7. Avoid introducing bias. Slanted questions will produce slanted results.
  8. Make sure the answer to one question relates smoothly to the next. For example, if necessary add “if yes . . . did you?” or “if no . . . did you?”
  9. Give exact instructions to the respondent on how to record answers. For example, explain exactly where to write the answers: check a box, circle a number, etc.
  10. Provide response alternatives. For example, include the response “other” for answers that don’t fit elsewhere.
  11. Make the questionnaire attractive. Plan its format carefully using subheadings, spaces, etc. Make the survey look easy for a respondent to complete. An unusually long questionnaire may alarm respondents.
  12. Decide beforehand how the answers will be recorded and analyzed. 101
Biased Question  

Don’t you agree that professional teaching artists should earn more money than they currently earn?

Unbiased Question

Do you believe professional teaching artist salaries are a little lower than they should be, a little higher than they should be, or about right?

 

Program Example: Program Survey Questions  

The Legacy Works program of Elders Share the Arts asks participants:

 

To administer a program survey, encourage participants to complete it at the last session. You will net a higher return rate than with a mail-in survey. When conducting audience surveys or response cards, include them in performance and exhibit programs, and have a staff member or volunteer hand them out to attendees. Set up several visible boxes to collect the completed forms. Don’t forget to provide pencils or pens.

Appendix 7 includes two examples of evaluation surveys: the New Horizons Music Evaluation Form and the Empowerment Group Care Partner Survey, which is targeted to family caregivers.

 

Interviews  

Interviews enable you to explore issues in depth and follow up on answers provided in written surveys. Follow these steps when planning and conducting interviews:

1. Determine a sample of participants to interview.
2. Schedule interviews.
3. Develop an interview guide of evaluation questions.
4. Conduct the interview.

5. Listen well.
6. Probe as needed to clarify answers or elicit more details.
7. Take notes.
8. Summarize on a form. 103

Select interviewees that represent the diversity of participants in terms of age, gender, ethnicity, education, experience in the art form, and other factors.

Focus Groups  

Focus groups are more efficient than interviews because one person’s comments often stimulate others to contribute related ideas. Sometimes people are less candid and more cautious in a group, however, because they feel intimidated by the perceived or actual social status of others. When you use this method, ask the same questions as you would in a one-on-one interview, and assemble the group using similar criteria.

Portfolio Review  

For visual or literary arts, you can use the art created by older adults—and students—to assess change over the course of a program. Review their respective portfolios at the beginning and end of the program to measure skill development. In the performing arts, you can assess skill development over time through participants’ performances. Of course, we have made the point repeatedly that older adults benefit equally from the process of making art; nevertheless, portfolio assessment is an option.

Experimental Design  

Even though experimental design sounds intimidating and may be time and labor intensive, it is an effective means of capturing how participants change because of your program. Both written surveys and guided observation that test knowledge, beliefs, attitudes, behaviors, skills, condition, or status fit in this category when they are administered before and after the sessions. This method is also known as a pre-test/post-test design. The obvious challenge is that older adults—and young people in an intergenerational program—may be affected by other variables as diverse as pet therapy, church, or another class.

Adding a control group to your experimental design further enhances the evaluation because it helps eliminate alternate explanations of experimental results. Members of a control group who do not participate in the program must be comparable in terms of numbers and demographics to the experimental group—the participants. Once you identify control group members, administer the same pre- and post-test at roughly the same time.

If you have not planned adequately for evaluation or have a limited budget and want to use a modified experimental design, ask respondents at the end of the program to reflect on how they have changed.

Surveys or tests that are appropriate for use in an experimental design evaluation include:

Be aware that you may require special expertise or a specific “key” to scoring to analyze some established measurement tools. Research these types of tools on the Internet, or consult with your evaluation partner. Your area agency on aging or local or state arts agency may have a staff member knowledgeable about evaluation or be able to point you to someone who can help.

Appendix 7 includes an example of a survey used in experimental design evaluation: the Empowerment Group Survey: Initial and Six Months.

 | 

Download This Chapter

 

Appendix 7: Evaluation Tools

Memories in the Making Evaluation Instrument

New Horizons Music Evaluation Form

Empowerment Group Care Partner Survey

Empowerment Group Survey: Initial

Empowerment Group Survey: Six Months

How to Measure an Objective