Tips & Takeaways
- Listen to your families. Evaluation is the only surefire way to get the ensure you're giving families what they want and producing the most successful product you can.
- You can do evaluation in-house, cheaply, and quickly. Tips below will help.
- Even small amounts of data will uncover trends that will help you design and enhance your program or exhibit.
Quick & Easy To-Dos
- When you're choosing a program or exhibit topic, do a quick survey asking 10 families what interests them. You may be surprised by the results. You can use a survey tool like this from the USS Constitution Museum (left).
- Developing a text panel, object label, or program directions? Put out a draft. Using observation, watch how your public interacts with them. Are they reading your labels? Following directions correctly? Having a family conversation prompted by your design? If not, you may need to revise. When appropriate, ask a couple of families an open-ended, non-leading question or two that will help you understand why the element isn't doing what you want it to do.
- An easy way to find out how visitors are using your space is a Timing & Tracking Study. Watch 15 families and you'll start to see trends that can inform future design.
Here is an example of an easy evaluative tool to discover which program or exhibit topics families find most interesting.
Why Listen to Your Visitors?
Incorporate audience evaluation in our decision-making process. Asking, listening to, and learning from visitors is of paramount importance.”
Fischer, Daryl, Best Practices in Cultivating Family Audiences, 4.
You may think you've covered all the bases at the design table. But you'd be surprised at what you realize when you bring your exhibit or program in front of a real family audience. The USS Constitution Museum's entire All Hands on Deck exhibit was designed by prototyping and evaluating with its intended audience, families. As Director of Exhibits Robert Kiihne explains in this article, the exhibit's original design and the final design are markedly different thanks to the families who helped shape the exhibit through their participation and feedback.
Researcher Minda Borun's article, "Why Listen to the Visitor?," recounts a particular example from the All Hands on Deck exhibit, testing visitor preferences among three different voices in exhibit labels (contemporary questions, historical questions, and (simulated) quotes) and 1st vs. 3rd-person voice. The results of evaluating this aspect, along with the entire exhibit, resulted in a successful, award-winning, family engagement exhibit that could not have been possible by an behind-closed-doors, one-and-done design.
You Can Do This: Evaluating Success and Failure on a Budget
How do you know if you've achieved your goals? Do you have to hire the big, expensive firm to get you the data you need? No. You can do it cheaply, simply, and effectively in house by practicing Little E versus Big E evaluation. In this video, Heather Nielsen and Beverly Sheppard discuss some ways you can evaluate success in-house with your own staff and still get the data you need:
What You Need to Know
6 Types of Evaluation
1. Front-End Evaluation
Front-end evaluation is conducted during the beginning of a project, when you are developing an exhibit’s or program’s themes and content. It concentrates on getting input from your potential audience to find out what they know, what would like to know, and how this information could be presented in meaningful ways. It will often illustrate misconceptions and surprising interests audiences bring to your subject matter. Minda Borun gives several examples in this article.
Front-end evaluation can:
- Help you develop clear themes that mean the same thing to you and your audience
- Lead to specific content and presentation elements to counter misconceptions about your subject
- Identify content that really interests your audience
- Identify content your audience does not and will not care about
- Provide your audience input to the exhibit development process
- Get some core visitors excited about your up coming project
- Help you develop a successful PR plan
Front-end evaluation can take the form of interviews, focus groups, or surveys.
2. Formative Evaluation
Formative evaluation is conducted when you have something to show your audience.
You might try testing:
- Drafts of text panels, instructions, or even object labels
- Interactive prototypes
- Rough cuts of media pieces
Try showing a clipboard with two rough versions of a text panel to your visitors. Do they have a preference? Would they read the whole piece? Are they clear about what you are trying to communicate?
The USS Constitution Museum found formative evaluation extremely helpful. In this article, Minda Borun discusses one occasion in which the formative evaluation process completely changed an exhibit element at the USS Constitution Museum, resulting in a far better and far more effective multi-generational experience.
You can find many more examples throughout the prototyping pages on this site. Many times during formative evaluations our audience suggested changes and additions we had not considered that improved the overall product significantly. We have come to see formative evaluation as making the audience a partner in exhibit development. Building exhibits is expensive, and formative evaluation can not be skipped if you are to be successful – especially with a family audience.
3. Remedial Evaluation
Remedial evaluation is formative evaluation of the installed exhibition. You are still looking for things that don’t work and have reserved funds to fix them. At this point, you can look at visitors’ use of thematic groupings of exhibits, determine whether or not the main themes or areas are seen and understood, observe whether traffic patterns take visitors to all parts of the exhibition or if there are significant areas that are being missed, and find out visitor’s response to the lighting and ambiance of the exhibition.
4. Summative Evaluation
Are exhibits and programs every really finished? There is often room for major improvements after the opening.
Summative Evaluation is conducted when your audience can experience the total “package.” It often reveals problems that were not, or could not be, identified during the earlier stages of development.
- Do visitors understand which way you intend them to go?
- Can they find instructions or key text panels?
- Do graphics make sense in place?
- Does your audience understand the major themes of your exhibit or program?
- How much time does the average visitor spend in the exhibit or program?
- Are there elements of the exhibit everyone seems to miss?
- Is your program the right length?
Some museums use temporary orientation and directional signage when opening an exhibit because they know summative evaluation will identify necessary changes for the final versions of these exhibit elements.
Forms of summative evaluation:
- Visitor tracking, timing, and behavioral coding – simple effective and honest! Observe your visitors in the exhibit with a floorplan. Where do they stop? What do they interact with or read? Do they talk to each other about your content? How much time do they spend in each part of your exhibit?
- Exhibit interview, asking visitors to give you a little feedback at the end of your exhibit. Do they get the major themes? Did they enjoy the exhibit? Do they have suggestions? Exit interviews can be short and informal or long and comprehensive.
5. Timing and Tracking Evaluation
Observing families for a simple tracking and timing evaluation study is easy, informative, and objective. It requires minimal training so everyone from frontline staff to your board president can participate. All you need is a clipboard, a floorplan of your exhibition, and a watch.
A USS Constitution Museum staff person unobtrusively observes a family interacting with a facilitator (in red vest).
Developing Your Evaluation Instrument
Just like designing a program or exhibit, constructing your instrument, whether it's a questionnaire or an interview, takes time. The questions that sound best at your computer may fall flat in front of an audience. Prototype your instrument; test, revise, and test again until you get something that works.
Examples of Evaluative Tools & Instruments
Exhibit Observation Form
Staff from the USS Constitution Museum used this form to evaluate an exhibit element, "Where's the Iron in 'Old Ironsides,'" in their 1812 "Old Ironsides" Discovery Center. Using the results of the observations, it became clear that staff needed to clarify the directions, further explain some scientific concepts, and make a computer monitor visible to a 360-degree audience. Click image to enlarge.
Program Observation Form
After developing new family programs for the Bicentennial of the War of 1812, USS Constitution Museum staff used this form to focus on observing the types of engagement prevalent during the program. Staff learned that despite their best efforts, adults were not participating actively with their children. As a result, these programs were redesigned to be more family friendly. Read about that process here. Click image to enlarge.
Summative Program Evaluation Form
This from, front and back shown below, was designed by the USS Constitution Museum staff in collaboration with Engage Families Project Evaluator, Marianna Adams, to capture summative data on families reaction to redesigned programs. On the back, observers recorded successful and unsuccessful engagements they found, along with ideas for improvement. After observing the family, the observers surveyed the family. This allowed project staff to compare the observations to the families perceptions. The instrument showed us how successful the programs had become at engaging all ages of a multigenerational group. Click image to enlarge.
Why should I use this technique?
While some families love to share their thoughts and honest opinions about a program with you – the good and the bad – others may feel uncomfortable. Comment boards or posters provide families the opportunity to respond to the program with some anonymity. Adding a comment board or poster to your program can help you see if key thematic messages come through and evaluate the impact of the experience.
Families tell us what they think at this exhibit comment board. Plexiglas covers a few sample comments below the prompt while adhesive notepads and golf pencils rest in a shelf that doubles as a writing ledge.
What makes questions or prompts effective?
This is the tricky part. Writing an effective question or prompt requires testing and adaptation. At the USS Constitution Museum, we use comment boards in exhibits and programs to gauge how families respond to key thematic messages and demonstrate the impact of different experiences.
A few years back, we started an exhibit comment board asking “What does USS Constitution mean to you?” and received many nice comments on the symbolic nature of the Ship.
After installing a 7-minute show about battle at sea, we asked “How did this show change your view of battle aboard USS Constitution?” and received more nuanced comments. The range of responses – patriotic, anti-war, pro-war, and empathic – demonstrated a healthy barometer of positive and negative comments that reflected the honor and brutality of war.
At the conclusion of a redesigned family program, we used two reflective prompts, “What I learned about my CHILD(REN)…” and “What I learned about my ADULT…” Families lingered at the comment posters sharing touching and challenging reactions.
A mother and son add their opinions to a pair of comment posters after a family program. Large sheets of paper typed to a wall with a handwritten prompt feels informal, but proved very effective.
Keep in mind, regardless of the question or prompt asked, a percentage of people will comment on their experience at your library or museum in general, and some just like to post inappropriate comments in public. We often post a sample of the appropriate comments to help get people started.
Responses to the prompts “What I learned about my ADULT…” and “What I learned about my CHILD(REN)…” including “I learned that we can finally worked [sp] together” and “My daughters are creative, critical thinkers :)”.
Now, you try it!
You don’t need fancy materials to start a comment board or poster. Get creative with materials you already have and that families feel comfortable using like large sheets of paper, foam-core boards, adhesive notepads, stickers, washable markers, golf pencils, and whiteboards.
How Much Data Do You Need?
Doing evaluative work yourself? Do you need it done quickly and/or cheaply? In this post, you'll find out how much data you really need to get a clear picture of what you're evaluating.
Marianna Adams also provides advice about data collection in this video clip. She argues that you've gotten enough when you start to see reliable patterns:
Evaluation Can Be Fun!
It doesn't have to be boring or tedious. Surveys and written questionnaires are the most common but not always the best instrument for capturing the nuances of the visitor experience. In this post, Marianna Adams reflects on some of the problems with traditional evaluation and suggests more imaginative options. She encourages museum professionals to think creatively about how to get the data the museum wants and have it match the spirit of the experience.
"But Evaluation is an Imposition"
Perhaps it is with a boring, complicated survey. But make your evaluation into a fun, interesting conversation and you'll have the public hooked. Marianna Adams explains:
Often we tack on an evaluation to the end of our programs, and it can feel like a test to an already over-served public. Is that the right time and methodology? Marianna Adams has championed the use of embedded evaluation. She explains the process here:
In this video, Beth Fredericks and Marianna Adams discuss a real world example of embedding evaluation in Beth's snow globe program at the Boston Children's Museum:
Why Asking "Did You Have Fun?" Doesn't Get You the Results You Want
Whenever we ask this question, we get high marks and nothing useful to improve upon. The same with "What could we do better?"; most often the answer is "nothing." Marianna Adams addressed how to ask the right questions to get useful data.
Using Photography and Video for Evaluation
Why and when should I use this technique?
Observing families in action during a program or inside an exhibit provides a wealth of information you can use at any stage in the evaluation process. That said, even experienced evaluators can feel overwhelmed trying to actively observe families and record detailed notes at the same time. Photographs and videos can lessen the pressure and provide a different perspective on what’s working and what’s not.
What kinds of information can photos and videos provide?
Someone said a picture’s worth a thousand words, but one picture isn’t enough in this instance. If you’re lacking a solid collection of images during a program, try any or all of these suggestions:
- Ask a volunteer or coworker to photograph an upcoming program.
- If that’s not an option, make your contact information (including e-mail address) readily available and ask participating families to send you photographs taken with their personal devices.
Capturing the moment takes planning
Don’t expect that just because the camera’s on, you’re recording useful evaluation data.
Make a clear plan of what to record and think about the challenges you may face in advance. Can your camera’s microphone pick up the voices?
You can do this cheaply and easily
Remember, you don’t need high-quality images to evaluate. A smartphone or inexpensive digital camera will do the job just fine.