These resources are designed to help you develop an effective and efficient evaluation strategy, tailored to your event. They include generic survey templates that will be appropriate for a variety of events and target audiences, and which represent the middle ground, aimed at maximising feedback and comparability across the range of events.
We’ve also included suggestions for other methods of more detailed evaluation for those who want greater depth, or simpler evaluation which sacrifices depth in order to achieve a higher response rate.
The following advice on event evaluation can also be downloaded for use offline:
Inspiring Australia Evaluation Resources [PDF 566 KB].
There are many different ways to collect evaluation responses. What you choose to do will depend very much upon your event and your audience. Some of the options for collecting evaluation responses are:
Paper-based surveys – Paper surveys are arguably one of the easiest ways to evaluate an event. Five survey templates are provided, to cover different audiences.
Online surveys – You may wish to use an online survey instead. We provide an example, and can help create an online survey specifically for your event.
Observations – Observations are subjective, but they can provide rich insights into who is in your audience and what they’re doing at your event.
Bean polls – If an evaluation form is inappropriate for your event, you can use a simple bean poll where you ask people to rate their opinion on a single question. A bean poll can also be used in conjunction with other evaluation methods like surveys.
It is essential to consider the event or program aims when you are planning your evaluation. Thinking about what you are trying to achieve is necessary to work out how to assess whether or not you have achieved it.
Inspiring Australia has four broad aims. They are to:inspire target groups and get them to value scientific endeavourattract increasing national and international interest in sciencecritically engage target groups with key scientific issuesencourage young people to pursue scientific studies/careers.
These aims cover a range of more specific objectives, such as informing people about a particular topic or equipping people with critical thinking skills. Effective evaluation should consider the aims in terms of delivery of specific and targeted outcomes.
The Framework for Evaluating Impacts of Informal Science Education Projects [PDF 665 KB] contains a chapter describing use of logic models to explicitly state and visually represent aims, objectives and outcomes. The development of this framework was supported by the USA’s National Science Foundation with the aim of advancing the field of informal science education. Case studies and reports can be obtained from the Center for Advancement of Informal Science Education: caise.insci.org.
Five Generic Learning Outcomes (GLO) from the Inspiring Learning Framework, have been developed by the Museums, Libraries and Archives Council in the UK, to describe the benefits of attending museums, libraries and archives. The GLOs and the National Science Foundation supported Framework for Evaluating Impacts of Informal Science Education Projects have informed development of the Inspiring Australia Evaluation Resources. The GLOs are:Enjoyment, inspiration, creativityAttitudes and valuesSkillsKnowledge and understandingActivity, behaviour and progression.
Common questions in the survey templates have been chosen to cover all five GLOs. A list of suggested optional questions [DOCX 17 KB] is provided and related back to the GLOs.
It is useful to note down the desired outcomes of your event and read through the list of optional questions. Which of these questions might be most useful to get information about the specific objectives and outcomes of your event?
If you have questions on any aspect of the resource kit, or would like more information, please feel free to contact us.