Development action with informed and engaged societies
After nearly 28 years, The Communication Initiative (The CI) Global is entering a new chapter. Following a period of transition, the global website has been transferred to the University of the Witwatersrand (Wits) in South Africa, where it will be administered by the Social and Behaviour Change Communication Division. Wits' commitment to social change and justice makes it a trusted steward for The CI's legacy and future.
 
Co-founder Victoria Martin is pleased to see this work continue under Wits' leadership. Victoria knows that co-founder Warren Feek (1953–2024) would have felt deep pride in The CI Global's Africa-led direction.
 
We honour the team and partners who sustained The CI for decades. Meanwhile, La Iniciativa de Comunicación (CILA) continues independently at cila.comminitcila.com and is linked with The CI Global site.
Time to read
2 minutes
Read so far

Evaluating Science Communication Projects

0 comments
Affiliation
Science Communication Consultant, Southern Science
Summary

Published on SciDev.Net, this article provides straightforward, practical advice on evaluating science communication projects that are often part of a wider campaign to engage people in science. Drawing on a variety of communication strategies, these projects may include public talks, debates, exhibitions, publications, science theatre, television documentaries, and citizen-centred initiatives like consensus conferences. The goals of such projects may include encouraging young people to consider science careers, supporting dialogue and informed decision-making about science and society issues, promoting public acceptance of new technologies, changing attitudes or behaviour, or simply fostering a science "culture".

For author Marina Joubert, the "why?" - the rationale for evaluating such projects - is clear (i.e. it allows one to improve the project as it develops, document what works/does not work, and assess how effectively goals have been met). It is the "how" - the actual strategy for measuring the achievement of such science communication goals - that shapes Joubert's guidance, which is divided into the following sections:

  • Planning for evaluation - Key points include: Develop a credible evaluation plan from the very start, considering carefully what needs to be evaluated and why, how the evaluation strategy will be implemented, and how the information will be used. The setting of clear and measurable objectives is crucial ("don't be too ambitious"). Participation and collaboration could be helpful in this process; Joubert suggests involving co-workers in designing the evaluation plan, and also drawing on evaluation reports from other science communicators.
  • Types of evaluation - Key points include: Both quantitative and qualitative evaluations can be important in determining a project's effectiveness - a determination which needs to happen before, during, and after a project. That is, early feedback, continuous monitoring/periodic checks, and a summative evaluation are all useful tools in guiding and assessing a science communication project.
  • Evaluation tools - Key points include: The appropriateness of the specific tools chosen depends on the nature and context of a project, but may incorporate:
    • observational tools such as photographs and video footage, which can be used to observe people's behaviour in science centres/museums, or to understand how they interact/respond during a public debate
    • website monitoring, which can be used to assess how people navigate a science communication website and to incorporate user-friendly response options to gather online feedback
    • simple questionnaires (to document what worked) and/or an attendance register (to obtain a demographic profile of an audience) for events;
    • interviews and/or focus group discussions to seek defined feedback on a project's relevance
    • analysis of press clippings and/or radio/television coverage of an event or science communication programme to help judge a project's wider impact
  • Evaluation in practice - Key points include: Keep the evaluation simple, practical, and user-friendly. Following pre-testing, choose tools that are easy to implement and tailored to address both a specific project's unique challenges/objectives and to fit with the intended audience (e.g., if working with teenagers, use young interviewers and avoid a formal "clipboard" approach to evaluation). Incentives can be developed to entice people to provide feedback.
  • Using evaluation results - Key points include: Make sure all collaborators know how the evaluation results will be used; position the evaluation as an empowerment tool; feed results (both positive and negative) into future activities; share results broadly (including mistakes); and communicate outcomes to team members, partners, and sponsors.

Editor's note: This article is part of SciDev.Net's broader E-Guide to Science Communication.

Source

SciDev.Net Weekly Update, January 6-12 2007.