Barriers to stakeholder participation in citizen science
The key barriers to using citizen science faced by science, policy and practice stakeholders.
The UKEOF-funded report Understanding Motivations for Citizen Science included a useful summary of the barriers and challenges to stakeholder participation in citizen science projects. The diagram below is adapted from the original table in order to reveal the key barriers faced by science, policy and practice stakeholders. Whilst these are not exhaustive summaries, this information will be useful for anyone looking to collaborate with other stakeholder groups on a citizen science or crowdsourcing initiative, as well as identify potential challenges in advance.
Key: Science-related stakeholder Policy & evidence stakeholder Practice-based stakeholder
Select a theme to explore related barriers
Data quality and biases
- Inadequate equipment (e.g. low quality sensors)
- Mistrust data from non-professionals
- Biases influencing decision to participate
- Scalability of data
Example: Quality of data is not at a level for use on a wide scale - Partnerships with local authorities
Example: Local authorities can lack manpower to assist - Patchiness of data
Example: Statistical techniques available to even out patchiness - Specific evidence need beyond the scope of citizen science
Example: “it wasn’t our direct need, and with low resources that wasn’t our priority” (Scientist, monitoring, policy)
Peer review / mistrust
- Peer reviewer reservations during publication process
- Citizen science frowned upon by colleagues
Example: Scientist (university) told: “should be doing proper science”. Another explained: “We as scientists are often a bit snobbish about our ability in comparison to other people’s ability, and you see that in the response to citizen science from a lot of the policymaking community. Citizen science equals poor data, that’s their starting point” (Scientist, monitoring, policy). - Institutional reservations about citizen science
Example: Need to get board members on side
Requirement of specialist equipment / knowledge
- Training required
Example: Scientist indicates specialist training is required due to challenges of identification: “Gone are the days where you used to have a huge visible injury on vegetation has gone, because there’s been acute exposure to everything.” (Scientist, policy, monitoring) - Difficulties of recruitment and commitment of volunteers
Example: “And then continued engagement, the enthusiasm barrier, because you get a drop off, an exponential drop off of participation as time goes on. So how do you keep the exponential drop off … as low as possible?” (Practitioner, science, engagement) - Inaccessible sites
Example: Linked to patchiness of data, “It’s difficult to tell people to go to a site that they think will be rubbish as well, I think if you want to see lots of dragonflies you go to a good dragonfly site rather than just anywhere” (Scientist, policy, monitoring) - Unable to keep up with technological developments
Example: Once technology is in place, it must be maintained and updated - Getting people to use technology
Example: “a survey that started off on paper, it’s actually very hard to move people over” (Practitioner, science, engagement) - Crowded marketplace for citizen science projects in certain areas
Example: Technology and online data options are flooding the market with similar citizen science projects
Time consuming / resource issues
- Promoting citizen science
Example: Launching apps requires time and resources for promotion and maintenance - Communication
Example: No time to explain key scientific ideas (e.g. recording absence is as important as presence) - Slow process
Example: Policy want answers yesterday - Individual interactions
Example: Individual requests for support are time-consuming - Mobilising and maintaining citizen science project
Example: Time spent validating, verifying, selecting appropriate technology, calibrating sensors; lack of funding, short-termism of funding, unable to prove concept, no funding for essential technical development - Volunteers threaten job opportunities/security of professionals
- Lack of interpretation may lead to poorly-informed public demands
Institutional politics
- Liability of organisers if don't act on citizen data
- Unaware of using citizen science data
Example: Scientists often use published data sets unaware that they are citizen science data - Nobody championing citizen science on high level
Example: Projects should identify someone high level to champion their project institutionally - Differing science and engagement objectives
Example: “you have to decide where you sit along the spectrum for the mass engagement versus data quality question” (Practitioner, science, engagement) - Activities require legislatory approval
Example: Challenges when species become scientific instruments. Legislated by Home Office - Lack of interest in engaging the public through citizen science
Unaware of audience
- Lack of attention to needs and expectations of the citizen science audience
Example: Must understand demands on participant’s time and project’s requirements may differ - Little acknowledgement of different types of volunteer
Example: Differing volunteer types, e.g. paper-based volunteer, don’t want to lose them due to quality - Need more volunteers
Example: There aren’t enough volunteers participating - Survey design by committee, by professionals only
Example: Too much discussion of small issues, but decision has to eventually be taken
Survey design / implementation issues
- Lack of clear research question
Example: Avoid “reverse engineering to a question”, instead “[be] led by a question” (Practitioner, science, engagement) - Survey is inaccessible
Example: Participants must be able to understand questions - Language barrier (scientific and linguistic)
Example: Avoid over-complication - Assumption that people have access to the internet and to a mobile phone
Example: Not everyone has a mobile phone on a data plan, nor access to the internet - Assumption that people are comfortable with technology
Example: Technological literacy should never be assumed - Over-reliance on web-based solutions
Example: Web-based solutions can be exclusionary - Designed a ‘boring’, yet scientifically important, survey
Example: “Well, it could be that it’s a really … important square, and when you go there and count your … your butterflies, you might only, you might see none, you might see one. And that’s a really boring day out” (Scientist, policy, monitoring)