
When people trust science, they can make better decisions, follow helpful rules and work together on big problems like health, climate change and new technology. But if people stop trusting science, it’s easier for false information to spread, and harder to solve those problems.
One way scientists are trying to build more trust is by being more open and honest about themselves and their work. The idea of “open science” means sharing data, how experiments are done and even results from tests that didn’t go as planned. Scientists are also being asked to tell people if they have any financial incentives that might affect the quality of their work.
But as a philosopher of science and public policy, I argue that some forms of openness can actually reduce trust.
Science isn’t perfect because scientists are human beings. They can make mistakes and have opinions that affect how they think. But some people may still believe in the “storybook” idea that scientists are always impartial and don’t make mistakes. They expect science to be better than it actually can be.
People can stop trusting scientists if they don’t meet those high expectations. But it’s possible that, even when scientists are mostly doing things right, people may still stop trusting them just because they aren’t perfect.
For example, in the US, a law passed in 2013 made doctors tell their patients if they had any connections to drug companies or other groups. After that, experts saw that people started trusting doctors less. That’s because many people think doctors should never have those kinds of connections. So when they found out doctors sometimes do, they felt disappointed and lost trust.
Another example was when the Climatic Research Unit at the University of East Anglia was hacked in 2009, leaking thousands of emails and forcing transparency about the work of climate scientists. This led to alarm among some members of the public, who believed they had found evidence that data contradicting the idea of global warming was being covered up.
Numerous inquiries found that there was no wrongdoing and that the East Anglia scientists were engaging in normal scientific practices. But the publication of data and correspondence without adequate context led some to see a conspiracy.
Indeed, there is some research that shows being open about science can make people trust it less. However, it’s not a simple relationship, and other research shows that being open can also make people trust it more. So, we have a puzzle: being open can both help and hurt trust in science.

Roman Samborskyi/Shutterstock
To understand this puzzle, we need to look at what’s being shared. One possible explanation is that people lose trust when the news is bad, like when something shows that scientists aren’t as perfect as they thought. But if the news is good and matches what people already believe about scientists, that may make them trust science even more.
This might suggest scientists should only be honest about good things. And if there isn’t any good news to share it might seem easier to lie – to hide the bad news and make up something good.
Some people believe that good scientists never have conflicts of interest and that scientists who do have them must be doing something wrong, even though that’s not always true. So you could argue that scientists should be trusted and that it’s OK to lie about what might be seen as conflicts of interest in order to maintain that trust. This is called a “noble lie”.
But most people believe lying like this is wrong. Experts in politics say the public has a right to know what their governments and scientists are doing – and much of the public would probably agree. Plus, lying only works if no one finds out, and history shows that the truth usually comes out in the end.
The idea of a noble lie is what I call a fake solution. It doesn’t really fix the problem. I would argue it just shows that people don’t understand enough about how scientists and science works.
Scientists aren’t completely unbiased. Everyone has some level of bias or outside pressure, which may or may not affect their work. And science doesn’t prove things incontrovertibly. It makes the best guesses based on evidence.
If we could help people see that scientists are human, not infallible but still capable of good work despite their biases, then we arguably wouldn’t need to lie. Being open and honest could actually help build trust, because people might better understand how science really works.
Scientists know that they aren’t perfect, and nor is the practice of science. But they haven’t done a great job of explaining that to the public. If we want people to trust science as much as it deserves, we need to help them really understand how it works.
Byron Hyde receives funding from the Wellcome Trust.