As part of Journal of Applied Ecology’s efforts to discuss how real-world impact can be achieved following research, we’re talking to authors about their studies. In this post, Edward Straw discusses his first experience with sharing research via social media.
I think it’s fair to say that my experience publishing my first paper was atypical. Not many papers, let alone first papers, end up glued to the front door of a pesticide company by protestors. So, I wanted to take a moment, to reflect on science communication and present you with some of the behind-the-scenes numbers on what a widespread science communication campaign actually looks like.
For a little context, my first paper was a fairly simple experiment where we sprayed herbicides on bees and recorded which bees died. We found that some types of herbicides killed the bees, which we didn’t expect because herbicides should be non-lethal to bees.
The numbers
After the paper came out, I figured it’d be a good idea to write a Twitter (now known as X) thread about it. So, one morning early in lockdown I sat down for about an hour and typed one out. That Twitter thread ended up being seen by a colossal 623,000 people. Vastly more than I’d anticipated.
I was tracking engagement using Altmetric and could see that the thread went far and wide, reaching every corner of the planet. There were Tweets in a dozen languages about it, and it spread onto Facebook, news sites, Wikipedia and Reddit (reaching its front page via R/Science). On Twitter, my thread got 38,461 engagements (likes, re-tweets or replies).
Reading other people’s comments about my research was rather depressing. Specifically, the number of comments (from both a pro and anti-pesticide perspective) that clearly showed the commenter hadn’t even read the paper’s abstract before posting about it. To put numbers to this, over half a million people read the thread on Twitter, 38,461 interacted with it, while just a poultry 1,875 people clicked on the link to the paper through Twitter. That’s 0.3% of viewers, and 5% of ‘interactors’ actually clicking through to the paper. More people will offer their opinion on an article than will actually read the article itself. This statistic can help us understand the quality of online discussions about science.
Using conflict
Had I known I’d be reaching probably my largest ever audience, I’d have spent more than an hour writing the thread and been a touch more delicate in how it was written. One of the reasons I’d have spent longer is that I work on bees (which everyone loves) and pesticides (which a lot of people dislike), which makes it an evocative topic.
So, in publishing a paper finding a pesticide could (under certain conditions) kill bees, a lot of environmentalists rallied around the flag and said pesticides were the worst thing in the world. Conversely, a lot of farmers got their backs up and called the research terrible, called me an ‘idiot’ in Swedish and described me as someone who ‘doesn’t understand farming’.
I will add that a few months later I published a paper finding the worlds most used herbicide, glyphosate, doesn’t kill bees. And once that one came out, the teams swapped around and the farmers liked me, while some environmentalists called me a ‘propagandist for Monsanto’. The mob is fickle, take little heed of them!
The conflict my thread generated is, in my mind, why the article was so widely shared. An unfortunate feature of social media, whereby conflict generates engagement that triggers the algorithm to promote the post. But what it did teach me is that you can harness conflict to promote science. If done deliberately, and ethically, it can be a powerful tool for driving engagement.
An example of when I consciously employed this was an article I wrote for the Irish Farmers Journal where I advocated for banning pesticides for use in gardens. A totally honest title to the article would have been ‘Let’s ban garden pesticides’, and there’s nothing wrong with that title, but it is boring. Farmers aren’t going to click on that article because it’s not relevant to them. But I wanted farmers to click on the article, because I wanted to try and influence this stakeholder group to get them behind the idea.
So, to really get the clicks, I went with the slightly provocative title: ‘Let’s ban pesticides – just not the ones you’re thinking about’. A combination of conflict and clickbait. Conflict because most farmers don’t like the idea of banning pesticides. And clickbait because the title obviously doesn’t tell the whole story. To me, this was an ethical application of these principles because the article delivered on the promise of the title. Here, I used conflict to get a group of people with a particular viewpoint (don’t ban pesticides) to interact with a set of policy ideas they may well agree with but wouldn’t normally engage with.
Sometimes, we as scientists forget that our science communication content is competing for attention with everything else in a newspaper, magazine or social media feed, all of which is optimised for views. And not just low-brow content either, increasingly reputable publications are competing for views using more internet savvy methods. So, for scientists to gain attention they need to stand out. Leaning into conflict, in a conscious manner, can help drive engagement.
Confirmation bias
As part of the dissemination strategy for the bee-herbicide paper, an American biodiversity charity reached out to me to write a press release for the American market. They wanted to use the research to promote better pesticide regulation. Over a few drafts we wrote up a press release calling for better pesticide testing. All was going well until they inserted a paragraph at the end talking about how glyphosate (the main ingredient in the herbicides we tested) was harmful to wildlife, and calling for it to be banned.
This was an odd addition, as the research explicitly found that glyphosate wasn’t causing any toxicity in this set-up. A direct quote from the abstract: “…demonstrating that the active ingredient, glyphosate, is not the cause of the mortality.”. I pointed this out and asked for that paragraph to be removed. They refused, so I politely declined to work with them further. They published the press release about the paper without reference to me and with the paragraph criticising glyphosate.
What this highlighted for me, was that people will read whatever they want into science and will use it to further their agenda however they want. Saying “you are mis-interpreting my research” isn’t enough to get some people to change their behaviour. The actual science can become irrelevant if it isn’t furthering an agenda.
Theory in practice: Glue
The paper I’m framing this blog around really was glued to the front door of a pesticide company by protestors. While this isn’t really central to the content of the blog, it is kind of exciting. By introducing this conflict right at the top of the article, it encourages a reader to read on.
The story behind this is that in 2022 German Extinction Rebellion protesters targeted Bayer’s central offices. They glued themselves to the ground and glued a bunch of ‘anti’-pesticide papers to the doors of the building. Whether these protesters really understood what the paper was about I’ll never really know. Similarly, I’ll never know whether the pesticide company employees read the paper that’d been glued to their front door. But, if real life is anything like social media, there’s just a 5% chance that someone who’s interacted with the paper has actually read it.
Learn more about the impact of work published in Journal of Applied Ecology in our latest Editorial, and check out associated articles in our accompanying Virtual Issue.
This blog is written solely by the author, and independently to the other authors on the original paper.