the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Quantifying the Impact of Skeptical Science Rebuttals in Reducing Climate Misperceptions
Abstract. Misinformation about climate change causes societal damage in a number of ways and consequently, resources are required to support interventions that counter their influence. Aiming to meet this need, Skeptical Science is a highly-visited website featuring 250 rebuttals of misinformation about climate change. The rebuttals are written at multiple levels – basic, intermediate, and advanced – in order to reach as wide an audience as possible. This study collects survey data from visitors to the website, measuring their belief in climate facts and myths before and after reading a rebuttal. Our data found that a plurality of visitors were already highly convinced regarding climate facts, indicating many visitors come to the site not to answer unresolved questions but to gather resources and answers. We found that the rebuttals were effective in reducing belief in climate myths but that some rebuttals show a concerning reduction in belief in climate facts. The greatest improvement occurred with visitors who began with the most inaccurate climate perceptions. This indicates that the website is useful for two main audiences – those who are convinced about climate change but looking for material to support their own climate communication efforts, and those who disagree with climate facts but are open to new information. We examine potential ways that Skeptical Science rebuttals could be updated to improve their performance in raising climate literacy and critical thinking skills.
- Preprint
(2139 KB) - Metadata XML
- BibTeX
- EndNote
Status: final response (author comments only)
-
RC1: 'Comment on egusphere-2025-3812', Anonymous Referee #1, 31 Aug 2025
-
CC1: 'Reply on RC1', David Crookall, 05 Sep 2025
This is David Crookall, lead guest ed of the special issue.
Just a quick note to say thank you for an extremely helpful review.
Citation: https://doi.org/10.5194/egusphere-2025-3812-CC1 -
AC1: 'Reply on CC1', John Cook, 15 Sep 2025
Likewise, on behalf of the coauthors of the paper, thanks for your very helpful comments - we look forward to implementing them and strengthening our paper.
Citation: https://doi.org/10.5194/egusphere-2025-3812-AC1
-
AC1: 'Reply on CC1', John Cook, 15 Sep 2025
-
AC4: 'Response to RC1's comments', John Cook, 31 Oct 2025
Many thanks for your detailed comments. We have made a number of updates to our manuscript in response (which we think has resulted in a stronger paper). Here are our changes/responses:
Introduction: We’ve reworded the intro to clarify that subversive effects include the cancelling out effect, polarization, and self-censoring of scientists. We also shuffled them around beginning with the cognitive effect of cancelling out, followed by the social effects of polarization and self-censoring. Re the evolution of SkS rebuttals section, given it is only two paragraphs, we have trimmed this section down and integrated it into the previous section to trim and streamline. However, some justification for partial inclusion of this history, is that it documents an evolution of content design to bring the rebuttals more in line with debunking best-practices as recommended from psychological research (which then becomes relevant in the Discussion as we establish that we still have further to go).
Methods: We added some clarifying text in Methods to specify that participants would have conducted a Google search then clicked on a Skeptical Science link in the search results. We also added a footnote, clarifying that analysing the search phrases used to find our site are beyond the scope of this study but that we speculate (more tentatively in the revision) on the purpose of most readers in coming to Skeptical Science in the Results and Discussion section. Regarding the term “modal pop-up screen”, modal is the technical term used in the web design industry but not commonly known outside of the industry. We’ve updated the text for clarity, to “modal (an industry term for a pop-up box overlaying the webpage).”
Results: The text was restructured to lead with key findings then dig into the statistical details, improving the readability of the results (especially for readers not accustomed to psychology papers which is likely to be most of the readers of this journal). We also did this for the qualitative results, placing a summary of the key learning (the top three clearly explained the replacement fact, the bottom three didn’t) early on in this part of the Results.
Discussion: We’ve changed the wording at the start of Discussion to be more accurate: “most visitors (66.5%) already agreeing with climate facts, and 46.3% of visitors showing strong agreement with the fact or strong disagreement with the myth.” We’ve also qualified the inference about why they’re visiting: “This implies that a large proportion of visitors may be coming to SkS not because they were unsure about a particular climate fact or myth but because they were looking for information to assist them in responding to climate misinformation”. We’ve also restructured the Discussion moving the paragraph on why some of the debunkings performed poorly so that it immediately follows discussion of the overall pattern of findings (belief in climate myths declining, so too did belief in climate facts). We’ve also fleshed out slightly the discussion on discernment, incorporating newly published research that is highly relevant (Simchon et al., 2025).
Citation: https://doi.org/10.5194/egusphere-2025-3812-AC4
-
CC1: 'Reply on RC1', David Crookall, 05 Sep 2025
-
RC2: 'Comment on egusphere-2025-3812', Anonymous Referee #2, 10 Sep 2025
This article presents the results of a survey on the website Skeptical Science and attempts to determine the degree to which the climate myth rebuttals on the website changes the visitors’ viewpoint on the scientific consensus about climate change. This work is evidently very important in today’s world, where climate misinformation is becoming easier to spread while the need to act on the changing climate is all the more urgent. Therefore, I believe that the research in this piece of work to be highly publishable in this journal.
However, I recommend that the authors improve the quality of some aspects of the article to make it easier to follow, and to avoid making assumptions about why some rebuttals seemingly did not perform well, without any qualitative evidence to back this up (see below).
Introduction – It feels like there is possibly too much background information in this section, especially as it is presented without context as to why it is necessary to know before reading the rest of the study. In particular, I do not feel like Section 1.2 (Evolution of SkS Rebuttals) is necessary as it has no relevance to the survey or the discussion of the study. I acknowledge that background information about SkS is essential, but feel like Section 1.1 alone is adequate in preparing the reader for the rest of the article.A glaring omission when reading the introduction was a specific research aim, or purpose of the study. All the background information was presented without knowing the context behind it – why do we need to know what has been described in Section 1-1.1? What hypothesis or research question are you seeking to answer by doing this study?
Methods – I feel like a lot of trust is being placed on the assumption that all visitors that completed the second survey read the webpage and took in what was being said. For example, some people could have quickly scrolled to the bottom of the webpage and filled in the survey without reading the content of the website. Has some filter been applied where responses completed below a minimum time are removed from analysis? If not, I feel this should be considered. On a related note, I can’t seem to find Table S1 in Supplementary Material (where there is supposedly a distribution of times) – only Appendices.
Results – It would be interesting to know whether the “level” of the rebuttal read by the visitors had any affect on their change in perception of climate facts (though I concede this is probably impossible to find out in retrospect). However, my main concern with this section comes from the analysis of individual rebuttals (from Fig. 5) and why the “worst-performing” ones are not believed. Most of the arguments here as to how the rebuttals could be improved come from the authors, and are not the result of the survey itself. Therefore, I find it confusing to make comments like “rebuttal x fails to take in to account y and z” when there is no qualitative evidence from the survey that this is why visitors did not believe the rebuttal. While I think the comments made about these individual rebuttals in this way are useful and the reasoning is sound, this more speculative analysis would be better placed in the discussion section.
Discussion – I take issue with the statement on line 182 that suggests that 45% is “most visitors” – it is less than half! I do however agree that more visitors to the site agreed with climate facts than didn’t, so this sentence should be rephrased for clarity. I do not, however, agree that the implication (line 184) of this means that most visitors are “looking for information to assist them in responding to climate misinformation” – again this is not asked in the survey. These visitors could have been looking for detailed, accurate scientific details about certain aspects of climate change from a reliable source for individual research purposes, or just general interest. The wording around this implication could therefore be loosened to accept that other reasons exist as to why such people would visit the site.
Citation: https://doi.org/10.5194/egusphere-2025-3812-RC2 -
CC2: 'Thank you', David Crookall, 14 Sep 2025
This is David Crookall, lead guest ed of the special issue.
Just a quick note to say thank you for an extremely helpful review.
Citation: https://doi.org/10.5194/egusphere-2025-3812-CC2 -
AC2: 'Reply on CC2', John Cook, 15 Sep 2025
Thanks for your very helpful comments. You make a robust point - 45% is not "most"! :-) We look forward to implementing your comments in our revision which will improve our paper.
Citation: https://doi.org/10.5194/egusphere-2025-3812-AC2
-
AC2: 'Reply on CC2', John Cook, 15 Sep 2025
-
AC3: 'Response to RC2's comments', John Cook, 31 Oct 2025
Many thanks for your feedback which we have implemented, substantially improving the manuscript. We made changes to the following sections:
Introduction: we have substantially trimmed the Introduction (and incorporated Section 1.2 into Section 1.1). However, we do provide justification for providing some of the evolution of the rebuttal design, which happened to bring the rebuttals more in line with debunking best-practices. This becomes relevant in the Discussion where we come to the conclusion that there is still more work to be done. Text has also been added at the end of the Introduction specifying the research question
Methods: The omission of the reading distributions in the Appendix was an error, this has been added to Appendix A7. On the topic of excluding participants based on reading speed, we have also added the following text: "While some participants showed fast reading speeds, they weren’t excluded from the analysis as they were representative of real-world skimming behaviour and hence offered external validity."
Results: Unfortunately our data wasn’t sufficiently distributed across the different levels to facilitate such analysis of effectiveness across rebuttal levels. Part of the reason for this is because visitors were by default shown the lowest rebuttal level available if the link didn’t go to a specific level, resulting in basic level rebuttals being "over-represented" in our data. Re the qualitative analysis of the top 3/bottom 3 rebuttals, the Results section documented the inclusion/exclusion of two debunking elements: the replacement fact and the fallacy explanation. We deem this appropriate for Results - a straightforward documentation of the results of our qualitative analysis. In order to add clarity to the purpose of our qualitative analysis in the results section, we have added new text laying out a terse summary of the overall results in the opening paragraph of this section, and then explore the implications of these results in the Discussion section.
Discussion: We’ve updated the wording at the start of this section to accurately characterise our results, clarifying that, as you say, most visitors agreed with facts. As for the implication of looking for information, we have also added qualifying text as you’re correct that we can’t infer as strongly as we did in the text.
Citation: https://doi.org/10.5194/egusphere-2025-3812-AC3
-
CC2: 'Thank you', David Crookall, 14 Sep 2025
-
CC3: 'Comment on egusphere-2025-3812', Louise Arnal, 29 Sep 2025
Publisher’s note: the content of this comment was removed on 1 October 2025 since the comment was posted by mistake.
Citation: https://doi.org/10.5194/egusphere-2025-3812-CC3 -
CC4: 'Comment on egusphere-2025-3812', Theresia Bilola, 16 Oct 2025
*Note*
As someone with a science to policy communication focus, I find the research particularly compelling but I also recognize my tendency to focus on practical implications and messaging strategies may limit the depth with which I engage with the more technical psychological underpinnings of the study.
Overall Evaluation
This article offers a timely and valuable contribution to climate communication research by examining how online interventions can reduce misinformation using large-scale, naturalistic survey data. It presents nuanced findings on rebuttal effectiveness and also provides practical recommendations for improving communication strategies. However, despite its relevance and rigour, the manuscript requires major revisions, particularly a clearer discussion of methodological limitations, generalizability, and statistical interpretations before it is ready for publication.
ResultsThe publication is clear, highlighting that:
Belief in climate myths decreases significantly after rebuttal exposure.
The greatest positive shifts occur among those who began with the lowest levels of accuracy.
Alarmingly, some rebuttals also reduce agreement with facts, producing a paradoxical outcome.
In terms of structure and clarity:
The manuscript is well-structured and readable (Lines 21-36). The figures are effective in illustrating key results, though some could benefit from clearer labelling (e.g., emphasizing what ‘positive values’ in Figures 4 and 5 specifically mean)(Lines 121-143). The appendices and supplementary resources are thorough, providing transparency and reproducibility.
The discussion is sufficiently balanced, acknowledging both strengths (debunking myths) and weaknesses (occasional decreases in fact acceptance). The exploration of why certain rebuttals succeed or fail is especially useful. The authors’ recommendations: integrating clear replacement facts, explicitly explaining fallacies, and redesigning rebuttals are constructive and aligned with current psychological research (lines 201–211).
Need for ImprovementThe experimental design is clear and well-documented. Recruiting participants directly from website visitors through randomized exposure to factual or myth statements provides ecological validity. The use of pre- and post-survey questions allows for direct measurement of changes in perceptions. However, there is potential for improvement based on the following:
Deeper reflection on the role of cognitive load and readability in shaping rebuttal effectiveness (see references to accessibility concerns in lines 76–83 and future redesign plans in 201–211).
Consideration of additional outcome variables such as motivation to engage in climate discussions, not just shifts in factual belief (a gap noted around lines 186–191, where the authors describe “teaching the choir to sing” but do not measure motivational changes).
Explicit recognition of the limits of single-exposure interventions, especially given that misinformation effects often accumulate over time (lines 219–226 discuss live website challenges but do not address the cumulative effects of repeated misinformation exposure).
Sample Bias The majority of site visitors were already strongly convinced of climate facts (lines 116–118, 182–195). This makes the sample not representative of the broader public. While the authors acknowledge this, more explicit discussion of how this biases results (and potential improvements in future research) would be helpful.
Measurement of Engagement The study assumes scrolling to the end equals reading, but the median time spent on the site was only one minute (lines 107–110, 113–115). This is arguably too short to process complex rebuttals. A deeper discussion of how ‘engagement’ was measured or an acknowledgement of its limitations would strengthen the paper.
Qualitative Dimension Missing The article relies solely on survey data. Open-ended feedback from participants could provide deeper insights into why some rebuttals succeed or fail. The reliance on a single Likert-scale item per participant (rather than multiple questions per construct) constrains reliability (lines 209-211). Future iterations could triangulate beliefs with qualitative or behavioural data. This triangulation could be through open-ended feedback. For instance, collecting short written responses from participants after they engage with rebuttals can reveal the reasoning behind their reactions and can explain why certain rebuttals succeed or fail. Behaviour tracking is also another option. Monitoring the user interactions, such as time spent reading, scrolling patterns, or click behaviour can provide indirect evidence of engagement and belief change. This would complement self-reported data.
Overemphasis on Quantitative Shifts While numbers are useful, communication effectiveness may also depend on affective or motivational changes (e.g., increased willingness to talk about climate change). The current focus on survey outcomes (lines 219–225) overlooks these dimensions. The study could consider future work that broadens the outcome measures.Citation: https://doi.org/10.5194/egusphere-2025-3812-CC4 -
CC5: 'Thank you for CC4', David Crookall, 16 Oct 2025
This is David Crookall, Guest Ed of the special issue on climate education. Thank you, Theresia, for your excellent community comment. In my view, it can be taken as a review (referee comment).
Citation: https://doi.org/10.5194/egusphere-2025-3812-CC5 -
AC5: 'Response to CC4's comments', John Cook, 31 Oct 2025
Many thanks for your very specific comments, which has led to us making a number of updates - especially addressing limitations in our research (which points to potential changes when we commence data collection on our website again, which we are planning on doing). We've made the following revisions:
Figures: The text “(positive values mean increase in accuracy)” are now in the captions for Figures 4 and 5.
Experimental design: We have added some references providing some extra depth on readability, but we will qualify that we did look at the readability levels of our rebuttals to see if they correlated with change in accuracy, finding a non-significant correlation, so we didn’t delve into a deeper lit review as we didn’t see that as most productive avenue to greater rebuttal effectiveness.
Reading time: We acknowledge that for some readers, the reading time was disturbingly short. However, we didn’t exclude those from the analysis because 1) preliminary analysis found that it didn’t affect our statistical results, and 2) skimming behaviour is representative of how some readers browse websites so considered it offered external validity to the impact of the website on readers. We have added discussion of this issue in Methods.
Limitations: We’ve fleshed out the limitations section of the Discussion specifying that future studies should collect more specific data targeting concepts such as reader motivation. We discussed the limits of single-exposure interventions and challenges of repeated misinformation exposure. We added text discussing the benefits of more controlled lab experiments (including recruiting representative participant samples) although the fact that this study’s participants were also representative of the readership of the SkS website means our research results possessed external validity for our specific context. We agree wholeheartedly that future research (which we hope to conduct after the website’s redesign) should include qualitative open-ended questions and mention this in the Discussion section - that is definitely our intent next time around. We appreciate the advice on behaviour tracking as well. We will talk to the developers about the technical feasibility of this approach.
Citation: https://doi.org/10.5194/egusphere-2025-3812-AC5
-
CC5: 'Thank you for CC4', David Crookall, 16 Oct 2025
Viewed
| HTML | XML | Total | BibTeX | EndNote | |
|---|---|---|---|---|---|
| 737 | 113 | 25 | 875 | 26 | 21 |
- HTML: 737
- PDF: 113
- XML: 25
- Total: 875
- BibTeX: 26
- EndNote: 21
Viewed (geographical distribution)
| Country | # | Views | % |
|---|
| Total: | 0 |
| HTML: | 0 |
| PDF: | 0 |
| XML: | 0 |
- 1
This manuscript reports the findings of an in vivo impact evaluation of climate myth debunkings on people who arrived at Skeptical Science via a Google search query. Assessing the impact of attempts to debunk climate misinformation in vivo is a worthy research goal, and the research reported in this manuscript appears to be well conducted. Overall, I find this paper to be useful and will be pleased to see it published.
That said, I encourage the authors to consider how they might make the paper easier to read and grasp their key learning.
The Introduction section, for example, is a bit disjointed. Lines 26 to 42 could be restructured to clarify the more “subtle and subversive impacts” of climate misinformation. It isn’t clear why the “evolution of SKS rebuttals” section was included, given that the focus of the paper is an evaluation of current SKS content. Moreover, as a reader, I would like to see additional citations to substantiate statements about relevant prior research.
In the Methods, I would like to see a better explanation of how participants arrived at the research experience. The paper states they arrived via a Google search, but it would help readers understand the situation better if we knew what they were looking for. Is it the case that the Google research took them directly to a specific debunking and we safely presume that they asked Google a direct question related to that debunking? Readers need to better understand the research participant’s information search expectations to understand how they may have reacted to the question asked by the researchers. A minor point: I find the term “modal” (as in “modal pop-up screen”) confusing in this context. Is there an alternative term that can be used instead?
The results section is currently a bit hard to follow. The current flow of information is, essentially, “here is the data” and “this is what that data means.” I realize this is a standard and common way of reporting research findings, but I encourage the authors consider inverting the order of presentation to make easier for readers understand what the data mean. In essence, I’m suggesting the authors lead with a series of statements about the finding (e.g., the debunkings made visitors less likely to believe the climate myths being debunked) and then show the data that supports each claim.
I found the qualitative investigation of the best and worst performing rebuttals to be extremely useful, but I also it difficult to follow. Here too, I feel the detailed explanations made it challenging for readers to grasp the key learnings. Here too, I recommend leading with the key learning and then providing the details that justify the conclusion.
Lastly, in the Discussion section, I was not clear on why the authors concluded that “most visitors (45%) already strongly agreed with climate facts” (because 45% is not “most”), or that “they were looking for information to assist them in responding to climate misinformation.” I’m not saying these conclusions are unjustified, but I am saying I would find the statements more credible if they were better defended.
The paragraph beginning on line 201, which addresses why some of the debunkings performed poorly, would be better placed after the discussion of the overall pattern of findings—that belief in climate myths declined, but so too did belief in climate facts.
Minor points:
Line 26: the phrase “climate perceptions” does not fully convey the intended idea.
Line 224 : The phrase “a decreas in climate facts” does not accurately convey the intended idea.
Conclusion:
This paper is publication-worthy, but I would prefer to see it strengthened before publication. I thank the authors for their important work.