Bad Faith
I’ve been thinking a lot this past month about the concept of “bad faith”, particularly as it relates to peer review, in light of the so-called “Sokal Squared” hoax. There isn’t space here to go into much detail, but in short, a trio of academics crafted 21 “fake” papers and submitted them to several cultural studies journals and 7 were accepted, which they felt revealed serious problems with these journals and the field of cultural studies in general. If you’re interested in more details (and why they didn’t really prove what the claimed to prove), I recommend this article in Slate, but I think this Twitter thread from one of the reviewers of the fake papers is worth focusing on.
In their article exposing the hoax, the hoaxers claimed that the respectful comments as provided by this reviewer and others betrayed an underlying problem with the field, even if the article was ultimately rejected. The reviewer, however, notes that while he thought the paper had a lot of problems, he wanted to be constructive and provide feedback on what he thought was an argument presented in good faith.
And it is this assumption, that manuscripts are submitted in good faith, that under girds all of peer review, even in the more basic sciences. When an article has been discovered to have been submitted in bad faith, when it contains fraudulent data, it is not uncommon to see some claim that peer review failed. And while it is true there are instances where reviewers and editors should have caught obvious manipulations, every editor and reviewer must take it on faith that the data and images provided for every paper are what the authors say they are; e.g., that the gels provided actually do relate to those specific proteins, that the numbers in the tables correspond to actual recordings.
But it doesn’t have to just be a matter of faith. As it says in the Bible, “thus also faith by itself, if it does not have works, is dead” (James 2:17). In this passage, the writer of James is saying that faith itself is meaningless if it is not backed by good deeds; in the context of peer review, I would argue that transparency is the analogue to works. As we push for increasing transparency and availability of data, through initiatives such as FAIR (see below) and others, we cannot completely stop bad actors, but we are certainly making their work harder. It’s one thing to manipulate a single image or a table in an article, but it just simply takes more time and effort to fabricate all the underlying data and the accessibility of that data to independent reviewers increases the chances of being caught, if not during peer review, then at some time after publication.
It would be misguided to approach every author as a potential fraud: even when problems are found, it can be hard to discern if they are mistakes or something worse. But as science becomes more transparent, more open, we will be more able to see that an author is “justified by works, and not by faith alone.”
Jonathan Schultz
Editor-in-Chief, Science Editor
Note: This month we’re starting a new interactive feature called “Question of the Month” so please be sure to scroll all the down to see how you can contribute
Recent Early Online Articles & the Acronym of the Month
As noted above, journals and organizations are working hard to ensure that they support and promote the transparency and data accessibility in scientific research. In this recent article by Shelley Stall and co-authors, you’ll find a great example of an initiative to create “New Author Guidelines Promoting Open and FAIR Data in the Earth, Space, and Environmental Science”.
This also provides a perfect opportunity to introduce a new ongoing feature, the Acronym of the Month. Scientific publishing is larded with acronyms (and the occasional initialism), so we thought it would be helpful to regularly discuss some in this space, so you’ll be less likely to be caught off guard when they come up in meetings. This month, it’s the aforementioned FAIR, which stands for “Findable, Accessible, Interoperable and Re-usable Data” and you can find additional information about the FAIR data principles on the website of FORCE11 (you’ll have to look that one up on your own).
Hot Articles from Recent Issues (For CSE Members only)
As a CSE member benefit, once Science Editor articles are moved to an issue, they are available only to CSE Members for one year.
This month, CSE Members should be sure to check out Author Surveys: Insights into Iterative Author Survey Campaigns. In this article, Jessica Rucker and Jody Plank of the American Chemical Society provide helpful tips, and things to avoid, when creating author surveys. One key piece of advice: “Test, test, and retest your surveys before launch.”
Not a CSE member? Additional membership info along with instructions for becoming a member of the Council of Science Editors can be found here.
CSE Members are reminded to update their member profiles following these steps so they can access all of Science Editor.
From the Archives
Following the themes of environmentalism and conservation highlighted in some of the articles in the most recent issue of Science Editor, here’s an delightful article from April 2009 on “Rachel Carson, Science Editor” [pdf].
Resource of the Month
Being an editor and working at scientific publication requires being ever knowledgeable of a rapidly changing scientific and publishing landscape, so each month I plan to highlight a resource that will hopefully make this at least a little bit easier.
This month’s suggested resource is courtesy of Editorial Board member Lyndsee Cordes: The Conscious Style Guide.
As she notes, the Conscious Style Guide is “a resource for inclusive writing and editing. I’ve used this for questions around pronouns, preferred terms, etc. There’s also a newsletter with interesting reads from around the web and updates on new entries.”
Question of the Month
We end this month with what we hope will be a new ongoing feature, the Question of the Month. Each month we’ll ask a different question about experiences related to scientific editing and publishing, and we want you, the reader, to provide your answer. If we receive enough interesting responses, we’ll compile them into a future Science Editor article.
This month’s question: If you could provide one piece of advice to budding science editors on how they can improve their editing skills, what would it be?
If you have a response, email it to scienceeditor@councilscienceeditors.org. A special thanks to Editorial Board member Erin Nyren for this month’s question (which inspired this feature).
Feedback and suggestions are always welcome at scienceeditor@councilscienceeditors.org.
We are also always looking for new submissions or article suggestions you may have; for more details, see our Information for Authors.