Citation is a powerful form of social communication that is largely unmanaged. Reference lists have more than quadrupled in length since the 1970s, but there have been few important advances in how citations are used or generally understood. The impact factor shows fractional quantitative relationships. Citations can be used to create unfounded authority by expanding modest claims through a series of distortions. They can be critical of the source, which currently still counts as a positive intellectual debt in impact-factor calculations. They can point to low-quality evidence, but this is difficult to detect in time-compressed situations or busy editorial offices.
Because citations are slippery, scholars, researchers, and academics read the literature with caution. Editors have no useful tools for evaluating the quality of the reference lists in papers that they receive. And once a citation takes hold in the literature, there is little to stop it from being increasingly distorted with reuse.
In short, citations need help, editors need help, and readers need a tool that makes it easy for them to discern inadequate or commendable citation practices.
SocialCite is being introduced to begin to address those fundamental issues in a way that is compatible with normal workflows and current technologies.
Using a simple Javascript widget that is easily inserted into online reference lists, SocialCite gives readers of the literature a two-click method for evaluating citations as a natural part of their reading workflow. If readers wish, they can also mark the type of citation that they encounter—whether the citation critiques the source, cites assertions, cites evidence, or cites authority.
There are long lists of the qualities that citations can possess, but SocialCite boils it down to two major dimensions: Is the citation “appropriate”? Does the citation point to “strong evidence”? SocialCite can limit its vocabulary in this way because it has been designed as a network tool. Most citations exist in multiple settings, in multiple journals, and in at least a small family of disciplines. If signals from throughout the literature are concatenated, millions of data points are possible, and this can create a useful and powerful statistical mesh that can be analyzed to derive a number of new and useful measurements.
The data are even more robust because there are two sides to any citation: the citing article, author, and journal and the cited article, author, and journal.
The “appropriateness” measure refers mostly to the act of citing; that is, is the citation a good-faith citation, accurate, and free of distortion or expansion? Some articles become common sources of distortion, passing a critical threshold from evidence into belief. SocialCite’s goals are to detect when that occurs and to limit the damage. False hubs of authority can form around such citation echo chambers. SocialCite seeks to weed these out before years of misdirected research questions emanate from a distorted authority hub. To that end, SocialCite will create a Care Index that shows which journals, authors, and papers have the highest rankings of care in what they cite and in how they are cited.
The “strong-evidence” dimension focuses more on the cited work: Does the article being cited provide high-quality evidence? Too often, cited sources are underpowered or flawed in some manner or simply provide overviews of other evidence while providing no new evidence. A paper may receive many ratings across citing sources as being of high quality or low quality when SocialCite is in place. SocialCite will create a Quality Index that shows which journals, authors, and papers cite the best evidence or are the sources of the best evidence.
Testing with more than two dozen scientific researchers and editors revealed a strong awareness of the problem that SocialCite seeks to address. In addition, reference lists are consistently used as part of the reading process, and the absence of a feedback tool around this major intellectual activity was viewed as a major deficiency in the current design of online journals. Finally, all agreed that two or three clicks would pose no barrier to use of SocialCite.
To encourage its use, SocialCite will offer all participants a free dashboard in addition to immediate feedback for any interaction. The personal dashboard for individual users will keep track of rated citations and provide summary data on journals that they routinely use. The dashboard for publishers will flag problematic citations in their journals and present summary data on all their titles.
Publishers will benefit from the increased use of their HTML pages and by the message that participation in SocialCite sends to researchers and readers: We care about quality, we understand that you use citations, and we have a technology that benefits your core activities.
We are also imagining an editorial tool once SocialCite’s data are robust enough to make strong inferences about journals and papers. Our vision is of a preflight tool for manuscripts at any stage in the review process. The tool would flag citations that are potentially problematic because they are being used inappropriately across multiple sources or because they are pointing to evidence that is being consistently rated as of low quality. The tool will only flag potential problems and recommend paying extra editorial attention. In addition, we are hoping to develop a tool that would flag authors who are routinely associated with papers that are distorting citations or publishing papers regarded as of low quality.
SocialCite’s potential is important, but its success depends on one crucial aspect: the network effect. It has to be installed in a large number of journals if it is to generate robust, reliable data among disciplines. That is why our business model will have no costs to publishers who want to install it (other than work that a platform provider might do to install the widgets). On the basis of our user research, we believe that the widget will increase engagement with publisher sites, especially their HTML. SocialCite was such a draw that when users were asked how they would rate a citation in a printed PDF, many said that they would go back to a SocialCite-enabled paper and rate citations that caught their attention.
Social media are changing the world. News stories trend on Twitter and Facebook. Political and athletic careers are bolstered or torpedoed by social media’s power and transparency. Science is advancing through social-media initiatives like Zooniverse. It is time for the power of social media to come to the evidence base of science by allowing scientists to indicate the quality and appropriateness of citations in the literature that they rely on. SocialCite has the potential to match the increased velocity and quantity of publication with increased quality and interactivity.
If you would like to be an early adopter or simply learn more, visit us at www.socialcite.org for more information and to register for your free account.
KENT ANDERSON is the founder of Caldera Information Solutions, LLC, and SocialCite.