Authorea as Submission Platform: Value Adds for Journals
We spend a lot of time talking about the value Authorea can provide research collaborators as well as the open science movement as a whole (don’t worry, there’s more to come!), but today let’s turn the tables.
How can Authorea not only help journals expedite peer review, typesetting, and submission, but also increase overall article quality and impact?
Authorea provides a simple writing, editing, and reviewing platform for researchers to use across the globe. The same tools could be implemented by journal referees. With Authorea’s site-specific Comments, reviewers can leave highly contextualized notes (no vague lists of changes or requests for more information). Further, dependent on the privacy settings of the refereed article (public, all co-authors, or reviewers only), reviewers may experience some positive pressure to complete their duties in a timely and high-quality manner. It’s amazing what a little competition and supervision can do.
Format and Submission Guidelines
So you want to submit your painstakingly researched article to Journal X. Journal X is a good journal, easily in the top-five of your field. Great. Journal X has a website, a page specifically describing the desired format and file-type (and tone, verb tense, grammar, etc.), and a place to send or drop your file. Wonderful. So you leap through all those hoops, submit, and then…this.
With Authorea, you can bypass all the submission slog with one click, getting published/rejected that much faster. Authorea can also let you open your work to wider scrutiny earlier, helping you clear up issues before they jeopardize your publication chances (see Linus’s Law below). What’s more, in the sad event Journal X rejects your manuscript, you can quickly and easily re-format your work for Journal Y, a comparable journal that might “get it” more. After all, Nature held-up publishing on the Kreb’s Cycle in 1937.
Data & Code, at Your Fingertips
With the integration of data, code, and interactivity baked into the Authorea interface, reviewers can easily fact check images and information, making sure they get the same results. This will allow reviewers to crack down on less-than-scrupulous researchers as well as honest-to-goodness errors, ensuring the rigors of published work easier than ever before.
Linus’s Law: Wider Community, Bigger Effects
This last point cuts to the core of what truly open science can offer. But first, a relevant digression on bottom-up design in open source programming:
Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone.
Or, less formally, “Given enough eyeballs, all bugs are shallow.”
I dub this: “Linus’s Law”.
My original formulation was that every problem “will be transparent to somebody”. Linus [Torvalds] demurred that the person who understands and fixes the problem is not necessarily or even usually the person who first characterizes it. “Somebody finds the problem,” he says, “and somebody else understands it. And I’ll go on record as saying that finding it is the bigger challenge.” - Eric S. Raymond
This is the HUGE value add to researchers and publishers, as well as Science (the amorphous process as well as community) and Humanity. With a sufficient number of Authorea-based reviewers - distributed across the world, skill-levels, and areas of expertise - any gaps in the code, data, experiments, figures, methodologies, statistics, or conclusions will jump out to someone. Then this incongruity could be flagged for the official journal referees to wrestle with.
No longer would a few choice referees be the sole arbiters of “true results”. Going a step further, Authorea community members could re-create pre-publication articles’ experiments. This is where forking comes in: there could be real reproduction and immediate advancement of research, with credit and confirmation (and even collaboration) given where it’s due. It may seem idealistic, but it is by no means impossible.