At Authorea, we want to change the way scientists communicate and share their research. This includes giving all the information behind figures a place to live: by letting readers and reviewers access your data and code, your results can be easily reproduced and extended. ADDING DATA TO ARTICLES It’s really easy to incorporate Jupyter Notebooks as well as datasets in your articles. This guide illustrates the step-by-step procedure to easily add any arbitrary dataset to an Authorea article, by navigating to the Data folder, and adding files there (they get added to the Authorea’s underlying Git repository). ADDING DATA “BEHIND” FIGURES You can also associate data and Jupyer Notebooks to a specific figure. This guide illustrates the step-by-step procedure for adding a figure to an Authorea document and attaching data and Jupyer Notebooks to it. Whenever Authorea documents detect “data behind a figure”, a data flag icon appears next to the figure, as for the figure below. The icon is meant to alert you, your collaborators and readers that some data is available behind the figure.
Authorea has a new powerful commenting framework built into the editor. Each document has comment bubbles to the right of the text that may be toggled to leave or view comments. HOW TO COMMENT To insert a comment anywhere on the document, click on the comment icon that appears on the right margin of your document
Thomson Reuters’s reference management system ENDNOTE makes it easy to store, share, annotate and export your citations in selected formats. FIRST TIME SETUP IN ENDNOTE Adding references from EndNote is easy. You’ll need to do a little one-time prep work first. STEP 1 Download the BibTeX output style from the EndNote homepage. STEP 2 Add the BibTeX output style to your EndNote Styles folder. STEP 3 Open EndNote and through the Edit drop-down menu, go to Output Styles > Open Style Manager...
A recent article titled The spin rate of pre-collapse stellar cores: wave driven angular momentum transport in massive stars was written on Authorea and submitted to the Astrophysical Journal (ApJ) and to the arXiv as a pre-print. While waiting on peer review from the ApJ, the authors want to test Authorea as a platform for OPEN PEER-REVIEW. By going to the document’s page, you can comment on a section, figure, observation, sentence, or the whole piece. The authors and other commenters can respond and further the discussion. And it’s all out in the open, just how science was meant to be. But it doesn’t stop there. You can also view full-size, high-resolution versions of the paper’s figures, as well as easily follow links in the References at the bottom of the page. In the paper, show for the first time how internal gravity waves, excited in the turbulent layers of stars at least ten times larger than the Sun, can radically change their internal rotation rate. In particular, these waves – somewhat analogous to ocean waves – can determine how rapidly the stellar core spins around its axis when the star is about to die and become a supernova. The spin of a pre-supernova core is important because it deeply affects the stellar explosion and determines the rotation rate of the stellar remnant (neutron star or black hole).
The peer review process is a pillar of modern research, verifying and validating the ever-increasing output of academia. While the academic community agrees that some process of review is necessary to ensure the quality of published research, not everybody agrees on the best approach. In particular, doubts have been cast on the current peer review process: most journals select and assign one anonymous referee (few journals assign two or more) who is in charge of reviewing the manuscript and recommending it for publication or rejection. The argument is that the current peer review system is becoming inadequate. Here’s an incomplete list of issues: - Research is increasingly collaborative, complex, and specialized. Thus, it is less likely that one or a few referees can have the necessary expertise (and time) to properly handle many modern articles. Simply put, THE AVERAGE NUMBER OF AUTHORS PER PAPER HAS BEEN STEADILY INCREASING IN THE LAST FEW DECADES, WHILE THE NUMBER OF REFEREES PER PAPER HAS NOT. - “Publication pressure” means there is a growing number of papers to referee. This need can not be easily matched since scholars, who need to constantly publish and engage in the “funding race”, HAVE LESS TIME TO BE DEDICATED TO COMMUNITY SERVICE (in a “single referee” system the review process is very time consuming). - Given the anonymous nature of peer reviewing manuscripts, RESEARCHERS WHO VOLUNTEER THEIR VALUABLE TIME AND KNOWLEDGE DON’T GET RECOGNITION for contributing. - Cases of peer-review scams, mostly from predatory open access publishers, have grown in number over recent years. A number of journals, exploiting the publication pressure climate, accept and publish articles with LITTLE OR NO PEER REVIEW. - Similarly, there are reports of fraud in which authors review their own or close friends’ manuscripts to give favorable reviews .
Simple, mundane tasks can take huge tolls on our productivity. A growing body of research is quantitatively demonstrating the existence of willpower depletion, so don’t be a statistic. Authorea helps cut out friction associated with academic writing and editing, so you can get back to doing what you love instead of just writing about it. HERE’S THE SITUATION: You’ve spent months (or even years) running and reproducing experiments, keeping meticulous notes, collecting and annotating data, writing code, writing grant applications, presenting your progress and, occasionally, sleeping. All for that slam-dunk publication. Which, finally, you are writing up. BUT THERE ARE SOME PROBLEMS. From stylistic preferences and building consensus with your colleagues, to back-up paranoia, to re-re-re-formatting your article for the ~~reach~~, ~~next best~~, ~~achievable~~, journal of last resort, Authorea has you covered. AN AUTHOREA-TATIVE DIFFERENCE As the recent over 200-author CERN paper demonstrates, Authorea can really kill it when it comes to collaborating, on any scale. This is particularly powerful, given the well-feared notion that communication complexity increases as the SQUARE of the number of people on a project. Regardless of your level of distribution (worldwide or just down the hall), with Authorea, everyone you care to include can view, edit, comment, commit, and even upload and review data, code, and figures. Let’s consider that last point for a minute. No longer will you have to fumble for flash drives, attach and contextualize via email, or compile, edit, and view in separate programs. All the data and code associated with your figures is online in the upper left-hand folder for your collaborators to play with. Further, if your document is public, members from the wider Authorea community can comment, verify, and even fork it - increasing your FF and contribution to science. Pretty sweet. Authorea, given the oft-made comparisons to GitHub and Google Docs, also helps with versioning updates and the distributed editing of your manuscript. Let’s say your PI isn’t thrilled with your phrasing or explanation in the Discussion. With Authorea, you can: lock the section while you edit it (i.e. no one is looking over your shoulder, judging); commit the update for all to see (oh, how they will marvel); get real-time feedback through additional comments and edits; and, when your PI has a change of heart, you can easily revert back to the section’s previous version (by clicking on that handy “History” clock icon). So, Authorea provides a platform for collaborative writing and review of your manuscript, an easy and automated citation mechanism, a one-stop repository for all your figures’ data, code, and editing, and even lets you get pre-publication feedback from your peers. What’s more, Authorea will also format your manuscript for the journal of your choice - text, figures, bibliography and all - at a click of a button. TWO QUESTIONS: 1. Why _wouldn’t_ you use Authorea for your next collaborative publication? 2. _What would you do with the time saved_ when you’d otherwise be emailing around drafts and data, sharing and modifying code, clarifying, citing, and formatting? Let us know in the comments!
In a breakthrough victory for open access, the EU’s European Medicines Agency (EMA) approved a system last week that provides researchers and the public with the vast majority of data from clinical trials. While generally resistant to such developments, some pharmaceutical companies are already opening up their data to scrutiny. In the US, NIH’s clinicaltrials.gov hosts a similar database for voluntary submission of public and private clinical trial results. By January 1, 2015, however, all companies in the EU will be REQUIRED BY LAW to submit trial data for newly approved drugs. FDA is considering adopting such a policy, with safety and efficacy data-mining projects in mind. As the analysis from ScienceInsider notes: Published journal articles often contain the main outcomes, . . . but lack detailed data and information about study design, efficacy, and safety analysis, which might shed a different light on the results when analyzed by others; moreover, some trials aren’t published at all. The AllTrials campaign has argued that the details of every trial should be publicly available for anyone to study. Traditional publication formats disconnected from modern needs? A move toward data-rich scientific content? Opening up the process of verification and analysis to a wider audience? It’s as if science was always meant to be open, or something. Naturally, there are caveats to the ruling. Only identified researchers can download searchable trial results and data, while registered public users can only view results on-screen. Further, certain types of commercially relevant data may be redacted by companies, with the EMA providing an 18 month window before completed trial results are finalized and posted. Still, however, this represents a huge step forward for widespread access to and synthesis of information that could be critical for improving patient outcomes. Long-sought by many researchers, the beneficial network effects of open trial data have been lauded in the literature, with comparisons made to successes in the open-source community . In one example, data-sharing led to rapid analysis and determination of treatment for a deadly _e coli_ outbreak in 2011. By broadly applying standard protocols to ease the access and use of clinical trial information, researchers contend we will see huge health care improvements. Results include learning what treatments are best in which circumstances, determining contraindications faster, and increasing adoption and innovation rates in treatments. Here’s to hoping EMA’s actions are successful, FDA approves similar measures, and science in aggregate opens up to take advantage of these synergetic network effects. EMA Q&A release