In recent coverage of a massive meta-analysis of the Google Scholar archives, the top-ten “elite” journals are compared to “the rest” in several broad disciplines. For papers published from 1995 to 2013, there was a 64% average increase of top-1000 cited papers coming out of non-elite journals (here, “elite” = top-ten most-cited journals for a given category; “non-elite” = the rest). Lest you worry these represent the _only_ cited articles in non-elite journals: the total share of citations going to non-elite articles rose from 27% to 47% over the same period. Part of the reason for this sudden shift is digitization. In the conclusion to the paper the team responsible for Google Scholar (released 10 years ago in November 2014) state: Now that finding and reading relevant articles in non-elite journals is about as easy as finding and reading articles in elite journals, researchers are increasingly building on and citing work published everywhere. With the introduction of exactingly searchable databases, the playing field is indeed leveling for access and awareness of all tiers of journals, splashy-high-impact or otherwise. This naturally leads to faster and more efficient scientific endeavors. (Imagine getting even closer, accessing new developments and discoveries in near-real-time. If you think the rate of progress in science is dizzying _now_...) Not mentioned, however, is the fact fields have grown more specialized, and publishers have responded by producing more specialty-specific journals. This may in part account for the increased share of non-elite citations: the publication of a groundbreaking article in a lower impact specialty journal will become a necessary citation in many subsequent papers in that and related fields. Another interesting point to consider in future studies is how open access journals measure up in citation rate. It has also been documented that high impact, elite journals have higher rates of retraction . Do the high impact works from non-elite journals show comparable rates of retraction? Given their high impact, many of the same explanations high impact journals give for higher retraction rates should still apply (i.e. increased exposure and thus increased scrutiny). Regardless, it is clear that new considerations must be made and changes are underway with respect to academic publications. Hopefully scientists return to their roots of open discourse and dissemination of their data SO WE CAN GET FURTHER, FASTER, TOGETHER.
The traditional way to publish scientific work is to write a narrative describing the performed experiments and related conclusions. Nowadays, the pressures for funding and journal impact factor generate a vicious circle promoting, at the very least, an increase in the minimum number of relevant findings required for publication and the over-stretching of claims. As a consequence, the problem of reproducibility in science has surged to the attention of the media, including The New York Times and The Wall Street Journal.
BIRDING FOR SCIENCE In a noteworthy display of Citizen Science, a 20+ year project has resulted in a publication exposing ecological shifts caused by climate change . Known as Project FeederWatch, an international community of birding enthusiasts watch their feeders during winter, taking note of the counts of species present. By aggregating and analyzing time-resolved data across different regions (specifically the Eastern U.S. in this study), researchers found dramatic poleward shifts in warm-adapted species increasingly dominating counts. Since the project started in 1990, rising average winter temperatures have correlated with groups of birds that are seen together (due to similar environmental adaptations) making up higher proportions of the total further north. The current study suggests the shift in species proportions occurs at approximately 70km per decade. While the strong evidence for the effects of climate change on ecology is compelling and documented throughout the literature, the authors note potential sources of deviation in their study. Random variation in migration patterns, increases in food supplementation, habitat destruction, or variation in natural food source distribution are possible confounding factors. The last one, in fact, could be an effect of climate change, a point the authors emphasize: this is not just affecting the species studied, but the broader ecosystems as well. In further support of their conclusions, the same magnitude in species shift is seen in European studies using the same statistical models . Beyond the content of the study, the sourcing of data is intriguing. As of its 26th season, Project FeederWatch has over 20,000 participants. Volunteers, for a small fee and 1 to 4 half-days during each two day observation period, get a bird identification poster, a detailed protocol guide, a summary of the previous winter’s data, and THE CHANCE TO CONTRIBUTE TO SCIENCE. They even provide their own bird feed, feeder, and water. Dozens of papers have used these datasets, and accumulated graphs and trends are viewable online. Real, impactful research from people doing what they love: enjoying nature in its many-feathered ways. This is what inspires young minds and gives citizens a deeper connection to the world around them. If it weren’t for these contributions and COLLABORATIVE EFFORT, ecological studies of this scale would be fewer and further between. With technological advances constantly distracting us, how could we better harness our connectivity to help citizens explore, observe, and make a difference in the world through science? Science News Coverage WANT MORE SCIENCE? Follow Authorea (fb, tw, g+) and sign up now!
In this week’s _Nature_, a special issue on the evolution of modern academic institutions, Arizona State University (ASU) President Michael Crow and his vision of the New American University are profiled. Appointed President in 2002, previously Executive Vice Provost at Columbia University, Crow began restructuring ASU. His goal: to shape it into A HUB OF MULTIDISCIPLINARY RESEARCH, ENTREPRENEURSHIP, AND INNOVATION. 12 years into Crow’s tenure, ASU has expanded its campus, forming and constructing new research institutes like the Biodesign Institute, School of Earth and Space Exploration, and the School of Human Evolution and Social Change. The University’s growth in funding and collaboration are also remarkable. From the _Nature_ piece: ASU’s funding numbers show that grant-givers find the cross-disciplinary approach attractive. From 2003 to 2012, the university’s FEDERALLY FINANCED RESEARCH PORTFOLIO GREW BY 162%, vastly outpacing the average increase seen at 15 similar public institutions. ... The number of funded projects with principal investigators in two or more departments rose by 75% between 2003 and 2014. While the article further notes that ASU’s publication rate has more than doubled, it asserts its scientific profile has hardly been raised. Citing largely unchanged proportions of publications in high-profile journals or with high numbers of citations, this analysis doesn’t account for some important factors. 1. Since ASU’s funding at present has more than doubled, one should expect an explosive BOOM in publications and citations over the next few years; 2. Collaborating, whether in the same institution or across the world is fraught with challenges, so pace may lag; 3. The research from ASU’s new institutes and far-reaching collaborations is inherently different (innovation is, by definition); like Crow said, “We don’t want to ask the same questions as other institutions,” so there aren’t yet large circles to cite these early works; 4. Building off the above point, it can take years for journal articles to accumulate even a portion of their lifetime citations (an argument against impact factor); Even without these caveats, ASU’s progression and THE ESSENCE OF CROW’S NEW AMERICAN UNIVERSITY MODEL PRESCIENTLY ANTICIPATED RECENT DEVELOPMENTS IN THE MODERN UNIVERSITY. Observing ASU’s emphasis on cross-disciplinary entrepreneurship and innovation (E&I) through research, one notes similar trends at top-tier universities across the world. With job markets in flux, a rapidly changing economy, and an ever-increasing focus on science and technology, schools attract students and build connections to business through E&I hubs, an explicit goal of Crow’s vision for ASU. Rethinking and reimagining research and education at academic institutions is critical for universities and their students to remain competitive. Best of all, science and society will both benefit. Here’s to hoping the New American University expands beyond Phoenix, Arizona.
Authorea has a new powerful commenting framework built into the editor. Each document has comment bubbles to the right of the text that may be toggled to leave or view comments. HOW TO COMMENT To insert a comment anywhere on the document, click on the comment icon that appears on the right margin of your document
At Authorea, we want to change the way scientists communicate and share their research. This includes giving all the information behind figures a place to live: by letting readers and reviewers access your data and code, your results can be easily reproduced and extended. ADDING DATA TO ARTICLES It’s really easy to incorporate Jupyter Notebooks as well as datasets in your articles. This guide illustrates the step-by-step procedure to easily add any arbitrary dataset to an Authorea article, by navigating to the Data folder, and adding files there (they get added to the Authorea’s underlying Git repository). ADDING DATA “BEHIND” FIGURES You can also associate data and Jupyer Notebooks to a specific figure. This guide illustrates the step-by-step procedure for adding a figure to an Authorea document and attaching data and Jupyer Notebooks to it. Whenever Authorea documents detect “data behind a figure”, a data flag icon appears next to the figure, as for the figure below. The icon is meant to alert you, your collaborators and readers that some data is available behind the figure.
Friday, an op-ed piece _actually_ titled “Academic Science Isn’t Sexist” went up on the _New York Times_ blog (a version appeared in the Sunday Review). It was about academic research and the lack of sexism therein. The two editorialists are co-authors on a recently released analysis on the subject (it _is_ beautifully open access, and much of the raw data is available). The piece and the paper claim sexism has largely waned in academic research, the result of shifts from a previously sexist, male-dominated academy. Further, that any remaining incongruities between male and female enrollment, advancement, and achievement are artifacts and anecdotal. Academic research is completely gender-blind now. Any differences are largely the product of society-at-large and earlier life decisions (like the choice to play with dolls/cute animals versus trucks/destructive robots). Huh. The response from the science blogging community and Twittersphere was immediate and is still on-going. Jonathan Eisen responded Halloween night, soon after the piece was posted. His immediate critique was of the acknowledgement of reports of “physical aggression” in the op-ed piece, without ever addressing these in their data or analysis (even the 60+ page research paper is short on coverage). The assumption: they are also anecdotal? So everything is actually fine? Probably not (<- this article details accounts of _sexual misconduct in field work_ involving biology, anthropology, and other social sciences, disciplines the authors above highlight as _largely welcoming and open to women_). Emily Willingham provides excellent analysis of the data presented in the paper and in the broader debate at hand. It turns out there are numerous discrepancies and avoided topics of analysis (e.g. salary figures often had statistically significant differences by gender; women more often reported lack of inclusion; more details in her impeccable post). Likewise, Matthew Francis covered the story, emphasizing the need to actively address these still-existent problems and not ignore them: the importance of even a little explicit encouragement of female students in the face of implicit discouragement (like he sees in his native field of physics) is often all that’s needed. The ever-emphatic PZ Myers rounds out the debate by breaking down the major reasoning and assumptions in the original paper, with characteristic gusto. So what exactly were the original authors thinking? A handful of distributed scientists were able to challenge the key arguments of their paper, using their data and citations, in free time over the weekend. Talk about peer-review. Seriously though, what were they thinking? I would _like to think_ that this was actually a brilliantly orchestrated publicity stunt to get more attention on this critical issue. AFTER ALL, WHO IS GOING TO BLOG/TWEET/COUNTER-OP-ED “ACADEMIC SCIENCE IS SLIGHTLY LESS SEXIST THAN WHEN MALE ACADEMICS COULD STILL SMOKE IN THEIR OFFICES”? Because when you look at the data, the background on this issue, and the immediate response from the community, it’s obvious academic research isn’t now some utopian meritocracy brimming with equality. There is still institutional and systemic biases. Whether its gender, race, sexual-preference, or need related, or tied up in the archaic publishing system that is all too easily gamed, we have a long way to go before things can be considered “fair”. What might a fair system even look like?