Public Articles
Engineering a Table Tennis Smart Racket for Game Analysis and Coaching
Geoscience Papers of the Future: Lessons Learned from Practicing Reproducible Research, Open Science, and Digital Scholarship
and 15 collaborators
Hey, welcome. Double click anywhere on the text to start writing. In addition to simple text you can also add text formatted in boldface, italic, and yes, math too: E = mc2! Add images by drag’n’drop or click on the “Insert Figure” button.
Citing other papers is easy. Voilà: \cite{2012} or \cite{Holstein_2009}. Click on the cite
button in the toolbar to search articles and cite them. Authorea also comes with a powerful commenting system. Don’t agree that E = mc3?!? Highlight the text you want to discuss or click the comment button. Find out more about using Authorea on our help page.
The Geosciences Paper of the Future Initiative was created by the EarthCube OntoSoft project and its Early Career Advisory Committee formed by 30 geoscientists in different disciplines in order to disseminate best practices for reproducible publications, open science, and digital scholarship. The Initiative consists of three major efforts:
the compilation of best practices from a variety of community organizations (e.g, ESIP, RDA), scientific societies (e.g., AGU, AAAS, CODATA), curators (e.g., IEDA, NSIDC), and publishers (Nature, Science)
the dissemination of best practices through training sessions at major scientific conferences (e.g., AGU, GSA, ASLO, CEDAR); and research institutions (e.g., WHOI, USGS). The training materials are openly available, including a summary checklist for authors, and show how to manage their scholarly identity, reputation, and impact throughout their careers.
the publication of a special issue of the AGU Earth and Space Science journal on Geoscience Papers of the Future containing articles that illustrate how to apply these best practices in different geosciences areas, with another special issue of the journal Geophysics under way.
A Geosciences Paper of the Future follows best practices to document all the associated digital products that result from the research reported in the paper. This means that a paper would include:
Data available in a public repository, including documented metadata, a clear license specifying conditions of use, and citable using a unique and persistent identifier
Software available in a public repository, with documentation, a license for reuse, and a unique and citable using a persistent identifier
Provenance of the results by explicitly describing the series of computations and their outcome in a workflow sketch, a formal workflow, or a provenance record, possibly in a shared repository and with a unique and persistent identifier
These best practices are described in detail in \cite{Gil-etal-ess16}.
The Geoscience Papers of the Future published to date not only serve as exemplars of how to implement best practices, but also expose limitations of existing cyberinfrastructure capabilities to support scientists in their work.
In this paper, we give a synthesis of perspectives by GPF authors contrasting the approaches used to implement GPF best practices in their own disciplines, the lessons learned, the challenges encountered, and the benefits found. We should summarize here the main findings.
The paper starts with an overview of the articles that illustrates the breadth of disciplines, motivations, and approaches covered by all the GPFs. We then compare the different papers along common dimensions. We discuss the benefits and the challenges found. We conclude with prospects for the future.
NOTE from 5/15/17 meeting: Add a comment about the different levels of reproducibility.
Comparison of the Differential Cross Section of High Energy Gamma Rays using Classical and Quantum Electromagnetism
In an experiment conducted in 1923, Arthur Holly Compton observed an inelastic scattering of photons by a charged particle at high energy and correctly predicted the result from derivation made possible by attributing particle-like momenta to light particle, quanta. This result laid claim to the particle-like behavior of the light and continued the discussion on the wave-particle duality of light. This was significant as the behavior of the light was widely accepted as purely wave-like in the study of classical electromagnetism. The modern interpretaion of light considers it both wave and particle.
At the time of Compton’s discovery, the light was largely understood as a wave. One example of wave-like behavior of light was observed in the slit experiments in which the light scattered similar to water rather than particles. In 1905, this view was challenged by the discovery of photo-electric effect \cite{Einstein_1905} which associated the likelinesshood of electron production from metal shined by light to the frequency of the light rather than intensity of the light. This interpretation of the light suggested that light may be closer in behavior to a stream of particles rather than a wave phenomenon. The observation has given rise to the expression of the energy in terms of frequency, \begin{equation} E = h f \end{equation} where \(h\) is the Planck’s constant and \(f\) is the frequency of the light. The result was inconsistent with the classical electromagnetism which claimed that scattering is determined by the intensity of the light.
Nature Communications Template
Алгоритмы локации и маршрутизации. Алгоритм Калмана-Кузьмин
and 1 collaborator
Фильтр Калмана – эффективный рекурсивный фильтр, оценивающий вектор состояния динамической системы, используя ряд неполных и зашумленных измерений. Назван в честь Рудольфа Калмана. Впервые был описан в 1960 году\cite{Selcuk2002}.
Parallel implementations of a TV-\(L^{1}\) image-denoising algorithm
Imaging acquisition involves many hardware and software stages that introduce error sources. This is seen as visual artifacts in the image, typically recognized as noise. This can be especially noticeable in images acquired with low levels of illumination, such as night photography.
Two common manifestations of noise in digital images are Gaussian noise and salt-and-pepper noise. Gaussian noise is typically associated with errors in detection. It produces pixel values that vary within a quasi-normally distributed range about the “true” value at that point in the image. Salt-and-pepper noise typically arises from transmission errors, and the pixel value is recorded as either fully on or fully off (in grayscale, white or black). Removing these artifacts can be desirable from an aesthetic perspective or in order to pre-process images for other workflows. Common techniques to address this include Gaussian blurring, which takes the convolution of an image with a Gaussian kernel window, and median filtering, which replaces pixels with the median value of a sliding window. These techniques can reduce noise; however, they are also susceptible to blurring edges of features.
The total variation technique was introduced in 1992 by Rudin, Osher and Fatemi (ROF) \cite{Rudin_1992} as an alternative denoising method. The method works by iteratively constructing a function u on a domain Ω that minimizes its energy with an input function f: \begin{equation} \min_{u}\int_{\Omega}\left\Vert{\nabla{u}}\right\Vert + \lambda\int_{\Omega}(u-f)^2 \end{equation} with ‖ ⋅ ‖ being the L2 norm. The first term is the total variation of the image. It is a regularizer for the minimization function. The second term also includes an L2 norm, which means that the problem is convex and has a unique solution \cite{chambolle:hal-00437581}.
Replacing the second term above with an L1 norm leads to the TV-L1 model: \begin{equation} \min_{u}\int_{\Omega}\left\Vert{\nabla{u}}\right\Vert + \lambda\int_{\Omega}\left\vert{u-f}\right\vert \end{equation} The TV-L1 model is not a (necessarily) convex model; however, when applied to discrete images, the model tends to offer better performance at removing salt-and-pepper noise from images than the ROF model and is thus an important image processing technique. It is a standard method implemented in many computer vision packages, including OpenCV \cite{noauthor_opencv:_2017}.
A QTAIM assessment of the transannular interactions in the medium-sized ring compounds
Transforming Introductory Astronomy at USC : Spring 2017 Progress Report
and 1 collaborator
Поиск соседних регионов России и их классификация по численности населения с помощью языка запросов SPARQL
and 1 collaborator
Аннотация
Статья посвящена исследованию наиболее важных свойств объекта Викиданных “Субъекты России”. С помощью SPARQL-запросов были получены данные о количестве экземпляров объекта “Области России”, о количестве всех существующих на данный момент времени субъектов Российской Федерации(рассматривались области России, республики, города федерального значения, края, автономные области, автономные округа, бывшие административно-территориальные единицы), построен граф соседних субъектов РФ и стран, а также нарисована карта, на которой отмечена численность населения отдельных субъектов Российской Федерации. Более того, рассматривалась задача, в которой исследовалась заполненость свойства “shares border with”(граничит с) у каждого экземпляра рассматриваемых объектов. По ходу работы пополнялись поля со свойствами в Викиданных. Читатель познакомится с компьютерной обработкой Викиданных и визуализацией информации о регионах России.
How to conduct a meta analysis
The reservoir pressure concept: a response to the controversy
and 2 collaborators
Draft
and 1 collaborator
Most genetic association studies focus on common variants, but rare genetic variants can play major roles in influencing complex traits.\cite{Pritchard_2001,Schork_2009}. The rare susceptibility variants identified through sequencing have potential to explain some of the ’missing heritability’ of complex traits \cite{Eichler_2010}. However, standard methods to test for association with single genetic variants are underpowered for rare variants unless sample sizes are very large \cite{Lee_2014}. The lack of power of single-variant approaches holds in fine-mapping as well as genome-wide association studies. #
In this report, we are concerned with fine-mapping a genomic region that has been sequenced in cases and controls to identify disease-risk loci. A number of methods have been developed to evaluate the disease association for both single-variant and multiple-variants in a genomic region. Besides single-variant methods, we consider three broad classes of methods for analysing sequence data: pooled-variant, joint-modelling and tree-based methods. Pooled-variant methods evaluate the cumulative effects of multiple genetic variants in a genomic region. The score statistics from marginal models of the trait association with individual variants are collapsed into a single test statistic, either by combining the information for multiple variants into a single genetic score or by evaluating the distribution of the pooled score statistics of individual variants \cite{Lee_2014}. Joint-modeling methods identify the joint effect of multiple genetic variants simultaneously. These methods can assess whether a variant carries any further information about the trait beyond what is explained by the other variants. When trait-influencing variants are in low linkage disequilibrium, this approach may be more powerful than pooling test statistics for marginal associations across variants \cite{Cho_2010}. A local genealogical tree represents the ancestry of the sample of haplotypes at each locus in the genomic region being fine-mapped. Haplotypes carrying the same disease risk alleles are expected to be related and cluster on the genealogical tree at a disease risk locus. Tree-based methods assess whether trait values co-cluster with the ancestral tree for the haplotypes (e.g., \citeNP{Bardel_2005}). \citeNP{Mailund_2006} has developed a method to reconstruct and score genealogies according to the case-control clusters.
In practice true trees are unknown. However, cluster statistics based on true trees represent a best case for detecting association as tree uncertainty is eliminated. Burkett et al. use known trees to assess the effectiveness of such a tree-based approach for detection of rare, disease-risk variants in a candidate genomic region under various models of disease risk in a haploid population. They found that Mantel statistics computed on the known trees outperform popular methods for detecting rare variants associated with disease. Following Burkett et al., we use clustering tests based on true trees as benchmarks against which to compare the popular association methods. However, unlike Burkett et al., who focus on detection of disease risk variants, we here focus on localization of association signal in the candidate genomic region. Moreover, we use a diploid disease model instead of a haploid disease model.
In this article, we compare the performance of selected rare-variant association methods for fine-mapping a disease locus. Our investigation focus on the localization of association signal to between 950kbp − 1050kbp within a 2Mb candidate genomic region. To motivate our study, we use variant data simulated from coalescent trees. Our work on localization of association signal extends that of Burkett et al., which investigated the ability to detect association signal in the candidate region, without regard to localization. To illustrate ideas, we start by working through a particular example dataset as a case study for insight into selected association methods. we next perform a simulation study involving 200 sequencing datasets and score which association method localizes best, overall. Our results indicate that the potential of ancestral tree-based approach for localizing the association signal.
Renewable And Sustainable Energy Reviews Template
Non Baryonic Dark Matter with a Concentration on Cold Dark Matter from Supersymmetry
Ever since astronomers have been studying galaxies and their mass/luminosity relationship, there is clearly something wrong. There is a missing luminosity problem - there is a lot of non-luminous (about 90%) matter in a galaxy. Possible dark matter candidates come from cold dark matter, warm dark matter, hot dark matter, and axions. After years of research, astronomers and physicists can rule out warm and hot dark matter, but cold dark matter and axions are both extremely probable candidates of dark matter. In this report, I will explain about all the different candidates of dark matter and their implications.
edx Phot1x report template (2016/11)
This is the final report for the Mach-Zehnder interferometer device with both simulated results and experimental results. The pdf of this report can be found at https://upload.siepic.ubc.ca/uploads/EBeam_JamesClay.pdf
Чергова спроба використання цього засобу для написання статей
Cancer Research- Priority Report
Cancer Research - Review Template
Cancer Research - Perspectives
Cancer Research- Public Issues