AUTHOREA
Log in Sign Up Browse Preprints
LOG IN SIGN UP

Preprints

Explore 39,076 preprints on the Authorea Preprint Repository

A preprint on Authorea can be a complete scientific manuscript submitted to a journal, an essay, a whitepaper, or a blog post. Preprints on Authorea can contain datasets, code, figures, interactive visualizations and computational notebooks.
Read more about preprints.

Curriculum Vitae: Alyssa A. Goodman
Alyssa Goodman

Alyssa Goodman

February 15, 2022
---------- ------------------------------------------ NAME Alyssa A. Goodman OFFICE Astronomy Department, Harvard University Cambridge, MA 02138, 617-495-9278 HOME 485 Concord Avenue, Lexington, MA 02421 WEB SITE scholar.harvard.edu/agoodman ORIGIN July 1, 1962, New York, New York ---------- ------------------------------------------
Fragment Oriented Molecular Shapes
Ethan Hain
Awaiting Activation

Ethan Hain

and 2 more

September 11, 2015
Molecular shape is an important concept in drug design and virtual screening. Shape similarity typically uses either alignment methods, which dynamically optimize molecular poses with respect to the query molecular shape, or feature vector methods, which are computationally less demanding but less accurate. The computational cost of alignment can be reduced by pre-aligning shapes, as is done with the Volumetric-Aligned Molecular Shapes (VAMS) method. Here we introduce and evaluate Fragment Oriented Molecular Shapes (FOMS), where shapes are aligned based on molecular fragments. FOMS enables the use of _shape constraints_, a novel method for precisely specifying molecular shape queries that provides the ability to perform partial shape matching and supports search algorithms that function on an interactive time scale. When evaluated using the challenging Maximum Unbiased Validation dataset, shape constraints were able to extract significantly enriched subsets of compounds for the majority of targets, and FOMS matched or exceeded the performance of both VAMS and an optimizing alignment method of shape similarity search.
Open Source Molecular Modeling
David Koes
Somayeh Pirhadi

David Koes

and 2 more

August 27, 2015
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry.
Asteroseismology Can Reveal Strong Internal Magnetic Fields in Red Giant Stars
Jim Fuller
Matteo Cantiello

Jim Fuller

and 4 more

July 21, 2015
_This is the author’s version of the work. It is posted here by permission of the AAAS for personal use, not for redistribution. The definitive version was published in Science on Vol. 350 no. 6259 pp. 423-426 , DOI: 10.1126/science.aac6933_ Internal stellar magnetic fields are inaccessible to direct observations and little is known about their amplitude, geometry and evolution. We demonstrate that strong magnetic fields in the cores of red giant stars can be identified with asteroseismology. The fields can manifest themselves via depressed dipole stellar oscillation modes, which arises from a magnetic greenhouse effect that scatters and traps oscillation mode energy within the core of the star. The _Kepler_ satellite has observed a few dozen red giants with depressed dipole modes which we interpret as stars with strongly magnetized cores. We find field strengths larger than $\sim\! 10^5 \,{\rm G}$ may produce the observed depression, and in one case we infer a minimum core field strength of $\approx \! \! 10^7 \,{\rm G}$.
Fasta-O-Matic: a tool to sanity check and if needed reformat FASTA files
Jennifer Shelton
Sue Brown

Jennifer Shelton

and 1 more

July 16, 2015
As the sheer volume of bioinformatic sequence data increases, the only way to take advantage of this content is to more completely automate robust analysis workflows. Analysis bottlenecks are often mundane and overlooked processing steps. Idiosyncrasies in reading and/or writing bioinformatics file formats can halt or impair analysis workflows by interfering with the transfer of data from one informatics tools to another. Fasta-O-Matic automates handling of common but minor format issues that otherwise may halt pipelines. The need for automation must be balanced by the need for manual confirmation that any formatting error is actually minor rather than indicative of a corrupt data file. To that end Fasta-O-Matic reports any issues detected to the user with optionally color coded and quiet or verbose logs. Fasta-O-Matic can be used as a general pre-processing tool in bioinformatics workflows (e.g. to automatically wrap FASTA files so that they can be read by BioPerl). It was also developed as a sanity check for bioinformatic core facilities that tend to repeat common analysis steps on FASTA files received from disparate sources. Fasta-O-Matic can be set with format requirements specific to downstream tools as a first step in a larger analysis workflow. Fasta-O-Matic is available free of charge to academic and non-profit institutions at https://github.com/i5K-KINBRE-script-share/read-cleaning-format-conversion/tree/master/KSU_bioinfo_lab/fasta-o-matic.
Publication bias evaluations are not routinely conducted in clinical oncology systema...
David Herrmann
Jonathan Holmes

David Herrmann

and 4 more

July 13, 2015
Background: Publication bias (PB) can cause an exaggerated estimate of summary effects in systematic reviews (SR). The extent of PB assessment by SRs within oncology journals remains to be determined. Methods: This study looked at SRs from high impact factor oncology journals between 2007 and 2015 using a PubMed search. Articles were sorted and coded for PB. An additional assessment of BP from unevaluated SRs was performed using Egger’s regression and the trim-and-fill method. Findings: Of 182 included SRs, 52 preformed a PB assessment. The most common form of assessment was a funnel plot supplemented by Egger’s regression or Begg’s test (44%, 23/52). PB was a routine finding in these SRs (19%, 10/52). SRs that stated following a reporting guideline frequently failed to do so with regards to assessing PB. The magnitude of effect sizes generally decreased when conducting our independent assessments of PB among SRs in our sample that did not evaluate for it. Interpretation: Our study shows that there exists an underutilization of PB assessments by SRs in clinical oncology. Additionally the methodological validity of SRs can be increased by adhering to reporting guidelines, and through the search of grey literature and clinical trials registries. Funding: No external source of funding
Public-Friendly Open Science
Matteo Cantiello

Matteo Cantiello

February 21, 2017
PREVIOUS “A “MODERN SCIENTIST” MANIFESTO” In the 21st century science is growing more technical and complex, as we gaze further and further while standing on the shoulders of many generations of giants. The public has often a hard time understanding research and its relevance to society. One of the reasons for this is that scientists do not spend enough time communicating their findings outside their own scientific community. Obviously there are some exceptions, but THE RULE IS THAT SCIENTISTS WRITE CONTENT FOR SCIENTISTS. Academia is often perceived as an ivory tower, and when new findings are shared with the outside world, this is not done by scientists, but by the media or even the political class. The problem is that these external agents do not have the necessary background to digest and properly communicate this knowledge with the rest of society. They often misunderstand, over-hype and in some case even distort the results and views of the scientific community. IT’S IRONIC AND SOMEWHAT FRIGHTENING THAT THE DISCOVERIES AND RECOMMENDATIONS FOR WHICH SOCIETY INVESTS SUBSTANTIAL ECONOMIC AND HUMAN CAPITAL, ARE NOT DIRECTLY DISSEMINATED BY THE PEOPLE WHO REALLY UNDERSTAND THEM. At the same time transparency and reproducibility are at stake in the increasingly complex world of research, which is still using old-fashioned tools when packaging and sharing content. This is not only a big problem for research itself, but can give science a bad name in front of the public opinion, which increasingly does not understand and trust the work of scientists. To the average tax-payer science is often cryptic, with most recently published papers behind a pay-wall and the majority of research virtually inscrutable. In this scenario it is hard for the public to access and capture the relevance of scientists’ work. I strongly believe that a society that does not trust its scientists is set on a dangerous course. ACTION ITEMS. To improve the situation 21st century scientists need to: 1. Learn to efficiently share and communicate their research with the public at large. 2. Make their research more transparent and reproducible, so that it can be trusted and better understood by their peers and the public at large. 21st century scientists need to produce “PUBLIC-FRIENDLY OPEN SCIENCE” (PFOS).
From Walras’ auctioneer to continuous time double auctions: A general dynamic theory...
Jonathan Donier
Awaiting Activation

Jonathan Donier

and 1 more

July 08, 2015
In standard Walrasian auctions, the price of a good is defined as the point where the supply and demand curves intersect. Since both curves are generically regular, the response to small perturbations is linearly small. However, a crucial ingredient is absent of the theory, namely transactions themselves. What happens after they occur? To answer the question, we develop a dynamic theory for supply and demand based on agents with heterogeneous beliefs. When the inter-auction time is infinitely long, the Walrasian mechanism is recovered. When transactions are allowed to happen in continuous time, a peculiar property emerges: close to the price, supply and demand vanish quadratically, which we empirically confirm on the Bitcoin. This explains why price impact in financial markets is universally observed to behave as the square root of the excess volume. The consequences are important, as they imply that the very fact of clearing the market makes prices hypersensitive to small fluctuations.
Quality of systematic review and meta-analysis abstracts in oncology journals
Chelsea Koller
Sarah Khan

Chelsea Koller

and 6 more

July 04, 2015
Abstract Purpose: The purpose of this study was to evaluate the quality of reporting in the abstracts of oncology systematic reviews using PRISMA guidelines for abstract writing. Methods: Oncology systematic reviews and meta-analyses from four journals - The Lancet Oncology, Clinical Cancer Research, Cancer Research, and Journal of Clinical Oncology - were selected using a PubMed search. The resulting 337 abstracts were sorted for eligibility and 182 were coded based on a standardized abstraction manual constructed from the PRISMA criteria. Eligible systematic reviews were coded independently and later verified by a second coder, with disagreements handled by consensus. One hundred eighty-two abstracts comprised the final sample. Results: The number of included studies, information regarding main outcomes, and general interpretation of results were described in the majority of abstracts. In contrast, risk of bias or methodological quality appraisals, the strengths and limitations of evidence, funding sources, and registration information were rarely reported. By journal, the most notable difference was a higher percentage of funding sources reported in Lancet Oncology. No detectable upward trend was observed on mean abstract scores after publication of the PRISMA extension for abstracts. Conclusion: Overall, the reporting of essential information in oncology systematic review and meta-analysis abstracts is suboptimal and could be greatly improved. Keywords: Review, Systematic; Meta-Analysis; Cancer; Medical Oncology; Abstracting as Topic; Funding
Utilization of clinical trials registries in obstetrics and gynecology systematic rev...
Michael Bibens
A. Benjamin Chong

Michael Bibens

and 2 more

June 29, 2015
ABSTRACT Objectives: We evaluated the use of clinical trials registries in published obstetrics and gynecological systematic reviews and meta-analyses. Methods: A review of publications between January 1, 2007, and December 31, 2015, from six obstetrical and gynecological journals (_Obstetrics & Gynecology, Obstetrical & Gynecological Survey, Human Reproduction Update, Gynecologic Oncology, British Journal of Obstetrics and Gynaecology, and American Journal of Obstetrics & Gynecology_) was completed to identify eligible systematic reviews. All systematic reviews included after exclusions were independently reviewed to determine if clinical trials registries had been included as part of the search process. Studies that reported using a trials registry were further examined to determine whether trial data was included in the analysis. Results: Our initial search resulted in 292 articles, which was narrowed to 256 after exclusions. Of the 256 systematic reviews meeting our selection criteria, 47 utilized a clinical trials registry. Eleven of the 47 systematic reviews found unpublished data, and added the unpublished trial data into their results. Conclusion: A majority of systematic reviews in clinical obstetrics and gynecology journals do not conduct searches of clinical trials registries or do not make use of data obtained from these searches.
Factual errors in a recent paper by Westerhof, Segers and Westerhof in Hypertension
Kim H. Parker
Alun Hughes

Kim H. Parker

and 1 more

June 16, 2015
But facts are chiels that winna ding An downa be disputed – from _A Dream_ by Robert Burns (1786) (But facts are fellows that will not be overturned, And cannot be disputed) _Wave separation, wave intensity, the reservoir-wave concept, and the instantaneous wave-free ratio (2015) N Westerhof, P Segers and BE Westerhof, Hypertension, DOI: 10.1161/HYPERTENSIONAHA.115.05567_ Hereinafter referred to as [WSW]. This paper by three distinguished workers in the field of cardiovascular mechanics, concludes that both the reservoir pressure and instantaneous wave-free ratio are ’... both physically incorrect, and should be abandoned’. These are very strong conclusions which, if they were opinions could only be debated. Reading the paper in detail, however, reveals that it contains numerous factual errors in their discussion of these two entities. Since facts are different from opinions, we believe that it is essential that these errors be corrected before they gain credence by repetition. False facts are highly injurious to the progress of science, for they often endure long; but false views, if supported by some evidence, do little harm, for every one takes a salutary pleasure in proving their falseness. – Charles Darwin (1871) Because we are naturally prejudiced about the validity of both the reservoir pressure (Pres) and instantaneous wave-free ratio (iFR), having been involved in the conception and development of both ideas, we will try to present our arguments as transparently and fairly as possible. As far as possible we will demonstrate the errors by direct quotations from the paper. The whole paper¹ is available from the Hypertension web site and should be consulted directly if there are any questions about our treatment of the text. Approximately two thirds of the paper is taken up with a discussion of wave separation and wave intensity from the point of view of the more usual Fourier-based methods of analysing cardiovascular mechanics, frequently called the impedance method. This part of the paper is, as far as we can see, both insightful and free of major errors. We found some of the discussion about wave intensity analysis thought-provoking and agree with most of their conclusions. We recommend the first two-thirds of this paper to anyone interested in arterial mechanics. In contrast, the last third of the paper, starting with the final sentence of the section ’Summary of Wave Separation and WIA’ is riddled with errors of interpretation and, more importantly, contains a number of mistakes (or in Darwin’s terms ’false statements of fact’) that need to be corrected. Instead of dealing with these errors chronologically, we will point out the fundamental errors first and then deal with their sequelae.
Participatory action research about Figshare user experiences at the University of Me...
Cobi Calyx
Awaiting Activation

Cobi Calyx

and 1 more

November 01, 2017
Participation & feedback are welcome! Please email me on cobi.smith@unimelb.edu.au (which is treated as private unless you explicitly consent to sharing) or tweet [@cobismith](https://twitter.com/cobismith) (public) if you'd prefer not to comment on this working paper using Authorea's features. Please note this is an open notebook and is intended to be part of an open science research project, which means if you choose to share information here your contributions are in the public domain. See the University of Melbourne research protocols for more information: http://www.orei.unimelb.edu.au/content/when-approval-needed
Transcranial Direct Current Stimulation: Theory, Treatment of Major Depressive Disord...
Shan H. Siddiqi

Shan H. Siddiqi

June 04, 2015
BACKGROUND AND THEORY The use of non-invasive brain stimulation for the treatment of various neuropsychiatric disorders, including major depressive disorder (MDD), has rapidly expanded recently. Transcranial direct current electrical stimulation (tDCS), variants of which have been used experimentally for psychiatric , neurologic , and physical rehabilitation applications, has garnered a great deal of attention. While it is not yet FDA-approved for any indication, its promise is related to its low cost and wide range of applications; although the breadth of its applicability has been questioned due to heterogeneous data , this heterogeneity has been attributed to methodological variability . The safety and tolerability of tDCS were outlined by an early study including 567 sessions in 102 patients. The most common adverse effects were mild tingling/itching at the stimulation site and moderate fatigue. Less frequent effects included headaches (11.8%), nausea (2.9%), and insomnia (0.98%), all of which were mild and transient . The underlying theory is that tDCS modulates the excitability of certain cortical regions by passage of a small electrical current through conducting pads applied to the scalp in a minimally painful manner. While the precise mechanism is not fully understood, it likely enhances cortical excitability at the anode and depresses it at the cathode . Proposed mechanisms have been based on data demonstrating relationships between tDCS stimulation and neuropharmacologic effects, cortical electrophysiology, and functional neuroimaging changes. Effects of tDCS on neuroplasticity and cortical excitability have been shown to be differentially modulated by agents affecting neurotransmission via serotonin (citalopram), dopamine (L-dopa), NMDA (dextromethorphan and d-cycloserine), and GABA (lorazepam). Electrophysiologic changes include differential modulation in the presence of agents that modulate sodium channels (carbamazepine) and calcium channels (flunarizine) . Active tDCS shows significant increases in prefrontal cortex activity as measured by functional near infrared spectroscopy (fNIRS), a technique used to measure cortical oxygenation, during and after stimulation – notably, fNIRS measurements may be limited by interference due extracranial blood flow and inability to assess deeper structures, so they merely approximate the functional magnetic resonance imaging (fMRI) signal in superficial structures . Stimulation also increases fMRI activation and connectivity of the underlying cortical regions and hippocampi, though the clinical significance of this is uncertain given that this same study found no behavioral changes .
A Framework for Mitigating the Biases in Barometric Dust Devil Surveys...
Brian Jackson

Brian Jackson

May 27, 2015
BACKGROUND Dust devils are small-scale (few to many tens of meters) low-pressure vortices rendered visible by lofted dust. They usually occur in arid climates on the Earth and ubiquitously on Mars. Martian dust devils have been studied with orbiting and landed spacecraft and were first identified on Mars using images from the Viking Orbiter . On Mars, dust devils may dominate the supply of atmospheric dust and influence climate , pose a hazard for human exploration , and they may have lengthened the operational lifetime of Martian rovers . On the Earth, dust devils significantly degrade air quality in arid climates and may pose an aviation hazard . The dust-lifting capacity of dust devils seems to depend sensitively on their structures, in particular on the pressure wells at their centers , so the dust supply from dust devils on both planets may be dominated by the seldom-observed larger devils. Using a martian global climate model, showed that observed seasonal variations in Mars’ near-surface temperatures could not be reproduced without including the radiative effects of dust and estimated the dust contributes more than 10 K of heating to the heating budget. Thus, elucidating the origin, evolution, and population statistics of dust devils is critical for understanding important terrestrial and Martian atmospheric properties and for in-situ exploration of Mars. Studies of Martian dust devils have been conducted through direct imaging of the devils and identification of their tracks on Mars’ dusty surface \citep[cf.][]{Balme_2006}. Studies with in-situ meteorological instrumentation have also identified dust devils, either via obscuration of the Sun by the dust column or their pressure signals . Studies have also been conducted of terrestrial dust devils and frequently involve in-person monitoring of field sites. Terrestrial dust devils are visually surveyed , directly sampled , or recorded using in-situ meteorological equipment . As noted in , in-person visual surveys are likely to be biased toward detection of larger, more easily seen devils. Such surveys would also fail to recover dustless vortices . Recently, terrestrial surveys similar to Martian dust devil surveys have been conducted using in-situ single barometers and photovoltaic sensors . These sensor-based terrestrial surveys have the advantage of being directly analogous to Martian surveys and are highly cost-effective compared to the in-person surveys (in a dollars per data point sense). In single-barometer surveys, a sensor is deployed in-situ and records a pressure time series at a sampling period ≲1 s. Since it is a low-pressure convective vortex, a dust devil passing nearby will register as pressure dip discernible against a background ambient (but not necessarily constant) pressure. Figure [fig:conditioning_detection_b_inset] from shows a time-series with a typical dust devil signal.
How structure-directing agents control nanocrystal shape: PVP-mediated growth of Ag n...
Tonnam Balankura

Tonnam Balankura

May 25, 2015
KINETIC WULFF PLOT Away from equilibrium, the NC shape is governed by the kinetics of inter- and intrafacet atom diffusion, as well as by the kinetics of deposition to various facets. At nonequilibrium growth conditions, the resulting shapes are expected to be different from the thermodynamic shapes. Examples of well-known kinetic shapes include nanowires and highly branched (bi- and tripods) structures . When NCs grow beyond a critical size, the relative atom deposition rate to various facets becomes a major influence in the NC shape. In this kinetically-controlled growth regime, the kinetic Wulff construction can predict the shape evolution of faceted crystal growth based on the surface kinetics . Using 3-dimensional shape evolution calculation method , we correlate the relative flux of Ag atom deposition to {111} and {100} facets $}{F_{100}}$ and the resulting kinetic Wulff shape in the reversible octahedron-to-cube transformation. This transformation is observed in the seed-mediated growth of Ag NCs , in which the shape-controlling parameter is the concentration of poly(vinylpyrrolidone) (PVP) in the solution. The constructed kinetic Wulff plot is shown in Fig. [fig:kinetic-wulff]. The construction of the kinetic Wulff plot is described in the supporting information. When the relative flux to {111} facets is less than half of the flux to {100} facets, the octahedra is predicted as the kinetic Wulff shape. As $}{F_{100}}$ increases, we observe a shape progression from octahedra to cubo-octahedra, then to truncated cubes, and eventually to cubes at $}{F_{100}} \geq $. To study the mechanism by which SDAs impart shape selectivity, we use the seed-mediated Ag polyol synthesis in the presence of PVP as our model. We utilize large-scale MD simulations to quantify F₁₀₀ and F₁₁₁ using _in-silico_ deposition and potential of mean force calculation.
Tools and pipelines for BioNano data: molecule assembly pipeline and FASTA super scaf...
Jennifer Shelton
Cassondra Coleman

Jennifer Shelton

and 7 more

May 20, 2015
BACKGROUND: Genome assembly remains an unsolved problem. Assembly projects face a range of hurdles that confound assembly. Thus a variety of tools and approaches are needed to improve draft genomes. RESULTS: We used a custom assembly workflow to optimize consensus genome map assembly, resulting in an assembly equal to the estimated length of the _Tribolium castaneum_ genome and with an N50 of more than 1 Mb. We used this map for super scaffolding the _T. castaneum_ sequence assembly, more than tripling its N50 with the program Stitch. CONCLUSIONS: In this article we present software that leverages consensus genome maps assembled from extremely long single molecule maps to increase the contiguity of sequence assemblies. We report the results of applying these tools to validate and improve a 7x Sanger draft of the _T. castaneum_ genome. KEYWORDS: Genome map; BioNano; Genome scaffolding; Genome validation; Genome finishing
A prevalence of dynamo-generated magnetic fields in the cores of intermediate-mass st...
Dennis
Matteo Cantiello

Dennis

and 4 more

May 17, 2015
_This is the author’s version of the work. It is posted here for personal use, not for redistribution. The definitive version was published in Nature on 04 January 2016, DOI:10.1038/nature16171_ Magnetic fields play a role in almost all stages of stellar evolution . Most low-mass stars, including the Sun, show surface fields that are generated by dynamo processes in their convective envelopes . Intermediate-mass stars do not have deep convective envelopes , although 10% exhibit strong surface fields that are presumed to be residuals from the stellar formation process . These stars do have convective cores that might produce internal magnetic fields , and these might even survive into later stages of stellar evolution, but information has been limited by our inability to measure the fields below the stellar surface . Here we use asteroseismology to study the occurrence of strong magnetic fields in the cores of low- and intermediate-mass stars. We have measured the strength of dipolar oscillation modes, which can be suppressed by a strong magnetic field in the core , in over 3,600 red giant stars observed by . About 20% of our sample show mode suppression but this fraction is a strong function of mass. Strong core fields only occur in red giants above 1.1 solar masses (1.1), and the occurrence rate is at least 60% for intermediate-mass stars (1.6–2.0), indicating that powerful dynamos were very common in the convective cores of these stars.
Reanalyzing Head et al. (2015): No widespread p-hacking after all?
C.H.J. Hartgerink

C.H.J. Hartgerink

May 06, 2015
Statistical significance seeking (i.e., p-hacking) is a serious problem for the validity of research, especially if it occurs frequently. Head et al. provided evidence for widespread p-hacking throughout the sciences, which would indicate that the validity of science is in doubt. Previous substantive concerns about their selection of p-values indicated they were too liberal in selecting all reported p-values, which would result in including results that would not be interesting to have been p-hacked. Despite this liberal selection of p-values Head et al. found evidence for p-hacking, which raises the question why p-hacking was detected despite it being unlikely a priori. In this paper I reanalyze the original data and indicate Head et al. their results are an artefact of rounding in the reporting of p-values.
The accretion histories of brightest cluster galaxies from their stellar population g...
Paola Oliva-Altamirano

Paola Oliva-Altamirano

May 04, 2015
_Sarah Brough, Jimmy, Kim-Vy Tran, Warrick J. Couch, Richard M. McDermid, Chris Lidman, Anja von der Linden, Rob Sharp_
Ontology-based Learning Content Management System in Programming Languages Domain
Anton Anikin
Alexander Dvoryankin

Anton Anikin

and 3 more

May 01, 2015
INTRODUCTION A learning content management system (LCMS) ,, is a computer application that allows creating, editing and modifying learning content, organizing, deleting as well as maintenance from a central interface. The LCMS provides a complex platform meant for developing learning content used in e-learning educational systems. Many LCMS packages available on the market also contain tools that resemble those used in learning management systems (LMS), and most assume that an LMS is already in place. The emphasis in an LCMS is the ability for developers to create a new learning content in accordance to learning objectives as well as cognitive peculiarities and experience of learner. Most content-management systems have several aspects in common: a focus on creating, developing, and managing content for on-line courses, with far less emphasis placed on managing the experience of learners; a multi-user environment that allows several developers to interact and exchange tools; a learning object repository containing learning materials, which are commonly used components that are archived so as to be searchable and adaptable to any on-line course. A new trend in LCMS development is using the Smart Learning Content (SLC) approach. Apart from adaptive personalization and sophisticated forms of feedback, smart learning content often also authenticates the user, models the learner, aggregates data, and supports learning analytics. That is especially important in computer science education because of expediency of using the program and algorithm visualization tools, automatic assessment, coding tools, algorithm and program simulation tools, problem-solving tools and other learning resources that process input data provided by the learner and generate customized output. The same approach can be used to generate adaptive learning content based on the some content elements. So the creation of SLC implies personalized search of learning resources and adaptive visualization of information retrieval . In this paper we describe the ontology-based learning content management system which allows to create a new smart learning content in programming languages domain in form of personal learning collection.
Use of the Temperament and Character Inventory to predict response to repetitive tran...
Shan H. Siddiqi

Shan H. Siddiqi

April 29, 2015
ABSTRACT OBJECTIVE: We investigated the utility of the Temperament and Character Inventory (TCI) in predicting antidepressant response to rTMS. BACKGROUND: Although rTMS of the dorsolateral prefrontal cortex (DLPFC) is an established antidepressant treatment, little is known about predictors of response. The TCI measures multiple personality dimensions (harm avoidance, novelty seeking, reward dependence, persistence, self-directedness, self-transcendence, and cooperativeness), some of which have predicted response to antidepressants and cognitive-behavioral therapy. A previous study suggested a possible association between higher self-directedness and rTMS response specifically in melancholic depression, although this was limited by the fact that melancholic depression is associated with a limited range of TCI profiles. METHODS: Sixteen patients in a major depressive episode completed a TCI prior to a clinical course of rTMS over the DLPFC. Treatment response was defined as ≥50% decrease in Hamilton Depression Rating Scale (HDRS). Baseline scores on each TCI dimension were compared between responders and non-responders via paired t-test with Bonferroni correction. Temperament/character scores were also subjected to regression analysis against percentage improvement in HDRS. RESULTS: Ten of the sixteen patients responded to rTMS. T-scores for Persistence were significantly higher in responders (48.3, 95% CI 40.9-55.7) than in non-responders (35.3, 95% CI 29.2-39.9) (p=0.006). Linear regression revealed a correlation between persistence score and percentage improvement in HRDS (R=0.65±0.29). CONCLUSIONS: Higher persistence predicted antidepressant response to rTMS. This may be explained by rTMS-induced enhancement of cortical excitability, which has been found to be decreased in patients with high persistence. Personality assessment that includes measurement of TCI persistence may be a useful component of precision medicine initiatives in rTMS for depression.
The human experience with intravenous levodopa
Shan H. Siddiqi
Natalia Abrahan

Shan H. Siddiqi

and 5 more

April 24, 2015
ABSTRACT OBJECTIVE: To compile a comprehensive summary of published human experience with levodopa given intravenously, with a focus on information required by regulatory agencies. BACKGROUND: While safe intravenous use of levodopa has been documented for over 50 years, regulatory supervision for pharmaceuticals given by a route other than that approved by the U.S. Food and Drug Administration (FDA) has become increasingly cautious. If delivering a drug by an alternate route raises the risk of adverse events, an investigational new drug (IND) application is required, including a comprehensive review of toxicity data. METHODS: Over 200 articles referring to intravenous levodopa (IVLD) were examined for details of administration, pharmacokinetics, benefit and side effects. RESULTS: We identified 144 original reports describing IVLD use in humans, beginning with psychiatric research in 1959-1960 before the development of peripheral decarboxylase inhibitors. At least 2781 subjects have received IVLD, and reported outcomes include parkinsonian signs, sleep variables, hormones, hemodynamics, CSF amino acid composition, regional cerebral blood flow, cognition, perception and complex behavior. Mean pharmacokinetic variables were summarized for 49 healthy subjects and 190 with Parkinson disease. Side effects were those expected from clinical experience with oral levodopa and dopamine agonists. No articles reported deaths or induction of psychosis. CONCLUSION: At least 2781 patients have received i.v. levodopa with a safety profile comparable to that seen with oral administration.
Orthostatic stability with intravenous levodopa
Shan H. Siddiqi
Mary L. Creech RN, MSW, LCSW

Shan H. Siddiqi

and 2 more

April 20, 2015
Intravenous levodopa has been used in a multitude of research studies due to its more predictable pharmacokinetics compared to the oral form, which is used frequently as a treatment for Parkinson’s disease (PD). Levodopa is the precursor for dopamine, and intravenous dopamine would strongly affect vascular tone, but peripheral decarboxylase inhibitors are intended to block such effects. Pulse and blood pressure, with orthostatic changes, were recorded before and after intravenous levodopa or placebo—after oral carbidopa—in 13 adults with a chronic tic disorder and 16 tic-free adult control subjects. Levodopa caused no statistically or clinically significant changes in blood pressure or pulse. These data add to previous data that support the safety of i.v. levodopa when given with adequate peripheral inhibition of DOPA decarboxylase.
An Atlas of Human Kinase Regulation
David Ochoa
Pedro Beltrao

David Ochoa

and 1 more

April 15, 2015
The coordinated regulation of protein kinases is a rapid mechanism that integrates diverse cues and swiftly determines appropriate cellular responses. However, our understanding of cellular decision-making has been limited by the small number of simultaneously monitored phospho-regulatory events. Here, we have estimated changes in activity in 215 human kinases in 399 conditions derived from a large compilation of phosphopeptide quantifications. This atlas identifies commonly regulated kinases as those that are central in the signaling network and defines the logic relationships between kinase pairs. Co-regulation along the conditions predicts kinase-complex and kinase-substrate associations. Additionally, the kinase regulation profile acts as a molecular fingerprint to identify related and opposing signaling states. Using this atlas, we identified essential mediators of stem cell differentiation, modulators of Salmonella infection and new targets of AKT1. This provides a global view of human phosphorylation-based signaling and the necessary context to better understand kinase driven decision-making.
← Previous 1 2 … 1621 1622 1623 1624 1625 1626 1627 1628 1629 Next →
Authorea
  • Home
  • About
  • Product
  • Preprints
  • Pricing
  • Blog
  • Twitter
  • Help
  • Terms of Use
  • Privacy Policy