Discover and publish cutting edge, open research.

Browse 66,610 multi-disciplinary research preprints

Featured documents

Michael Weekes

and 11 more

Nick K. Jones1,2*, Lucy Rivett1,2*, Chris Workman3, Mark Ferris3, Ashley Shaw1, Cambridge COVID-19 Collaboration1,4, Paul J. Lehner1,4, Rob Howes5, Giles Wright3, Nicholas J. Matheson1,4,6¶, Michael P. Weekes1,7¶1 Cambridge University NHS Hospitals Foundation Trust, Cambridge, UK2 Clinical Microbiology & Public Health Laboratory, Public Health England, Cambridge, UK3 Occupational Health and Wellbeing, Cambridge Biomedical Campus, Cambridge, UK4 Cambridge Institute of Therapeutic Immunology & Infectious Disease, University of Cambridge, Cambridge, UK5 Cambridge COVID-19 Testing Centre and AstraZeneca, Anne Mclaren Building, Cambridge, UK6 NHS Blood and Transplant, Cambridge, UK7 Cambridge Institute for Medical Research, University of Cambridge, Cambridge, UK*Joint first authorship¶Joint last authorshipCorrespondence: [email protected] UK has initiated mass COVID-19 immunisation, with healthcare workers (HCWs) given early priority because of the potential for workplace exposure and risk of onward transmission to patients. The UK’s Joint Committee on Vaccination and Immunisation has recommended maximising the number of people vaccinated with first doses at the expense of early booster vaccinations, based on single dose efficacy against symptomatic COVID-19 disease.1-3At the time of writing, three COVID-19 vaccines have been granted emergency use authorisation in the UK, including the BNT162b2 mRNA COVID-19 vaccine (Pfizer-BioNTech). A vital outstanding question is whether this vaccine prevents or promotes asymptomatic SARS-CoV-2 infection, rather than symptomatic COVID-19 disease, because sub-clinical infection following vaccination could continue to drive transmission. This is especially important because many UK HCWs have received this vaccine, and nosocomial COVID-19 infection has been a persistent problem.Through the implementation of a 24 h-turnaround PCR-based comprehensive HCW screening programme at Cambridge University Hospitals NHS Foundation Trust (CUHNFT), we previously demonstrated the frequent presence of pauci- and asymptomatic infection amongst HCWs during the UK’s first wave of the COVID-19 pandemic.4 Here, we evaluate the effect of first-dose BNT162b2 vaccination on test positivity rates and cycle threshold (Ct) values in the asymptomatic arm of our programme, which now offers weekly screening to all staff.Vaccination of HCWs at CUHNFT began on 8th December 2020, with mass vaccination from 8th January 2021. Here, we analyse data from the two weeks spanning 18thto 31st January 2021, during which: (a) the prevalence of COVID-19 amongst HCWs remained approximately constant; and (b) we screened comparable numbers of vaccinated and unvaccinated HCWs. Over this period, 4,408 (week 1) and 4,411 (week 2) PCR tests were performed from individuals reporting well to work. We stratified HCWs <12 days or > 12 days post-vaccination because this was the point at which protection against symptomatic infection began to appear in phase III clinical trial.226/3,252 (0·80%) tests from unvaccinated HCWs were positive (Ct<36), compared to 13/3,535 (0·37%) from HCWs <12 days post-vaccination and 4/1,989 (0·20%) tests from HCWs ≥12 days post-vaccination (p=0·023 and p=0·004, respectively; Fisher’s exact test, Figure). This suggests a four-fold decrease in the risk of asymptomatic SARS-CoV-2 infection amongst HCWs ≥12 days post-vaccination, compared to unvaccinated HCWs, with an intermediate effect amongst HCWs <12 days post-vaccination.A marked reduction in infections was also seen when analyses were repeated with: (a) inclusion of HCWs testing positive through both the symptomatic and asymptomatic arms of the programme (56/3,282 (1·71%) unvaccinated vs 8/1,997 (0·40%) ≥12 days post-vaccination, 4·3-fold reduction, p=0·00001); (b) inclusion of PCR tests which were positive at the limit of detection (Ct>36, 42/3,268 (1·29%) vs 15/2,000 (0·75%), 1·7-fold reduction, p=0·075); and (c) extension of the period of analysis to include six weeks from December 28th to February 7th 2021 (113/14,083 (0·80%) vs 5/4,872 (0·10%), 7·8-fold reduction, p=1x10-9). In addition, the median Ct value of positive tests showed a non-significant trend towards increase between unvaccinated HCWs and HCWs > 12 days post-vaccination (23·3 to 30·3, Figure), suggesting that samples from vaccinated individuals had lower viral loads.We therefore provide real-world evidence for a high level of protection against asymptomatic SARS-CoV-2 infection after a single dose of BNT162b2 vaccine, at a time of predominant transmission of the UK COVID-19 variant of concern 202012/01 (lineage B.1.1.7), and amongst a population with a relatively low frequency of prior infection (7.2% antibody positive).5This work was funded by a Wellcome Senior Clinical Research Fellowship to MPW (108070/Z/15/Z), a Wellcome Principal Research Fellowship to PJL (210688/Z/18/Z), and an MRC Clinician Scientist Fellowship (MR/P008801/1) and NHSBT workpackage (WPA15-02) to NJM. Funding was also received from Addenbrooke’s Charitable Trust and the Cambridge Biomedical Research Centre. We also acknowledge contributions from all staff at CUHNFT Occupational Health and Wellbeing and the Cambridge COVID-19 Testing Centre.

Guangming Wang

and 4 more

Tam Hunt

and 1 more

Tam Hunt [1], Jonathan SchoolerUniversity of California Santa Barbara Synchronization, harmonization, vibrations, or simply resonance in its most general sense seems to have an integral relationship with consciousness itself. One of the possible “neural correlates of consciousness” in mammalian brains is a combination of gamma, beta and theta synchrony. More broadly, we see similar kinds of resonance patterns in living and non-living structures of many types. What clues can resonance provide about the nature of consciousness more generally? This paper provides an overview of resonating structures in the fields of neuroscience, biology and physics and attempts to coalesce these data into a solution to what we see as the “easy part” of the Hard Problem, which is generally known as the “combination problem” or the “binding problem.” The combination problem asks: how do micro-conscious entities combine into a higher-level macro-consciousness? The proposed solution in the context of mammalian consciousness suggests that a shared resonance is what allows different parts of the brain to achieve a phase transition in the speed and bandwidth of information flows between the constituent parts. This phase transition allows for richer varieties of consciousness to arise, with the character and content of that consciousness in each moment determined by the particular set of constituent neurons. We also offer more general insights into the ontology of consciousness and suggest that consciousness manifests as a relatively smooth continuum of increasing richness in all physical processes, distinguishing our view from emergentist materialism. We refer to this approach as a (general) resonance theory of consciousness and offer some responses to Chalmers’ questions about the different kinds of “combination problem.”  At the heart of the universe is a steady, insistent beat: the sound of cycles in sync…. [T]hese feats of synchrony occur spontaneously, almost as if nature has an eerie yearning for order. Steven Strogatz, Sync: How Order Emerges From Chaos in the Universe, Nature and Daily Life (2003) If you want to find the secrets of the universe, think in terms of energy, frequency and vibration.Nikola Tesla (1942) I.               Introduction Is there an “easy part” and a “hard part” to the Hard Problem of consciousness? In this paper, we suggest that there is. The harder part is arriving at a philosophical position with respect to the relationship of matter and mind. This paper is about the “easy part” of the Hard Problem but we address the “hard part” briefly in this introduction.  We have both arrived, after much deliberation, at the position of panpsychism or panexperientialism (all matter has at least some associated mind/experience and vice versa). This is the view that all things and processes have both mental and physical aspects. Matter and mind are two sides of the same coin.  Panpsychism is one of many possible approaches that addresses the “hard part” of the Hard Problem. We adopt this position for all the reasons various authors have listed (Chalmers 1996, Griffin 1997, Hunt 2011, Goff 2017). This first step is particularly powerful if we adopt the Whiteheadian version of panpsychism (Whitehead 1929).  Reaching a position on this fundamental question of how mind relates to matter must be based on a “weight of plausibility” approach, rather than on definitive evidence, because establishing definitive evidence with respect to the presence of mind/experience is difficult. We must generally rely on examining various “behavioral correlates of consciousness” in judging whether entities other than ourselves are conscious – even with respect to other humans—since the only consciousness we can know with certainty is our own. Positing that matter and mind are two sides of the same coin explains the problem of consciousness insofar as it avoids the problems of emergence because under this approach consciousness doesn’t emerge. Consciousness is, rather, always present, at some level, even in the simplest of processes, but it “complexifies” as matter complexifies, and vice versa. Consciousness starts very simple and becomes more complex and rich under the right conditions, which in our proposed framework rely on resonance mechanisms. Matter and mind are two sides of the coin. Neither is primary; they are coequal.  We acknowledge the challenges of adopting this perspective, but encourage readers to consider the many compelling reasons to consider it that are reviewed elsewhere (Chalmers 1996, Griffin 1998, Hunt 2011, Goff 2017, Schooler, Schooler, & Hunt, 2011; Schooler, 2015).  Taking a position on the overarching ontology is the first step in addressing the Hard Problem. But this leads to the related questions: at what level of organization does consciousness reside in any particular process? Is a rock conscious? A chair? An ant? A bacterium? Or are only the smaller constituents, such as atoms or molecules, of these entities conscious? And if there is some degree of consciousness even in atoms and molecules, as panpsychism suggests (albeit of a very rudimentary nature, an important point to remember), how do these micro-conscious entities combine into the higher-level and obvious consciousness we witness in entities like humans and other mammals?  This set of questions is known as the “combination problem,” another now-classic problem in the philosophy of mind, and is what we describe here as the “easy part” of the Hard Problem. Our characterization of this part of the problem as “easy”[2] is, of course, more than a little tongue in cheek. The authors have discussed frequently with each other what part of the Hard Problem should be labeled the easier part and which the harder part. Regardless of the labels we choose, however, this paper focuses on our suggested solution to the combination problem.  Various solutions to the combination problem have been proposed but none have gained widespread acceptance. This paper further elaborates a proposed solution to the combination problem that we first described in Hunt 2011 and Schooler, Hunt, and Schooler 2011. The proposed solution rests on the idea of resonance, a shared vibratory frequency, which can also be called synchrony or field coherence. We will generally use resonance and “sync,” short for synchrony, interchangeably in this paper. We describe the approach as a general resonance theory of consciousness or just “general resonance theory” (GRT). GRT is a field theory of consciousness wherein the various specific fields associated with matter and energy are the seat of conscious awareness.  A summary of our approach appears in Appendix 1.  All things in our universe are constantly in motion, in process. Even objects that appear to be stationary are in fact vibrating, oscillating, resonating, at specific frequencies. So all things are actually processes. Resonance is a specific type of motion, characterized by synchronized oscillation between two states.  An interesting phenomenon occurs when different vibrating processes come into proximity: they will often start vibrating together at the same frequency. They “sync up,” sometimes in ways that can seem mysterious, and allow for richer and faster information and energy flows (Figure 1 offers a schematic). Examining this phenomenon leads to potentially deep insights about the nature of consciousness in both the human/mammalian context but also at a deeper ontological level.

Susanne Schilling*^

and 9 more

Jessica mead

and 6 more

The construct of wellbeing has been criticised as a neoliberal construction of western individualism that ignores wider systemic issues including increasing burden of chronic disease, widening inequality, concerns over environmental degradation and anthropogenic climate change. While these criticisms overlook recent developments, there remains a need for biopsychosocial models that extend theoretical grounding beyond individual wellbeing, incorporating overlapping contextual issues relating to community and environment. Our first GENIAL model \cite{Kemp_2017} provided a more expansive view of pathways to longevity in the context of individual health and wellbeing, emphasising bidirectional links to positive social ties and the impact of sociocultural factors. In this paper, we build on these ideas and propose GENIAL 2.0, focusing on intersecting individual-community-environmental contributions to health and wellbeing, and laying an evidence-based, theoretical framework on which future research and innovative therapeutic innovations could be based. We suggest that our transdisciplinary model of wellbeing - focusing on individual, community and environmental contributions to personal wellbeing - will help to move the research field forward. In reconceptualising wellbeing, GENIAL 2.0 bridges the gap between psychological science and population health health systems, and presents opportunities for enhancing the health and wellbeing of people living with chronic conditions. Implications for future generations including the very survival of our species are discussed.  

Mark Ferris

and 14 more

IntroductionConsistent with World Health Organization (WHO) advice [1], UK Infection Protection Control guidance recommends that healthcare workers (HCWs) caring for patients with coronavirus disease 2019 (COVID-19) should use fluid resistant surgical masks type IIR (FRSMs) as respiratory protective equipment (RPE), unless aerosol generating procedures (AGPs) are being undertaken or are likely, when a filtering face piece 3 (FFP3) respirator should be used [2]. In a recent update, an FFP3 respirator is recommended if “an unacceptable risk of transmission remains following rigorous application of the hierarchy of control” [3]. Conversely, guidance from the Centers for Disease Control and Prevention (CDC) recommends that HCWs caring for patients with COVID-19 should use an N95 or higher level respirator [4]. WHO guidance suggests that a respirator, such as FFP3, may be used for HCWs in the absence of AGPs if availability or cost is not an issue [1].A recent systematic review undertaken for PHE concluded that: “patients with SARS-CoV-2 infection who are breathing, talking or coughing generate both respiratory droplets and aerosols, but FRSM (and where required, eye protection) are considered to provide adequate staff protection” [5]. Nevertheless, FFP3 respirators are more effective in preventing aerosol transmission than FRSMs, and observational data suggests that they may improve protection for HCWs [6]. It has therefore been suggested that respirators should be considered as a means of affording the best available protection [7], and some organisations have decided to provide FFP3 (or equivalent) respirators to HCWs caring for COVID-19 patients, despite a lack of mandate from local or national guidelines [8].Data from the HCW testing programme at Cambridge University Hospitals NHS Foundation Trust (CUHNFT) during the first wave of the UK severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic indicated a higher incidence of infection amongst HCWs caring for patients with COVID-19, compared with those who did not [9]. Subsequent studies have confirmed this observation [10, 11]. This disparity persisted at CUHNFT in December 2020, despite control measures consistent with PHE guidance and audits indicating good compliance. The CUHNFT infection control committee therefore implemented a change of RPE for staff on “red” (COVID-19) wards from FRSMs to FFP3 respirators. In this study, we analyse the incidence of SARS-CoV-2 infection in HCWs before and after this transition.

How it works

Upload or create your research work
You can upload Word, PDF, LaTeX as well as data, code, Jupyter Notebooks, videos, and figures. Or start a document from scratch.
Disseminate your research rapidly
Post your work as a preprint. A Digital Object Identifier (DOI) makes your research citeable and discoverable immediately.
Get published in a refereed journal
Track the status of your paper as it goes through peer review. When published, it automatically links to the publisher version.
Learn More

Most recent documents

Muhammad Usman Hadi

and 8 more

Ritwik Raj Saxena

and 1 more

Objective: Techniques that are based on artificial intelligence, specifically machine learning, have played a major role in the enhancement of pharmacological methodologies and development of medical treatments, especially those that are individualized or those which fall in the province of precision medicine. In this article, we attempt to examine how graph neural networks have revolutionized certain important aspects of pharmacology.Background: Pharmacological data is replete with unidirectional as well as bidirectional associations, with regards to, for example, drug interactions, patient-centered medicine, precision medicine, multi-omics data analysis, drug discovery, and optimization of experimental processes, and other fields. These associations can be more readily modeled using advanced computational methods and machine learning techniques like graph neural networks. The revolutionary advancements in the field of data mining have further fueled the need to create models that can resolve pharmacological correlations and dependencies into facilely interpretable outcomes. Methods: We conducted a literature review to find those documents which provide relevant information about our objectives. With a comprehensive search plan in place, we sequestered applicable articles and studied them to identify pertinent points that assisted our understanding of graph neural networks as a tool to improvise, automate, and simplify the practical applications in pharmacology and pharmacotherapeutics.Conclusion: The review of relevant research has confirmed our hypothesis that graph neural networks can be used to create an innovative, lasting, and radical departure in pharmaceutical therapeutics. Graph Neural Networks can automate and simplify many tasks based on large and complex datasets which are inherent in pharmacological science. Such techniques can help us achieve innovative methods in therapeutics using extant pharmaceuticals and in the development of new drugs, and therefore bode well for the future of healthcare.

Bilal Zahid Hussain

and 2 more

This research paper presents a comprehensive investigation into the development of an innovative and novel custom neural network model for intrusion detection systems (IDS). In the current era of rapid data transfer facilitated by the internet and advancements in communication technologies, the security of sensitive information is of paramount concern. As attackers continuously devise new methodologies to steal or tamper with data, IDSs face significant challenges in effectively detecting and mitigating intrusions. While extensive research has been conducted to enhance IDS capabilities, the need for improved detection accuracy and reduced false alarm rates remains a pressing issue. Moreover, the identification of zeroday attacks continues to pose a formidable obstacle. In contrast to conventional IDS approaches that heavily rely on statistical methodologies and rule-based expert systems, this study embraces data mining techniques, specifically Neural Networks (NNs), to overcome the limitations associated with large datasets. This research paper proposes a meticulously designed custom neural network model that leverages machine learning (ML) algorithms to analyze contemporary host activity and cloud service data. The paper extensively discusses the utilized dataset, meticulously evaluates the performance of various classifiers, and introduces our innovative neural network model. Emphasizing the significance of our model in anomaly detection, the findings underscore the importance of robust ML models to ensure the efficacy and longevity of deployed defensive systems. By capitalizing on its innovative design and leveraging the power of ML algorithms, our model not only addresses the limitations of traditional IDS approaches but also paves the way for enhanced accuracy, reduced false alarms, and improved resilience against zero-day attacks. This research contributes to the advancement of the field, shedding light on the novel possibilities and remarkable innovation offered by our custom neural network model in safeguarding critical information in an increasingly hostile digital landscape.

Bilal Zahid Hussain

and 1 more

In the contemporary era of rapid technological advancement, the Industrial Internet of Things (IIoT) has become a pivotal element in revolutionizing industrial operations. This paper delves into the escalating cybersecurity challenges posed by the sprawling networks of IIoT, accentuating the inadequacy of traditional cybersecurity methods in the face of sophisticated cyber threats. We introduce machine learning (ML) as a transformative approach to fortify the cybersecurity landscape of IIoT systems. Our research primarily focuses on the application of machine learning algorithms to detect, analyze, and counteract diverse cyber threats in IIoT environments. These algorithms are trained to recognize and respond to a spectrum of cyber threats, thereby enhancing the resilience of IIoT networks. We present a novel Convolutional-GRU autoencoder model, which demonstrates superior performance over traditional machine learning models in terms of accuracy, precision, recall, and F1score. This model is adept at learning and adapting from complex data patterns, ensuring robust defense against cyber intrusions. We also address the challenges in applying ML to IIoT cybersecurity, considering the varied nature of IIoT devices and the dynamic landscape of cyber threats. This study is an important stride towards enhancing IIoT cybersecurity, highlighting the symbiotic relationship between ML and IIoT. It serves as a foundation for future research and a guide for current implementations, aiming to create more secure, reliable, and efficient IIoT environments. By exploring the potential of ML in cybersecurity, we pave the way for a new era in industrial digital protection, one that is adaptable, forward-thinking, and resilient against the ever-evolving digital threats.

Wonhyun Lee

and 2 more

Accurately predicting the extent of compound flooding events, including storm surge, pluvial, and fluvial flooding, is vital for protecting coastal communities. However, high computational demands associated with detailed probabilistic models highlight the need for simplified models to enable rapid forecasting. The objective of this study was to assess the accuracy and efficiency of a reduced-complexity, hydrodynamic solver – the Super-Fast INundation of CoastS (SFINCS) model – in a probabilistic ensemble simulation setting, using Hurricane Ike (2008) in the Texas Gulf Coast as a case study. Results show that the SFINCS-based framework can provide probabilistic outputs under reasonable simulation times (e.g., less than 4 hours for a 100-member ensemble on a single CPU). The model agrees well with observed data from NOAA tidal stations and USGS gage height stations. The ensemble approach significantly reduced errors (average 16%) across all stations compared to a deterministic case. The ensemble improved overall performance and revealed wider flood extents and lower depths. Sensitivity studies performed on ensemble sizes (1,000, 189, 81) and lead times (1 to 3 days before landfall) further demonstrate the reliability of flood extent predictions over varying lead times. In particular, Counties adjacent to the Trinity River Basin had ≥ 80% probability in exceeding the critical 3-m flood threshold during Hurricane Ike. Our study highlights the effectiveness of the SFINCS-based framework in providing probabilistic flood extent/depth forecasts over long lead times in a timely manner. Thus, the framework constitutes a valuable tool for effective flood preparedness and response planning during compound flooding.

Zhen Zhou

and 19 more

Using Unmanned Aerial Systems (UAS) equipped with optical RGB cameras and Doppler radar, surface velocity can be efficiently measured at high spatial resolution. UAS-borne Doppler radar is particularly attractive because it is suitable for real-time velocity determination, because the measurement is contactless, and because it has fewer limitations than image velocimetry techniques. In this paper, five cross-sections (XSs) were surveyed within a 10 km stretch of Rönne Å in Sweden. Ground-truth surface velocity observations were retrieved with an electromagnetic velocity sensor (OTT MF Pro) along the XS at 1 m spacing. Videos from a UAS RGB camera were analyzed using both Particle Image Velocimetry (PIV) and Space-Time Image Velocimetry (STIV) techniques. Furthermore, we recorded full waveform signal data using a Doppler radar at multiple waypoints across the river. An algorithm fits two alternative models to the average amplitude curve to derive the correct river surface velocity: a Gaussian one peak model, or a Gaussian two peak model. Results indicate that river flow velocity and propwash velocity caused by the drone can be found in XS where the flow velocity is low, while the drone-induced propwash velocity can be neglected in fast and highly turbulent flows. To verify the river flow velocity derived from Doppler radar, a mean PIV value within the footprint of the Doppler radar at each waypoint was calculated. Finally, quantitative comparisons of OTT MF Pro data with STIV, mean PIV and Doppler radar revealed that UAS-borne Doppler radar could reliably measure the river surface velocity.

Cindy S Y Lim

and 3 more

Deep learning (DL) phase picking models have proven effective in processing large volumes of seismic data, including successfully detecting earthquakes missed by other standard detection methods. Despite their success, the applicability of existing extensively-trained DL models to high-frequency borehole datasets is currently unclear. In this study, we compare four established models (GPD, U-GPD, PhaseNet and EQTransformer) trained on regional earthquakes recorded at surface stations (100 Hz) in terms of their picking performance on high-frequency borehole data (2000 Hz) from the Preston New Road (PNR) unconventional shale gas site, in the United Kingdom (UK). The PNR-1z dataset, which we use as a benchmark, consists of continuously recorded waveforms containing over 38,000 seismic events previously catalogued, ranging in magnitudes from -2.8 to 1.1. Remarkably, three of the four DL models recall a good fraction of the events and two might satisfy the monitoring requirements of some users without any modifications. In particular, PhaseNet and U-GPD demonstrate exceptional recall rates of 95% and 76.6%, respectively, and detect a substantial number of new events (over 15,800 and 8,300 events, respectively). PhaseNet’s success might be attributed to its exposure to more extensive and diverse instrument dataset during training, as well as its relatively small model size, which might mitigate overfitting to its training set. U-GPD outperforms PhaseNet during periods of high seismic rates due to its smaller window size (400-samples compared to PhaseNet’s 3000-sample window). All models start missing events below Mw -0.5, suggesting that the models could benefit from additional training with microseismic datasets. Nonetheless, PhaseNet may satisfy some users' monitoring requirements without further modification, detecting over 52,000 events at PNR. This suggests that DL models can provide efficient solutions to the big data challenge of downhole monitoring of hydraulic-fracturing induced seismicity as well as improved risk mitigation strategies at unconventional exploration sites.

Browse more recent preprints

Powerful features of Authorea

Under Review
Learn More
Journals connected to Under Review
Ecology and Evolution
Clinical Case Reports
Land Degradation & Development
Mathematical Methods in the Applied Sciences
Biotechnology Journal
Plant, Cell & Environment
International Journal of Quantum Chemistry
PROTEINS: Structure, Function, and Bioinformatics
All IET journals
All AGU journals
All Wiley journals
Featured Collection
Featured communities
Explore More Communities

Other benefits of Authorea


A repository for any field of research, from Anthropology to Zoology


Discuss your preprints with your collaborators and the scientific community

Interactive Figures

Not just PDFs. You can publish d3.js and graphs, data, code, Jupyter notebooks

Documents recently accepted in scholarly journals

Tanja Kalic

and 21 more

Background: Recent studies indicated that fish-allergic patients may safely consume certain fish species. Multiplex IgE testing facilitates the identification of species tolerated by individual patients. Methods: Sera were collected from 263 fish-allergic patients from Austria, China, Denmark, Luxembourg, Norway and Spain. Specific (s) IgE to parvalbumins (PVs) from 10 fish species along with IgE to 7 raw and 6 heated fish extracts was quantified using a research version of the ALEX 2 assay. IgE-signatures of individual patients and patient groups were analyzed using SPSS and R. Results: sIgE to alpha-PV from ray, a cartilaginous fish, was not detected in 78% of the patients while up to 41% of the patients, depending on their country of origin, tested negative for at least one beta-PV. sIgE values were highest for mackerel and tuna PVs (>10 kUA/L) and significantly lower for cod (4.9 kUA/L) and sole PVs (2.55 kUA/L). 17% of the patients, although negative for PVs, tested positive for the respective fish extracts. Based on the absence of IgE to PVs and extracts, up to 21% of the patients were identified as potentially tolerating one or more bony fish. Up to 90% of the patients tested negative for ray. The probability of negativity to one fish based on negativity to others was calculated. Negativity to tuna and mackerel emerged as a good marker of negativity to additional bony fish. Conclusion: Measuring sIgE to PVs and extracts from evolutionary distant fish species indicates bony and cartilaginous fish species for tolerance-confirming food challenges.


and 3 more

Hypoxia Inducible Factor (HIF), the main actor in the cell response to hypoxia, represents a potential target in cancer therapy. HIF is involved in many biological processes such as cell proliferation, survival, apoptosis, angiogenesis, iron metabolism and glucose metabolism. This protein regulates the expressions of Lactate Dehydrogenase (LDH) and Pyruvate Dehydrogenase (PDH), both essential for the conversion of pyruvate to be used in aerobic and anaerobic pathways. HIF upregulates LDH, increasing the conversion of pyruvate into lactate which leads to higher secretion of lactic acid by the cell and reduced pH in the microenvironment. HIF indirectly downregulates PDH, decreasing the conversion of pyruvate into Acetyl Coenzyme A which leads to reduced usage of the Tricarboxylic Acid (TCA) cycle in aerobic pathways. Upregulation of HIF may promote the use of anaerobic pathways for energy production even in normal extracellular oxygen conditions. Higher use of glycolysis even in normal oxygen conditions is called the Warburg effect. In this paper, we focus on HIF variations during tumour growth and study, through a mathematical model, its impact on the two metabolic key genes PDH and LDH, to investigate its role in the emergence of the Warburg effect. Mathematical equations describing the enzymes regulation pathways were solved for each cell of the tumour represented in an agent-based model to best capture the spatio-temporal oxygen variations during tumour development caused by cell consumption and reduced diffusion inside the tumour. Simulation results show that reduced HIF degradation in normoxia can induce higher lactic acid production. The emergence of the Warburg effect appears after the first period of hypoxia before oxygen conditions return to a normal level. The results also show that targeting the upregulation of LDH and the downregulation of PDH could be relevant in therapy.

Heba Saber

and 1 more

Cristina Teixeira

and 2 more

Objective: To estimate time trends in the frequency of severe perineal tears (SPT) in Portugal and its relationship with episiotomy. Design: Nationwide register-based study by using the national inpatient database. Setting: All Portuguese public hospitals Population: All women with a singleton vaginal delivery between 2000 and 2015 Methods: Time-trend analysis using joinpoint regression models was performed to identify time trends in the prevalence of SPT and of risk factors, including episiotomy. Poisson regression models were fitted to assess the association between episiotomy and SPT. Main Outcome Measures: Annual percentage change (APC) with 95% Confidence Interval (95% CI) in the prevalence of SPT and its risk factors. Adjusted relative risk (RR) and respective 95% CI. Results: From 908,889 singleton vaginal deliveries, 20.6% were instrumental deliveries, 76.7% with episiotomy and 0.56% were complicated by SPT. SPT decreased among women with non-instrumental deliveries and no episiotomy from 2009 onwards (1.3% to 0.7%), whereas SPT kept increasing in women with episiotomy for both non-instrumental (0.1% in 2000 to 0.4% in 2015) and instrumental deliveries (0.7% in 2005 to 2.3% in 2015). Episiotomy was associated with a decrease in SPT with adjusted RR varying between 2000 and 2015 from 0.18 (95%CI:0.13-0.25) to 0.59 (95%CI:0.44-0.79) for non-instrumental deliveries and from 0.45 (95%CI:0.25-0.81) to 0.50 (95%CI:0.40-0.72) for instrumental deliveries. Conclusions: Episiotomy rate could safely further decrease as the main factor driving SPT rates seems to be an increase in awareness and reporting of SPT particularly among women who underwent an episiotomy.

mariem gdoura

and 8 more

Introduction: SARS-CoV2 serology testing is multipurpose provided to choose an efficient test. We evaluated and compared 4 different commercial serology tests, three of them had the Food and Drug Administration (FDA) approval. Our goal was to provide new data to help to guide the interpretation and the choice of the serological tests. Methods: Four commercial tests were evaluated: Cobas®Roche®(total anti-N antibodies), VIDAS®Biomerieux®(IgM and IgG anti-RBD antibodies), Mindray®(IgM and IgG anti-N and anti-RBD antibodies) and Access®Beckman Coulter®(IgG anti-RBD antibodies). Were tested: a positive panel (n=72 sera) obtained from COVID-19 confirmed patients and a negative panel (n=119) of pre-pandemic sera. Were determined the analytical performances and was drawn the ROC curve to assess the manufacturer’s threshold. Results: A large range of variability between the tests was found. Mindray®IgG and Cobas® tests showed the best overall sensitivity 79,2%CI95%[67,9-87,8]. Cobas® showed the best sensitivity after D14; 85,4%CI95%[72,2-93,9]. The best specificity was noted for Cobas®, VIDAS®IgG and Access® IgG(100%CI95%[96,9-100]). Access® had the lower sensitivity even after D14 (55,5% CI95%[43,4-67,3]). VIDAS®IgM and Mindray®IgM tests showed the lowest specificity and sensitivity rates. Overall, only 43 out of 72 sera gave concordant results (59,7%). Retained cut-offs for a significantly better sensitivity and accuracy, without altering significantly the specificity, were: 0,87 for Vidas®IgM(p=0,01), 0,55 for Vidas®IgG(p=0,05) and 0,14 for Access®(p<10-4). Conclusion: Although FDA approved, each laboratory should realize its own evaluation for commercial tests. Tests variability may raise some concerns that seroprevalence studies may vary significantly based on the used serology test.

Browse more published preprints

Featured templates
Featured and interactive
Journals with direct submission
Explore All Templates