Public Articles
Thermal Testing of Non-Volatile Memories
and 2 collaborators
In space, the question of whether new non-volatile memory (NVM) technology still functions properly remains an unsolved research topic. As manufacturer data sheets for electrical components are created based on data collected from tests done on Earth, and most electrical components aren’t designed with low Earth orbit (LEO) operation in mind, it is not a foregone conclusion that these components will operate according to manufacturer specifications in space conditions. These arduous conditions provide an immense engineering challenge in designing components that are hardened against wild temperature fluctuations, bombardment from galactic cosmic radiation, solar particle events, and more \cite{Keys_2008} \cite{Xapsos_2012}.
First blood
Hey, welcome. Double click anywhere on the text to start writing. In addition to simple text you can also add text formatted in boldface, italic, and yes, math too: E = mc2! Add images by drag’n’drop or click on the “Insert Figure” button.
From Euclidean to Riemannian Centers of Classes: Information Geometry for SSVEP Classification
and 2 collaborators
Brain Computer Interfaces (BCI) based on electroencephalography (EEG) rely on multichannel brain signal processing. Most of the state-of-the-art approaches deal with covariance matrices, and indeed Riemannian geometry has provided a substantial framework for developing new algorithms. Most notably, a straightforward algorithm such as Minimum Distance to Mean (MDM) yields competitive results when applied with a Riemannian distance or divergence. This applicative contribution aims at assessing the impact of several distances/divergences on real EEG dataset, as the invariances embedded in those distances/divergences have an influence on the classification accuracy. Riemannian centers of classes compare favorably with respect to Euclidean ones both in term of quality of results and of computational load. Riemannian distances cope with signal variabilities and reduce the adverse effect of artifacts in EEG signal.
Progmosis: Evaluating Risky Individual Behavior During Epidemics Using Mobile Network Data
and 4 collaborators
The possibility to analyze, quantify and forecast epidemic outbreaks is fundamental when devising effective disease containment strategies. Policy makers are faced with the intricate task of drafting realistically implementable policies that strike a balance between risk management and cost. Two major techniques policy makers have at their disposal are: epidemic modeling and contact tracing. Models are used to forecast the evolution of the epidemic both globally and regionally, while contact tracing is used to reconstruct the chain of people who have been potentially infected, so that they can be tested, isolated and treated immediately. However, both techniques might provide limited information, especially during an already advanced crisis when the need for action is urgent.
In this paper we propose an alternative approach that goes beyond epidemic modeling and contact tracing, and leverages behavioral data generated by mobile carrier networks to evaluate contagion risk on a per-user basis. The individual risk represents the loss incurred by not isolating or treating a specific person, both in terms of how likely it is for this person to spread the disease as well as how many secondary infections it will cause. To this aim, we develop a model, named Progmosis, which quantifies this risk based on movement and regional aggregated statistics about infection rates. We develop and release an open-source tool that calculates this risk based on cellular network events. We simulate a realistic epidemic scenarios, based on an Ebola virus outbreak; we find that gradually restricting the mobility of a subset of individuals reduces the number of infected people after 30 days by 24%.
While these results are promising, it is important to underline the fact that this is only an initial foundational work and to stress some key points. First, this paper focuses on a theoretical model, rather than on its actual translation into a real-world system. In particular, centralized deployments of this model would pose several ethical questions, as they would require access to user data. Decentralized deployments for which user mobility data never leaves the mobile device of a user are possible and should be preferred, as they fully protect user privacy. Second, results are generated from computer-based simulations, under specific assumptions. Social factors and technical difficulties might greatly affect results obtained in the real world. Third, this risk-assessment tool is not designed specifically for implementing containment measures based on mobility restrictions. For example, it could be used to advise users about the most appropriate behavior given his/her risk profile (e.g., willingly change own behavior, see a doctor, and similar); users would finally choose whether to follow the advice or not. Finally, the simulations were run on data call records from a country that is according to WHO Ebola-free \cite{who_senegal_2014}, and this work has not been commissioned neither by Orange nor by any other entity for preparation to a real-world disease outbreak.
Shining a Light on Dark Matter
and 3 collaborators
The objective of our project is to understand the mystery that is dark matter, a key area of physics and cosmology in particular. Not only does dark matter contribute to a significant part of the composition of the universe (around 27%), dark matter also remains an elusive and unknown quantity. Our research into the topic mostly comes from a cosmological standpoint, but the studying of dark matter does require a broad knowledge base. From experimental particle physics looking at the very small, up through astrophysics and into the Universe-wide scale that we explore through cosmology, this report aims to explore many aspects of the subject. Through looking at galaxy rotation curves and dark matter halos, the implications of hot and cold dark matter and the candidate particles associated with them, and the direct and indirect detection of dark matter, this report aims to validate, or reject, the major theories of dark matter.
Riqueza de mamíferos medianos y grandes del Refugio de vida silvestre y marino costera Pacoche
and 7 collaborators
#Abstract
al final...
Shining a Light on Dark Matter, Lancaster Version
and 3 collaborators
The objective of our project is to understand the mystery that is dark matter, a key area of physics, and cosmology in particular. Not only does dark matter contribute to a significant part of the composition of the universe (around 27%), dark matter also remains a rather elusive and unknown quantity. Though our research into the topic mostly comes from a cosmological standpoint, dark matter is a multi-disciplinary area drawing from many fields. From experimental particle physics looking at the very small, up through astrophysics and into the Universe-wide scale that we explore through cosmology, this report aims to explore many aspects of the subject. Through looking at galaxy rotation curves and dark matter halos, the implications of hot and cold dark matter and the candidate particles associated with them, and the direct and indirect detection of dark matter, this report aims to validate, or reject, the major theories of dark matter.
Fitted EOFs
and 1 collaborator
This manuscript is aimed at discussing our thoughts on the use of fitted EOFs for climate studies. Fitted EOF analysis is an extension of traditional EOFs that attempts to extract EOFs and associated PCs encapsulating predictors and response relationship making use of multivariate regression. Fitted EOFs of ENSO and volcanic aerosols are identified by estimating the impact of these factors on grid of surface temperature anomalies in the Tropics. We mapped influence of ENSO and volcanoes on temperature and removed these impacts to provide adjusted reconstructed grid of surface temperature (1856 - 2011). Spatial gaps are filled with ordinary kriging as gappy data are bottleneck for EOF analysis. ENSO accounts for more variability in surface temperature than volcanoes. Adjusted annual average temperature time series indicates warming as does the unadjusted version. However, it plateaus prominently after 2000.
(As it stands, I’ve just taken a LaTeXtemplate for the Journal Frontiers here, but wouldn’t anticipate that we would ever submit it there. I’ve also added in some section outlines. As a primary goal, the abstract should render the general significance and conceptual advance of the work clearly accessible to a broad readership. References should not be cited in the abstract. Refer to http://www.frontiersin.org/ or Table [Tab:01] for abstract requirement and length according to article type.)
Keywords: EOFs, ENSO, volcanic aerosols, warming, Tropics, surface temperature
All article types: you may provide up to 8 keywords; at least 5 are mandatory.
Review of cpd-11-4003-2015
I feel this manuscript provides strong rationale and very helpful description of the PlioMIP2 experiment. I recommend that it is published subject to some revisions. I look forward to the actual experiment and hope that some interesting science will emerge from it. Below I’m suggesting some big revisions to the ensemble of simulations requested. I’m happy with the authors directly about whether these revisions truly represent better value for resources.
The vast majority of the simulations are required solely for the forcing factorisation. You might want to consider just important you feel this component of the research is. I worry that the amount of simulations required really justify the extra effort. They need 6x as much computation as just doing the PlioMIP2 entry card, but surely gain nothing like as much as six times the information (considering the fact all CMIP6 models must do the DECK, I’m not counting the preindustrial run). You may want to think of the factorisation as a sub-experiment, otherwise PlioMIP2 appears really daunting.
Solution and Response to Dark Matter
Kaizen Event Strategy: A Review of Literaure
and 2 collaborators
ABSTRACT
Continuous Improvement usually uses methodologies that often put their best effort to plan, do, check and act (correctively) all the processes that occur in organizations. Likewise, it cares about maximizing the efficient use of productive and material resources through long-term projects, which is expected to lead the company to achieve better results. However, these initiatives often neglect the development and promotion of the human resource who makes possible to reach such an improvement, and which also may jeopardize their sustainability in time. In this article Kaizen Events Strategy is reviewed and analyzed through current concepts of continuous improvement. Similarly, a characterization and a critical practice that improves the understanding of the concept and leverages the success of its implementation is proposed as well. Finally, some criteria are proposed to ensure that processes and people improve in tandem within organizations through short-term projects that are properly defined and focused.
Keywords: Continuous Improvement, Process Management, Kaizen, Lean, Competences and Skills
Comparación de modelos físicos y de Inteligencia Artifical para predicción de inundaciones
and 1 collaborator
Las inundaciones son un fenómeno natural que se producen cuando las lluvias se presentan de manera frecuente o son tan fuertes que la capacidad de absorción del suelo es sobrepasada, generando que el agua cambie de curso y se extienda hacia las zonas adyacentes al mismo (SDAB, 2009). Cuando estos fenómenos ocurren en entornos urbanos que se encuentran poblados, las consecuencias se hacen más notorias, puesto que no sólo se presentan daños a nivel ambiental, sino también a nivel social y económico, generando grandes inversiones en apoyo a damnificados y en la recuperación de los espacios inundados (CAR, 2011). Al respecto, datos del Banco Mundial \cite{mundial2012analisis} muestran que las inundaciones producen un 43% de las viviendas destruidas y alrededor del 10% de las pérdidas de vidas humanas Por otra parte, dos de los eventos con gran variabilidad climática que representan mayor amenaza en Colombia son los Fenómenos de “El Niño” y “La Niña”. El primero se caracteriza por la presencia de sequías y escasez de agua produciendo incendios forestales, por su parte, el fenómeno de La Niña presenta una mayor saturación de humedad de los suelos lo que acarrea eventos como deslizamientos y crecientes rápidas en los sistemas hídricos, que para el caso de Colombia se dan especialmente en la región Andina, Caribe y Pacífica. Para enero de 2011 fue necesario declarar estado de emergencia económica, social y ecológica en todo el territorio colombiano debido a los devastadores efectos generados por las inundaciones. La Corporación Autónoma Regional - CAR, de la cuenca del Río Bogotá sostiene que es necesario contar con modelos probabilísticos que estimen la variabilidad climática e identifiquen el aumento del volumen de los ríos, considerando que a través de ellos se puedan crear alertas de desastres naturales y obtener información útil al momento de la toma de decisiones respecto a la prevención de emergencias (CAR, 2011).
Sobre este aspecto, tradicionalmente la hidrología recurre a métodos de pronóstico de las inundaciones mediante regresiones lineales \cite{pandey1999comparative}, que miden la relación entre las variables dependientes e independientes del fenómeno \cite{weisberg2005applied}. El gran inconveniente han sido los problemas y limitaciones que han tenido en el área de la predicción, no solo por el cambio climático que se está dando en la tierra \cite{huffman2001geographic}, o la dificultad de calibración y las herramientas de optimización robustas que se necesitan \cite{kia2012artificial}, sino porque este tipo de fenómenos son no lineales, lo que hace inapropiado el uso de este tipo de modelos predictivos \cite{dawson2006flood, aqil2007analysis}. Como se ilustra en los párrafos anteriores, si bien los métodos tradicionales han sido de gran ayuda a la hora de pronosticar inundaciones, investigadores se han dado la tarea de estudiar nuevos modelos más eficientes que tengan una mayor exactitud en el pronóstico.
Otro tipo de método de pronóstico de inundaciones son los modelos físicos basados en principios hidráulicos que permiten explicar, a través de leyes físicas mezcladas con ecuaciones diferenciales, el comportamiento de los cauces de los ríos. Uno de estos modelos físicos se realiza mediante el software Hec-Ras, creado por el ejército de los Estados Unidos \cite{tisseuil2010statistical}. El gran problema de los modelos físicos es la cantidad de información que requieren, en términos de variables hidrometeorológicas (caudal, nivel de agua, precipitación, escorrentía, entre otras), además de las consideraciones de los aspectos geológicos y topográficos del cauce, tales como batimetría del terreno, tipos de suelos, curvas de gasto y parámetros de escurrimiento \cite{merwade2008gis, kia2012artificial}. Lo anterior impide la aplicación de este tipo de modelos dado que ciertas cuencas no han sido caracterizadas en términos de la capacidad de almacenamiento, captación de agua y probables zonas de inundación alrededor del río \cite{werner2006regional, park2012integrated, callow2013studying}.
A su vez, se han realizado diversos trabajos para estimar parámetros geográficos en ríos, en donde la simulación hidráulica se llevó a cabo en el software Hec-Ras \cite{guida2015strategic, manfreda2014investigation, dimitriadis2016comparative}. También se ha analizado la modelación de inundaciones, evaluando el riesgo de inundación a través de sistemas de simulación hidrológica en 3D, 2D y 1D con Hec-Ras \cite{zazo2015analysis}. Como parte de las investigaciones realizadas se resalta la utilización del sistema hidráulico para tener un mayor detalle geográfico de la zona, así mismo, el software realiza un análisis más apropiado de los parámetros del modelo para la predicción de inundaciones.
Por otro lado, actualmente se han desarrollado estudios de modelos de predicción para eventos futuros integrando técnicas de sistemas de inteligencia artificial, la cual tiene una estructura matemática flexible que es capaz de identificar complejas relaciones no lineales entre las características de los datos de entrada y de salida y para lo cual es difícil describir el proceso utilizando ecuaciones físicas \cite{seckin2013comparison}. Algunas de las herramientas más usadas en el campo de la inteligencia artificial para el pronóstico de inundaciones a nivel global, son las técnicas de Soft Computing como las Redes Neuronales Artificiales (RNA), que a través de modelos matemáticos inspirados en procesos neurológicos simulan el funcionamiento del cerebro para la resolución de problemas \cite{kalteh2013monthly, wang2009comparison}, y las redes neuronales con sistemas difusos (ANFIS, por sus siglas en inglés), que representan una combinación de las herramientas previamente mencionadas y pueden ser utilizadas para llevar a cabo la elaboración de modelos de pronóstico \cite{aqil2007analysis}.
Fake Source Injection with lcogtsnpipe
Research Olympics
What Really Happened: Benjamin Franklin's Kite Experiment
Authorea User Spotlight: Jenna Morgan Lang
and 1 collaborator
Interdisciplinarity: Working Together Takes Work
A big challenge, but one that I enjoy, is that the important—many of the most societally relevant—problems can no longer be just solved with physics like for the transistor or biology like the for Polio vaccine. It is increasingly the case that we need to bring different groups of people together from very different disciplines to partner and tackle important problems. It is like the analogy that we can no longer act like golf or tennis players—we have to now think in terms of baseball or football. A baseball team will not be successful if it is full of shortstops.
7 Crazy Things You Didn't Know About DNA
Data Visualization: Create Powerful Infographics
and 1 collaborator
When the Obstacle is the Course: Job Security in Academia
This post is part of the series called Obstacles in Academia, which aims to highlight the many challenges young scientists face today.
Authorea Partners with Italian Doctoral Association
and 1 collaborator
What Really Happened: Darwin's Finches
The Decline of Accuracy in Science Communication: Who is to Blame?
Journal Of Management Template