Influence of Depth of Interaction upon the Performance of Scintillator Detectors
and 6 collaborators
T-SNE visualization of large-scale neural recordings
and 2 collaborators
Multi-Frequency Electrical Impedance Tomography Data Collected From Stroke Paitents
and 2 collaborators
Electrical Impedance Tomography (EIT) could be used as a rapid and non-invasive technique to image/diagnose ischaemic or haemorrhagic stroke. However, there are currently no suitable imaging/classification methods which which can be applied to human stroke data. In part this is due to the complexity of the problem itself, but it is also affected by a lack of available data on which to evaluate different techniques. Multi-frequency EIT data (alongside MRI/CT) has been collected on 23 stroke patients, and 10 healthy volunteers, as part of a clinical trial in collaboration with the Hyper Acute Stroke Unit (HASU) at University College London Hospital (UCLH). Data was collected at 17 frequencies between 5Hz and 2kHz, with 31 current injections, yielding 930 measurements at each frequency. The raw data, collected simultaneous on all channel using an EEG amplifier sampling at 16kHz, is also made available. Bit more maybe
The reservoir pressure concept: a response to the controversy
and 2 collaborators
A versatile and reproducible multi-frequency Electrical Impedance Tomography system
and 4 collaborators
A highly versatile EIT system, nicknamed the ScouseTom, has been developed. The system allows control over current amplitude, frequency, number of electrodes, injection protocol and data processing. A Keithley 6221 current source is used, along with a 24-bit EEG system for voltage recording. Custom PCBs interface with a PC to control the measurement process, electrode addressing and triggering external stimuli. The performance of the system has been characterised using resistor phantoms in experiments representative of human scalp recordings, with an overall SNR of 77.5 dB (n=343), stable across a four hour recording and across frequencies from 20 Hz to 20 kHz. The ScouseTom was used successfully in four modes of operation: time difference, triggered averaging, multi-frequency EIT and impedance spectrum measurements, in experiments investigating stroke and evoked potentials in both rat and human recordings. The experimental procedure is controlled by software and is readily adaptable to new paradigms. Where possible, commercial or open-source components have been used, to minimise the cost and complexity in reproduction. All of the hardware designs and software for the system have been released under an open source licence, encouraging contributions and allowing for rapid replication.
T-SNE visualization of large-scale neural recordings - Supplementary
Arterial Blood Pressure During Diastole (Draft)
and 1 collaborator
Three models of arterial pressure during diastole; a single exponential with zero asymptote, a single exponential with a non-zero asymptote, and a double exponential with a zero asymptote; are fitted to measurements of pressure in the ascending aorta. The results indicate that the single exponential with zero asymptote fits the data relatively poorly. Both the single exponent with a non-zero asymptote and the double exponent fit the measured pressure during diastole remarkably well. These two models, however, diverge significantly at longer times commensurate with extended diastole due to missing or ectopic beats. We conclude that the best choice of model can only be ascertained by looking at the arterial pressure during abnormal extended diastoles.
The curse of dimensionality reduction. - Or how (not) to reverse engineer the brain.
I have been working in Neuroscience for seven years already, but my PhD, in a slightly irrelevant way, was in molecular biophysics / protein mechanics. It was about figuring out the folding dynamics of protein molecules so small nobody knew how to measure appropriately yet. Soon after I started in it, I realized I had landed myself in an interesting situation. Although most biophysicists / physical chemists work in a molecular biology / biochemistry lab I ended up working in the Molecular and Nanoscale Physics group of the Department of Physics and Astronomy of my university. My project demanded a strong collaboration with the biochemists and the molecular biologists across the road from us (someone had to to mutate and purify the proteins I was all too happy to blast with very strong lasers beams) but I also spent large chunks of my time interacting with the engineers and physicists in my group. It was maybe three years into my studies when I started piecing together the reasons I often felt like suffering a bit from bipolar or split personality syndromes. I especially remember the monthly lab meetings were the physicists would visit the molecular biologists’ offices (or vice versa) and would all sit down to discuss the progress of a common project (mine being one of those). We all seemed to share the same overarching goal, i.e. to understand how proteins fold and how structure leads to function. Everybody agreed that this was how we would be able to prevent and correct misfoldings in the cell and create artificial proteins with novel functionalities. Yet, the question of how to go on about that goal was what appeared to divide us at a level that went all the way down to our most basic and assumed notions of how to do science, or even what science is in the first place. Over the course of my degree it dawn on me that there was this oft used word which appeared to posses two very different meanings according to which group you asked. The magic word, and in my view, the cause of many a misunderstanding was ’model’. To the molecular biologist, to model a protein was to collect all the data there was to collect for it (structural, dynamic, thermodynamic and what have you), maybe draw some qualitative and descriptive (i.e language based) assumptions about how that protein did what it did and then collate it all together in a nice publication. For the physicist, modeling a protein meant to collect the least amount of data from it possible and use these to create, expand, tweak, or disprove one of the caricaturish but quantitative descriptions of how all proteins under all conditions folded and functioned. And there you had it; model as an exhaustive collection of true information, collated in an as easy to use look up table (written in English or another language) or model as a summarizing group of principles (written in maths) that are general but always a bit wrong. Or to put it differently, models driven by data, or data driven by models.
Welcome to Authorea!
and 1 collaborator