loading page

Bayesian Analysis of High Throughput Measurement Data
  • Eric Ma
Eric Ma

Corresponding Author:[email protected]

Author Profile

Abstract

Duplicate or triplicate experimental replicates are commonplace in the high throughput literature. However, it has not been tested whether this is statistically defensible or not. To address this issue, we use probabilistic programming to extend Kruschke's Bayesian two-sample comparison model. With the model and simulated data, we show that a small increase in the number of replicate experiments can quantitatively improve accuracy in measurement. We also provide posterior densities for statistical parameters used in the evaluation of HT data. Finally, we provide an extensible open source implementation that ingests data structured in a simple format and produces posterior densities of estimated measurement and assay evaluation parameters.