Robert Hanisch edited Data Quality.tex  about 10 years ago

Commit id: dc10c739494a544120b63bbba90f33749508251b

deletions | additions      

       

\item Sensitivity – How sensitive are results to changes in inputs or upon assumed boundary conditions?  \end{itemize}  Similar, and perhaps more difficult problems pertain to simulation data. While such data may be perfectly precise in a numerical sense, simulations typically rely on many parameters, assumptions, and/or approximations.  In principle, if the above are specified, and the quantitative metrics meet user requirements, the data can be used with high level of confidence. A similar approach to defining data quality was recently proposed within the context Nanotechnology Knowledge Infrastructure Signature Initiative within the National Nanotechnology Initiative\cite{DRLs}. Initiative \cite{DRLs}.  An often posed question in the research community with regard to data associated with peer-reviewed journal articles is that of peer-review of the data itself. Indeed, it has been reported that approximately 50\% of data being reviewed for submission to the The American Mineralogist Crystal Structure Database contained errors\cite{downs}. errors \cite{downs}.  The elements defined above represent the key criteria by which to judge the quality of the data. General pedigree and provenance information are typically conveyed in most research articles, though they may be provided in insufficient detail to reproduce the data. The remaining elements of validation, verification, uncertainty and sensitivity are relatively loosely defined within materials science and engineering, and best practices have not generally been developed for each element, or, where developed, are not in widespread use.