Search results
Results from the WOW.Com Content Network
In analytical chemistry, the detection limit, lower limit of detection, also termed LOD for limit of detection or analytical sensitivity (not to be confused with statistical sensitivity), is the lowest quantity of a substance that can be distinguished from the absence of that substance (a blank value) with a stated confidence level (generally 99%).
A calibration curve plot showing limit of detection (LOD), limit of quantification (LOQ), dynamic range, and limit of linearity (LOL).. In analytical chemistry, a calibration curve, also known as a standard curve, is a general method for determining the concentration of a substance in an unknown sample by comparing the unknown to a set of standard samples of known concentration. [1]
Because of the complex inter-relationship between analytical method, sample concentration, limits of detection and method precision, the management of Analytical Quality Control is undertaken using a statistical approach to determine whether the results obtained lie within an acceptable statistical envelope.
There are different definitions within laboratory quality control, wherein "analytical sensitivity" is defined as the smallest amount of substance in a sample that can accurately be measured by an assay (synonymously to detection limit), and "analytical specificity" is defined as the ability of an assay to measure one particular organism or ...
Detection bias occurs when a phenomenon is more likely to be observed for a particular set of study subjects. For instance, the syndemic involving obesity and diabetes may mean doctors are more likely to look for diabetes in obese patients than in thinner patients, leading to an inflation in diabetes among obese patients because of skewed detection efforts.
In statistical quality control, the CUSUM (or cumulative sum control chart) is a sequential analysis technique developed by E. S. Page of the University of Cambridge. It is typically used for monitoring change detection. [1] CUSUM was announced in Biometrika, in 1954, a few years after the publication of Wald's sequential probability ratio test ...
The sensitivity index or discriminability index or detectability index is a dimensionless statistic used in signal detection theory. A higher index indicates that the signal can be more readily detected.
In statistics, Grubbs's test or the Grubbs test (named after Frank E. Grubbs, who published the test in 1950 [1]), also known as the maximum normalized residual test or extreme studentized deviate test, is a test used to detect outliers in a univariate data set assumed to come from a normally distributed population.