Search results
Results from the WOW.Com Content Network
A calibration curve plot showing limit of detection (LOD), limit of quantification (LOQ), dynamic range, and limit of linearity (LOL).. In analytical chemistry, a calibration curve, also known as a standard curve, is a general method for determining the concentration of a substance in an unknown sample by comparing the unknown to a set of standard samples of known concentration. [1]
In thermodynamics, the Volume Correction Factor (VCF), also known as Correction for the effect of Temperature on Liquid (CTL), is a standardized computed factor used to correct for the thermal expansion of fluids, primarily, liquid hydrocarbons at various temperatures and densities. [1]
The formal definition of calibration by the International Bureau of Weights and Measures (BIPM) is the following: "Operation that, under specified conditions, in a first step, establishes a relation between the quantity values with measurement uncertainties provided by measurement standards and corresponding indications with associated measurement uncertainties (of the calibrated instrument or ...
The calibration curve that does not use the internal standard method ignores the uncertainty between measurements. The coefficient of determination (R 2 ) for this plot is 0.9985. In the calibration curve that uses the internal standard, the y-axis is the ratio of the nickel signal to the yttrium signal.
Custody transfer measurements involve measurements in pipelines, storage tanks, transportation tanks (tankers, trailers or railway tanks) - whole fuel distribution process must be traceable. In order measurements can be made in a volume or mass units (or both), so various metering methods are commonly used. [3]
A classical torsion wire-based du Noüy ring tensiometer. The arrow on the left points to the ring itself. The most common correction factors include Zuidema–Waters correction factors (for liquids with low interfacial tension), Huh–Mason correction factors (which cover a wider range than Zuidema–Waters), and Harkins–Jordan correction factors (more precise than Huh–Mason, while still ...
Gravimetric analysis describes a set of methods used in analytical chemistry for the quantitative determination of an analyte (the ion being analyzed) based on its mass. The principle of this type of analysis is that once an ion's mass has been determined as a unique compound, that known measurement can then be used to determine the same analyte's mass in a mixture, as long as the relative ...
Calibration involves taking three readings: one with an empty tube R 0, one with a tube filled with the calibration reference material, and one with the tube filled with the sample R s. Some balances feature an auto-tare function that eliminates the need for the R 0 measurement. [11] The first two readings provide a calibration constant (C).