Definition - What does Nanometer (nm) mean?
A nanometer (nm) is a metric unit of spatial measurement that is one billionth (1×10-9) of a meter. It is commonly used in nanotechnology, the building of extremely small machines.
The ability to monitor corrosion on a nanometer scale is a powerful tool for a fundamental understanding of surface chemical processes. Coating thickness is often measured in nanometer scale, and corrosion progression can also be measured in nanometers.
Corrosionpedia explains Nanometer (nm)
The nanometer is often used to express dimensions on an atomic scale – the diameter of a helium atom, for example, is about 0.1 nm, and that of a ribosome is about 20 nm. The nanometer is also commonly used to specify the wavelength of electromagnetic radiation near the visible part of the spectrum – visible light ranges from around 400 to 800 nm. The angstrom, which is equal to 0.1 nanometer, was formerly used for these purposes.
The nanometer scale is able to measure the structure of crack tips, rates of crack growth and the roughness of fracture surfaces of materials.
A wide variety of methods for imaging and analyses at resolutions down to the nanometer scale can be used to examine crack and corrosion film characteristics. The early stages of passivity breakdown and localized corrosion are obtained by scanning tunneling microscopy (STM), a tool used to investigate the initiation of localized corrosion at the nanometer scale.
In electroplating, nanometers are used to measure the plating thickness. For example, a layer of zinc atoms approximately 10 nanometers thick can be observed as a silvery layer on the copper cathode. Nanometers can also be used to measure deposition thickness from chemical vapor depositions.