Mohs Hardness Scale
Definition - What does Mohs Hardness Scale mean?
The Mohs hardness scale refers to a sequential series of ten minerals ranging from softest to hardest used to quantify the degree of resistance to scratching of a given material. Such a material would be examined in reference to being scratched by one of the ten scale minerals or other minerals of a corresponding degree of hardness.
This scale is a tool used to prevent corrosion on metal surfaces by identifying corresponding minerals that are incompatible for certain applications that lead to heightened scratching. A scratched surface provides an avenue for permeation of corrosion-causing agents into a metal, thus increasing the probability of corrosion.
The Mohs hardness scale may also be known as the Mohs scale of mineral hardness.
Corrosionpedia explains Mohs Hardness Scale
The Mohs hardness scale was invented in 1812 by the German geologist Friedrich Mohs (1773-1839). It is used to facilitate the Mohs hardness test, which makes hardness a reliable diagnostic property for most minerals.
The hardness scale is as follows:
As shown in the table above, a series of ten minerals is arranged from the softest to the hardest such that a mineral with a lower number can be scratched by a mineral with a higher number.