What Does
Mohs Hardness Scale Mean?
The Mohs hardness scale refers to a sequential series of ten minerals ranging from softest to hardest used to quantify the degree of resistance to scratching of a given material. Such a material would be examined in reference to being scratched by one of the ten scale minerals or other minerals of a corresponding degree of hardness.
This scale is a tool used to prevent corrosion on metal surfaces by identifying corresponding minerals that are incompatible for certain applications that lead to heightened scratching. A scratched surface provides an avenue for permeation of corrosion-causing agents into a metal, thus increasing the probability of corrosion.
The Mohs hardness scale may also be known as the Mohs scale of mineral hardness.
Corrosionpedia Explains Mohs Hardness Scale
The Mohs hardness scale was invented in 1812 by the German geologist Friedrich Mohs (1773-1839). It is used to facilitate the Mohs hardness test, which makes hardness a reliable diagnostic property for most minerals.
The hardness scale is as follows:
Mineral Hardness Scales |
Mineral |
Mohs |
Vickers(kg/mm2) |
Talc |
1 |
27 |
Gypsum |
2 |
61 |
Calcite |
3 |
157 |
Fluorite |
4 |
315 |
Apatite |
5 |
535 |
Orthoclase |
6 |
817 |
Quartz |
7 |
1161 |
Topaz |
8 |
1567 |
Corundum |
9 |
2035 |
Diamond |
10 |
10000 |
As shown in the table above, a series of ten minerals is arranged from the softest to the hardest such that a mineral with a lower number can be scratched by a mineral with a higher number.