Hardness Test

Definition - What does Hardness Test mean?

A hardness test is a method employed to measure the hardness of a material. Hardness refers to a material’s resistance to permanent indentation.

There are numerous techniques to measure hardness and each of these tests can identify varying hardness values for a single material under testing. Hence, hardness test as a method can be dependent and each test's outcome needs to be labeled to determine the kind of hardness test used.

Corrosionpedia explains Hardness Test

Hardness tests are extensively used to characterize a certain material and to identify if it is appropriate for its intended purpose. All hardness tests involve the utilization of a particularly shaped indenter that is harder than the material under testing. The indenter is pressed onto the test surface with the use of a certain amount of force. The size of the depth of the indent is measured in order to determine the hardness value.

Hardness tests are beneficial because:

  • The hardness test is easy to conduct.
  • Results can be obtained within 30 seconds.
  • Tests are relatively cost effective.
  • Finished components can be subjected to testing without being damaged.
  • Any shape and surface size can be subjected to testing.

The major applications of hardness tests are to verify the type of heat treatment to be used on a part and to identify if a material possesses the required properties for its intended use. This makes hardness tests beneficial in industrial applications.

The five most common hardness scales are:

  • Knoop
  • Vickers
  • Rockwell
  • Brinell
  • Shore
This definition was written in the context of Corrosion
Share this:

Connect with us