Understanding Corrosion in Water Pipelines: A Guide for Pipeline Designers


Magnetic Pull-Off Gauge (Type I)

Last updated: July 24, 2017

What Does Magnetic Pull-Off Gauge (Type I) Mean?

A magnetic pull-off gauge (type I) is a measurement device used to determine the thickness of a coating without damaging it. The magnetic pull-off gauge does this by measuring the magnetic attraction between the iron-based (ferrous) material that has been coated and a magnet that is attached to the gauge. Typically a thicker coating will cause a weaker magnetic force to be measured.


Corrosionpedia Explains Magnetic Pull-Off Gauge (Type I)

A magnetic pull-off gauge (type I) consists of a scale, a spring and a magnet. The magnetic force compresses the spring when it is brought into proximity with a magnetic alloy object. The spring compresses further and further as the magnetic force increases, and the scale measures the amount of compression. This force increases as the magnet moves closer to the magnetic alloy object. Thus, a thick coating produces a weaker force and a thin coating produces a stronger force for the same alloy and coating.

The major advantage of a magnetic pull-off gauge is that it does not damage the coating being measured. This eliminates the need for any rework after the measurement has been taken. The major disadvantage of a magnetic pull-off gauge is that it only works with magnetic materials. The spring will not compress if there is no magnetic force present.



Magnetic Pull-Off Gage (Type I)

Share This Term

  • Facebook
  • LinkedIn
  • Twitter

Related Reading

Trending Articles

Go back to top