Definition of hardness scale
i. The scale by which the hardness of a mineral is determined as compared with a standard. The most frequently used and quoted hardness scale was devised by Friedrich Mohs about 1820. Another commonly used hardness scale is the Vickers scale, which depends on the indentation depth and width made by a weighted point (usually a diamond in the form of a square-based pyramid). The Brinell scale also characterizes the indentation hardness, but using a small steel ball.
The Mohs scale is a relative scale of hardness of ten standard minerals, higher numbers indicate the mineral can scratch minerals with a lower number, so calcite can scratch gypsum, but not fluorite.
A human fingernail has a hardness of around 2.5, so a fingernail scratch can easily determine the difference between calcite and gypsum. Caution: Steel is often cited as having a hardness of 5 to 5.5, but many high quality steels can have a Mohs hardness 6 and more.
See also: hard, hardness, hardness test
The Mohs scale is a relative scale of hardness of ten standard minerals, higher numbers indicate the mineral can scratch minerals with a lower number, so calcite can scratch gypsum, but not fluorite.
A human fingernail has a hardness of around 2.5, so a fingernail scratch can easily determine the difference between calcite and gypsum. Caution: Steel is often cited as having a hardness of 5 to 5.5, but many high quality steels can have a Mohs hardness 6 and more.
Mohs | ||||||||||||||||||||||
|
See also: hard, hardness, hardness test