Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Micrometer (device)
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Calibration: testing and adjusting== ===Zeroing=== On most micrometers, a small [[wrench#spanner wrench|pin spanner]] is used to turn the sleeve relative to the barrel, so that its zero line is repositioned relative to the markings on the thimble. There is usually a small hole in the sleeve to accept the spanner's pin. This calibration procedure will cancel a zero error: the problem that the micrometer reads nonzero when its jaws are closed. ===Testing=== A standard one-inch micrometer has readout divisions of 0.001 inch and a rated accuracy of ±0.0001 inch<ref>{{cite web |url=http://www.starrett.com/download/222_p1_5.pdf |title=Archived copy |access-date=2010-01-19 |url-status=dead |archive-url=https://web.archive.org/web/20110716132738/http://www.starrett.com/download/222_p1_5.pdf |archive-date=2011-07-16 }} GENERAL MICROMETER INFORMATION</ref> ("[[thou (length)#In machining|one tenth]]", in machinist parlance). Both the measuring instrument and the object being measured should be at room temperature for an accurate measurement; dirt, operator skill issue, and misuse (or abuse) of the instrument are the main sources of error.<ref>{{cite web |url=http://www.mahr.de/index.php?NodeID=13120 |title=Micrometer Accuracy - Mahr Metrology |access-date=2009-06-12 |url-status=dead |archive-url=https://web.archive.org/web/20110719060139/http://www.mahr.de/index.php?NodeID=13120 |archive-date=2011-07-19 }} MICROMETER ACCURACY: Drunken Threads and Slip-sticks</ref> The accuracy of micrometers is checked by using them to measure [[gauge block]]s,<ref>BS EN ISO 3650: "Geometrical product specifications (GPS). Length standards. Gauge blocks" (1999)</ref> rods, or similar standards whose lengths are precisely and accurately known. If the gauge block is known to be 0.75000 ± 0.00005 inch ("seven-fifty plus or minus fifty millionths", that is, "seven hundred fifty thou plus or minus half a tenth"), then the micrometer should measure it as 0.7500 inch. If the micrometer measures 0.7503 inch, then it is out of calibration. Cleanliness and [[#Low but consistent torque|low (but consistent) torque]] are especially important when calibrating—each tenth (that is, ten-thousandth of an inch), or hundredth of a millimetre, "counts"; each is important. A mere speck of dirt, or a mere bit too much squeeze, obscures the truth of whether the instrument can read correctly. The solution is simply [[wikt:conscientious#Adjective|conscientiousness]]—cleaning, patience, due care and attention, and repeated measurements (good repeatability assures the calibrator that their technique is working correctly). Calibration typically checks the error at 3 to 5 points along the range. Only one can be adjusted to zero. If the micrometer is in good condition, then they are all ''so near to zero'' that the instrument seems to read essentially "-on" all along its range; no noticeable error is seen at any locale. In contrast, on a worn-out micrometer (or one that was poorly made to begin with), one can "chase the error up and down the range", that is, ''move'' it up or down to any of various locales along the range, by adjusting the sleeve, but one cannot ''eliminate'' it from all locales at once. Calibration can also include the condition of the tips (flat and parallel), ratchet, and linearity of the scale.<ref>{{cite web|title=Archived copy|url=http://ittc.sname.org/2006_recomm_proc/7.6-02-04.res.pdf|url-status=dead|archive-url=https://web.archive.org/web/20111005092058/http://ittc.sname.org/2006_recomm_proc/7.6-02-04.res.pdf|archive-date=2011-10-05|access-date=2011-08-04}} ITTC – Recommended Procedures : Sample Work Instructions Calibration of Micrometers.</ref> Flatness and parallelism are typically measured with a gauge called an optical flat, a disc of glass or plastic ground with extreme accuracy to have flat, parallel faces, which allows light bands to be counted when the micrometer's anvil and spindle are against it, revealing their amount of geometric inaccuracy. Commercial machine shops, especially those that do certain categories of work (military or commercial aerospace, nuclear power industry, medical, and others), are required by various [[standards organization]]s (such as [[International Organization for Standardization|ISO]], [[American National Standards Institute|ANSI]], [[American Society of Mechanical Engineers|ASME]],<ref name="B89.1.13 - Micrometers">[https://www.asme.org/products/codes-standards/b89113-2013-micrometers ASME B89.1.13 - 2013 Micrometers].</ref> [[ASTM International|ASTM]], [[SAE International|SAE]], [[Aerospace Industries Association|AIA]], [[United States Military Standard|the U.S. military]], and others) to calibrate micrometers and other gauges on a schedule (often annually), to affix a label to each gauge that gives it an ID number and a calibration expiration date, to keep a record of all the gauges by ID number, and to specify in inspection reports which gauge was used for a particular measurement. Not all calibration is an affair for metrology labs. A micrometer can be calibrated on-site anytime, at least in the most basic and important way (if not comprehensively), by measuring a high-grade gauge block and adjusting to match. Even gauges that are calibrated annually and within their expiration timeframe should be checked this way every month or two if they are used daily. They usually will check out OK as needing no adjustment. The accuracy of the gauge blocks themselves is traceable through a chain of comparisons back to a master standard such as the [[International prototype of the metre|international prototype of the meter]]. This bar of metal, like the [[:File:CGKilogram.jpg|international prototype of the kilogram]], is maintained under controlled conditions at the [[International Bureau of Weights and Measures]] headquarters in France, which is one of the principal [[measurement standards laboratory|measurement standards laboratories]] of the world. These master standards have extreme-accuracy regional copies (kept in the national laboratories of various countries, such as [[NIST]]), and metrological equipment makes the chain of comparisons. Because the definition of the meter is now based on a light wavelength, the international prototype of the meter is not quite as indispensable as it once was. But such master gauges are still important for calibrating and certifying metrological equipment. Equipment described as "NIST traceable" means that its comparison against master gauges, and their comparison against others, can be traced back through a chain of documentation to equipment in the NIST labs. Maintaining this degree of traceability requires some expense, which is why NIST-traceable equipment is more expensive than non-NIST-traceable. But applications needing the highest degree of quality control mandate the cost. ===Adjustment=== A micrometer that has been zeroed and tested and found to be off might be restored to accuracy by further adjustment. If the error originates from the parts of the micrometer being worn out of shape and size, then restoration of accuracy by this means is not possible; rather, repair (grinding, lapping, or replacing of parts) is required. For standard kinds of instruments, in practice it is easier and faster, and often no more expensive, to buy a new one rather than pursue refurbishment.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Micrometer (device)
(section)
Add topic