How accurate must test equipment be in order to satisfy the regulations?
The reason for asking this question is that I now have access to 3 bits of test gear for Insulation and continuity testing. If I use a 1% tolerance 10Mohm resistor as a test item I get 3 diffrent results from the bits of gear. One is a Megger, one a seaward ant the third is a fluke. The fluke is the most accurate with the others being about 8% wrong ( reading low ). If I then use 2 multimeters and ohms law the result is almost exactly 10 M ohms.Similar differences appear on the "low ohms" ranges with a 1 Ohm standard resistor!
When "calibrating" items of test gear what adjustments are made as looking inside say the megger there appear to be no adjustmets at all?
The reason for asking this question is that I now have access to 3 bits of test gear for Insulation and continuity testing. If I use a 1% tolerance 10Mohm resistor as a test item I get 3 diffrent results from the bits of gear. One is a Megger, one a seaward ant the third is a fluke. The fluke is the most accurate with the others being about 8% wrong ( reading low ). If I then use 2 multimeters and ohms law the result is almost exactly 10 M ohms.Similar differences appear on the "low ohms" ranges with a 1 Ohm standard resistor!
When "calibrating" items of test gear what adjustments are made as looking inside say the megger there appear to be no adjustmets at all?
Last edited: