A critical component of the IR test itself is the DC test voltage level used during the process. The amount of leakage current that can be measured in an insulation’s dielectric material is directly dependent on the test voltage level being applied. IEEE, NETA, and ABS standards all confirm that when performing an IR test, the higher the test voltage level used the greater the ability will be to detect any defects that may be present in the insulation materials. Those defects, such as dirt or moisture, are what breakdown the insulation materials causing the insulation resistance to drop to an unacceptable level and eventually making the equipment unsafe to operate. Typically a 500 VDC or 1000 VDC test voltage is used for low voltage equipment and either a 2500 VDC or 5000 VDC test voltage is used for medium and high voltage equipment. IEEE Std.43-2000 and NETA MTS-2011 both contain industry standard guidelines for choosing the correct minimum test voltage to be used when performing IR testing on equipment operating at various voltage levels. These minimum IR testing voltages must always be adhered to in order to accurately measure the Insulation Resistance in all electrical equipment. Any test done at a lower test voltage level is considered to be inaccurate and misleading at best.