APPLICATION NOTE

Why Calibrate Your Cable Certifier?

Download PDF

Introduction

You are serious about your cable test instruments. You buy top brands, and you expect them to be accurate. You know some people send their testers to a lab for calibration, and you wonder why. After all, they are all electronic – there’s no meter movement to go out of calibration. What do those calibration folks do, anyhow – just change the battery?

These are valid concerns, especially since you can’t use your tester while it’s out for calibration. But, let’s consider some other valid concerns. For example, what if an event rendered your tester less accurate, or maybe even unsafe? What if you are working with tight tolerances and accurate measurement is key to proper operation of expensive processes or safety systems? What if you are trending NEXT or loss measurements across your projects, and two testers used for the same measurement significantly disagree?

Why calibrate your cable certifier?

Field certification of structured datacomm cable installations is a high stakes game. Receipt of payment for a job is usually contingent on successful certification of all the links, which often number in the thousands. A faulty certifier can play havoc in various ways. For instance, suppose the certifier yields false passes of bad links. In that case, the future users of the system could experience networking problems, traceable to the cable plant. These problems could result in legal action against the installer, who would also be responsible for rework and repair. On the other hand, suppose the certifier fails good links. Then the installer will expend needless time and money on repair and rework.

The Fluke Networks design team focuses on creating robust certifiers whose design intrinsically guarantees accuracy and reliability. Our production team strives to bring to zero the chance of shipping a defective instrument. However, once the instrument is in service, various unavoidable factors come into play that can affect performance.

Such a factor is simply the passage of time and associated environmental stresses. The component parts of our measurement systems are highly stable, being devices such as resistors, capacitors, and integrated circuits. However, these components will inevitably exhibit slight variations over time, due largely to routine temperature and humidity variation that occurs both operationally as well as during storage and transport. An instrument could spend the night in a sub-freezing car trunk, followed by rapid warmup to a normal office environment for the day’s testing. Even in a controlled environment, the circuit assemblies warm up and cool down due to the power draw of the measurement engine as it cycles on and off during the work day. Another more insidious factor could be a defect induced by an extreme environmental event. Suppose an instrument is dropped onto concrete from a tall ladder. Very likely the instrument will survive, since we design for impact and perform rigorous qualification testing. But still, a component might be loosened or otherwise damaged. This component may cause subtle accuracy degradation, resulting in false fails or passes. Or, suppose an instrument becomes contaminated with a material that compromises the clean surface of a printed circuit board. This could result in current leakage, adversely affecting precision resistance measurements. Clearly, we can envision factors both expected and unexpected that over time reduce one’s confidence of accuracy.

What is calibration?

All of these uncertainties can be greatly mitigated through routine instrument calibration, which has several benefits. The first step in calibration is essentially a measurement evaluation and correction process, during which the instrument is connected to a comprehensive series of reference calibration standards. The instrument measures each standard and stores internal correction data so that subsequent measurements of the standard are precisely centered. A very useful side benefit of this process is that a comprehensive self-test is performed at the same time. For each artifact, the instrument’s internal calibration data is compared to a pass/fail template. Failure indicates a faulty circuit. These templates were created through rigorous statistical analysis of a large population of instruments, and serve as a highly sensitive test of instrument health.

The second step involves measuring a set of Verification Artifacts. These artifacts behave much like a cable link from the instrument’s point of view. For instance, the Insertion Loss artifact yields a measurement that is similar to a 100 meter cable link. The artifacts are used as transfer standards. Each one has been measured with a laboratory system that employs highly accurate and NIST traceable bench equipment. The resulting data is archived and compared with the instrument’s test result. The difference is the observed measurement accuracy, which is compared to a calculated pass/fail limit based on the instrument’s uncertainty specifications.

To summarize, the process centers the instrument’s measurements, performs a highly sensitive self-test, and verifies accuracy against NIST traceable transfer standards. The owner receives his instrument in the highest state of operational readiness.

Calibration frequency

The question isn’t whether to calibrate – we can see that’s a given. The question is when to calibrate. We can’t predict with certainty the accuracy drift of any single instrument. However, with decades of production history, and with tens of thousands of instruments in service worldwide, we have a solid empirical base to support our recommendations for calibration frequency. As a general rule, we recommend calibration on a yearly basis at a minimum, as a reasonable balance between costs / down time, while maintaining high operational confidence.

Other circumstances may suggest additional off-schedule calibration. For instance, calibration should be considered prior to undertaking a very large job. Conversely, successful calibration immediately following a job will provide the very highest confidence of accurate results for that job. Or, if a potentially damaging event has occurred, such as a hard impact or an extreme temperature cycle, calibration should be considered.

 

While this article focuses on calibrating testers, the same reasoning applies to your modules.

Calibration isn’t a matter of “fine-tuning” your test instrument. Rather, it ensures you can safely and reliably use instruments to get the accurate test results you need. It’s a form of quality assurance. You know the value of testing cables, or you wouldn’t have test instrumentation to begin with. Just as cables need testing, so do your test instruments.

Get Annual Calibration for Free with Gold Support

Gold support includes one calibration and factory refresh per year at no charge. Your cable analyzer or fiber bundles will be precisely calibrated to factory specifications (calibration certificate provided – calibration traceable with data is available for extra charge) using the full battery of proprietary Fluke Networks test procedures, adjusted/repaired as necessary with genuine repair parts, software and firmware updates applied, all accessories tested and replaced if defective, then cleaned and performance verified. Typical calibration turn-around time is 5 business days, but Gold member loaner units are available for BF calibration (available in most regions). Learn more about Gold Support at www.flukenetworks.com/goldsupport, contact your local representative or Gold Sales 888-283-5853.

Non-Gold members can still receive outstanding service for calibration or repair from any of the Authorized Fluke Networks Service Centers. Simply find the Service Center closest to you and they will help you with your Fluke Networks repair and calibration needs or contact us at 1-888-993-5853.