Testing advanced cabling plants today is so complex that a single Category 5e test report can include more than 400 data fields. With typical jobs including hundreds and even thousands of links, even if network engineers examine all the results and determine that they all pass, there are still critical questions remaining: How can you tell if an incorrect test specification or the wrong adapter was used? Were all testers using the right software? Did you get the margins you expected and were they consistent?
Anyone who’s ever tested a multimode fiber optic link with light sources from different equipment vendors will know that the loss measurement can vary by as much as 50 percent. Without proper controls, multimode light sources will inject light differently into multimode fibers. Even light sources from the same manufacturer operating under different launch conditions, will produce diverse link-loss measurements, leading to different—and often confusing—test results.
With the introduction of low loss fiber optic components such as LC/MPO cassettes, loss budgets (test limits) are becoming increasingly smaller. As a result, installers are finding out that previous methods and assumptions about fiber testing no longer hold true.
While the Telecommunications Industry Association (TIA) still allows 0.75 dB per connector, factory polished connectors are closer to 0.2 dB. So when testing to TIA limits, installers are afforded quite a bit of measurement uncertainty. In other words, their testing practices need to be reasonable but not perfect.
As companies try to address the rising demand in IP traffic in order to support a growing array of business applications, many are designing their data centers with fiber optic networks that support 10G or 40G Ethernet connections between servers, switches and storage area networks, and 100G Ethernet for core switching and routing backbone connections.