How to avoid packet duplication through a monitoring system

Let’s take a look at another common issue that dogs unintelligent monitoring systems – which causes a reduction in the effectiveness of connected Analytical tools.

What is Packet duplication and what causes packets to be duplicated?

This is where the same packet is on the wire multiple times. It gets there through a multitude of ways, but the issue is that it causes problems with analytic tools such that they end up spending more time than they should removing the duplicated packets before any actual analysis can take place. Analytic tool results can become skewed, network issues can become deeply hidden.

With a router plugged in to a single network segment, if multiple SPAN ports are connected to a single aggregator, identical traffic will be present from each of the SPAN ports. You monitoring system will have collected multiple copies of the same packet. With an aggregator, if you are monitoring traffic from several points in a series of point to point links, you’ll end up with the same packet as it traverses each link, resulting in multiple copies of the same packet.

Why is this an issue?

If you have multiple identical packets being provided by the monitoring system / traffic visibility / network monitoring infrastructure, the tool has to do more work. The tool becomes less effective – essentially costs more, as you have to buy more tools to analyse same amount of data. On top of that, there could be issues with the analysis results since the same packets are arriving with different delays respective to each other which can cause different / un-repeatable results / a lack of precision with the results. Also, issues with a network could be hidden because the tool can’t correctly analyse the data – as the underlying issue is being effectively hidden by the duplicated packets.

How to avoid this issue?

When you select a monitoring system, make sure it has a capability to remove duplicated packets as a base feature…