Verizon makes nice with P2P

From an ISP’s point of view, P2P traffic can appear to be exceptionally daunting. If they choose to block it, as some have accused almost all of the major US ISPs of doing, then their networks would become ghost networks, with virtually no traffic in sight. But if they embrace it, their networks are fast moving crazy places, where suppliers have to sprint to keep their network surviving.

So what’s it to be? Well Verizon appears, at least to be considering a middle road, one where instead of working against P2P, or just putting up with its traffic costs, it will offer protocols to help co-operate with P2P networks to deliver entertainment, by better understanding the conditions of the network it is traveling over. That really IS open.

The initiative began last July and is through the auspices of a Distributed Computing Industry Association (DCIA) working group called P4P, which stands for Proactive network Provider Participation for P2P. The two founder members and chairs come from Pando Networks and Verizon Communications. Pando is one of the new breed of P2P companies trying to eek out a living in legal P2P file delivery.

This is really a club for ISPs and P2P suppliers in which they can work out their differences and it is so much more of a positive approach than whining about network traffic and investing purely in “traffic shaping".

Statements from this workgroup claim that software that is already being tested which can improve download speed between 200 per cent and 600 per cent, purely by offering up a set of network APIs, which let a P2P application know which parts of a network are busy, and using this to intelligently decide which P2P nodes should be uploading in support of a file or stream delivery. It’s not rocket science, and if a CompSci grad student had been given the problem he could have come up with the same answer, but it is how to phrase that question which is interesting.

If the question was “How do we get traffic zingin around the internet, for nothing, without the help of the ISP and despite its best efforts to stop us,” then that definitively is the wrong question. If it were simply told “you have a network and multiple copies of large files distributed around that network, how do you build a rapid file delivery mechanism,” then naturally you reach the DCIA answer.

It is the history of ISPs and P2P suppliers being at each other’s throats for so long, that makes it hard to see how this might ever have come about.

In fact what needed to happen was that the livelihood of ISPs needed to be threatened, where the average customer was expecting more and more from the ISP, while the average monthly price for ISP service went down and down, and traffic on their networks went up and up, forcing more and more investment. At that point, P2P traffic is taken as a fact of life, not something that the ISP looks to the US Supreme Court to make illegal.

ISPs cannot block all P2P activity because Verisign’s Kontiki P2P client, which is now used to deliver millions of hours of TV services around the world from respectable broadcasters, Skype, as well as Joost and Babelgum, are not breaking any laws. Even Kazaa and Bit- Torrent may now be carrying more legal than illegal traffic, or if not yet, they should lean that way over time.

If we look beyond this simple set of proposals we see more and more which might be done. By bringing ISPs and P2P suppliers closer, perhaps the handshakes for this type of co-operative routing might also include some form of legitimate traffic audit. So we perhaps reach a point where if P2P traffic from your software passes some kind of “threshold” test of mostly sending legitimate files (something that deep packet inspection might still be needed for) then the APIs to sense the condition of the network are open to your client software, and it is pushed higher up the food chain in terms of the priority attached to the traffic.

If mostly copyrighted material appears to be traveling across the network, then perhaps that API co-operation is refused by the network nodes and the resulting traffic packets will be treated as low priority. That would create an underclass and upperclass of P2P clients, each with a signature which would trigger the various treatments by ISPs.

Now that all sounds fine and dandy, except that much of this traffic is encrypted, and one P2P client can be made to emulate any other, and can re-establish itself in different ports once it is identified and slowed, so there would be technical hurdles, but we believe that there would remain a class of P2P players that are working with the ISPs, and class that is not. What that creates politically is an accelerate acceptance of P2P activity for the average ISP customer.

Regardless, it is encouraging to know that Verizon at least is looking to a future when the FCC might make it illegal to indiscriminately block or slow P2P traffic, and instead is thinking about how to make the internet turn into one huge TV set, sending Gigabyte class video files to every home.

While so little of Verizon’s revenues currently rely on video delivery compared to say Comcast, its sworn enemies the major cable operators perhaps may not feel able to embrace this approach, and this will accelerate a drift towards using the RBOCs as ISPs rather than the more expensive and more restrictive cable cos.

In the end we would expect that protocols and APIs that comes out of this work will have to be a standard, and one that EVERY ISP, regardless of what their main business is (cable or telco) will be forced to offer it.

Either customers will begin to leave in droves as word gets out that P2P goes faster on other networks or the simple fact that those networks that don’t wish to co-operate will still be faced with a day by day war trying to keep P2P at bay, and will be still suffering the traffic consequences of badly saturated networks. Comcast is supposedly supporting this new P4P activity, despite the accusations about its “traffic shaping” activities.

There is the feeling that the ISPs are the companies that need the technical help to make this happen, not that they are being begged by P2P software suppliers, because the P2P players seem to be winning the technology war here. Perhaps it is the ISPs that need the P2P players’ help, not the other way around.

Faultline is published by Rethink Research, a London-based publishing and consulting firm. This weekly newsletter is an assessment of the impact of the week's events in the world of digital media. Faultline is where media meets technology. Subscription details here.