VoIP services have become fertile ground for the type of comparison reviews — complete with "Editor's choices" and star ratings — more typical of computer hardware devices such as monitors, printers and routers. But subjective standards, limited scope and technology limitations have made these public service comparisons less than useful.
As consumer-level Voice over IP (VoIP) becomes more firmly entrenched in the mainstream of telecommunications, VoIP services have become fertile ground for the type of comparison reviews — complete with "Editor's choices" and star ratings — more typical of computer hardware devices such as monitors, printers and routers.
Service comparisons have come from the usual suspects — Cnet, PC Magazine (in August 2004 and again in January 2005), PC World and the highly respected Tom's Networking — and traditional media such as the New York Times (subscription required) and BusinessWeek. There have been so many VoIP reviews that Consumer Search has even come up with a review of the VoIP reviewers, assigning a “credibility rating” to each of the comparisons (PC World was deemed most “credible”).
Comparing VoIP services, however, is not easy — some would say impossible. The results are too tainted by subjective standards and by factors that VoIP service providers have no control over — for example, differing levels of network latency caused by Internet service providers.
Predictably, there is no consensus — not even a clear leader — among the reviewers as to which service provider is best. VoicePulse, AT&T CallVantage, Vonage, Packet8, Skype, take your pick: Each has won a coveted top-choice from a comparison reviewer.
To make matters worse, these evaluations are too severely limited in scope to be considered authoritative. Usually, only four or five of the several hundred service providers operating in the US are covered, and even major providers like iConnectHere and BroadVoice — among the more established VoIP companies — are often ignored.
Even grading service providers based on calling plans and on the features they offer is a bit of a crapshoot. VoIP services are offered with dozens of features — such as online call and voice mail management “dashboards,” “point-and-click” dialing, geographically dispersed virtual numbers, and many others — that are not available with traditional landline telephone service. Still, a feature is meaningless if not used by a customer, and some VoIP calling niceties are too new or too esoteric to get much use. Grading providers on the calling plans they offer is equally difficult simply because there is no calling plan that fits every consumer's needs.
Some question whether VoIP services can even be compared in side-by-side tests like servers or routers. While it's easy to check off laundry lists of features and pricing, reviews of service quality are likely to be determined by a variety of factors that service providers have no control over — one possible explanation for the scattershot of ratings.
First, results are affected by the underlying network, says Laura Holly, Brix Networks market development manager and author of “Measuring Voice Quality in VoIP Networks: Going Beyond 'Can You Hear Me Now?'”
“It's hard to compare two services without knowing what kind of access network you're riding over,” says Holly. “For example, you and I both subscribe to Vonage but you are on a satellite connection and I'm on a 3Mb cable connection. We're unlikely to see the same quality of performance.”
Brix Networks, which supplies commercial network test products to service providers, also provides a free portal that evaluates the performance of a VoIP call going through a residential broadband connection.
“We routinely get email from people who have used our service and say 'I'm having trouble with service provider X,'” Holly reports. “We have to tell them that it's not the service provider, it's the broadband connection.”
Second, because calls take different routes to the same point, changing network conditions affect test results.
“Very few network providers use the same topology,” says Broadvocie CEO David Epstein. “A tester from location A will get a different result from testers at locations B and C.”
And third, unless these reviews are done over an extended period of time it's impossible to fairly measure performance.
“Vonage might sound great on Monday at noon,” says Epstein. “But that doesn't mean it will sound great on Tuesday at 1:00 pm from exactly the same place.”
Validating Epstein's concern are the conflicting results derived by PC Magazine when it compared VoIP providers in two separate issues, five months apart. In August 2004, the magazine compared AT&T CallVantage, VoicePulse and Vonage, with VoicePulse coming out on top. In January 2005, Broadvox, Lingo and Verizon VoiceWing were added to the chosen companies. This time, AT&T won the magazine's “Editor's Choice.”
Still, some think these reviews have value, despite their limitations.
“Services can for the most part be compared,” says VoicePulse CEO Ravi Sakaria. “What we found after two years of experience in this area that the part that is beyond is beyond our control very seldom has a negative impact on customer's experience.”
“When we first started I thought we would have a high rate of cancellations — 25 percent — because the Internet connection couldn't handle the calls,” Sakaria continues. “In fact, we have less than 1 percent of cancellations because of the connection.”
Sakaria says that there are tools out there that can accurately evaluate customer experience and cites Minacom's VoIP monitoring tool, used in a recent review by PC Magazine. “That kind of test is very accurate as far as what the customer's perception is,” according to Sakaria.
But can these reviews have merit if they only evaluate a tiny fraction of the rapidly growing number of VoIP providers in the world? Sakaria says 'yes.'
“Even if we're not included in the review,” says Sakaria, “we can still learn from the results and use that information to improve our service.”
Following is a list of recent published comparisons of VoIP services. For each, we list all of the service providers compared, with the winner(s) displayed in red boldface type.
New York Times (subscription required), April 8, 2004.
PC World, May 2004.
PC Magazine, August 2004.
Business Week, November 2004.
Cnet, November 2004.
PC Magazine, January 2005.
Tom's Networking, April 2005.