LeaderBoard Part 1: Why Vendors Bomb
By Joe Skorupa
Every year I get asked by low-scoring software vendors why they bombed in the LeaderBoard rankings. The answers are sometimes straightforward but they are not received as such by the vendors asking the question. Here’s a sneak peek into the 2015 RIS Software LeaderBoard ranking data, which will be announced in two weeks, and the reasons why many vendors rank so low.
I have been managing and writing the RIS Software LeaderBoard for 13 of its 15 years and I have the gray hairs to prove it. I get an equal measure of happy phone calls and deeply distressed messages when the results are announced, depending on the vendor’s placement in the rankings.
The reasons for the deeply distressed messages can be boiled down to two simple facts: 1. The vendor did not get enough evaluations by retailers to qualify for ranking, and 2. The Customer Satisfaction score, an average of retailer evaluations in 10 areas, is not high enough to crack into the top-10 lists.
But that is just the beginning of the difficult conversation.
Retailers and E-Mail Expectations
The answer to #1 – not enough votes – sounds simple enough, but many vendors ask why? They often say things like, “I spoke to my retailers and they said they were going to vote or they said they voted.” Or they say something like, “I spoke to my retailers and they said they didn’t receive the invitation to vote.”
Having tracked this for years, I can report with confidence that if a vendor has 15 fully deployed retail clients, the best conversion rate a vendor can hope for is one third or five votes. That means that for every 15 fully committed retail clients who have foreknowledge that a LeaderBoard invitation is coming their way on a specific date and they have agreed to fill it out only five will do so. And that is a best-case scenario. Often the conversion rate is lower or in the 20% range.
One other factor comes into play – corporate-level e-mail blocking. There are many retail executives who do not get e-mail messages that are sent as part of a mass e-mail, which is how the LeaderBoard invitations are sent. Blocking of this type does not involve individual spam filters or folders. Blocking of this type takes place before the message enters the corporate e-mail server and software. This means there is no location for an executive to check for the blocked message. Basically, the message has never entered the executive’s personal or corporate e-mail system.
Horrible Customer Satisfaction Scores and Reviews
The answer to #2 – low Customer Satisfaction score – is also straightforward, but it is often not easy for a vendor to accept, which is understandable. After all, a vendor’s entire professional career, salary, incentive, brand image and business plan is dependent on the satisfaction of customers.
In the 2015 LeaderBoard results, the average Customer Satisfaction score is 38.2 (out of a maximum score of 50). There are 27 vendors who scored higher that this figure and 23 lower. Those who scored lower rarely make it into any of the LeaderBoard’s 51 charts (50 top-10 charts and 1 top-20).
Naturally, most of the distressed messages come from those on the low end of the Customer Satisfaction list, usually very low. The lowest Customer Satisfaction score recorded in the 2015 LeaderBoard is 25.9. A total of four vendors scored below 30, which is well below the average score of 38.2. Two of these bottom dwellers are among the best-known names in retail technology.
Often a vendor will try to take issue with the LeaderBoard scoring methodology, which has been made virtually bullet-proof over the years. But one thing a vendor can’t argue with is the words of retailers themselves. Here are a few comments about bottom-dwelling vendors on the Customer Satisfaction list (with names removed to protect the guilty):
· What should be intuitive upgrades are not provided. We have to push hard to get results and manage the process and then they charge us for it! Also maintenance seems a mystery to them. When we want to clean up files they falter on the process. It always becomes a big deal to research the procedure as the original instructions do not work.
· They have poor customer service and long wait times for support. They were not able to provide us with what they had promised initially.
· Very disappointed with product and company. They have not spent the money and time required to integrate its various purchased and outsourced vendors and products. A Frankenstein of a product. Company moves on to the next release before stabilizing the current one. Lack of strong implementation and integration managers.
· Too many promises about out of the box capabilities
· The implementation was very difficult. The vendor’s implementation team was not familiar enough with the product and we had many issues with the software. The software continues to have issues requiring vendor support involvement, as well as the need for internal IT.
· This system is a joke. They experienced a catastrophic systems failure in August impacting thousands of retailers and millions of transactions. They have yet to fix the underlying issues and their systems have not worked correctly since. We have spent 1000's of hours trying to make customers whole.
These are just a few of the hundreds of insightful comments/reviews collected during the LeaderBoard process. If anyone – vendor or retailer – wishes to book a customer LeaderBoard data review that includes deep insight that goes beyond the published report, feel free to contact me.
In Part 2 of the LeaderBoard series I will reveal more data about the top-10 lists, present a top-10 list not published in the final report, and offer a sneak peek into some of the top-performing vendors in retail technology.