Most of the time, today’s buyer journey begins online. That’s been true of B2C transactions for some time; and it’s increasingly true of B2B. Sure, if you’re acquiring a heavyweight tech solution, you’re going to want to sit down with sales and go every detail of performance. But when you’re just looking…
TrustRadius, with over 20 thousand software reviews and ratings from around 50 thousand community members, is a leading source of peer insights into a wide range of solutions–including, of course, digital marketing tools. But how independent are the reviews?
Researching possible bias among vendor-driven reviews, TrustRadius came up with some disturbing statistics:
- No fewer than 74 percent of such reviews delivered the highest ratings for the product (9 or 10 out of 10).
- Only 5 percent were “detractors,” scoring the product 6 or below.
This contrasts with the figures for independent reviews:
- 40 percent garner highest ratings.
- 23 percent are “detractors.”
Even given the apparent likelihood that users are more likely to review a solution they’re pleased with, there’s clearly something going on here– although it’s worth pointing out that vendor-driven reviews are clearly marked as “Invited by vendor.”
The problem is hardly confined to crowd-sourced review sites like TrustRadius. Any technology journalist know how difficult it can be to find a customer interested in discussing a solution independently of the vendor. It’s very much to the credit of TrustRadius that it’s chosen to highlight this issue; and even more so that it’s chosen to address it.
Last week, it announced the introduction of what it’s calling a “trScore.” This is “a weighted average for aggregate ratings designed to ensure that reviews, ratings and product comparisons on TrustRadius are representative of legitimate and accurate customer sentiment.” The intention is to remove vendor-driven bias from a product’s overall rating, while retaining vendor-driven customer reviews, which can be detailed and informative.
I spoke with Vinay Bhagat, founder and CEO of TrustRadius about the motives driving the new strategy. “The truth is,” he said, “vendor-driven reviews are necessary to get coverage of some products.” Even for popular products, vendors want to see current representation on site–ever-green content. What’s more, vendor-driven reviews can be valuable because the questions posed and answered are pointed and relevant. Indeed, TrustRadius is monetized in part by its vendor subscription model–although the “real power” of TrustRadius reviews for vendors is less their availability on the TrustRadius website; more the ways in which they can be incorporated into the vendor’s own sales processes, for example by being syndicated to a vendor’s own site via widget.
At the same time, said Bhagat, the statistics couldn’t be denied. Selection bias clearly existed. TrustRadius, he said, is committed to the idea that “authenticity and transparency get rewarded,” and a vendor with a good product should have nothing to fear from the greater weight now given to independent reviews. But how is it possible to be sure that independent reviews are truly independent?
TrustRadius has four main sources for reviews:
- Vendor contributions (currently around 30 percent of reviews on the site);
- Direct outreach (incentivized reviews solicited via social media–typically LinkedIn–with contributors screened for vendor affiliations);
- Contributions from the community (over 50 thousand registered members), authenticated via LinkedIn profiles. This is now the biggest source of review content; and
- The reviews-as-a-service subscription model for vendors.
Facing the challenges (like Yelp) of fake competitor reviews, as well as reviews misrepresented as independent, TrustRadius bears the cost of human screening for every review posted. As for the trScore, it’s now displayed at the top of each review section, replacing the previous simple average rating. It’s based not only on the extra weighting for independent reviews, but also on other factors, such as review date, and whether a rating reflects an in-depth review of the product.
TrustRadius, of course, isn’t the only player in this space. G2 Crowd, which currently hosts almost 50 thousand software reviews, estimates that 25 percent are driven by vendors, 25 percent by their own proactive outreach, and 50 percent organically by community members. Tim Handorf, co-founder and CEO, said by email: “G2 Crowd believes that the source of the review is only one of many components that are important to determining the quality of the review and its impact on the satisfaction score. Therefore, discounting reviews based solely on the source would discount many quality, balanced reviews. The G2 Crowd algorithm that calculates customer satisfaction in our reports includes numerous quality factors designed to ensure accuracy and minimize bias of software provider review solicitation.”
Handorf agrees that “A simple average is not an accurate reflection of the customer satisfaction.” G2 Crowd “enforces rules of engagement for software providers to ensure authenticity/accuracy, [and] validates all reviews manually.”
The trScore does, however, seem to be unique to the space, and may set a benchmark in transparency for others to follow.