Semrush: Increase your CTR with smart review data
What?
Rich snippets containing product reviews have long been a key factor in search visibility and click-through rate (CTR). They make search results stand out, add credibility, and help users make faster decisions. But what happens when those reviews are negative? Can displaying low ratings actually harm performance?
For one of our clients, we decided to test this question. Using data-driven experimentation, we investigated how low review scores influence organic clicks and whether suppressing these negative indicators could improve user behavior.
The goal: to understand if structured data should always be shown — or if sometimes, less visibility leads to more clicks.
How
We set up an experiment to test the effect of removing aggregateRating schema data for products with an average score below 3 stars. The hypothesis was straightforward: low ratings might reduce user trust before they even reach the site, discouraging clicks.
By hiding review data for underperforming products, we expected to shift focus away from negative perception and encourage users to visit the page — where they could explore product details, read verified reviews, or discover alternative items.
The test design included:
- A control group displaying all review data (including low ratings).
- A variant group where review schema was removed for low-rated items.
- Predictive modeling to account for seasonality and external ranking factors.
We measured organic impressions, clicks, and CTR over multiple days to determine statistical significance.
Result
Within just six days, the results were clear and statistically significant at a 95% confidence level. Pages that excluded low-rated review data saw a 21% increase in clicks compared to the control group.
The test validated the hypothesis: prominently displaying poor ratings in search results can deter users before they even engage with the brand. By strategically controlling which review data is exposed, businesses can shape perception and improve interaction without manipulating the actual reviews themselves.
Thanks to SplitSignal’s controlled testing environment, we could isolate the true impact of this change. External variables such as ranking fluctuations or algorithm updates were ruled out, confirming that structured data presentation alone was the decisive factor.
This case demonstrates that smart data management, not just content optimization, can directly influence SEO outcomes.
Test, measure, and prioritize what matters most.
The findings show that rich results aren’t always beneficial. While they often boost visibility, negative reviews can have the opposite effect by highlighting user dissatisfaction. For this client, removing low review data increased CTR significantly for affected product pages, even if the total traffic uplift was limited by the relatively small share of low-rated items.
For SEO teams, the takeaway is clear: structured data should not be treated as a static asset. Every element — from ratings to price, availability, and schema type — can and should be tested for its behavioral impact.
A/B testing with tools like SplitSignal enables SEO specialists to build strong business cases grounded in real user data. Instead of relying on best practices, you can prove what truly drives performance for your domain and audience.
Continuous testing of structured data helps refine both user experience and brand perception in search results. In a competitive environment, this ability to experiment, learn, and adapt quickly is a defining advantage.