A recent study has revealed a surprising difference between the answers given by Google’s Search Generative Experience (SGE) and the top organic search results. The study, done over two days in December 2023, looked at 1,000 commercial keywords and found that 93.8% of the time, the AI-generated answers in SGE did not match any links from the top 10 organic search results. This raises concerns about the reliability and accuracy of AI-generated content.
The study shows that there is a big difference between the URLs of SGE content and those in organic search results. It is clear that a large number of search queries, specifically 86.8% of the keywords analyzed, lead users to AI-generated answers instead of organic search results. However, this change has a cost for website owners as SGE is expected to decrease organic traffic for many keywords. Websites that don’t make it to the top 10 of Google’s organic results may still appear in SGE, which means missed chances for exposure and potential loss of revenue.
One of the most concerning things about the study is that most generative content URLs are completely different from those in organic search results. Only 1.6% of the time did the generative content match at the domain level. This suggests that SGE answers often prioritize less well-known sources over established ones, which could compromise the reliability and credibility of the information.
Additionally, the study found that only a small percentage of generative URLs matched a page 1 organic URL. This raises questions about the rigor and credibility of AI-generated content, as users may unknowingly rely on information that hasn’t gone through the rigorous ranking process of organic search results.
The difference between SGE and organic search results has a big impact on website rankings. With 94% of SGE links being different from organic search results, it’s clear that the two sources provide very different information. This makes us wonder how Google’s AI algorithms decide what content to show and how it affects the overall user experience.
Interestingly, the study also looked at different formats of SGE presentation. The format with a “Show more” button appeared less often compared to the version with a “Generate” button, which appeared 65.9% of the time. This suggests that Google is trying out different ways to present AI-generated answers, possibly aiming to improve user engagement and satisfaction.
While SGE offers the convenience of getting answers directly within Google’s interface, it’s important to be cautious with this information. The study highlights the need to critically evaluate the sources and validity of AI-generated answers, especially when they differ significantly from the top organic search results.
It’s worth noting that the study focused on commercial keywords and used data from a logged-in Google Chrome account holder in New York. While this gives valuable insights into a specific set of search queries, more research is needed to fully understand the impact of AI-generated answers across different industries and user demographics.
In conclusion, the limited overlap between AI-generated answers in Google’s SGE and top organic search results raises valid concerns about the reliability and accuracy of information presented to users. As SGE is expected to significantly reduce organic traffic for many keywords, website owners need to adjust their strategies to ensure visibility in an increasingly AI-driven search landscape. As users, it’s crucial to approach AI-generated answers with a critical mindset, considering the potential biases and limitations of these automated systems. The findings of this study call for further exploration and analysis to fully grasp the impact of AI-generated content on search results and user experience.