The Impact of Generative AI on Search Traffic Referrals
Companies that develop generative AI often claim to include links to websites in the answers provided by their chatbots. However, Cloudflare CEO Matthew Prince revealed to Axios that search traffic referrals have been consistently declining. This downward trend poses an existential threat to publishers, as users are increasingly relying on AI summaries rather than clicking through links to access more information.
The Plunge in Search Traffic Referrals
Prince noted that ten years ago, Google would send one visitor to a publisher for every two pages it crawled. This ratio has significantly changed over time. Six months ago, the ratio was one visitor for every six pages, and now it has further declined to one visitor for every 18 pages. Similarly, OpenAI and Anthropic have also seen a decline in the number of visitors they send to publishers. OpenAI’s ratio has changed from one visitor for every 250 pages to one visitor for every 1,500 pages, while Anthropic’s ratio has dropped from one visitor for every 6,000 pages to one visitor for every 60,000 pages.
The Consequences for Publishers
The growing trust in AI chatbots has led to a decrease in clicks on links, resulting in reduced advertisement revenue for publishers. To address this issue, Prince is urging publishers to take action to ensure fair compensation. Cloudflare is currently developing a tool to block bots that scrape content for large language models, even if a website has a "no crawl" instruction. This move comes after reports that AI companies have been ignoring websites’ Robots Exclusion Protocol (robots.txt) files and scraping content despite protocols meant to block them.
Cloudflare’s Efforts to Block Scrapers
Cloudflare has been exploring ways to block scrapers since last year. In March, the company introduced AI Labyrinth, a tool that uses AI-generated content to slow down, confuse, and waste the resources of AI crawlers and other bots that disregard "no crawl" directives. AI Labyrinth works by linking unauthorized crawlers to a series of AI-generated pages that appear convincing but lack the actual content of the protected site. This approach helps to waste the crawler’s time and resources.
The Challenge of Protecting Customer Sites
Prince emphasized the challenges of protecting customer sites from hackers and AI companies that disregard "no crawl" directives. "I go to war every single day with governments and entities trying to hack into our customer sites," Prince said. "It’s unacceptable that I can’t stop some companies from scraping content despite our best efforts to block them." Cloudflare’s efforts to develop tools like AI Labyrinth aim to address this issue and protect the interests of publishers and website owners.
Source Link