BREAKINGON

The Internet's Oldest Bargain is Breaking: How AI is Reshaping Web Traffic

7/20/2025
The rise of AI is changing the web landscape. With bots now outnumbering human visitors, traditional SEO strategies are losing their effectiveness. Discover how businesses are adapting to this new reality and what it means for the future of online content.
The Internet's Oldest Bargain is Breaking: How AI is Reshaping Web Traffic
AI is revolutionizing web traffic, leaving traditional SEO in the dust. Learn how businesses are adapting to the surge of bot traffic and what it means for content creators.

The Internet's Oldest Bargain is Breaking: The Rise of AI and Its Impact on Web Traffic

The internet has long operated on an implicit bargain where websites welcomed web crawlers. For decades, getting scraped by search engines like Google meant increased visibility, traffic, and ultimately, business growth. Historically, for every two bots Google sent to a website, it returned one user, creating a symbiotic relationship. However, as generative artificial intelligence (AI) tools rapidly evolve, this dynamic is shifting dramatically.

According to data from Cloudflare, for every user that OpenAI’s ChatGPT directs to a site, it sends a staggering 1,500 bots. In contrast, Anthropic’s platforms generate 60,000 bots for each user visit. This surge in automated traffic is leading to a flattening of human visits, with bots sometimes outnumbering human users entirely. Unlike the era of Google, these new bots typically do not link back to the original source material, as AI models like ChatGPT and others summarize and present answers directly, effectively sidelining websites and content creators.

A Profound Shift in the Digital Landscape

Linda Tong, CEO of Webflow, a leading web design and hosting company, describes this transformation as one of the most significant changes she has witnessed in her 20 years in the online business sector. “It’s fundamentally changing how people find and interact with brands,” she stated. “For some businesses, it’s an existential threat.” This statement encapsulates the heart of the issue: while public discourse often centers on fears of AI replacing human jobs, a more immediate concern is the disruption of the internet's economic structure.

The End of the Search Era

Since the early 1990s, bots have crawled the web to map and index it, ultimately fueling the rise of search engines. Early crawlers like World Wide Web Wanderer paved the way for platforms like Google, which launched its crawler, Backrub (later renamed Googlebot), in 1996. The prevailing logic during this time was clear: allow search engines to scan your content, and they will direct traffic to your site. This principle defined the open web for nearly three decades, where visibility in Google search results determined a website’s success or failure.

However, this established logic is becoming obsolete as AI models like OpenAI’s ChatGPT and Anthropic’s Claude no longer adhere to the old rules. Instead of linking back to source content, these AI systems read and recontextualize information, providing direct answers to queries with minimal or no attribution to the original sources, rendering traditional SEO efforts ineffective.

Introducing AEO: AI Engine Optimization

In response to this seismic shift, a new term has emerged: AEO (AI Engine Optimization). This concept represents strategies aimed at enhancing content visibility to AI systems, even if those AI responses do not lead to direct clicks. As Linda Tong highlights, AI bots synthesize available information and present it in a way that diminishes the original site's value proposition, resulting in no click-through traffic and no credit for the original content. In just six months, Webflow has experienced over a 125% increase in AI crawler traffic, a trend mirrored across the broader internet, with more than 50% of all web traffic now attributed to bots.

As bot traffic continues to rise, some companies are adapting by creating dual versions of their websites: one tailored for human users, featuring rich visuals and interactive storytelling, and another optimized for machine readability, designed to feed AI without relinquishing proprietary content. “For a human, your site should be rich, interactive, delightful,” Tong explains. “For a bot? You want clear structure, easy crawlability, but maybe not your full content.”

The Challenges for Content Creators

This evolving landscape poses significant challenges for businesses reliant on traditional web traffic, particularly media outlets and content creators. If an AI chatbot summarizes an article or extracts essential facts from a guide, users may never need to click through to the original website, resulting in a loss of ad impressions, email signups, and revenue. Adam Singolda, CEO of Taboola, a prominent ad-tech platform, warns that this scenario mirrors past experiences with platforms like Facebook's Instant Articles, which ultimately failed to deliver meaningful traffic or revenue to publishers.

With AI tools like ChatGPT amassing over 400 million weekly active users, the implications for traditional publishers are stark, with many reporting a 20-30% decline in search traffic over the past year. “That’s just from the first wave,” Singolda notes, raising concerns about what could happen as AI adoption continues to grow.

Strategies to Combat AI Scraping

In light of these challenges, publishers and platforms are exploring various strategies to protect their content. Some have entered into licensing agreements, allowing specific AI companies to access their content for a fee, though such deals remain exceptions rather than the norm. “There isn’t enough money in the world to pay every publisher whose content is being scraped,” Singolda explains. “You can’t offer $100 million to a thousand outlets.”

Looking ahead, Tong envisions a future where publishers have more control over their content access. Through partnerships, such as Webflow’s collaboration with Cloudflare, businesses can differentiate between beneficial and harmful bots, deciding whether to share full content, summaries, or nothing at all. However, enforcement remains challenging, as not all bots comply with the standard crawl policies, leading to ongoing content scraping issues.

The Future of Content in an AI-Driven World

The stakes are high in an environment where bots supply answers first. The distinction between being properly credited and being overlooked could determine the fate of entire industries. Currently, some AI-generated pages are crafted not for human consumption but solely for other AI systems to scrape, creating a closed loop of machine-generated content. To address this, companies like Taboola are innovating new models, such as the Deeper Dive feature that integrates AI experiences within publishers’ websites, ensuring that audiences can still engage with the original content while retaining control over their traffic and trust.

Breakingon.com is an independent news platform that delivers the latest news, trends, and analyses quickly and objectively. We gather and present the most important developments from around the world and local sources with accuracy and reliability. Our goal is to provide our readers with factual, unbiased, and comprehensive news content, making information easily accessible. Stay informed with us!
© Copyright 2025 BreakingOn. All rights reserved.