All-powerful, ever-pervasive AI is running out of internet
There is no such thing as unlimited data


Artificial intelligence (AI) has relied on high-quality language data to train its models, but supply is running low. That depletion is forcing companies to look elsewhere for data sourcing as well as to change their algorithms to use data more efficiently.
What is the scope of AI's data problem?
Artificial intelligence needs to be trained, and data and information is used to accomplish that. Trouble is, the data is running out. A paper by Epoch, an AI research organization, found that AI could exhaust all the current high-quality language data available on the internet as soon as 2026. This could pose a problem as AI continues to grow. "The issue stems from the fact that, as researchers build more powerful models with greater capabilities, they have to find ever more texts to train them on," said the MIT Technology Review. The quality of the data used in training AI is important. "The [data shortage] issue stems partly from the fact that language AI researchers filter the data they use to train models into two categories: high-quality and low-quality," said the Review. "The line between the two categories can be fuzzy," but "text from [high-quality data] is viewed as better-written and is often produced by professional writers."
AI models require vast amounts of data to be functional. For example, "the algorithm powering ChatGPT was originally trained on 570 gigabytes of text data, or about 300 billion words," said Singularity Hub. In addition, "low-quality data such as social media posts or blurry photographs are easy to source but aren't sufficient to train high-performing AI models," and could even be "biased or prejudiced or may include disinformation or illegal content which could be replicated by the model." Much of the data on the internet is considered useless for AI modeling. Instead, "AI companies are hunting for untapped information sources and rethinking how they train these systems," said The Wall Street Journal. "Companies also are experimenting with using AI-generated, or synthetic, data as training material — an approach many researchers say could actually cause crippling malfunctions."
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
What are AI companies doing to combat the imminent data scarcity?
The ticking clock on high-quality data has forced AI developers to think more creatively. For instance, Google has considered using user data from Google Docs, Google Sheets and similar company products. Other companies are "searching for content outside the free online space, such as that held by large publishers and offline repositories," like those published before the internet existed, said Singularity Hub. Meta has considered purchasing Simon & Schuster publishing house to gain access to all its literary works. More broadly, many companies have looked to synthetic data, which is generated by AI itself. "As long as you can get over the synthetic data event horizon, where the model is smart enough to make good synthetic data, everything will be fine," OpenAI CEO Sam Altman said at a tech conference in 2023. However, using synthetic data can present other problems. "Feeding a model text that is itself generated by AI is considered the computer-science version of inbreeding," said the Journal. "Such a model tends to produce nonsense, which some researchers call 'model collapse.'"
The other option is to rework AI algorithms to better and more efficiently use the existing high-quality data. One strategy being explored is called curriculum learning, which is when "data is fed to language models in a specific order in hopes that the AI will form smarter connections between concepts," said the Journal. If successful, the method could cut the data required to run an AI model by half. Companies may also diversify the data sets used in AI models to include some lower-quality sources or instead opt to create smaller models that require less data altogether. "We've seen how smaller models that are trained on higher-quality data can outperform larger models trained on lower-quality data," Percy Liang, a computer science professor at Stanford University, said to the MIT Technology Review.
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
Devika Rao has worked as a staff writer at The Week since 2022, covering science, the environment, climate and business. She previously worked as a policy associate for a nonprofit organization advocating for environmental action from a business perspective.
-
What's at stake in the Mahmoud Khalil deportation fight?
Talking Points Vague accusations and First Amendment concerns
By Joel Mathis, The Week US Published
-
Why is a new Air Force One taking so long to build?
The Explainer Trump may look for alternatives for his new plane
By Joel Mathis, The Week US Published
-
New and notable podcasts for March
Feature The MeidasTouch Podcast and The Magnificent Others With Billy Corgan
By The Week US Published
-
Could artificial superintelligence spell the end of humanity?
Talking Points Growing technology is causing growing concern
By Devika Rao, The Week US Published
-
Space-age living: The race for robot servants
Feature Meta and Apple compete to bring humanoid robots to market
By The Week US Published
-
Musk vs. Altman: The fight over OpenAI
Feature Elon Musk has launched a $97.4 billion takeover bid for OpenAI
By The Week US Published
-
Apple pledges $500B in US spending over 4 years
Speed Read This is a win for Trump, who has pushed to move manufacturing back to the US
By Rafi Schwartz, The Week US Published
-
AI freedom vs copyright law: the UK's creative controversy
The Explainer Britain's musicians, artists, and authors protest at proposals to allow AI firms to use their work
By The Week UK Published
-
The AI arms race
Talking Point The fixation on AI-powered economic growth risks drowning out concerns around the technology which have yet to be resolved
By The Week UK Published
-
Microsoft unveils quantum computing breakthrough
Speed Read Researchers say this advance could lead to faster and more powerful computers
By Rafi Schwartz, The Week US Published
-
Elon Musk's DOGE website has gotten off to a bad start
In the Spotlight The site was reportedly able to be edited by anyone when it first came online
By Justin Klawans, The Week US Published