Oliver Roick's Weblog Nobody reads this anyway.

Fascinating piece about the dependencies between Wikipedia and large-language AI models. LLMs scrape Wikipedia data and flood the internet with AI-generated, sometimes inaccurate and outdated content. This leads to less exposure of Wikipedia, fewer contributions, and potentially a decline of Wikipedia’s quality. But large-language models rely on Wikipedia for verification of facts and information on recent events. A healthy Wikipedia is crucial for companies running products on large-language models, at the same time they are working to diminish its relevance.

You're reading an entry in Oliver Roick's Weblog. This post was published on . Explore similar posts about Artificial Intelligence, and Wikipedia.