Generative artificial intelligence (AI) is being incorporated into the editing process of Wikipedia to help human editors focus on quality control by reducing tedious tasks and allowing for more time spent on deliberation.
Director of Machine Learning at the Wikimedia Foundation, Chris Albon, has emphasised that the organisation prioritises a human-centric approach, transparency, and a nuanced approach to multilinguality.
AI is already used on the site to detect vandalism, translate content, and predict readability, and the organisation has recently partnered with Kaggle to create an open access dataset of structured Wikipedia content optimised for machine learning.
The trend of increasing bot traffic on the site has seen it strain Wikimedia servers and increase bandwidth consumption by 50 percent.
This new approach aims to strike a balance between human agency and AI optimisation.