POST: AI Will Lead Us to Need More Garbage-subtraction

Todd Carpenter, Executive Director of NISO, writes for The Scholarly Kitchen, “AI Will Lead Us to Need More Garbage-subtraction.” Amid a flurry of recent articles in LIS journals and higher education blogs on concerns about generative AI and large language models (LLMs) being trained on non-transparent, highly biased swaths of data culled from across the internet, Carpenter speculates on another unintended consequence of these advanced machine-learning technologies: they are adding to the growing amount of low-quality content being shared online, or in other words, more “garbage” for researchers to sift through in the search for valuable information.

From the article:

In a world of ubiquitous information, curation becomes the most coveted service. Reduction, selection, and curation become the highest value an organization can provide. We need to subtract from the flow of information, by “deleting the garbage….”

Into this environment, generative AI systems will only exacerbate that problem. In the same way that robotics have made manufacturing processes more exact, more efficient, faster, and cheaper, AI tools will help everyone generate ever more content. As large language models and generative text creation AI systems make the authorship of content easier, ultimately this will only generate more and more content.

dh+lib Review

This post was produced through a cooperation between Rebekah Walker, Chelsea Wells, Olivia Staciwa, Elena Hoffenberg, Molly McGuire, Monica Maher, John Knox, and Abbie Norris-Davidson (Editors-at-large for the week), Rachel Starry and Linsey Ford (Editors for the week), Claudia Berger, Nickoal Eichmann-Kalwara, Pamela Lach, Hillary Richardson, and John Russell (dh+lib Review Editors).