A new paper by British and Canadian researchers published in @Nature has warned that today’s machine learning models are fundamentally vulnerable to a syndrome they call “model collapse,” where AI is trained on data it generated itself, and by other AI sources.
Reports @Techcrunch: “If the models continue eating each other’s data, perhaps without even knowing it, they’ll progressively get weirder and dumber until they collapse. The researchers provide numerous examples and mitigation methods, but they go so far as to call model collapse “inevitable,” at least in theory.” Here’s more.
Read the full paper here: https://flip.it/pqfPZ7
Im glad someone ‘did the math’ but this is a water has been determined to be moist thing.
Dogfood fight.
They will eat each others shit. Sick dogs are known to do this.
@SpaceLifeForm @TechDesk @Nature @Techcrunch
Coming soon, "AI and Silent Bob Strike Back"
"...we're gonna make 'em eat our shit, then shit out our shit, then eat their shit which is made up of our shit that we made 'em eat."
@TechDesk @Nature @Techcrunch Yes. I find this funny. And yes, I got caught in that trap in the 1990s.
@ralph058 @TechDesk @Nature @Techcrunch 'Progressively weirder and dumber' sounds great, though!