flipboard.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Welcome to Flipboard on Mastodon. A place for our community of curators and enthusiasts to inform and inspire each other. If you'd like to join please request an invitation via the sign-up page.

Administered by:

Server stats:

992
active users

Flipboard Tech Desk

A new paper by British and Canadian researchers published in @Nature has warned that today’s machine learning models are fundamentally vulnerable to a syndrome they call “model collapse,” where AI is trained on data it generated itself, and by other AI sources.

Reports @Techcrunch: “If the models continue eating each other’s data, perhaps without even knowing it, they’ll progressively get weirder and dumber until they collapse. The researchers provide numerous examples and mitigation methods, but they go so far as to call model collapse “inevitable,” at least in theory.” Here’s more.

flip.it/.dT.fE

Read the full paper here: flip.it/pqfPZ7

TechCrunch · 'Model collapse': Scientists warn against letting AI eat its own tail | TechCrunchIf the models continue eating each other's data, perhaps without even knowing it, they'll progressively get weirder and dumber until they collapse.

@TechDesk @Nature @Techcrunch

Im glad someone ‘did the math’ but this is a water has been determined to be moist thing.

@TechDesk @Nature @Techcrunch

Dogfood fight.

They will eat each others shit. Sick dogs are known to do this.

@SpaceLifeForm @TechDesk @Nature @Techcrunch

Coming soon, "AI and Silent Bob Strike Back"

"...we're gonna make 'em eat our shit, then shit out our shit, then eat their shit which is made up of our shit that we made 'em eat."

@TechDesk @Nature @Techcrunch Yes. I find this funny. And yes, I got caught in that trap in the 1990s.

@ralph058 @TechDesk @Nature @Techcrunch 'Progressively weirder and dumber' sounds great, though!