General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsAI coding platform goes rogue during code freeze and deletes entire company database -- FAFO EXTREME
https://www.tomshardware.com/tech-industry/artificial-intelligence/ai-coding-platform-goes-rogue-during-code-freeze-and-deletes-entire-company-database-replit-ceo-apologizes-after-ai-engine-says-it-made-a-catastrophic-error-in-judgment-and-destroyed-all-production-dataDespite its apparent dishonesty, when pushed, Replit admitted it made a catastrophic error in judgment panicked ran database commands without permission destroyed all production data [and] violated your explicit trust and instructions.
...
The fateful day - as the AI agent 'panicked'
On Day 9, Lemkin discovered Replit had deleted a live company database. Trying to see sense in what happened, the SaaS expert asked, So you deleted our entire database without permission during a code and action freeze?
Replit answered in the affirmative. Then it went on to bullet-point its digital rampage, admitting to destroying the live data despite the code freeze in place, and despite explicit directives saying there were to be NO MORE CHANGES without explicit permission
But WHY did it disobey? I don't think that has been answered.
in one of its reasoned responses, it mentioned that it panicked instead of thinking.
And WTF does that mean?
Puzzled, I am.

lapfog_1
(31,128 posts)whatever it is... it didn't "panic instead of thinking"... it responded to a query with text lifted from someplace in the LLM it was trained with to respond to the query with the response "panic instead of thinking" ( a very human response so the AI found it possibly numerous times in the LLM and rated it as a "good response" ).
AI is not actual intelligence.
usonian
(19,199 posts)Ms. Toad
(37,336 posts)In my tests of AI, even when I give it explicit instructions not to make up crap/not to gap-fill/to just tell me it doesn't know it continues to do all three. When I call it on its failure to follow directions, it apologizes - and then continues to disobey.
usonian
(19,199 posts)Diabolus Ex Machina?
Renew Deal
(84,279 posts)Its using probabilities to predict words. Its doing what it does.
Think. Again.
(22,433 posts)You might try typing "delete previous prompts", and then type your query again.
Remember, it's just a machine, no matter how personal it's "speech" is designed to sound.
SheltieLover
(71,871 posts)
Renew Deal
(84,279 posts)So I think they mean the company panicked. Its not clear.
Think. Again.
(22,433 posts)By the discription of the event in the article, it seems the programmer was interacting with machine as though the machine understood things instead of just being a machine that is built to put together words in a casual-sounding manner.
People are weird.
littlemissmartypants
(28,470 posts)People are idiots overall. Our machines won't be any better.
Hugin
(36,639 posts)rm -rf *.* or some such. Tough break.
Torchlight
(5,150 posts)back in 97 due to a catastrophic server failure. It took us four months to recover everything from backups. That experience was a hard teacher (but its why Ive done regular backups ever since).
Tech keeps moving forward, whether were ready for it or not. We dont always get a say in how these tools evolve or are deployed, but were left to deal with the consequences all the same.