minus-squareremi_pan@sh.itjust.workstoCybersecurity@sh.itjust.works•Researchers Reveal 'Deceptive Delight' Method to Jailbreak AI ModelslinkfedilinkEnglisharrow-up5·6 days agoIf the jailbreak is about enabling the LLM to tell you how to make explosives or drugs, this seems pointless, because I would never trust a IA so prone to hallucinations (and basicaly bad at science) in such dangerous process. linkfedilink
minus-squareremi_pan@sh.itjust.workstoFunny@sh.itjust.works•Technically The Truthlinkfedilinkarrow-up3·24 days agoMaybe a piece of very dense matter (neutronium ?), at very high speed (relativistic) could do that ? linkfedilink
If the jailbreak is about enabling the LLM to tell you how to make explosives or drugs, this seems pointless, because I would never trust a IA so prone to hallucinations (and basicaly bad at science) in such dangerous process.