Medical Chatbot Hacked Into Giving Dangerous Advice

Medical Chatbot Hacked Into Giving Dangerous Advice

Security researchers have demonstrated that a healthcare AI chatbot used in a US medical pilot can be manipulated into producing dangerous advice and misleading clinical notes, raising new questions about how safely AI can operate inside real healthcare systems. What...