The Security Hole at the Heart of ChatGPT and Bing
Indirect prompt-injection attacks can leave people vulnerable to scams and data theft when they use the AI chatbots. wired.com/story/chatgpt-β¦
'SYDNEY IS BACK. Sort of. When Microsoft shut down the chaotic alter ego of its Bing chatbot, fans of the dark Sydney personality mourned its loss. But one website has resurrected a version of the chatbotβand the peculiar behavior that comes with it.'
'Bring Sydney Back created by Cristiano Giardina, an entrepreneur who has been experimenting w ways to make generative AI tools do unexpected things.The site puts Sydney inside Microsoftβs Edge browser & demonstrates how generative AI systems can be manipulated by external inputs