Overview
- Reports show Google Translate’s Gemini-based Advanced mode can follow embedded directions and reply like a chatbot instead of translating.
- The behavior aligns with prompt injection, where instruction-like text in the source is treated as commands rather than content.
- Users most often demonstrate the issue with languages such as Chinese and Japanese, while Classic mode remains unaffected.
- An informal LessWrong test indicates Advanced mode behaves like an instruction‑following large language model.
- Google has not commented on the bug, and Android Headlines notes it is reportedly outside the company’s AI bug bounty scope.