New Jailbreaks Allow Users to Manipulate GitHub Copilot

Whether by intercepting its traffic or just giving it a little nudge, GitHub’s AI assistant can be made to do malicious things it isn’t supposed to.
Source: htdarkreading.com