
GitHub Copilot: AI Chatbot Now Available for Individual Users, But Beware of API Key Exposure
Researchers have discovered that AI code completion tools like GitHub Copilot and Amazon CodeWhisperer can inadvertently expose hardcoded credentials, such as API keys, that were captured during their training. The researchers used regular expressions to identify specific string patterns associated with these credentials on GitHub and then constructed prompts to ask the models to complete code snippets, resulting in the extraction of valid secrets. While the exposed credentials were already accidentally public, this finding raises concerns about the potential recall of sensitive data and highlights the need for proper security practices in code repositories. GitHub and Amazon have not yet responded to the research findings.