
Closed Microsoft and Google repositories found accessible to Copilot
Israeli company Lasso, specializing in cybersecurity in the field of generative artificial intelligence, discovered a serious security issue: data that was accessible on the internet even for a short time continues to exist in the memory of AI chatbots even after being deleted or switched to private mode.
The scale of the problem turned out to be impressive: more than 20,000 GitHub repositories, which are now private, are still accessible through Microsoft Copilot. This vulnerability affects over 16,000 organizations, including tech giants such as Google, IBM, PayPal, Tencent, and Microsoft itself.
The story began when Lasso discovered its own repository, which was accidentally made public for a short time, in Copilot’s responses. “If you browse the web, you won’t see this data. But any person in the world can ask Copilot the right question and get this information,” explains Lasso co-founder Ofir Dror.
The research showed that the problem is related to the caching mechanism of Microsoft’s Bing search engine. Lasso analyzed a list of repositories that were public at some point in 2024 and were then deleted or switched to private mode. It turned out that data from these repositories is still accessible through Copilot.
This discovery raises serious questions about data security in the era of generative AI. Even a brief disclosure of confidential information can lead to long-term consequences, as AI systems index and store this data, making it potentially accessible to any user who knows the right questions.
The situation is particularly notable because it affects the world’s largest technology companies, including the developer of the tool itself – Microsoft. This demonstrates that even organizations with the highest level of technical expertise can face new security challenges created by artificial intelligence systems.