After announcing an AI-powered Copilot assistant for Office apps, Microsoft is now turning its attention to cybersecurity. Microsoft Security Copilot is a new assistant for cybersecurity professionals, designed to help defenders identify breaches and better understand the huge amounts of signals and data available to them daily.
Microsoft Security Copilot is a new GPT-4 AI assistant for cybersecurity


Powered by OpenAI’s GPT-4 generative AI and Microsoft’s own security-specific model, Security Copilot looks like a simple prompt box like any other chatbot. You can ask “what are all the security incidents in my enterprise?” and it will summarize them. But behind the scenes, it’s making use of the 65 trillion daily signals Microsoft collects in its threat intelligence gathering and security-specific skills to let security professionals hunt down threats.
Microsoft Security Copilot is designed to assist a security analyst’s work rather than replace it — and even includes a pinboard section for co-workers to collaborate and share information. Security professionals can use the Security Copilot to help with incident investigations or to quickly summarize events and help with reporting.
Security Copilot accepts natural language inputs, so security professionals could ask for a summary of a particular vulnerability, feed in files, URLs, or code snippets for analysis — or ask for incident and alert information from other security tools. All prompts and responses are saved, so there’s a full audit trail for investigators.
Results can be pinned and summarized into a shared workspace, so colleagues can all work on the same threat analysis and investigations. “This is like having individual workspaces for investigators and a shared notebook with the ability to promote things you’re working on,” says Chang Kawaguchi, an AI security architect at Microsoft, in an interview with The Verge.
One of the most interesting aspects of Security Copilot is a prompt book feature. It’s essentially a set of steps or automations that people can bundle into a single easy-to-use button or prompt. That could involve having a shared prompt to reverse engineer a script so security researchers don’t have to wait for someone on their team to perform this type of analysis. You can even use Security Copilot to create a PowerPoint slide that outlines incidents and the attack vectors.
Much like Bing, Microsoft is also sourcing results clearly when security researchers ask for information on the latest vulnerabilities. Microsoft is using information from the Cybersecurity and Infrastructure Security Agency, the National Institute of Standards and Technology’s vulnerability database, and Microsoft’s own threat intelligence database.
That doesn’t mean Microsoft’s Security Copilot will always get things right, though. “We know sometimes these models get things wrong, so we’re offering the ability to make sure we have feedback,” says Kawaguchi. The feedback loop is a lot more involved than just the thumbs-up or thumbs-down found on Bing. “It’s a little more complicated than that, because there are a lot of ways it could be wrong,” explains Kawaguchi. Microsoft will let users respond with exactly what’s wrong to get a better understanding of any hallucinations.
“I don’t think anyone can guarantee zero hallucinations, but what we are trying to do through things like exposing sources, providing feedback, and grounding this in the data from your own context is ensuring that it’s possible for folks to understand and validate the data they’re seeing,” says Kawaguchi. “In some of these examples there’s no correct answer, so having a probabilistic answer is significantly better for the organization and the individual doing the investigation.”
While Microsoft’s Security Copilot looks like a prompt and chatbot interface like Bing, the company has limited it to just security-related queries. You won’t be able to grab the latest weather information here or ask the Security Copilot what its favorite color is. “This is very intentionally not Bing,” says Kawaguchi. “We don’t think of this as a chat experience. We really think of it as more of a notebook experience than a freeform chat or general purpose chatbot.”
Security Copilot is the latest example of Microsoft’s big push with AI. The Microsoft 365 Copilot feels like it will forever change Office documents, and Microsoft-owned GitHub is supercharging its own Copilot into more of a chatty assistant to help developers create code. Microsoft doesn’t appear to be slowing down with its Copilot ambitions, so we’re likely to see this AI assistant technology appear throughout the company’s software and services.
Microsoft is starting to preview this new Security Copilot with “a few customers” today, and the company doesn’t have a date in mind for rolling this out more broadly. “We’re not yet talking about timeline for general availability,” says Kawaguchi. “So much of this is about learning and learning responsibly, so we think it’s important to get it to a small group of folks and start that process of learning and to make this the best possible product and make sure we’re delivering it responsibly.”
After announcing an AI-powered Copilot assistant for Office apps, Microsoft is now turning its attention to cybersecurity. Microsoft Security Copilot is a new assistant for cybersecurity professionals, designed to help defenders identify breaches and better understand the huge amounts of signals and data available to them daily. Powered by OpenAI’s…
Recent Posts
- Reddit is experiencing outages again
- OpenAI confirms 400 million weekly ChatGPT users – here’s 5 great ways to use the world’s most popular AI chatbot
- Elon Musk’s AI said he and Trump deserve the death penalty
- Grok resets the AI race
- The GSA is shutting down its EV chargers, calling them ‘not mission critical’
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010