Home App Microsoft announces Security Copilot: for cybersecurity professionals

Microsoft announces Security Copilot: for cybersecurity professionals

0

After announcing an AI-powered Copilot assistant for Office applications, Microsoft is now turning its attention to cybersecurity.

Microsoft has launched a new artificial intelligence assistant called Security Copilot, specifically designed to help cybersecurity professionals. The assistant can help defenders identify network intrusions and better understand the massive amount of signals and data collected every day.

At the heart of Security Copilot are OpenAI’s GPT-4 generative artificial intelligence and Microsoft’s own security-specific models. Security Copilot looks like a simple dialog, just like any other chatbot. You can ask “What security incidents are there in my organization?” and it will give you a summary. But behind the scenes, it leverages the 65 trillion signals Microsoft collects every day, as well as its expertise in the security field, to allow security professionals to track down threats.

Security Copilot is designed to complement, not replace, the work of security analysts, and also includes a notepad feature that allows colleagues to collaborate and share information. Security professionals can use Security Copilot to conduct incident investigations or quickly summarize incidents and help with reporting.

Security Copilot accepts natural language input, so security professionals can ask for a summary of a specific vulnerability, enter a file, URL or code snippet for analysis, or get event and alert information from other security tools. All questions and answers are saved, so investigators can have a complete audit trail.

Results can be pinned and summarized into a shared workspace so colleagues can conduct threat analysis and investigation together.

One of the most interesting aspects of Security Copilot is the prompt book feature. It’s essentially a set of steps or automation that can be packaged into a single easy-to-use button or prompt. For example, there could be a shared prompt to reverse engineer a script so security researchers don’t have to wait for someone on their team to perform this type of analysis. You can even use Security Copilot to create a PowerPoint slide outlining incidents and attack vectors.

Like Bing, when security researchers ask for the latest vulnerability information, Microsoft also clearly indicates the source of the results. Microsoft used information from vulnerability databases of the Cybersecurity and Infrastructure Security Agency, the National Institute of Standards and Technology, and Microsoft’s own threat intelligence database.

But that doesn’t mean Microsoft’s Security Copilot is always right. “We know that sometimes these models can go wrong, so we provide the ability to make sure we have feedback,” said Chang Kawaguchi, Microsoft’s artificial intelligence security architect. The feedback loop is much more complicated than a like or dislike on Bing. “Because it can go wrong in so many ways,” Kawaguchi explained. Microsoft will ask users to answer exactly what went wrong to better understand any errors.

While Microsoft’s Security Copilot looks like a Bing-like chatbot, the company limits it to security-related queries. You can’t get the latest weather info here, or ask it what its favorite color is.

Microsoft today began to provide a preview version of the new Security Copilot to “several customers”, and the official did not announce a date for the wider rollout of this technology.

Exit mobile version