Microsoft’s new offering, Copilot for Security, announced earlier this year, is now generally available as a tool to assist security teams in quickly actioning security tasks using data within the Microsoft ecosystem. As one of the newer Copilots in Microsoft’s arsenal, it brings plenty of promise in with its capabilities but not without its share of constraints. It’s good to remember that, like all AI companions, features are still being fine-tuned and improved as time goes on. In this post, we’ll discuss what Copilot for Security is, how it intends to help security officials, and who can benefit from using the tool. We will also discuss limitations users should be aware of and briefly cover its potential to become a useful tool in the world of ever-changing security threats.
1: Understanding Copilot for Security
What is Copilot for Security?
Copilot for Security is Microsoft’s generative AI-powered, cloud-based security analysis tool made specifically to harness Microsoft Copilot in a security context. Copilot for Security operates by taking natural language questions to generate actionable analysis using data from various Microsoft security offerings. It’s built on various OpenAI models, including GPT-4, providing responses with up-to-date security context. When you ask Copilot for Security a question about an incident, it presents a summary with actionable steps, real-time insights, and recommendations2.
Copilot for Security integrates well with various service offerings. From Entra to Sentinel and even third-party tools like JAMF and CrowdSec, Copilot pulls data from security offerings, like Defender XDR, to tailor its responses. It can even pull form the web or custom plugin based on OpenAI. By integrating with existing security infrastructures, Copilot for Security offers a seamless experience that enhances the overall security posture without disrupting workflows by switching between products.
Key Features and Capabilities
Copilot for Security is meant to help seasoned professionals respond to threats quickly or support junior resources in addition to security mentors. Think of Copilot as another seat on the security team to provide additional insight and coaching wherever needed.
Copilot for Security’s main capabilities include:
- Incident summaries – Use Copilot for Security to condense security alerts into summaries that reduce typical threat investigation processes.
- Impact analysis – Provides an analysis of areas affected by the incident.
- Reverse engineering scripts – translate detected scripts into natural language.
- Guided Responses – actionable step-by-step guidance with relevant links.
Copilot for Security also has a way to automate routine tasks using promptbooks. These promptbooks are different from traditional automation in that prompts sent to the Copilot can be templatized for future use. This structure lets common prompts be grouped together to give a streamlined process for a particular security task, allowing members of the security to focus on more complex issues. At the same time, junior staff can use these promptbooks for step-by-step guidance, letting them gain experience following the prompts left behind by more senior resources. Both members of the security team can perform tasks faster and can work with more data quickly without needing to pivot to other security products for analysis.
2: Benefits and Limitations
While Copilot for Security brings various capabilities with the use of AI, it’s not without its share of considerations:
Pricing
Since Copilot for Security is intended to serve as another member of the security team, an additional cost to organizations is required to stand up an instance. Copilot for Security operates on a Pay-as-you-Go basis, where organizations purchase Security Compute Units (SCUs) at $4 per hour each. You can scale the number of SCUs from 1 to 100, depending on your needs, with the flexibility to adjust usage during off-peak hours to manage costs. This can become pricy the more units provisioned at a time.
Prompting Limits
The more SCUs that are provisioned, the more prompts one can produce in an hour. With more SCUs, you can issue more prompts per hour. For instance, Microsoft recommends at least three compute units when using Copilot for Security to avoid hitting the limit after a few prompts. Our testing proves this is true, as only one or two compute units limits the number of continuous responses one can perform within an hour. Depending on the organization, the cost of adding additional SCUSs may be outweighed by the benefits Copilot for Security can offer. It’s possible to increase and decrease the provisioned amount of SCUs when needed, allowing organizations to reduce the number of SCUs provisioned to the minimum outside of business hours and back up to the original capacity when in use during business hours. Still, organizations should take this price into account, as the resource will need to be constantly provisioned to retain prompting history. It is up to the organization to determine the best budget for Copilot for Security and to constantly monitor pricing to ensure that they are getting the best value out of the service.
Proper Prompting and Verification
Detailed prompts are crucial for Copilot to provide meaningful responses. Microsoft’s guide on effective prompting can help you craft prompts that yield the best results and can be saved as promptbooks for recurring use. Human judgement is still a necessity, requiring review before taking any response as fact. As organizations use Copilot for Security, be sure to provide feedback on responses so that the AI model can be trained on real-world use cases.
3: Closing
In conclusion, Copilot for Security is a promising addition to Microsoft’s suite, especially for larger organizations dealing with numerous security incidents. It’s not a replacement for human experts but a valuable assistant in the evolving landscape of cybersecurity threats. While Copilot for Security boasts advanced AI capabilities like incident summaries and guided responses, it’s important to weigh its cost against the benefits for your organization. Additionally, remember that effective use requires precise prompting and regular feedback to the AI model.
How can Copilot for Security help improve your security processes? Share your thoughts on improvements or how AI can transform security management in the comments below. Feel free to reach out to me on LinkedIn to discuss more about Copilot for Security and its features!
Microsoft. (n.d.). Microsoft Copilot security. Microsoft. Retrieved June 17, 2024, from https://www.microsoft.com/en-us/security/business/ai-machine-learning/microsoft-copilot-security
Microsoft. (2024, June 1). Microsoft Security Copilot. Microsoft. Retrieved June 17, 2024, from https://learn.microsoft.com/en-us/copilot/security/microsoft-security-copilot
Microsoft. (n.d.). Create effective prompts. Microsoft. Retrieved June 17, 2024, from https://learn.microsoft.com/en-us/training/modules/security-copilot-getting-started/5-create-effective-prompts