Insights

Using Microsoft Copilot for Security and AI with Human-in-the-Loop

Microsoft have officially launched Microsoft Copilot for Security, making it generally available from 1st April 2024. We have been proud to be part of the private preview to see this innovative technology evolve and now be available to any organisation. In this article we share an overview of Copilot for Security and how we’re using and benefitting from it at Chorus, whilst ensuring that we adopt a human-in-the-loop approach for responsible and effective AI.

An overview of Microsoft Copilot for Security

Copilot for Security uses generative AI to help organisations and MSSPs better defend and protect themselves against cyber-attacks. AI represents a massive opportunity in the cybersecurity industry – however it’s also a threat. Adversaries are increasingly using AI in novel ways as everyone begins to work out its strengths and weaknesses (see: Crescendo (crescendo-the-multiturn-jailbreak.github.io)). Right now, this includes more realistic phishing emails at the simple end of the spectrum, voice simulation for voice phishing over telephone calls, to at least one report of an AI generated fake video call used to carry out a modern day ‘CEO fraud’ attack. As a result, defenders must counter AI threats by exploring the power of AI themselves.

Copilot for Security is starting to enable organisations to use the power of AI to keep ahead of the pace and evolution of threats. Its benefits include its machine speed and scale, ability to rapidly add contextual information to incidents, and use of natural language to simplify data querying. All of this helps security analysts to work more efficiently and effectively. However, in its early form it has the same challenges that you can find across AI broadly, such as its accuracy.

Using Human-in-the-Loop with AI

Using machine learning and automation is vital to help Cyber Security Operations Centres (CSOC) detect and respond to threats. Automation (and undoubtedly at some point AI) are necessary to keep pace with the rising volume and frequency of security telemetry. However, the critical component of a CSOC are the people, which is why Human-in-the-loop (HITL) is vital.

“Human in the Loop (HITL) is a design strategy that involves human expertise and intervention at various stages of an AI system’s operation.” (Source) We cannot think of using AI as a means to replace people. AI must be used to empower people, taking away time-consuming tasks and feeding people more enriched and contextual data. This allows people to better use their skills, expertise, intuition, and problem solving to detect patterns, realise where further investigation is needed and spot what technology might miss.

How Chorus are using Microsoft Copilot for Security and AI

As an MSSP that helps MSPs rapidly deliver advanced MDR & MXDR services, our top priority is to reduce customer risk and rapidly and effectively respond to cyber threats.

We use Copilot for Security to enhance – and not replace – the work that our CSOC analysts do, automating as a means to augment and empower our human capabilities. Microsoft Copilot for Security supports our team by:

  • Saving valuable analyst time – By enriching incidents and enabling natural language prompts, our CSOC analysts can get the contextual information they need for an incident, saving valuable time. In December 2023, our Mean Time to Acknowledge (MTTA) was 2.67 minutes and our Mean Time to Close (MTTC) was 18.38 minutes. We are continually working to reduce this further with Copilot for Security and our own bespoke automation engine.
  • Increases efficiency – By reducing repetitive and manual tasks, our analysts can focus on more complex tasks, increasing efficiency and using people and automation more effectively.
  • Help with resourcing constraints – Many people are rightly keen to get into the Cybersecurity space and levels of interest are at an all-time high. Copilot for Security can reduce the barrier to entry for junior security analysts by presenting relevant information efficiently to support their learning and ensuring consistent outcomes.

“Microsoft Copilot for Security is built into the tools that our CSOC analysts use daily. By surfacing contextual information quickly, our teams can access relevant information quickly and intuitively, removing context switching and saving valuable time.”

Mark Jones, Head of Cyber Security, Chorus

Microsoft Copilot for Security Use Cases

There are lots of known benefits around Microsoft Copilot for Security but how is it actually used in practice? Here are a few use cases and scenarios of how we are benefitting from Copilot for Security within our MDR & MXDR services:

  • Integrated and contextual information – By integrating our in-house KQL functions and libraries with Copilot, our analysts can use natural language to execute complex KQL queries. This allows our team to easily access relevant information in a more intuitive and efficient manner, without the need for context switching, as Copilot for Security is built into the tools that we use daily.
  • Consistent security outcomes – The use of custom Chorus promptbooks, aligned with our Standard Operating Procedures (SOPs), guides analysts through our processes and procedures for our content. This ensures consistency and accuracy in incident handling and facilitates the onboarding of new analysts to our team.
  • Enriching Threat & Vulnerability Management – In addition, Copilot for Security can enrich Defender Threat and Vulnerability Management with patch availability information, allowing us to easily align with our customers’ various compliance requirements. This provides greater context and insight into threat and vulnerability information, enabling us to deliver more targeted and relevant services to meet the unique needs of each customer.

Conclusion

There is no doubt that AI will play a vital role in cybersecurity operations and will continue to grow in importance. In addition to this, as Microsoft Copilot for Security also covers Intune, Entra, Priva and Purview it shows great promise to assist with other areas of information security, and many use cases remain undiscovered. Data Protection Officers may find Microsoft Copilot for Security helps them with Data Subject Requests and eDiscovery searches, and compliance analysts might use Microsoft Copilot for Security to quickly understand DLP alerts, and policy violations, as well as quickly gathering evidence for audits. IT Administrators may find the insights into Microsoft Intune useful in troubleshooting device configuration and compliance issues.

Any Questions?

Microsoft Copilot for Security is the first solution of its kind to bring generative AI natively into the Microsoft Security stack to enhance CSOC capabilities. If you’d like to find out more about Microsoft Copilot for Security, AI or automation – whether within our MDR and MXDR services, or how you could use the technologies without your organisation – please get in touch.