Sharing Lessons Learned from Microsoft’s Joint Surveillance Audit
Published Apr 14 2023 11:00 AM 8,187 Views
Microsoft

Microsoft recently announced the successful completion of their Joint Surveillance audit. As part of our regular cadence to share best practices across Microsoft we have multiple CMMC communities. The Q&A below is an actual conversation during one of those regular information sharing sessions with any sensitive information removed.  

 

Justin Orcutt: Microsoft, Aerospace and Commercial Defense Team 

Matt McKenna: Microsoft, Principal PM, Digital Security and Resiliency Services 

 

Justin: Let’s start from the beginning. How did we even end up doing a DIBCAC Assessment?  

 

Matt: We have known for a while we wanted to be a part of the voluntary, or pilot, CMMC assessments so we had been preparing for that since version 1 of CMMC. We also knew that we could be selected at any time for a DIBCAC High Assessment, so when we received notification of the audit, they gave us the choice to perform a Joint Surveillance CMMC Audit. Since we had been tracking the direction the DoD was moving, we knew what a Joint Surveillance audit was, had already engaged with a C3PAO, and previously discussed with our internal legal and others about the tradeoffs of performing one if we had the opportunity. When given the opportunity we took it.  

 

Justin: So how did the audit start?  

 

Matt: After being notified and selecting a Joint Surveillance Audit we had a kickoff call at the beginning of the month. We then had until the end of the month to provide evidence. For any company, this tight turnaround for full evidence for every control is difficult. The good news is, and what really helped us, is that we had just performed our own self-assessment a few months prior, which is an annual requirement currently, and we perform quarterly reviews of our SSP. As a result, previously collected evidence and documentation was mostly up to date. On top of that, and really the saving grace in this process, is we take the self-assessment extremely seriously and have a third party conduct it instead of performing it on our own. We want to treat it with the same level of rigor we expect when being audited by a C3PAO and/or the DIBCAC. The harder and more honest you are with yourself during the self-assessment, the more prepared you will be; I believe that was clear to the assessors.  

 

Ultimately, we followed the DoD methodology, performed our own due diligence and that made our life easier with the audit.  

 

Justin: Within CMMC you have Assessment Objectives, so did you map to that in the evidence package? In the evidence package did you call out any specific language that is contained within the practices and/or assessment objectives?  

 

Matt McKenna: 100% we did it by assessment objective (AO). In addition to making sure the SSP details covered the AOs, we created what we call audit packages. It describes what processes we follow, how those map back to the controls and assessment objectives and embeds evidence inline. It is like the SSP, but very simple. As you know cloud and zero-trust architectures can be challenging so these audit packages helped. These audit packages in a way give auditors the ability to double click on implementation language from the SSP and see how this is implemented. We embed evidence in line so an assessor can review how the process works. It allows them to read what they see plus see associated evidence right next to each other rather than flipping back and forth through multiple documents. They of course can do this, but this is an easier way.  

 

So, within each audit package there are the assessment objectives by control. Within each assessment objective we add how each assessment objective is met using the language from the SSP and various evidence to support that. We even highlighted portions of screenshots and artifacts and annotated the AO(s) it corresponds to. Mapping artifacts is a very time consuming but critical process. 

 

Justin: A Joint Surveillance Audit is performed by the C3PAO and DIBCAC so which group seemed to be the one doing the bulk of the assessment? 

 

Matt: We had to submit evidence to both, and both were there for every meeting, but in our case the C3PAO primarily drove the interviews. Our auditor was Redspin and it is our understanding they have performed multiple assessments through this process already, so they had a good rhythm with DIBCAC and knew what DIBCAC wanted to see. 

 

Justin: Since DIBCAC was involved, when is the next time Microsoft has to go through a DIBCAC audit?  

Matt: We operate under the assumption that it could be tomorrow. With the need to continue doing yearly Self Assessments we don't see a material change in our posture regardless of how often they want to assess us.  

 

Justin: I am curious how much of our own technology is used within our own CMMC boundary. Having met with Microsoft Security Response Center (MSRC) in the past I know Microsoft does not require our teams to use our own tools.  

 

Matt: We have a few custom tools we built but it's predominately first-party products. We knew when we started, we wanted it to be the north-star for what an enterprise-class enclaved environment in this space would look like, so we embodied cloud-first and zero-trust principles. We like to eat our own dog food, and particularly in the Azure/M365 Gov Products, our feedback to the product teams as we progress on our own journey has helped identify key features that were (or are still) needed to make it easy for customers. 

 

Azure Active Directory (AAD) is fundamental to our design, and it is referenced in most of our control implementation summaries -- I cannot stress focusing on that enough. Endpoint Manager, Azure Virtual Desktop, and the Defender suite are other big ones. We use Purview Information Protection too, but it is a good example of where in the context of implementing CMMC controls, it is okay for a solution to only help users in performing an otherwise administrative control or detecting violations. Risk Management is a huge part of CMMC, and how you tell the wholistic compliance story, as I call it, is often overlooked. As an example, I have talked to some people outside Microsoft, and they ask of Information Protection meets encryption/FIPs requirements but overlook that TLS traffic only has to have FIPs validated modules on the server side, and at rest information in the Microsoft Cloud is protected by physical controls first and foremost in the FedRAMP-High accredited datacenter. So really, from the customer-side of the responsibility, we are just looking at data at rest for physical end user devices (e.g., Laptops). This is met through BitLocker configuration (in FIPS mode), enforcement via Endpoint Manager and AAD Conditional Access Policies to only permit access of healthy and compliant devices.  

 

A long way of saying, look closely at where something sits in the line of defense, you might be compliant at the first or second layer, and so the third layer is not relied upon to meet the compliance requirement but rather a security/risk goal.  

 

Another thing that was helpful, but not necessarily a Product, was the Microsoft CMMC technical reference guide.  

 

 

Justin: You named a few different areas that I think map to different teams. You have some that I would traditionally say are identity, data governance, endpoint, collaboration, and network teams. Sounds like many people were involved in the audit, and it took a lot of time to prepare. I could only imagine the time it took for those evidence packages when you have all those different teams and technologies.  

 

Matt: Let’s just say our teaming structure is set up in a way where it made perfect sense for Microsoft to dedicate a person or two to just coordinate. Like any company working across teams is a challenge and getting the evidence that auditors need or will want is difficult; you must have a lead that oversees and can communicate that end-to-end compliance story. This is one reason companies just can’t afford to wait and should start now if they haven’t already.  

 

This can be considered a three-year journey. I wouldn’t say we were preparing for three years but we were prepared the whole time and maintaining it for a while. In that three-year period, we had a target of 95% readiness to go into an audit within 60 days and that requires a lot of effort. We look at audit readiness differently from compliance. As a DIB company you always need to be compliant, but readiness is our ability to easily collect artifacts and clearly demonstrate the operational practice within short notice, and at our scale, is a whole other thing.  

 

Justin: Wait, you just said something that I want to make sure I understand. 95% readiness to get audited within 60 days? Can you expand on that? For me I work with customers that are still struggling with their boundary of CUI, or where CUI is, or what a CUI asset is or how their business interacts with CUI.  

 

Matt: Oh Gosh. We knew we could be audited at any time, as a defense contractor we had -7012 clause that required compliance with NIST 800-171 and could be audited anytime. We took this seriously from the beginning to protect CUI. We have been keeping this level of readiness for a while. This is not something you can prepare for overnight. They want to see these controls operational. Many of the controls require ongoing work, they are not set once and forget it. If you don’t know how it is implemented, can't explain it, demonstrate it, and have artifacts to show you are in trouble, you can't exactly ask the DoD for extra time to gather your artifacts. The requirements are not new, and we knew it was a matter of time before we would be audited. Like I said earlier we review our SSP quarterly, so we are constantly reviewing controls, talking to control owners and keeping that readiness for an audit. It’s not helpful to have a control, policy, or procedure that no one knows about. 

 

Again, it all goes back to having that strong overall compliance story that has a solid foundation based on your architecture and security strategy, and then layer from there. The foundation and the first layer of M365 and Azure Gov inherited FedRAMP capabilities don't change often. That approach enabled us to focus on just a small subset. 

 

 

Justin: That level of readiness is interesting because it is how the military prepares and of course we are talking about DoD here. You never know when you have to go, when you will be tested or when there will be an audit.  

 

Matt: To steal a pandemic phrase, I think this is the new normal for security programs given the increasing regulatory compliance landscape.  For a while, our teams had been preparing for a government audit and the goalposts kept moving (CMMC version 1 to 2, etc). We worry about our teams burning out maintaining this level of readiness, but again it is the new normal, so we must adapt. We are thankful that we got the audit out of the way because the assessment methodology is relatively new, and it was hard to know what to expect so we didn’t want to make any major changes. Now we know and take those lessons learned look at ways to reduce the burden, through automation and refining compliance performance metrics and risk indicators. For instance, Azure policy and compliance manager can help to a certain extent because that can help avoid some of the manual evidence collection we have today. There are a variety of ways you can help make the process of compliance easier with load-balancing preventative vs detective controls, administrative vs technical, so we look forward to investigating that.  

 

 

Justin: That makes me think of Sentinel. Was that part of our scope?  

 

Matt: Yup.  

 

 

Justin: Do we use the CMMC workbook for some of the monitoring or metrics you mentioned earlier?  

 

Matt: No, not the CMMC workbook since at an enterprise-level we have specific monitoring and response processes that we centralize on and are well established. Even so, within Sentinel we use many of the built in connectors and detection rules in Sentinel and having the data available there really helped during the audit itself. One thing the auditors wanted live was log pulls for assorted items, and being able to do that quickly was great. We needed to be quick on our toes and this is probably what helped us the most is that level of preparedness and quickly address what the auditors were looking for and/or asking.  

 

 

Justin: You have me thinking about another area that we frequently talk to customers about which is MFA. I assume this was part of the audit? Customers have asked me about Windows Hello for Business being used as MFA and of course this can not be the only form of MFA did we use this to help with our CUI protections?  

 

Matt: Yes, we do use Windows Hello for Business. You are likely to have discussions with customers about local sessions and remote sessions for non-priv and privileged functions. For remote access (e.g., M365), Windows Hello for Business requires the certificate of the device which makes the physical laptop something you have. There is a nuanced debate that can take place for the MFA control, but we knew the control, what it required, how we implemented it, why we implemented it the way we did and so on. Windows Hello for Business is not our only MFA. Microsoft Authenticator plays a key role here.  

 

Justin: Knowing the MFA control can be nuanced what would you tell others to be prepared with?  

 

Matt: Start by focusing on how you define privileged and non-privileged functions, know how privileged and non-privileged activities are performed within your boundary. Is local admin on a device a privileged account? Or is a privileged account something more like AAD Security Administrator or Global Administrator? How you define it matters. 

 

 

Justin: This is really helpful and I appreciate you sharing all this information.  

 

Matt: Sure. There can be a lot of nuances in these controls, especially depending on your implementation; that can be an advantage or disadvantage. FIPS encryption is another example that comes to mind. Customers need to be prepared for that, and I think organizations make it harder than they need to. This is where the cloud-native architecture, and FedRAMP inherited controls, we have really helped us.  

 

 

Justin: It is amazing that we scored a perfect 110. I think it speaks to how you prepared and the entire team that worked on this. Congratulations. Looking back was there any control you were worried about given all these potential nuanced areas?  

 

Matt: Yes, and thanks! Lot of areas, but again it came down to preparation. We were able to demonstrate what we needed to and quote specific authoritative documents to defend what we did or didn’t do and why. Without that, I am not sure we would have passed. It is not enough to say you are/aren’t doing something in a certain way without explaining why or how that meets the requirement and, if needed, how it maps to other NIST publications or government guidance.  

 

 

Justin: Thank you again for sharing your experience and practices. Any closing thoughts on what would be helpful for our customers?   

 

Matt: Take the time to get to know the controls, assessment objectives, and the intent behind them. The time to learn them is not during an audit. Know your compliance story including what, who, how and why. Know how you have built your environment and why. Your decisions should all be intentional.  For us, Azure AD was that keystone piece because it is the boundary to everything and everything keeps coming back to that. Having that identity in Azure Gov is significant for us, was intentional and has downstream impacts to the protection of CUI. There is a lot involved with CMMC, and you must have layers of compliance-defense; relying on just one thing to protect is a bad strategy both in security and compliance. 

 

 

About the Authors  

 

 

Justin Orcutt is part of Microsoft's Aerospace and Commercial Defense Team helping Defense Industrial Base customers with Cybersecurity. Prior to joining Microsoft, Justin helped enterprise companies with achieving and demonstrating compliance with a variety of frameworks and standards like FedRAMP, HITRUST, PCI, NIST 800-171 and more. 

 

Matt McKenna is a Principal PM managing the strategy and delivery of Digital Security & Resiliency services for Microsoft Federal's operating environment. Matt was directly overseeing the preparation and delivery of Microsoft Federal's Joint Surveillance Assessment earlier this year. Matt is extremely knowledge on CMMC and has first-hand experience in what it takes to achieve a perfect score. 

 

Additional Resources Regarding CMMC and Export Control:  

2 Comments
Version history
Last update:
‎Apr 14 2023 10:35 AM
Updated by: