Blog Post

Security, Compliance, and Identity Blog
4 MIN READ

Secure and Govern Your Custom-Built AI Apps with Microsoft Purview

Liz_Willets's avatar
Liz_Willets
Icon for Microsoft rankMicrosoft
May 21, 2024

The rise of generative AI unlocks new opportunities for developers to create groundbreaking applications. Studies show that 75% of organizations are more likely to adopt AI apps when they come with assurance mechanisms for secure and compliant use. This underscores the importance of building apps that can handle and govern sensitive data appropriately. But despite this growing demand for secure and compliant AI applications, developers often lack security expertise and tools to build these controls into custom-built applications. What developers need are easy to use APIs that enable them to build data security and compliance controls into their applications by design.

 

As consumers of AI applications, enterprises are concerned about data oversharing, data leakage, and non-compliant use of AI apps. Ensuring that your application meets enterprise needs for safeguarding against data risks is critical for enterprise adoption. Once deployed, security teams want visibility into which GenAI applications are being used, how often, by whom, and what kind of sensitive data is being shared with those applications.

 

On top of that, end users want clear visibility into the confidentiality of data referenced by AI applications. Ensuring that end users can clearly see the sensitivity label of any files referenced by your GenAI app is imperative. This visual cue informs the user that the application is interacting with a sensitive document, which is critical to maintain data integrity and compliance with their organization’s data handling obligations.

 

Today, we are excited to announce new innovations from Microsoft Purview to help developers build enterprise-grade security and compliance controls into their custom-built AI apps: 

  • Microsoft Purview integration in Copilot Studio (public preview) and Azure AI Studio (coming soon) offers data security and compliance features to developers using Copilot Studio and Azure AI Studio. This integration provides visibility into when an application accesses sensitive data by recognizing and honoring sensitivity labels of the data being accessed. It also protects sensitive data generated by the app through label inheritance, and honors label permissions, limiting data access to authorized users only. Additionally, it facilitates governance of app development by providing audit logging for developer activities.  
  • Build enterprise-grade data security and compliance controls with Purview SDK (coming soon) offering a set of easy to integrate APIs for pro-code developers. These APIs enable enterprise-grade data security, compliance and governance controls with just a few lines of code. 

 

Microsoft Purview integration in Copilot Studio (public preview) and Azure AI Studio (coming soon) 

For developers looking to get started today, we are thrilled to announce the integration of Microsoft Purview capabilities in Copilot Studio (public preview) and Azure AI Studio (coming soon). With this integration, Microsoft Purview capabilities come built-in so that when you build your custom apps in Copilot Studio or Azure AI Studio, your enterprise customers and end users get best-in-class security and governance features, including:

  • Discover data risks in AI interactions: Enhance end user confidence by providing visibility into the sensitivity label of the data referenced from SharePoint in responses from your custom-built Copilots and GenAI apps.​
  • Protect sensitive data with encryption: Ensure that app generated responses inherit the sensitivity label of the files referenced and are encrypted accordingly. Additionally, ensure that your AI applications respect user permissions and sensitivity labels, limiting the access to sensitive data to authorized users only. This builds trust with your customers, as they know their data is handled according to their security policies.
  • Capture AI activities: Log developer activities during the creation of custom-built applications to understand which data sources were enabled, whether GenAI answers were enabled on those sources, and more. This ensure comprehensive oversight and transparency for enterprises purchasing your application to maintain control over data and applications interacting with it.

Figure 1: Copilot Studio can inherit labels from the referenced files, honor permission controls associated with the label, and enhance users’ awareness on the sensitivity of the content.

 

Try out these capabilities in Copilot Studio today! These same capabilities will also be released in Azure AI Studio in the coming months. 

 

Leverage Purview SDK for enterprise-grade security and compliance (coming soon) 

Our vision is to equip developers with tools to secure and govern custom-built AI applications so that they can provide the same level of enterprise grade security and compliance to their customers as we do with our own Microsoft Copilots. The Purview SDK (coming soon), will enable developers to seamlessly integrate data security and compliance features into their custom-built apps. With just a few lines of code, developers can provide their customers with insights into the custom-built application’s usage, protective controls to reduce data oversharing and data leakage risks, and compliance capabilities to detect and govern non-compliant usage effectively.

 

From the onset of development, the SDK will provide all the necessary integration and connectivity with Microsoft Purview services, such as Information Protection and Communication Compliance. This includes granting visibility into sensitive data shared with custom-built apps, implementing protection controls such as label inheritance on responses, and integrating compliance capabilities to flag and investigate potential compliance breaches. Developers can easily access the full range of security and compliance capabilities, with a few lines of code, ensuring that their custom-built AI applications are safeguarded at inception.

 

The SDK will enable developers to evaluate, improve and validate that the data security and compliance controls in their apps are working during both development and deployment. Testing and validation is a critical step for ensuring that your app is prepared to operate in production environments, facilitating a smoother transition to deployment. And once deployed, the SDK will provide ongoing visibility to enable developers to proactively monitor to ensure data security controls are working properly.

 

By leveraging the Purview SDK, developers can leverage Purview capabilities across the application lifecycle, instilling confidence in enterprise customers when purchasing and adopting custom-built AI applications. Stay tuned for additional details coming soon!

 

Figure 2: Simply add a few lines of code to leverage Microsoft Purview capabilities in your custom- built application.

 

Get started 

  • Try out the Microsoft Purview integration in Copilot Studio today! Aka.ms/CopilotStudio 
  • Join us at Microsoft Build this week where we will show these capabilities live in action!
  • Look out for announcements coming soon on Purview SDK and Azure AI Studio! 
Updated May 20, 2024
Version 1.0
No CommentsBe the first to comment