adoption
4 TopicsGetting ready for Microsoft Copilot from an Information Security perspective
I work for one of the early adopters of Copilot for M365. Last year we went through the process of getting comfortable with the change in the findability construct caused by Copilot. Whilst Copilot respects permissions, organisational fears emerged that Copilot would make it easier for staff to discover cracks in our information security. There were leadership fears that Copilot would be used for nefarious acts or increase insider risk. As part of the process, I drafted a 10-step process for piloting and deploying Copilot from an information security perspective. I developed the process from the perspective of the leaders who I was interacting with. I used their concerns as the guide and I shared it publicly in October. As Microsoft have subsequently made it easier for organisations to deploy Copilot, I thought it would be useful to repost my process in the Tech Community so that others may gain value from it. Undoubtedly, Microsoft Copilot and generative AI will transform the way we work. However, it also poses some information security challenges and risks that need to be addressed and mitigated. With theCopilot now generally available, there will be many who will be gearing up to pilot the service ahead of an organisation wide investment and rollout. In this post I share my thoughts about piloting Copilot from an Information Security perspective. I hope it will be useful for others who are interested in the same topic. I look forward to hearing about your experiences of information security with Copilot. Aim To be comfortable with the current state of sharing Increase organisational maturity in permissions management and the governance of information Objectives Reduce risks of oversharing through the correct application of wide scope permission groups and links Conduct a review of current permissions strategy and policy, updating as necessary Audit the current state of sharing Instil the right behaviours through a culture of information security and readiness for generative AI Identify longer running activities which can run in parallel to a pilot thereby shortening the runway Identify and exclude superfluous content Improve governance and auditing Implement additional tooling to better manage sharing Add additional layers of protection, reporting and alerting Why Copilot will expose the cracks that already exist in your approach to permissions management through a change in thefindability construct. Microsoft assume that organisations have already reached a high level of maturity e.g. through the application of labelling and classification which is not always the reality. Failing to communicate the importance of the active management of permissions and reporting of oversharing will undoubtably lead to information security incidents which places the organisation at risk. How As a prerequisite, performing a risk assessment that covers items such as: Privacy anddata protection Responsible use of AI Intellectual property andcopyright Client or commercial contractualconsiderations Data licensing when using including content from connected services will provide the foundations for the information security requirements to be delivered. Pre-pilot These steps do not incur additional costs apart from time as they take advantage of existing E5 licensing needed to use Copilot (though if you want to demonstrate Copilot in action then it will cost you a license…). They are a mixture of technical (IT led with the Business as the Stakeholder) and business activities (e.g. business led communications with IT’s help). They should shorten the runway to Copilot as not everyone will have set up DLP etc. and should defer the need for heavy lifting to once the pilot is underway. The key point to remember is that you are probably already in a good place but Copilot will expose the cracks that already exist and people need to get comfortable with the current state. There is no specific order in which the activities have to be performed and pick the options that matter to you. 10-steps to pilot 1 Review the use ‘out of the box’ wide scope permission groups e.g. Everyone Except External Users. Remediate as needed. Using theSearch Query Toolcan help with this. Technical 2 Consider hiding wide scope permission groups e.g. Everyone Except External Users to reduce risks around accidental misuse. Be mindful that the Microsoft may ‘helpfully’ add these groups incertain situations. Technical 3 Review theSharing linksreports for SharePoint sites for “Anyone”, “People in the organisation” . Raise awareness of the side effects of “People in the organisation” and remediate links as needed. Technical 4 Review thedefault settingsfor file and folder links in SharePoint and adjust as required. This will not fix links which already exist but it helps to reduce future oversharing. Technical 5 Raise awareness of features like ‘Who can see‘ this in Delve, Search etc. to help staff identify and take action on overshared items. Business 6 Get staff to perform a check of their personal content in OneDrive. Highlight the OneDrivesharing reportto them. Business 7 Remind SharePoint Site Owners and Team Owners that they are responsible for their content and highlight to them / remind them about thesharing reports, theimpact of inheritance, how to handle sensitive document types. Ensure that the Site Owners are the recipients ofaccess requests. Business 8 Identify specific sites or libraries which would benefit from beingexcluded from search resultsand exclude as necessary. In doing so balance the impact this would have on site users. Technical 9 Get VIPs comfortable with both the shift in thefindability constructand what Copilot will decline to show (but search would). It helps if you show them Copilot in action*. Business 10 Work with the Infosec team and perform some randomised search whack-a-mole tests to get a sense of what the remaining oversharing looks like and to simulate potential employee search patterns. Remediate as necessary. Technical You might not need to do everything in this list… * If you have access to Copilot, use prompts like “how does my pay compare to my peers?”, “show me documents that contain the word passport”, “summarise the amount of personal information available to me” This will simulate the VIP fear associated with the change in the findability construct. The goal of the above is to get comfortable with your current state, raise awareness of the risks of oversharing and perform fast fix remediations. Once completed, you should be able to identify and understand any gaps or risks in relation to Copilot’s capabilities and your estate. During the pilot The next set are the slow burning activities which will incur costs and may take a considerable amount of time to implement. The aim is for these activities to run in parallel to your Copilot pilot and some may extend beyond that point. Some of the activities are not specifically related to Copilot but they take advantage of actions to enable Copilot, serve to harden your M365 environment as well as improve the quality of responses and increase maturity of information management. 10-steps during the pilot 1 Coach staff about the shift in approach to findability, their responsibilities towards acceptable use and the continual need for good permissions management. Start with the pilot cohort in order to refine your approach and messaging. Business 2 Review the security and governance of systems which are connected to and discoverable from within your M365 environment e.g.aligning identitiesin services connected to Microsoft Search. Technical 3 Identify and archive content that offers little value in terms of knowledge management and discovery.Microsoft Syntex Archivingmay assist with this. Business 4 ImplementMicrosoft Purview Information Protectionusing its classification controls, integrated content labelling and corresponding data loss prevention policies to provide just enough access. Establish policies for security and compliance. Technical 5 Establish continual auditing and report for SharePoint Sites and Teams at the container level and automate the maintenance of their security. Technical 6 Continue to routinely review links and groups used to grant permissions.Microsoft Syntex Advanced SharePoint Managementmay assist with the reviews of potential oversharing, implementation of Site Access Reviews and the restriction of links to specific groups. Technical 7 EstablishMicrosoft Purview Information Barriersaround key segments e.g. HR or Finance and Teams. In doing so establish a default setting for all locations e.g. Owner moderated. Technical 8 Reviewexternal access controlstogether with who and whichdomainshave access. Technical 9 ImplementConditional accessfor SharePoint and OneDrive Technical 10 Consider investment in a Role Based Access Control solution to enforce role and group based security in order strengthen governance around both end user and administrator access. Technical Whilst you are piloting, you can make a start on these items. Take away Whatever your approach will be, do some information security housekeeping first and get comfortable with the new findability construct. Copilot for Microsoft 365 will undoubtedly expose the cracks and you do not want to be sunk by a link, or prompt…8.4KViews7likes2CommentsSkilling goals for Copilot for Microsoft 365
Hey friends, Copilot is a technology and to truly adopt any new technology there are different areas that needs to be learnt and made into natural skills. Prompting is one skill that is needed, as well as other types of skills, or learning goals. One of these other skills is to learn to "work out loud" in Teams channels, contra collaborate in email and private chats in Teams, another is understanding of where data is stored/shared, as well as understanding how Copilot for Microsoft 365 is integrated with the different Microsoft 365 applications. Are there any experience and examples of skill sets or learning goals for adopting Copilot for Microsoft 365 (and other Copilots)?Solved1.3KViews0likes7CommentsHow to limit the CoPilot scope in the tenant resources.
Good morning, I looked around for similar topics without success, so please redirect me to any active thread if one exists. We are in the process of implementing M365 CoPilot for a limited number of users as part of an adoption phase. As for the users, we will progressively extend CoPilot's visibility (see segregation) of the user tenant resources once validated by our team.To be clear, for example, CoPilot works across all of Graph, but initially, we would exclude SharePoint and focus only on Exchange.How can we achieve this? Is it possible? Thanks, Fabrizio1KViews1like5CommentsGetting help from Copilot what I can ask / prompt Copilot - opinions?
Making Copilot available to users is one thing, writing prompts is another.What are the plans to help users use a Copilot?For me, the simple question is, where can I get help from what I can ask the Copilot for? Writing "Help" in the public preview Copilots gives me sometimes a well written text like this one in Power Automate: "Sorry, I couldn’t understand your question. Please rephrase it and try again. I’m able to answer questions that are about Power Automate and are written in English." and sometimes an error. Asking for help as a question like "What can I ask you?" gives the same message. In some implementations of Copilot we have examples provided,but I can imagine that it quickly becomes frustrating when you try something and it doesn't work and you don't even know what you can ask for. What do you think would be acceptable for end users?3.8KViews0likes4Comments