Copilot for Microsoft 365
9 TopicsGetting ready for Microsoft Copilot from an Information Security perspective
I work for one of the early adopters of Copilot for M365. Last year we went through the process of getting comfortable with the change in the findability construct caused by Copilot. Whilst Copilot respects permissions, organisational fears emerged that Copilot would make it easier for staff to discover cracks in our information security. There were leadership fears that Copilot would be used for nefarious acts or increase insider risk. As part of the process, I drafted a 10-step process for piloting and deploying Copilot from an information security perspective. I developed the process from the perspective of the leaders who I was interacting with. I used their concerns as the guide and I shared it publicly in October. As Microsoft have subsequently made it easier for organisations to deploy Copilot, I thought it would be useful to repost my process in the Tech Community so that others may gain value from it. Undoubtedly, Microsoft Copilot and generative AI will transform the way we work. However, it also poses some information security challenges and risks that need to be addressed and mitigated. With the Copilot now generally available, there will be many who will be gearing up to pilot the service ahead of an organisation wide investment and rollout. In this post I share my thoughts about piloting Copilot from an Information Security perspective. I hope it will be useful for others who are interested in the same topic. I look forward to hearing about your experiences of information security with Copilot. Aim To be comfortable with the current state of sharing Increase organisational maturity in permissions management and the governance of information Objectives Reduce risks of oversharing through the correct application of wide scope permission groups and links Conduct a review of current permissions strategy and policy, updating as necessary Audit the current state of sharing Instil the right behaviours through a culture of information security and readiness for generative AI Identify longer running activities which can run in parallel to a pilot thereby shortening the runway Identify and exclude superfluous content Improve governance and auditing Implement additional tooling to better manage sharing Add additional layers of protection, reporting and alerting Why Copilot will expose the cracks that already exist in your approach to permissions management through a change in the findability construct. Microsoft assume that organisations have already reached a high level of maturity e.g. through the application of labelling and classification which is not always the reality. Failing to communicate the importance of the active management of permissions and reporting of oversharing will undoubtably lead to information security incidents which places the organisation at risk. How As a prerequisite, performing a risk assessment that covers items such as: Privacy and data protection Responsible use of AI Intellectual property and copyright Client or commercial contractual considerations Data licensing when using including content from connected services will provide the foundations for the information security requirements to be delivered. Pre-pilot These steps do not incur additional costs apart from time as they take advantage of existing E5 licensing needed to use Copilot (though if you want to demonstrate Copilot in action then it will cost you a license…). They are a mixture of technical (IT led with the Business as the Stakeholder) and business activities (e.g. business led communications with IT’s help). They should shorten the runway to Copilot as not everyone will have set up DLP etc. and should defer the need for heavy lifting to once the pilot is underway. The key point to remember is that you are probably already in a good place but Copilot will expose the cracks that already exist and people need to get comfortable with the current state. There is no specific order in which the activities have to be performed and pick the options that matter to you. 10-steps to pilot 1 Review the use ‘out of the box’ wide scope permission groups e.g. Everyone Except External Users. Remediate as needed. Using the Search Query Tool can help with this. Technical 2 Consider hiding wide scope permission groups e.g. Everyone Except External Users to reduce risks around accidental misuse. Be mindful that the Microsoft may ‘helpfully’ add these groups in certain situations. Technical 3 Review the Sharing links reports for SharePoint sites for “Anyone”, “People in the organisation” . Raise awareness of the side effects of “People in the organisation” and remediate links as needed. Technical 4 Review the default settings for file and folder links in SharePoint and adjust as required. This will not fix links which already exist but it helps to reduce future oversharing. Technical 5 Raise awareness of features like ‘Who can see‘ this in Delve, Search etc. to help staff identify and take action on overshared items. Business 6 Get staff to perform a check of their personal content in OneDrive. Highlight the OneDrive sharing report to them. Business 7 Remind SharePoint Site Owners and Team Owners that they are responsible for their content and highlight to them / remind them about the sharing reports, the impact of inheritance, how to handle sensitive document types. Ensure that the Site Owners are the recipients of access requests. Business 8 Identify specific sites or libraries which would benefit from being excluded from search results and exclude as necessary. In doing so balance the impact this would have on site users. Technical 9 Get VIPs comfortable with both the shift in the findability construct and what Copilot will decline to show (but search would). It helps if you show them Copilot in action*. Business 10 Work with the Infosec team and perform some randomised search whack-a-mole tests to get a sense of what the remaining oversharing looks like and to simulate potential employee search patterns. Remediate as necessary. Technical You might not need to do everything in this list… * If you have access to Copilot, use prompts like “how does my pay compare to my peers?”, “show me documents that contain the word passport”, “summarise the amount of personal information available to me” This will simulate the VIP fear associated with the change in the findability construct. The goal of the above is to get comfortable with your current state, raise awareness of the risks of oversharing and perform fast fix remediations. Once completed, you should be able to identify and understand any gaps or risks in relation to Copilot’s capabilities and your estate. During the pilot The next set are the slow burning activities which will incur costs and may take a considerable amount of time to implement. The aim is for these activities to run in parallel to your Copilot pilot and some may extend beyond that point. Some of the activities are not specifically related to Copilot but they take advantage of actions to enable Copilot, serve to harden your M365 environment as well as improve the quality of responses and increase maturity of information management. 10-steps during the pilot 1 Coach staff about the shift in approach to findability, their responsibilities towards acceptable use and the continual need for good permissions management. Start with the pilot cohort in order to refine your approach and messaging. Business 2 Review the security and governance of systems which are connected to and discoverable from within your M365 environment e.g. aligning identities in services connected to Microsoft Search. Technical 3 Identify and archive content that offers little value in terms of knowledge management and discovery. Microsoft Syntex Archiving may assist with this. Business 4 Implement Microsoft Purview Information Protection using its classification controls, integrated content labelling and corresponding data loss prevention policies to provide just enough access. Establish policies for security and compliance. Technical 5 Establish continual auditing and report for SharePoint Sites and Teams at the container level and automate the maintenance of their security. Technical 6 Continue to routinely review links and groups used to grant permissions. Microsoft Syntex Advanced SharePoint Management may assist with the reviews of potential oversharing, implementation of Site Access Reviews and the restriction of links to specific groups. Technical 7 Establish Microsoft Purview Information Barriers around key segments e.g. HR or Finance and Teams. In doing so establish a default setting for all locations e.g. Owner moderated. Technical 8 Review external access controls together with who and which domains have access. Technical 9 Implement Conditional access for SharePoint and OneDrive Technical 10 Consider investment in a Role Based Access Control solution to enforce role and group based security in order strengthen governance around both end user and administrator access. Technical Whilst you are piloting, you can make a start on these items. Take away Whatever your approach will be, do some information security housekeeping first and get comfortable with the new findability construct. Copilot for Microsoft 365 will undoubtedly expose the cracks and you do not want to be sunk by a link, or prompt…8.5KViews7likes2CommentsCopilot for Word
Hi There, I've been playing around with Copilot for Microsoft 365 for quite sometime and I just have a quick question in regards to Copilot for Microsoft 365. Lets say I'm using Copilot for word and I ask Copilot to create a document for me and I review it and I click on regenerate because I'm not happy about the result that it created. So, my question is how many times can I ask Copilot to create new document for me? Is there a limit for that? For instance, I can click on it for 5 times and it will give me a different content everytime and beyond that it will return back to content which it created for me the first time. So, please let me know the limit here.594Views1like2CommentsDrafting in Microsoft Word - Images
I am using Copilot to draft an end user guide in Microsoft Word. The step by step instructions are great and explanations of some of the applications features are included (Teams Audio Conferencing)! Issue: After the first draft, I ask Copilot to add screenshots. It creates the next draft but the images it tries to add simply shows text "undefined" for each image. It looks like it is attempting to add images, and in the correct spots, but the images are not being added or coming through. Any recommendations on resolving this issue? Draft snippet showing issue: Sharing your screen or content in a Teams meeting You can share your screen or content in a Teams meeting to collaborate with others or present your ideas. Here are the steps to share your screen or content in a Teams meeting: Click or tap on the share icon on the meeting toolbar. undefined You will see a menu with different options to share your screen or content. undefined You can choose to share your entire screen, a specific window, a PowerPoint presentation, a whiteboard, or a file from your computer or Teams. undefined Click or tap on the option you want to share and wait for it to appear on the meeting screen. undefined You can stop sharing by clicking or tapping on the stop sharing icon on the meeting toolbar. undefinedSolved675Views1like1CommentQuestions on Restricted SharePoint Search
Hello, I have questions on Restricted SharePoint Search that enables us to refer to only allowed onedrive and sharepoint sites. 1. If one's onedrive excluded from allowed site list, can a user refer documents in it's own onedrive? 2. If yes to question 1, can a user use copilot on working documents? (The documents located in onedrive excluded from allowed site list.) Thank you.Solved984Views0likes2CommentsSecurity Copilot
Nous serions intéressés par la solution Security Copilot. Il semblerait qu'elle ne soit disponible qu'en Early Access pour l'instant. Nous disposons actuellement de la licence ME3, quelles sont les conditions pour être éligible ? Quel est le coût de cette solution ? Les engagements de sécurité sont-ils les mêmes que pour le Copilot déjà déployé ?262Views0likes0CommentsData Security Features in Copilot Pro Similar to M365 Copilot
Hello, I’ve been exploring the capabilities of Copilot Pro and I’m curious about its data security features. Specifically, I’m interested in understanding if Copilot Pro offers similar data security measures as those found in M365 Copilot. My concerns revolve around the handling of sensitive information. Are the data inputs used to train the AI, and if so, how is the privacy of this data ensured? I’ve searched through the documentation but haven’t found clear information regarding this matter. Any insights or guidance on where I can find more detailed information about data security within Copilot Pro would be greatly appreciated. Thank you!Solved1.2KViews0likes2CommentsEmbedding a Microsoft 365 Chat (Copilot) Window on a SharePoint Site
We are looking to embed Microsoft 365 Chat on a SharePoint site as a web part or have it appear as a modal/dialog window when this site is launched or a button is clicked. I have seen instructions to do this for custom Copilot chatbot experiences created with Copilot Studio. Has anyone done this with Microsoft 365 Chat?3.1KViews0likes2CommentsIterating over a list of files and applying similar actions to each, particularly extraction
In my company tenant, we have Microsoft Copilot for Microsoft 365 (E5) mostly deployed. I have had success in extracting from individual PDF file form-letters with the copilot.microsoft.com interface. Something like this: The abbreviations ABC,DEF,NA respectively stand for "All Business Codes","Define Extras Form","Not/Applicable". Please take note that the "filing type" is a special classification that can only be "All Business Codes","Define Extras Form",or"Not/Applicable". Usually, it is found immediately after the phrase "Declaration of filing of". Please remember this when you extract information from PDF files. In a single table row, please list the "Declaration of filing of" date, doc name, applicant name, form number, filing type, header date at the document top, and code number And I use the attach feature to specifiy a specific, single PDF file in OneDrive, and this works well and I get the result I expect. However, I would like to have the copilot perform that same exact activity across a list of files, but it seems to get confused if I attempt to attach multiple files, I even get "I’m sorry, but I am an AI language model and do not have the capability to read or extract information from PDF files" which is kind of true, I guess, but the act of attaching the files explicitly to the prompt should have redirected it to the index, just like in the single attachment case, I would think. I have tried a couple of times to slightly modify the prompt to "For each of the attached files, read the file and generate a table row listing the..." or "...In a single table row, please list the [columns part] ... But please combine all those rows into one table." etc. But maybe my whole approach is wrong. I am sure multiple attachments are good for context focus for a series of questions, or perhaps for combining together some general freeform texts, but I specifically want to iterate over a list of files, or folder full or something, and have repetitive actions taken at each iteration. How can I do that?Solved6.8KViews0likes2Comments