copilot
924 TopicsDemystifying GitHub Copilot Security Controls: easing concerns for organizational adoption
At a recent developer conference, I delivered a session on Legacy Code Rescue using GitHub Copilot App Modernization. Throughout the day, conversations with developers revealed a clear divide: some have fully embraced Agentic AI in their daily coding, while others remain cautious. Often, this hesitation isn't due to reluctance but stems from organizational concerns around security and regulatory compliance. Having witnessed similar patterns during past technology shifts, I understand how these barriers can slow adoption. In this blog, I'll demystify the most common security concerns about GitHub Copilot and explain how its built-in features address them, empowering organizations to confidently modernize their development workflows. GitHub Copilot Model Training A common question I received at the conference was whether GitHub uses your code as training data for GitHub Copilot. I always direct customers to the GitHub Copilot Trust Center for clarity, but the answer is straightforward: “No. GitHub uses neither Copilot Business nor Enterprise data to train the GitHub model.” Notice this restriction also applies to third-party models as well (e.g. Anthropic, Google). GitHub Copilot Intellectual Property indemnification policy A frequent concern I hear is, since GitHub Copilot’s underlying models are trained on sources that include public code, it might simply “copy and paste” code from those sources. Let’s clarify how this actually works: Does GitHub Copilot “copy/paste”? “The AI models that create Copilot’s suggestions may be trained on public code, but do not contain any code. When they generate a suggestion, they are not “copying and pasting” from any codebase.” To provide an additional layer of protection, GitHub Copilot includes a “duplicate detection filter”. This feature helps prevent suggestions that closely match public code from being surfaced. (Note: This duplicate detection currently does not apply to the Copilot coding agent.) More importantly, customers are protected by an Intellectual Property indemnification policy. This means that if you receive an unmodified suggestion from GitHub Copilot and face a copyright claim as a result, Microsoft will defend you in court. GitHub Copilot Data Retention Another frequent question I hear concerns GitHub Copilot’s data retention policies. For organizations on GitHub Copilot Business and Enterprise plans, retention practices depend on how and where the service is accessed from: Access through IDE for Chat and Code Completions: Prompts and Suggestions: Not retained. User Engagement Data: Kept for two years. Feedback Data: Stored for as long as needed for its intended purpose. Other GitHub Copilot access and use: Prompts and Suggestions: Retained for 28 days. User Engagement Data: Kept for two years. Feedback Data: Stored for as long as needed for its intended purpose. For Copilot Coding Agent, session logs are retained for the life of the account in order to provide the service. Excluding content from GitHub Copilot To prevent GitHub Copilot from indexing sensitive files, you can configure content exclusions at the repository or organization level. In VS Code, use the .copilotignore file to exclude files client-side. Note that files listed in .gitignore are not indexed by default but may still be referenced if open or explicitly referenced (unless they’re excluded through .copilotignore or content exclusions). The life cycle of a GitHub Copilot code suggestion Here are the key protections at each stage of the life cycle of a GitHub Copilot code suggestion: In the IDE: Content exclusions prevent files, folders, or patterns from being included. GitHub proxy (pre-model safety): Prompts go through a GitHub proxy hosted in Microsoft Azure for pre-inference checks: screening for toxic or inappropriate language, relevance, and hacking attempts/jailbreak-style prompts before reaching the model. Model response: With the public code filter enabled, some suggestions are suppressed. The vulnerability protection feature blocks insecure coding patterns like hardcoded credentials or SQL injections in real time. Disable access to GitHub Copilot Free Due to the varying policies associated with GitHub Copilot Free, it is crucial for organizations to ensure it is disabled both in the IDE and on GitHub.com. Since not all IDEs currently offer a built-in option to disable Copilot Free, the most reliable method to prevent both accidental and intentional access is to implement firewall rule changes, as outlined in the official documentation. Agent Mode Allow List Accidental file system deletion by Agentic AI assistants can happen. With GitHub Copilot agent mode, the "Terminal auto approve” setting in VS Code can be used to prevent this. This setting can be managed centrally using a VS Code policy. MCP registry Organizations often want to restrict access to allow only trusted MCP servers. GitHub now offers an MCP registry feature for this purpose. This feature isn’t available in all IDEs and clients yet, but it's being developed. Compliance Certifications The GitHub Copilot Trust Center page lists GitHub Copilot's broad compliance credentials, surpassing many competitors in financial, security, privacy, cloud, and industry coverage. SOC 1 Type 2: Assurance over internal controls for financial reporting. SOC 2 Type 2: In-depth report covering Security, Availability, Processing Integrity, Confidentiality, and Privacy over time. SOC 3: General-use version of SOC 2 with broad executive-level assurance. ISO/IEC 27001:2013: Certification for a formal Information Security Management System (ISMS), based on risk management controls. CSA STAR Level 2: Includes a third-party attestation combining ISO 27001 or SOC 2 with additional cloud control matrix (CCM) requirements. TISAX: Trusted Information Security Assessment Exchange, covering automotive-sector security standards. In summary, while the adoption of AI tools like GitHub Copilot in software development can raise important questions around security, privacy, and compliance, it’s clear that existing safeguards in place help address these concerns. By understanding the safeguards, configurable controls, and robust compliance certifications offered, organizations and developers alike can feel more confident in embracing GitHub Copilot to accelerate innovation while maintaining trust and peace of mind.SharePoint and OneDrive at Microsoft Ignite 2025: What to Expect
Next week Microsoft Ignite 2025 lands in San Francisco’s Moscone Center for the first time! Bringing a wave of innovation that’s reshaping the industry for AI-driven content management, secure collaboration, and seamless digital experiences.559Views1like2CommentsOpenAI’s open‑source model: gpt‑oss on Azure AI Foundry and Windows AI Foundry
Open-weight models give customer decision makers control and flexibility - no black boxes, fewer trade-offs, and more options across deployment, compliance, and cost. With OpenAI’s gpt-oss open-weight models on Azure AI Foundry, you can: Fine-tune and distill the models using your own data and deploy with confidence. Mix open and proprietary models to match task-specific needs. Spin up inference endpoints using gpt oss in the cloud with just a few CLI commands. And Foundry Local makes gpt‑oss-20b usable on a high-performance Windows PC – enabling use-cases in offline settings, buildings in a secure network, or running at the edge. Check out more here!251Views0likes0CommentsEffortless Time Tracking in Teams, Outlook and M365 Copilot
How do you stay in the flow of work when tasks move across Teams, Outlook and now M365 Copilot? Many of us already collaborate and manage our day in these Microsoft 365 tools, but logging time often feels like something separate that interrupts our focus. With https://www.klynke.com/ time tracking stays right where your work happens. It runs inside Teams, Outlook and M365 Copilot, creating one consistent and natural experience for logging hours without leaving your workflow. We shared more in our blog: https://www.klynke.com/post/log-time-in-teams-outlook-copilot, and were grateful that Microsoft featured our story in a Tech Community interview: Building Secure SaaS on Microsoft Cloud. A quick look under the hood Microsoft 365 SSO (Entra ID) – Employees sign in with their existing credentials Tenant-based storage and security – Data stays within your Microsoft 365 tenant, under IT control Native experience – Same workflow in Teams, Outlook and M365 Copilot Simple reporting – Export to Excel, Power BI or dashboards How do you currently manage time tracking in Microsoft 365? Would having it built directly into Teams, Outlook and M365 Copilot make a difference in your day? CTO at Klynke33Views0likes0CommentsM365 Copilot Android Application Not Working
I am using a Samsung S24 FE phone, and the Microsoft Copilot app is not working. When I open the app, I only see a black screen. I cannot open any menus. I cannot create or search for anything. There is only an empty home screen with dead buttons. I cannot sign out of my current account, nor can I figure out which account I am signed into (it logs in automatically). Here’s what I have already tried, but the issue persists: Enabled all permissions in the app settings. Deleted and reinstalled the app. Cleared the app cache and reopened it. Cleared the app data and reopened it.256Views0likes2CommentsCan't see Copilot icon anymore!
Hey everyone. Today, after updating Microsoft Edge in my Windows 11, I can't see Copilot icon on top right of the window anymore (even in Edge Dev and Edge Can). Also, I can't access Bing Chat over sidebar shortcut that I've created. Why?? Actually, the last version of Edge was greatly working and didn't have any problems. I could completely work with Copilot, even in PDF files; but not anymore. Well, what should I do??😑😭 Thanks for your support.25KViews11likes25Comments