Endpoint Management
42 TopicsGIA - Get Intune Assignments Application
Hello Everyone, Some time ago I was struggling to get all Assignments Intune for a Specific Azure AD Group. This option does not exist at console, and we need to run a lot of queries at MS Graph and/or use PowerShell to retrieve. So, to help the community I started to create PowerShell scripts to help to query some of the Assignments but, still, I had a lot of scripts each one to retrieve a specific type of items (like profiles, conditional access, apps, etc). After a while I decide to develop a C# .NET Application to facilitate the process. Today I want to share with all you my GIA App (Get Intune Assignments). It's available on my gitHub page: https://github.com/sibranda/GetIntuneAssignments I hope this app can help you guys the same way is helping me and my customers. Regards4.5KViews3likes1CommentWindows office hours: June 17, 2021
Post your questions for our next office hours session, which will take place here in the Windows servicing community on Thursday, June 17th from 9:00-10:00 a.m. Pacific Time. Join us to get answers to any questions you may have around managing updates for the remote and onsite devices in your organization, help with specific issues, and tips on how to increase update velocity. We'll have members of the Windows and Microsoft Endpoint Manager product and engineering teams on hand, as well as the FastTrack team. Save the date and see the Windows IT Pro Blog for full details. Let's get started!3.4KViews3likes7CommentsSecuring Windows devices away from the corporate network
During the current public health situation, ensuring that devices can still be effectively managed and secured in what can be called "the new normal" is of utmost priority. As a result, I wanted to share with you the first chapter in a new web series where we will discuss what you, as an IT professional, can do immediately, in the next few weeks, and over the next few months to properly maintain the security of your organization's devices while users are working away from your corporate networks. We will look at sample timelines for accelerated approaches, including ways to optimize the impact of virtual private networks (VPNs) and minimize overall workflow disruption. Learn more Here are links to the resources mentioned in this session. We've also included a list of frequently asked questions below. OSHA COVID19 guidance Configure and Deploy Security Baselines Setup/Configure Azure AD Connect Set up a Cloud Management Gateway Enable OneDrive for Business Switch to Split-Tunnel VPN Policies Enable ConfigMgr Co-Management Shift update and servicing workloads to the cloud (Windows Update for Business, Office 365 CDN) Begin OneDrive for Business Known Folder Migration Configure and Enable Azure AD Conditional Access Set up Azure App Proxy Replace Perimeter trust with Zero Trust Enhance MFA by issuing FIDO2 Keys Consider Further Advanced Cloud Security Solutions Leverage the power of Analytics: User Experience & Productivity Score Shift line of business (LOB) application workloads Configure and Deploy Security Baselines Begin piloting and shifting Policy, Compliance, and EP to the cloud Enable asset protection through Office ATP and MCAS Managing remote machines with cloud management gateway in Microsoft Endpoint Configuration Manager Azure Multi-Factor Authentication Conditional Access Data Leak Prevention Intune Migration Guide Zero Trust strategy—what good looks like How to implement Multi-Factor Authentication (MFA) Microsoft Cloud Security solutions provide comprehensive cross-cloud protection Blog: Brad Anderson Blog: Jared Spataro While not mentioned specifically in this session, here are some additional resources you might find helpful: Microsoft COVID-19 response site Enabling Remote Work Microsoft Endpoint Manager remote work blog Work remotely, stay secure 2 weeks in: what we’ve learned about remote work Frequently asked questions Q: How are others offloading patching traffic to Microsoft sources for full-VPN clients, like split tunneling (since Windows Update IPs aren’t clearly published)? A: We are seeing customers move all Internet traffic away from VPN and that’s what we do internally as well. There are a couple resources on this for WSUS (see 2.1.1) and Windows Update. Q: Are there instructions to shift Office updates from Configuration Manager to the cloud? A: Yes. Here's guidance on how to Manage Office 365 ProPlus with Configuration Manager. Q: Regarding disabling password expirations, do you have any formal documentation that can be provided for our security team? A: Here are some resources that are available on the topic: https://www.microsoft.com/security/blog/2019/07/11/preparing-your-enterprise-to-eliminate-passwords/ https://techcommunity.microsoft.com/t5/azure-active-directory-identity/your-pa-word-doesn-t-matter/ba-p/731984 https://www.microsoft.com/en-us/security/business/identity/passwordless Q: Do you have any formal statements endorsing Split-Tunnel VPN? A: Statement below from: https://www.microsoft.com/en-us/itshowcase/enhancing-remote-access-in-windows-10-with-an-automatic-vpn-profile Split tunneling Split tunneling allows only the traffic destined for the Microsoft corporate network to be routed through the VPN tunnel, and all Internet traffic goes directly through the Internet without traversing the VPN tunnel. In the VPN connection profile, split tunneling is enabled by default. Q: How can we evaluate the potential cost of the cloud management gateway (CMG)? A: Refer to the Configuration Manager documentation here: https://docs.microsoft.com/en-us/configmgr/core/clients/manage/cmg/plan-cloud-management-gateway#cost Q: For split tunneling all Internet traffic out, how do you perform URL filtering for compliance? A: We use Microsoft Threat Protection across Office ATP and Microsoft Defender ATP. Specifically, the Endpoint Detection and Response (EDR) component. Feedback We hope you find this first session useful. We'd love your feedback and ideas for future sessions so please fill out this short survey. Thank you!14KViews3likes0CommentsOffice hours are closed: December 17, 2020
Office hours are now closed. We hope we were able to answer your questions and provide tips and resources to help you more easily manage Windows 10 updates and your Windows device estate. The experts and engineers who supported today's session were: Windows as a service strategies, tactics, best practices: Dave Backman and James Bell Windows 10 deployment: Steve Thomas Cloud-based update management, Windows Update for Business: Aria Carley Microsoft Endpoint Manager: Joe Lurie Microsoft Endpoint Manager (public sector, CMG, etc.): Danny Guillory Configuration Manager: Rob York, Bruno Yoshioka Product feedback: Kevin Mineweaser FastTrack: Sean McLaren Save the date for future events We'll be back in 2021 every third Thursday. Save the date for our next office hours event -- Thursday, January 21st, 9:00-10:00 a.m. Pacific Time -- and see the Windows IT Pro Blog for an up-to-date list of future events. See you next time!1.8KViews2likes0CommentsBuilding a Custom Continuous Export Pipeline for Azure Application Insights
1. Introduction MonitorLiftApp is a modular Python application designed to export telemetry data from Azure Application Insights to custom sinks such as Azure Data Lake Storage Gen2 (ADLS). This solution is ideal when Event Hub integration is not feasible, providing a flexible, maintainable, and scalable alternative for organizations needing to move telemetry data for analytics, compliance, or integration scenarios. 2. Problem Statement Exporting telemetry from Azure Application Insights is commonly achieved via Event Hub streaming. However, there are scenarios where Event Hub integration is not possible due to cost, complexity, or architectural constraints. In such cases, organizations need a reliable, incremental, and customizable way to extract telemetry data and move it to a destination of their choice, such as ADLS, SQL, or REST APIs. 3. Investigation Why Not Event Hub? Event Hub integration may not be available in all environments. Some organizations have security or cost concerns. Custom sinks (like ADLS) may be required for downstream analytics or compliance. Alternative Approach Use the Azure Monitor Query SDK to periodically export data. Build a modular, pluggable Python app for maintainability and extensibility. 4. Solution 4.1 Architecture Overview MonitorLiftApp is structured into four main components: Configuration: Centralizes all settings and credentials. Main Application: Orchestrates the export process. Query Execution: Runs KQL queries and serializes results. Sink Abstraction: Allows easy swapping of data targets (e.g., ADLS, SQL). 4.2 Configuration (app_config.py) All configuration is centralized in app_config.py, making it easy to adapt the app to different environments. CONFIG = { "APPINSIGHTS_APP_ID": "<your-app-id>", "APPINSIGHTS_WORKSPACE_ID": "<your-workspace-id>", "STORAGE_ACCOUNT_URL": "<your-adls-url>", "CONTAINER_NAME": "<your-container>", "Dependencies_KQL": "dependencies \n limit 10000", "Exceptions_KQL": "exceptions \n limit 10000", "Pages_KQL": "pageViews \n limit 10000", "Requests_KQL": "requests \n limit 10000", "Traces_KQL": "traces \n limit 10000", "START_STOP_MINUTES": 5, "TIMER_MINUTES": 5, "CLIENT_ID": "<your-client-id>", "CLIENT_SECRET": "<your-client-secret>", "TENANT_ID": "<your-tenant-id>" } Explanation: This configuration file contains all the necessary parameters for connecting to Azure resources, defining KQL queries, and scheduling the export job. By centralizing these settings, the app becomes easy to maintain and adapt. 4.3 Main Application (main.py) The main application is the entry point and can be run as a Python console app. It loads the configuration, sets up credentials, and runs the export job on a schedule. from app_config import CONFIG from azure.identity import ClientSecretCredential from monitorlift.query_runner import run_all_queries from monitorlift.target_repository import ADLSTargetRepository def build_env(): env = {} keys = [ "APPINSIGHTS_WORKSPACE_ID", "APPINSIGHTS_APP_ID", "STORAGE_ACCOUNT_URL", "CONTAINER_NAME" ] for k in keys: env[k] = CONFIG[k] for k, v in CONFIG.items(): if k.endswith("KQL"): env[k] = v return env class MonitorLiftApp: def __init__(self, client_id, client_secret, tenant_id): self.env = build_env() self.credential = ClientSecretCredential(tenant_id, client_id, client_secret) self.target_repo = ADLSTargetRepository( account_url=self.env["STORAGE_ACCOUNT_URL"], container_name=self.env["CONTAINER_NAME"], cred=self.credential ) def run(self): run_all_queries(self.target_repo, self.credential, self.env) if __name__ == "__main__": import time client_id = CONFIG["CLIENT_ID"] client_secret = CONFIG["CLIENT_SECRET"] tenant_id = CONFIG["TENANT_ID"] app = MonitorLiftApp(client_id, client_secret, tenant_id) timer_interval = app.env.get("TIMER_MINUTES", 5) print(f"Starting continuous export job. Interval: {timer_interval} minutes.") while True: print("\n[INFO] Running export job at", time.strftime('%Y-%m-%d %H:%M:%S')) try: app.run() print("[INFO] Export complete.") except Exception as e: print(f"[ERROR] Export failed: {e}") print(f"[INFO] Sleeping for {timer_interval} minutes...") time.sleep(timer_interval * 60) Explanation: The app can be run from any machine with Python and the required libraries installed—locally or in the cloud (VM, container, etc.). No compilation is needed; just run as a Python script. Optionally, you can package it as an executable using tools like PyInstaller. The main loop schedules the export job at regular intervals. 4.4 Query Execution (query_runner.py) This module orchestrates KQL queries, runs them in parallel, and serializes results. import datetime import json from concurrent.futures import ThreadPoolExecutor, as_completed from azure.monitor.query import LogsQueryClient def run_query_for_kql_var(kql_var, target_repo, credential, env): query_name = kql_var[:-4] print(f"[START] run_query_for_kql_var: {kql_var}") query_template = env[kql_var] app_id = env["APPINSIGHTS_APP_ID"] workspace_id = env["APPINSIGHTS_WORKSPACE_ID"] try: latest_ts = target_repo.get_latest_timestamp(query_name) print(f"Latest timestamp for {query_name}: {latest_ts}") except Exception as e: print(f"Error getting latest timestamp for {query_name}: {e}") return start = latest_ts time_window = env.get("START_STOP_MINUTES", 5) end = start + datetime.timedelta(minutes=time_window) query = f"app('{app_id}')." + query_template logs_client = LogsQueryClient(credential) try: response = logs_client.query_workspace(workspace_id, query, timespan=(start, end)) if response.tables and len(response.tables[0].rows) > 0: print(f"Query for {query_name} returned {len(response.tables[0].rows)} rows.") table = response.tables[0] rows = [ [v.isoformat() if isinstance(v, datetime.datetime) else v for v in row] for row in table.rows ] result_json = json.dumps({"columns": table.columns, "rows": rows}) target_repo.save_results(query_name, result_json, start, end) print(f"Saved results for {query_name}") except Exception as e: print(f"Error running query or saving results for {query_name}: {e}") def run_all_queries(target_repo, credential, env): print("[INFO] run_all_queries triggered.") kql_vars = [k for k in env if k.endswith('KQL') and not k.startswith('APPSETTING_')] print(f"Number of KQL queries to run: {len(kql_vars)}. KQL vars: {kql_vars}") with ThreadPoolExecutor(max_workers=len(kql_vars)) as executor: futures = { executor.submit(run_query_for_kql_var, kql_var, target_repo, credential, env): kql_var for kql_var in kql_vars } for future in as_completed(futures): kql_var = futures[future] try: future.result() except Exception as exc: print(f"[ERROR] Exception in query {kql_var}: {exc}") Explanation: Queries are executed in parallel for efficiency. Results are serialized and saved to the configured sink. Incremental export is achieved by tracking the latest timestamp for each query. 4.5 Sink Abstraction (target_repository.py) This module abstracts the sink implementation, allowing you to swap out ADLS for SQL, REST API, or other targets. from abc import ABC, abstractmethod import datetime from azure.storage.blob import BlobServiceClient class TargetRepository(ABC): @abstractmethod def get_latest_timestamp(self, query_name): pass @abstractmethod def save_results(self, query_name, data, start, end): pass class ADLSTargetRepository(TargetRepository): def __init__(self, account_url, container_name, cred): self.account_url = account_url self.container_name = container_name self.credential = cred self.blob_service_client = BlobServiceClient(account_url=account_url, credential=cred) def get_latest_timestamp(self, query_name, fallback_hours=3): blob_client = self.blob_service_client.get_blob_client(self.container_name, f"{query_name}/latest_timestamp.txt") try: timestamp_str = blob_client.download_blob().readall().decode() return datetime.datetime.fromisoformat(timestamp_str) except Exception as e: if hasattr(e, 'error_code') and e.error_code == 'BlobNotFound': print(f"[INFO] No timestamp blob for {query_name}, starting from {fallback_hours} hours ago.") else: print(f"[WARNING] Could not get latest timestamp for {query_name}: {type(e).__name__}: {e}") return datetime.datetime.utcnow() - datetime.timedelta(hours=fallback_hours) def save_results(self, query_name, data, start, end): filename = f"{query_name}/{start:%Y%m%d%H%M}_{end:%Y%m%d%H%M}.json" blob_client = self.blob_service_client.get_blob_client(self.container_name, filename) try: blob_client.upload_blob(data, overwrite=True) print(f"[SUCCESS] Saved results to blob for {query_name} from {start} to {end}") except Exception as e: print(f"[ERROR] Failed to save results to blob for {query_name}: {type(e).__name__}: {e}") ts_blob_client = self.blob_service_client.get_blob_client(self.container_name, f"{query_name}/latest_timestamp.txt") try: ts_blob_client.upload_blob(end.isoformat(), overwrite=True) except Exception as e: print(f"[ERROR] Failed to update latest timestamp for {query_name}: {type(e).__name__}: {e}") Explanation: The sink abstraction allows you to easily switch between different storage backends. The ADLS implementation saves both the results and the latest timestamp for incremental exports. End-to-End Setup Guide Prerequisites Python 3.8+ Azure SDKs: azure-identity, azure-monitor-query, azure-storage-blob Access to Azure Application Insights and ADLS Service principal credentials with appropriate permissions Steps Clone or Download the Repository Place all code files in a working directory. 2. **Configure **app_config.py Fill in your Azure resource IDs, credentials, and KQL queries. 3. Install Dependencies pip install azure-identity azure-monitor-query azure-storage-blob 4. Run the Application Locally: python monitorliftapp/main.py Deploy to a VM, Azure Container Instance, or as a scheduled job. (Optional) Package as Executable Use PyInstaller or similar tools if you want a standalone executable. 2. Verify Data Movement Check your ADLS container for exported JSON files. 6. Screenshots App Insights Logs:\ Exported Data in ADLS: \ 7. Final Thoughts MonitorLiftApp provides a robust, modular, and extensible solution for exporting telemetry from Azure Monitor to custom sinks. Its design supports parallel query execution, pluggable storage backends, and incremental data movement, making it suitable for a wide range of enterprise scenarios.124Views1like0CommentsCompliance licenses at tenant level
Hi, We are a small organization of about 200 employees, and we have following requirements. DLP policies configuration at Exchange, OneDrive, SharePoint BYOD security Users should not be able to send files outside the org And so on as we evaluate We already have M365 Business Premium. However, after researching we figured out that M365 Business premium will alone not solve our requirements. May be compliance license will. We want to apply security policies at tenant level in our organization but definitely do not want every user to get licenses as this will be expensive for us and there is no requirement at all for our users. The question is, Is there a way to solve the above scenario?314Views1like3CommentsMicrosoft 365 networking - Proxy Endpoints
[New Blog Post] In my latest article, I have summarized the endpoints for #Microsoft365. These endpoints are relevant for proxy settings and for routing with direct brake out. #M365 #EXO #SPO #Azure #MSIntune #MVPbuzz https://www.msb365.blog/?p=5549462Views1like0CommentsDefender for macOS onboarding issue
I am trying to onboard macOS devices in my organization with Microsoft Defender via Intune, and facing multiple issues with it, the configuration profiles are applied successfully only on few devices, only the first (manually installed) macOS is properly onboarded in Defender, and all of the other ones are complaining about missing license. Could someone answer few questions and maybe give some tips on how can I troubleshoot and resolve this: We have Microsoft 365 Business Premium license, and according to Defender documentation this is a sufficient license to use it on any endpoint device. However the error message on macOS devices states that there is a missing Microsoft Enterprise license. Is there a special license needed or is this just the payload configuration profile issue? The kernel extension and onboarding profiles are generated in the Microsoft Defender Admin Center, however I did noticed that the OrgID in the onboarding profile file does not match my TenantID. Does that mean that those files are premade and I should adjust them to my organization details or it is simply a different ID assigned? The onboarding profile gets successfully applied on all devices however the kernel extension profile fails on almost every device, and the successful applications do not follow any pattern or macOS version. Can't really find any suggestions on the possible root cause of this issue. Did anyone had similar problems with the kext profile? The Microsoft Defender Admin Center does provide a installation package PKG file. However according to the Defender documentation I should use Microsoft Defender for Endpoint (macOS) application that is ready to be applied directly from Intune Management Portal. Which is it? Or maybe both? Thank you in advance for any tips and / or answers 🙂1.1KViews1like0Comments