Forum Widgets
Latest Discussions
Reliable Interactive Resources for Dp300 exam
Hello everyone, I hope you're all having a great day! I wanted to reach out and start a discussion about preparing for the DP-300 (Azure Database Administrator) certification exam. I’ve been researching various resources, but I’m struggling to find reliable and interactive materials that truly help with exam prep. For those who have already passed the DP-300, could you share any interactive and trustworthy resources you used during your study? Whether it's courses, hands-on labs, or practice exams, I’d really appreciate your recommendations. Any advice on how to effectively prepare would be incredibly helpful! Thank you so much for your time reading this discussion and for sharing your experiences!DavidcastanoeOct 18, 2024Copper Contributor112Views0likes0CommentsThe Role of Microsoft Azure in the Solution
Microsoft Azure plays a pivotal role in the new Backup-as-a-Service solution. As one of the leading cloud platforms globally, Azure offers unparalleled infrastructure, security, and scalability. The integration of the Backup-as-a-Service solution with Azure ensures that businesses can leverage the full power of the cloud while maintaining control over their data. Azure's global network of data centers provides a reliable and secure environment for data storage and backup. This geographical distribution also enhances disaster recovery capabilities, allowing businesses to recover data from multiple locations if needed. Additionally, Azure's advanced security features, such as encryption and identity management, further enhance the overall security of the Backup-as-a-Service solution. Addressing the Challenges of Data Protection The introduction of this new Backup-as-a-Service solution addresses several key challenges that businesses face in the realm of data protection: Complexity of Data Management: Managing large volumes of data across multiple platforms can be complex and time-consuming. The Backup-as-a-Service solution simplifies this process by centralizing data protection efforts within the Microsoft 365 ecosystem, allowing IT teams to manage backups more efficiently.Not all of solution can handle but Azure Storage from Nakivo can manage large data volumes by centralizing protection efforts across platforms. Data Compliance Requirements: With data protection regulations becoming more stringent, businesses must ensure that their data protection strategies comply with local and international laws. The solution offers built-in compliance tools that help organizations meet these requirements, reducing the risk of costly fines and penalties. Increasing Cybersecurity Threats: The rise in cyberattacks, particularly ransomware, has made data protection a top priority for businesses. The solution's advanced security features, including ransomware protection and disaster recovery, provide a robust defense against these threats. Scalability Needs: As businesses grow, so does their data. The solution's scalable architecture ensures that businesses can continue to protect their data as they expand, without needing to invest in new infrastructure or tools.jaypans29Sep 18, 2024Copper Contributor214Views1like0CommentsHow to migrate SQL MI to different region
We created our Azure SQL MI in region US West, which does not support Zone redundancy. While we don't require it now, we may require it in the future. What options do we have now vs later to enable this? We have not gone live yet so we have the option to rebuild it entirely in a supported region, but if we decide not to do that, are there other options? I'm thinking we would set up a new Azure SQL MI in a new region and use Availability Groups to replicate the data.gyvkoffAug 19, 2024Brass Contributor103Views0likes0CommentsIhor Zahorodnii DataOps for the modern data warehouse
Ihor Zahorodnii DataOps for the modern data warehouse This article describes how a fictional city planning office could use this solution. The solution provides an end-to-end data pipeline that follows the MDW architectural pattern, along with corresponding DevOps and DataOps processes, to assess parking use and make more informed business decisions. Architecture The following diagram shows the overall architecture of the solution. Dataflow Azure Data Factory (ADF) orchestrates and Azure Data Lake Storage (ADLS) Gen2 stores the data: The Contoso city parking web service API is available to transfer data from the parking spots. There's an ADF copy job that transfers the data into the Landing schema. Next, Azure Databricks cleanses and standardizes the data. It takes the raw data and conditions it so data scientists can use it. If validation reveals any bad data, it gets dumped into the Malformed schema. Important People have asked why the data isn't validated before it's stored in ADLS. The reason is that the validation might introduce a bug that could corrupt the dataset. If you introduce a bug at this step, you can fix the bug and replay your pipeline. If you dumped the bad data before you added it to ADLS, then the corrupted data is useless because you can't replay your pipeline. There's a second Azure Databricks transform step that converts the data into a format that you can store in the data warehouse. Finally, the pipeline serves the data in two different ways: Databricks makes the data available to the data scientist so they can train models. Polybase moves the data from the data lake to Azure Synapse Analytics and Power BI accesses the data and presents it to the business user. Components The solution uses these components: Azure Data Factory (ADF) Azure Databricks Azure Data Lake Storage (ADLS) Gen2 Azure Synapse Analytics Azure Key Vault Azure DevOps Power BI Scenario details A modern data warehouse (MDW) lets you easily bring all of your data together at any scale. It doesn't matter if it's structured, unstructured, or semi-structured data. You can gain insights to an MDW through analytical dashboards, operational reports, or advanced analytics for all your users. Setting up an MDW environment for both development (dev) and production (prod) environments is complex. Automating the process is key. It helps increase productivity while minimizing the risk of errors. This article describes how a fictional city planning office could use this solution. The solution provides an end-to-end data pipeline that follows the MDW architectural pattern, along with corresponding DevOps and DataOps processes, to assess parking use and make more informed business decisions. Solution requirements Ability to collect data from different sources or systems. Infrastructure as code: deploy new dev and staging (stg) environments in an automated manner. Deploy application changes across different environments in an automated manner: Implement continuous integration and continuous delivery (CI/CD) pipelines. Use deployment gates for manual approvals. Pipeline as Code: ensure the CI/CD pipeline definitions are in source control. Carry out integration tests on changes using a sample data set. Run pipelines on a scheduled basis. Support future agile development, including the addition of data science workloads. Support for both row-level and object-level security: The security feature is available in SQL Database. You can also find it in Azure Synapse Analytics, Azure Analysis Services (AAS) and Power BI. Support for 10 concurrent dashboard users and 20 concurrent power users. The data pipeline should carry out data validation and filter out malformed records to a specified store. Support monitoring. Centralized configuration in a secure storage like Azure Key Vault. More details here:https://learn.microsoft.com/en-us/azure/architecture/databases/architecture/dataops-mdw Ihor Zahorodnii Ihor ZahorodniiihorzahorodniiAug 05, 2024Copper Contributor163Views0likes0CommentsUnable to Create Batch Pool in Batch Account with Batch Explorer Tool
Description: I am encountering an issue while creating a batch pool using the Batch Explorer as mentioned in the msdn Tutorial: Run a Batch job through Azure Data Factory - Azure Batch | Microsoft Learn. Despite following the standard procedures, I am unable to successfully create the pool. I have detailed the steps and configuration below. Issue Details: Service: Batch Account Tool: Batch Explorer Error Message: ["resizeErrors": [ { "code": "OperationsRestrictedByPolicy", "message": "Assigned policy on resource has blocked pool operation.", "values": [ { "name": "Code", "value": "OperationsOnNetworkSecurityGroupRestrictedByPolicy" }, { "name": "Reason", "value": "Policy is forbidding operations on the Network Security Group resource, which has blocked resize/create/delete of the pool. Check policy assignment." } ] }] Error Screenshot: Steps to Reproduce: Tutorial: Run a Batch job through Azure Data Factory - Azure Batch | Microsoft Learn Sign in to Batch Explorer with Azure credentials. Select the Batch account. Navigate to Pools on the left sidebar, and select the + icon to add a pool. Complete the Add a pool to the account form as follows: Under ID, enter custom-activity-pool. Under Dedicated nodes, enter 2. For Select an operating system configuration, select the Data science tab, and then select Dsvm Win 2019. For Choose a virtual machine size, select Standard_F2s_v2. For Start Task, select Add a start task. On the start task screen, under Command line, enter cmd /c "pip install azure-storage-blob pandas", and then select Select. Enable network configuration with virtual network to use private endpoints.[Error is coming when we enable the network configuration with private virtual network] Select Save and close.256Views0likes0Comments- aniskhan8521Jul 03, 2024Copper Contributor186Views0likes0Comments
Extracting tenantSettings via REST API from Microsoft Fabric to SharePoint directly using PowerShell
Microsoft Fabric is a unified platform that integrates data and services, encompassing data science and data lakes, to enhance your team’s data utilization. Discover how to leverage Fabric’s features like OneLake, Data Factory, Synapse, Data Activator, Power BI, and Microsoft Purview to usher your data into the AI era. Extracting tenantSettings via REST API from Microsoft Fabric to SharePoint directly using PowerShell/CloudShell. To generate a tenant settings report using REST API, you can use the following steps: Obtain an access token from Microsoft Entra (aka Azure Active Directory) by following this guide:https://docs.microsoft.com/azure/active-directory/develop/v2-oauth2-client-creds-grant-flow Use the access token to call the Fabric REST API endpoint for tenant settings:https://learn.microsoft.com/rest/api/fabric/admin/tenants/get-tenant-settings Authenticate SharePoint Teams Site using Service Principal and Certificate/Thumbprint and stream the REST API response directly to SharePoint file. Parse the JSON response and extract the relevant information for your report. Here’s a samplePowerShell/CloudShell scriptthat can produce a report on Microsoft Fabric (including Power BI) tenant settings using the REST API: Download complete code snippet:fabric/admin/tenant_settings_SP.ps1 (github)nsakthiFeb 05, 2024Microsoft699Views3likes0CommentsMicrosoft Fabric Tenant Settings v2 - Current & Snapshot Comparison
Preserve the API responses as JSON files either daily or weekly and use Power BI to analyze these snapshot files. This approach also enables you to subscribe to email notifications or alerts based on certain metrics as needed. Loading JSON Files: To import your local JSON files, select the JSON option in the Get Data menu and utilize the JSON connector in Power Query. This action prompts a local file browser for you to choose your JSON files. Transforming JSON Data: Power Query identifies tables within the JSON data and automatically converts them into a tabular format. You can then employ the editor to further modify the data as required, or simply close and apply. Data Analysis: Once all the JSON files have been imported and transformed into tables, Power BI’s data modeling and visualization tools can be used to analyze the data. You can formulate calculated columns, measures, and visuals to contrast the data across different JSON files. Email Subscriptions: Power BI offers the feature to regularly receive email updates for reports and dashboards. You can adjust the frequency of these emails to be daily, weekly, or triggered by data changes. This functionality allows you to stay informed with the most recent insights without the need to manually check Power BI. Alert Creation: Power BI also provides the capability to set up data-driven alerts, enabling you to receive notifications when your data surpasses predefined limits. Here are several methods for retrieving Microsoft Fabric tenant settings via the REST API Python, PowerShell and Curl. Extracting tenant settings from Microsoft Fabric using PowerShell and the REST API Extracting tenant settings from Microsoft Fabric using Python and the REST API You can obtain the Power BI Template files from this location: v1 Live/Current Tenant Settings:Microsoft Fabric Tenant Settings.pbit (github) v2 Live/Current Tenant Settings + Snapshot Settings:Microsoft Fabric Tenant Settings v2.pbit (github) v3 Snapshot Settings:Microsoft Fabric Tenant Settings v3.pbit (github) Provide the necessary parameters to the template file, save it as a PBIX file, and then publish it to the Power BI service. Provide the Tenant ID, Client ID, and Client Secret for the service principal that can access the Tenant Settings. Also, specify the folder path where the exported snapshot files are stored, so they can be read and compared. Please refer to the following documents to understand how to set up a service principal.Power BI REST APIsandEnable service principal authentication for read-only admin APIs Later, these reports can be shared with others and alerts can be created or subscribed to as required. The below is a version 1 or part 1 article that you can use as a reference.Microsoft Fabric Tenant Settings - Reports - Microsoft Community HubnsakthiJan 22, 2024Microsoft726Views2likes0CommentsMicrosoft Fabric Tenant Settings - Reports
Microsoft Fabric is a unified platform that combines data and services, encompassing data science, data engineering, data lakehouses, data warehouses and visualizations, to enhance your team’s data utilization. Discover how to leverage Fabric’s features like OneLake, Data Factory, Synapse, Data Activator, Power BI, and Microsoft Purview to usher your data into the AI era. Tenant settings provide a detailed level of control over the features accessible to your organization. If you’re worried about sensitive data, some features may not be suitable for your organization, or you may want to limit certain features to specific groups. While tenant settings that govern the availability of features in the Power BI user interface can aid in setting up governance policies, they do not serve as a security measure. For instance, the ‘Export data’ setting does not limit a Power BI user’s permissions on a semantic model. Power BI users with read access to a semantic model have the right to query this model and may be able to save the results without utilizing the ‘Export data’ feature in the Power BI user interface. By extracting and externally visualizing tenant settings in Power BI reports, stakeholders can view, archive, and compare these settings with historical data. This approach negates the need for higher privileges and access to the Microsoft Fabric (previously Power BI) admin portal. Learn more about the tenant settings:About tenant settings - Microsoft FabricandTenant settings index - Microsoft Fabric To generate a tenant settings report using REST API, you can use the following steps: Obtain an access token from Microsoft Entra (aka Azure Active Directory) by following this guide: OAuth 2.0 client credentials flow on the Microsoft identity platform Use the access token to call the Fabric REST API endpoint for tenant settings: Tenants - Get Tenant Settings - REST API Parse the JSON response and extract the relevant information for your report. You can obtain the Power BI Template file from this location:Microsoft Fabric Tenant Settings.pbit (github) Enter the Tenant ID, Client ID, and Client Secret associated with the service principal that has the necessary permissions to retrieve the Tenant Settings. Please refer to the following documents to understand how to set up a service principal. Power BI REST APIsandEnable service principal authentication for read-only admin APIs Save the file as PBXI and upload it to the relevant workspace for sharing with stakeholders or co-administrators. Take advantage of the ability to capture snapshots of the settings for audit purposes and historical comparison Set up alerts for any new additions or modifications Set up automatic email exports for record-keeping Here are several methods for retrieving Microsoft Fabric tenant settings via the REST API using M query, Python, and PowerShell. Extracting tenant settings from Microsoft Fabric using PowerShell and the REST API Extracting tenant settings from Microsoft Fabric using Python and the REST APInsakthiJan 04, 2024Microsoft997Views1like0CommentsMicrosoft Fabric (including Power BI) Workspace Deployment Pipeline with Customized Stages
Deployment Pipelines - Customized Stages Previously: Development=> Testing=> Production(Only 3 Stages) Now: POC => Dev => SIT => UAT => Preprod => Prod(Upto 10 Stages) The pipeline is initially set up with three default stages: Development, Test, and Production. You have the option to either keep these default stages or modify them according to your preferences. You can customize the number of stages and their names within a range of 2-10 stages. To add another stage, click on the "+Add" button. If needed, you can delete stages or rename them by entering a new name in the designated box. Once you have made the desired changes, click on "Create" to finalize the configuration. Create a pipeline Name the pipeline Customize the stages Assign the workspaces: Some sample customized stages 🙂 General: Practical Data Science: Data Platform: Data Application: Quality Assurance and Quality Control: Reference:Get started using deployment pipelines, the Fabric Application lifecycle management (ALM) tool - Microsoft Fabric | Microsoft LearnnsakthiJan 04, 2024Microsoft444Views0likes0Comments
Resources
Tags
- Azure Active Directory1 Topic
- Azure AD Connect1 Topic
- Connection Problem1 Topic
- azure sql1 Topic
- Spark1 Topic
- azure db monitoring1 Topic
- Azure Synapse Analytics1 Topic
- Azure Data Studio1 Topic
- Uptime1 Topic
- Uptime monitoring1 Topic