<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>Analytics on Azure topics</title>
    <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/bd-p/AnalyticsonAzureDiscussion</link>
    <description>Analytics on Azure topics</description>
    <pubDate>Thu, 30 Apr 2026 22:33:34 GMT</pubDate>
    <dc:creator>AnalyticsonAzureDiscussion</dc:creator>
    <dc:date>2026-04-30T22:33:34Z</dc:date>
    <item>
      <title>Hi everyone!</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/hi-everyone/m-p/4502058#M2208</link>
      <description>&lt;P&gt;Hi everyone! 👋&lt;/P&gt;&lt;P&gt;I’m new to this community and currently learning &lt;STRONG&gt;Azure Analytics&lt;/STRONG&gt;. I’m really excited to be here and connect with people who have experience in this field.&lt;/P&gt;&lt;P&gt;I believe the discussions and knowledge shared by members here are very valuable, and I’m looking forward to learning from all of you. If you have any advice, resources, or tips for someone starting with Azure Analytics, I’d really appreciate it.&lt;/P&gt;&lt;P&gt;Happy to be part of this community! 😊&lt;/P&gt;</description>
      <pubDate>Fri, 13 Mar 2026 17:34:04 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/hi-everyone/m-p/4502058#M2208</guid>
      <dc:creator>Thinuka99</dc:creator>
      <dc:date>2026-03-13T17:34:04Z</dc:date>
    </item>
    <item>
      <title>How can I get the Sign-In logs activated?</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/how-can-i-get-the-sign-in-logs-activated/m-p/4487302#M2191</link>
      <description>&lt;P&gt;Hi&lt;BR /&gt;&lt;BR /&gt;How can I get the Sign-In logs activated? and can I use it to audit Power Bi &amp;amp; Dynamics 365 sign-in attempts?&lt;BR /&gt;&lt;BR /&gt;Thank you in advance.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 20 Jan 2026 06:22:01 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/how-can-i-get-the-sign-in-logs-activated/m-p/4487302#M2191</guid>
      <dc:creator>yayaAmir</dc:creator>
      <dc:date>2026-01-20T06:22:01Z</dc:date>
    </item>
    <item>
      <title>Azure Functions vs. Azure Container Apps Choosing Your Serverless Compute</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-functions-vs-azure-container-apps-choosing-your-serverless/m-p/4459314#M2162</link>
      <description>&lt;P&gt;As organizations continue to embrace cloud-native architectures, the demand for&amp;nbsp;&lt;STRONG&gt;serverless computing&lt;/STRONG&gt;&amp;nbsp;has skyrocketed. Microsoft Azure offers multiple options for deploying applications without worrying about managing infrastructure. Two of the most popular choices are&amp;nbsp;&lt;STRONG&gt;Azure Functions&lt;/STRONG&gt;&amp;nbsp;and&amp;nbsp;&lt;STRONG&gt;Azure Container Apps&lt;/STRONG&gt;.&lt;/P&gt;
&lt;P&gt;While both enable developers to focus on code rather than servers, their use cases, scalability models, and operational models differ significantly. Let’s break down the key distinctions and help you choose the right tool for your next project.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A class="lia-external-url" href="https://dellenny.com/azure-functions-vs-azure-container-apps-choosing-your-serverless-compute/" target="_blank"&gt;https://dellenny.com/azure-functions-vs-azure-container-apps-choosing-your-serverless-compute/&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 06 Oct 2025 14:09:02 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-functions-vs-azure-container-apps-choosing-your-serverless/m-p/4459314#M2162</guid>
      <dc:creator>JohnNaguib</dc:creator>
      <dc:date>2025-10-06T14:09:02Z</dc:date>
    </item>
    <item>
      <title>Cost-effective alternatives to control table for processed files in Azure Synapse</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/cost-effective-alternatives-to-control-table-for-processed-files/m-p/4427066#M2151</link>
      <description>&lt;P&gt;Hello, good morning.In Azure Synapse Analytics, I want to have a control table for the files that have already been processed by the bronze or silver layers. For this, I wanted to create a dedicated pool, but I see that at the minimum performance level it charges 1.51 USD per hour (as I show in the image), so I wanted to know what other more economical alternatives I have, since I will need to do inserts and updates to this control table and with a serverless option this is not possible.&lt;/P&gt;&lt;img /&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 25 Jun 2025 13:48:15 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/cost-effective-alternatives-to-control-table-for-processed-files/m-p/4427066#M2151</guid>
      <dc:creator>JuanMahecha</dc:creator>
      <dc:date>2025-06-25T13:48:15Z</dc:date>
    </item>
    <item>
      <title>Should I ingest AADNonInteractiveUserSignInLogs from Entra ID to a LAW</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/should-i-ingest-aadnoninteractiveusersigninlogs-from-entra-id-to/m-p/4395182#M2142</link>
      <description>&lt;P&gt;As the title says, I am interested in expert opinions on whether I should include the AADNonInteractiveUserSignInLogs from Entra ID in a LAW, as this table dwarfs the SignInLogs in terms of the amount of data (by a factor of 8x) and therefore creates higher costs.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Secondly, I am curious if there are ways to reduce the amount of non-interactive SignInLogs that are generated in the first place.&lt;/P&gt;</description>
      <pubDate>Thu, 20 Mar 2025 07:13:41 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/should-i-ingest-aadnoninteractiveusersigninlogs-from-entra-id-to/m-p/4395182#M2142</guid>
      <dc:creator>CSI</dc:creator>
      <dc:date>2025-03-20T07:13:41Z</dc:date>
    </item>
    <item>
      <title>Specify which Entra ID Sign-in logs are sent to Log Analytics Workspace</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/specify-which-entra-id-sign-in-logs-are-sent-to-log-analytics/m-p/4393391#M2139</link>
      <description>&lt;P&gt;Hi, as the title says I am curious if its possible if I can limit which login logs are sent to a Log Analytics Workspace. We currently have a couple of service accounts in use that generate a high amount of traffic (an issue being worked on separately) and would like to exclude the logs from these specific users from being sent to LAW.&lt;/P&gt;</description>
      <pubDate>Fri, 14 Mar 2025 13:29:15 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/specify-which-entra-id-sign-in-logs-are-sent-to-log-analytics/m-p/4393391#M2139</guid>
      <dc:creator>CSI</dc:creator>
      <dc:date>2025-03-14T13:29:15Z</dc:date>
    </item>
    <item>
      <title>Data archiving of delta table in Azure Databricks</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/data-archiving-of-delta-table-in-azure-databricks/m-p/4362429#M2124</link>
      <description>&lt;P&gt;Hi all,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Currently I am researching on data archiving for delta table data on Azure platform as there is data retention policy within the company.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have studied the documentation from Databricks official (https://docs.databricks.com/en/optimizations/archive-delta.html) which is about archival support in Databricks. It said&amp;nbsp;&lt;EM&gt;"If you enable this setting without having lifecycle policies set for your cloud object storage, Databricks still ignores files based on this specified threshold, but no data is archived."&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Therefore, I am thinking how to configure the lifecycle policy in azure storage account. I have read the documentation on Microsoft official (https://learn.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-overview)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Let say the delta table data are stored in "test-container/sales" and there are lots of "part-xxxx.snappy.parquet" data file stored in that folder. Should I simply specify "tierToArchive", "daysAfterCreationGreaterThan: 1825", "prefixMatch: ["test-container/sales"]?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;However, I am worried that will this archive mechanism impact on normal delta table operation?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Besides, I am worried that what if the parquet data file moved to archive tier contains both data created before 5 years and after 5 years, it is possible? Will it by chance move data earlier to archive tier before 5 years?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Highly appreciate if someone could help me out with the questions above. Thanks in advance.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 03 Jan 2025 09:40:40 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/data-archiving-of-delta-table-in-azure-databricks/m-p/4362429#M2124</guid>
      <dc:creator>Brian_169</dc:creator>
      <dc:date>2025-01-03T09:40:40Z</dc:date>
    </item>
    <item>
      <title>Analytic Rules for Log Forwarder</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/analytic-rules-for-log-forwarder/m-p/4356725#M2102</link>
      <description>&lt;P&gt;Good day,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;May you kindly assist with KQL queries to create these 4 analytic on our environment.&lt;/P&gt;&lt;P&gt;Log Rate-Insufficient&lt;/P&gt;&lt;P&gt;Agent Heartbeat Latency&lt;/P&gt;&lt;P&gt;Agent Heartbeat Monitor&lt;/P&gt;&lt;P&gt;Agent-Health-Alert&lt;/P&gt;</description>
      <pubDate>Thu, 12 Dec 2024 06:43:57 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/analytic-rules-for-log-forwarder/m-p/4356725#M2102</guid>
      <dc:creator>Lumka1122</dc:creator>
      <dc:date>2024-12-12T06:43:57Z</dc:date>
    </item>
    <item>
      <title>azure VM process monitor</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-vm-process-monitor/m-p/4338489#M2086</link>
      <description>&lt;P&gt;Will it be possible to ingress the data about the process with respect to file activity, command line used on that process from the windows VM, as like as the Process Monitor data.&lt;/P&gt;</description>
      <pubDate>Tue, 26 Nov 2024 13:22:48 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-vm-process-monitor/m-p/4338489#M2086</guid>
      <dc:creator>RajkumarRamasamy</dc:creator>
      <dc:date>2024-11-26T13:22:48Z</dc:date>
    </item>
    <item>
      <title>Hd 10 tablet and thank get</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/hd-10-tablet-and-thank-get/m-p/4302609#M1487</link>
      <description>&lt;P&gt;HD the same thing a chance to you get me&lt;/P&gt;</description>
      <pubDate>Mon, 18 Nov 2024 07:46:53 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/hd-10-tablet-and-thank-get/m-p/4302609#M1487</guid>
      <dc:creator>triebeheh</dc:creator>
      <dc:date>2024-11-18T07:46:53Z</dc:date>
    </item>
    <item>
      <title>How are you able to prepare for Azure certification or any other exams with full time job.</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/how-are-you-able-to-prepare-for-azure-certification-or-any-other/m-p/4299513#M543</link>
      <description>&lt;P&gt;I am 29 M and i had been working in cloud since 4 years now , i have worked on azure mostly but i guess now its time for me to look for another jobs in another organization as my salary has been constant since a long time. I feel like getting certified will give more opportunity and better probability of getting my resume shortlisted. Please share any hacks or tips if you have&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2024 11:48:28 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/how-are-you-able-to-prepare-for-azure-certification-or-any-other/m-p/4299513#M543</guid>
      <dc:creator>Nithin_khanna</dc:creator>
      <dc:date>2024-11-16T11:48:28Z</dc:date>
    </item>
    <item>
      <title>Azure Log Analytics workspace</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-log-analytics-workspace/m-p/4287437#M501</link>
      <description>&lt;P&gt;Hi All,&lt;BR /&gt;I have a requirement to keep log analytics workspace in standalone mode. i want o remove or break the communication between Azure resources with the log analytics workspace and keep it in standalone mode to protect the logs which are collected before. i want to keep those logs for an interval of 1 Year. is there any possible way to achieve this requirement. Please suggest.&lt;BR /&gt;&lt;BR /&gt;Note:- This workspace is integrated with Sentinel as well.&lt;/P&gt;</description>
      <pubDate>Wed, 06 Nov 2024 20:40:17 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-log-analytics-workspace/m-p/4287437#M501</guid>
      <dc:creator>veerakumare22</dc:creator>
      <dc:date>2024-11-06T20:40:17Z</dc:date>
    </item>
    <item>
      <title>Azure Data Factory Question</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-data-factory-question/m-p/4274663#M490</link>
      <description>&lt;P&gt;Azure Data Factory Question&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Hi, I want to extract data from an API, and I'm confused about which activity in ADF I should use. Should I go with COPY or WEB activity?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Can anyone please help?&lt;/P&gt;</description>
      <pubDate>Sat, 19 Oct 2024 13:51:42 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-data-factory-question/m-p/4274663#M490</guid>
      <dc:creator>ayaansk99</dc:creator>
      <dc:date>2024-10-19T13:51:42Z</dc:date>
    </item>
    <item>
      <title>Understanding Data Ingestion for Log Analytics and Sentinel Workspace</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/understanding-data-ingestion-for-log-analytics-and-sentinel/m-p/4268636#M489</link>
      <description>&lt;P&gt;I'm trying to understand how data ingestion works for both Log Analytics and Microsoft Sentinel. Every time we notice a spike in data ingestion costs for Log Analytics, we see a similar increase in Sentinel costs as well. It seems like data is being ingested into both workspaces, potentially doubling the ingestion and driving up our costs.&lt;/P&gt;&lt;P&gt;Can someone explain if this is expected behavior, or if there's a way to optimize and avoid duplicate data ingestion between Log Analytics and Sentinel?&lt;/P&gt;</description>
      <pubDate>Fri, 11 Oct 2024 17:28:33 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/understanding-data-ingestion-for-log-analytics-and-sentinel/m-p/4268636#M489</guid>
      <dc:creator>ram512</dc:creator>
      <dc:date>2024-10-11T17:28:33Z</dc:date>
    </item>
    <item>
      <title>Understanding Data Ingestion for Log Analytics and Sentinel Workspace</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/understanding-data-ingestion-for-log-analytics-and-sentinel/m-p/4268628#M488</link>
      <description>&lt;P&gt;I'm trying to understand how data ingestion works for both Log Analytics and Microsoft Sentinel. Every time we notice a spike in data ingestion costs for Log Analytics, we see a similar increase in Sentinel costs as well. It seems like data is being ingested into both workspaces, potentially doubling the ingestion and driving up our costs.&lt;/P&gt;&lt;P&gt;Can someone explain if this is expected behavior, or if there's a way to optimize and avoid duplicate data ingestion between Log Analytics and Sentinel?&lt;/P&gt;</description>
      <pubDate>Fri, 11 Oct 2024 17:13:09 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/understanding-data-ingestion-for-log-analytics-and-sentinel/m-p/4268628#M488</guid>
      <dc:creator>Deleted</dc:creator>
      <dc:date>2024-10-11T17:13:09Z</dc:date>
    </item>
    <item>
      <title>Power BI Failing with the following error -</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/power-bi-failing-with-the-following-error/m-p/4257136#M480</link>
      <description>&lt;P&gt;&lt;SPAN&gt;Microsoft SQL: Cannot bulk load because the file "&lt;A href="https://rbssynapselinkstorage.dfs.core.windows.net/000000611.json" target="_blank"&gt;https://rbssynapselinkstorage.dfs.core.windows.net/000000611.json&lt;/A&gt;" could not be opened.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 27 Sep 2024 17:38:49 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/power-bi-failing-with-the-following-error/m-p/4257136#M480</guid>
      <dc:creator>lohitdacha</dc:creator>
      <dc:date>2024-09-27T17:38:49Z</dc:date>
    </item>
    <item>
      <title>Integrate logic apps rules engine with Microsoft fabric</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/integrate-logic-apps-rules-engine-with-microsoft-fabric/m-p/4256707#M477</link>
      <description>&lt;P&gt;Hi, I have a requirement where I need to integrate logic apps rules engine with fabric data pipelines.&amp;nbsp;&lt;/P&gt;&lt;P&gt;When I run an ETL pipeline, it has to pick up the business logic from rules engine and transform the data in fabric data pipeline. Is this achievable? Need suggestions please?&lt;/P&gt;</description>
      <pubDate>Fri, 27 Sep 2024 08:28:16 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/integrate-logic-apps-rules-engine-with-microsoft-fabric/m-p/4256707#M477</guid>
      <dc:creator>kc_analytics</dc:creator>
      <dc:date>2024-09-27T08:28:16Z</dc:date>
    </item>
    <item>
      <title>Charged for Synapse Analytics SQL Storage</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/charged-for-synapse-analytics-sql-storage/m-p/4233777#M457</link>
      <description>&lt;P&gt;Hi, everyone. I paused a dedicated SQL pool in Azure, and the charges for the pool stopped, but I'm still being charged for Synapse Analytics SQL Storage. How can I stop this? Do I need to delete the pool?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;img /&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 01 Sep 2024 01:49:48 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/charged-for-synapse-analytics-sql-storage/m-p/4233777#M457</guid>
      <dc:creator>eduardoazzolin</dc:creator>
      <dc:date>2024-09-01T01:49:48Z</dc:date>
    </item>
    <item>
      <title>Azure Data Factory Multiple Copy Activity problem with pgSql</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-data-factory-multiple-copy-activity-problem-with-pgsql/m-p/4218441#M455</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I have been trying to copy multiple tables from a PostgreSQL database out of Azure using the copy activity and pasting the data in azure PostgreSQL.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It's like I have two copy activities for copying data into two tables but it is giving me this error&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;FONT color="#DF0000"&gt;Operation on target Copy data1 failed: 'Type=Npgsql.PostgresException,Message=XX000: Tenant or user not found,Source=Npgsql,&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Anybody who can guide me on this?&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Your guidance in this matter will be highly appreciated.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 14 Aug 2024 06:31:04 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/azure-data-factory-multiple-copy-activity-problem-with-pgsql/m-p/4218441#M455</guid>
      <dc:creator>L1F19BSSE0033</dc:creator>
      <dc:date>2024-08-14T06:31:04Z</dc:date>
    </item>
    <item>
      <title>Use Azure Screenshot for my thesis</title>
      <link>https://techcommunity.microsoft.com/t5/analytics-on-azure/use-azure-screenshot-for-my-thesis/m-p/4191478#M452</link>
      <description>&lt;P&gt;Dear Microsoft Team,&lt;/P&gt;&lt;P&gt;In my bachelor's thesis, I plan to implement a BI process and would like to use screenshots from the development environments of Microsoft Fabric, Azure Data Factory, Azure Data Lake Storage Gen2, Azure Synapse Analytics, and PowerBI. Therefore, I would like to ask if I am allowed to use these types of screenshots in my thesis free of charge.&lt;/P&gt;&lt;P&gt;Best regards&lt;/P&gt;</description>
      <pubDate>Mon, 15 Jul 2024 09:31:40 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/analytics-on-azure/use-azure-screenshot-for-my-thesis/m-p/4191478#M452</guid>
      <dc:creator>marius1106</dc:creator>
      <dc:date>2024-07-15T09:31:40Z</dc:date>
    </item>
  </channel>
</rss>

