Forum Widgets
Latest Discussions
Clarifying false assertions by Oracle sales about Oracle licensing on Azure constrained VMs
Recently, I received the following question from a customer... How much of a challenge would it be to defend against Oracle's claim that, for a constrained Standard_E96-24ds_v5 VM, we owe them licensing for 96 vCPUs instead of 24 vCPUs? I've been receiving questions of this sort more frequently these days, so I wanted to share advice on dealing with it. Oracle's own documentation on public cloud licensing (HERE) states... For the purposes of licensing Oracle programs in an Authorized Cloud Environment, customers are required to count the maximum available vCPUs of an instance type as follows: Microsoft Azure – count two vCPUs as equivalent to one Oracle Processor license if multi-threading of processor cores is enabled, and one vCPU as equivalent to one Oracle Processor license if multi-threading of processor cores is not enabled Please note that the highlighted word available which means able to be used or obtained; at someone's disposal according to the Oxford dictionary. Azure constrained VMs are explained HERE, including the following description... Azure offers certain VM sizes where you can constrain the VM vCPU count to reduce the cost of software licensing, while maintaining the same memory, storage, and I/O bandwidth. The vCPU count can be constrained to one half or one quarter of the original VM size. These new VM sizes have a suffix that specifies the number of active vCPUs to make them easier for you to identify. So, constrained VMs in Azure offer only the memory, storage limits, and I/O bandwidth associated with a VM of a larger number of vCPUs in the name, but the number of vCPUs is the lower number in the name. For example, in the case of the above-mentioned Standard_E96-24ds_v5 VM instance type, the "96" represents the memory, I/O, and network resources normally associated with a 96 vCPU virtual machine, but it does not indicate that 96 vCPUs are available. Only 24 vCPUs are available with this instance type, and that is the count to be used when licensing Oracle. Referring to the guidance from Oracle licensing provided above, these 24 vCPUs, each hyperthreaded by 2, represent 12 CPU cores, so the number of Oracle processor licenses for this VM is 12. As an interesting side note, according to the same Oracle documentation on licensing in public clouds (HERE)... When counting Oracle Processor license requirements in Authorized Cloud Environments, the Oracle Processor Core Factor Table is not applicable. Thus, the popular Oracle Processor Core Factor Table discount is available only on-prem and in Oracle cloud, but not in Azure. This is the basis of another myth by Oracle sales teams suggesting that Oracle database is half as expensive in Oracle cloud than in Azure. It has nothing to do with technology or performance or cost of resources, merely a discount that Oracle has reserved only for themselves. Of course, for basic technical questions such as counting CPUs, there must be an empirical way to prove one way or the other. Oracle is welcome to recommend any Linux or Oracle utility they prefer to count the number of vCPUs presented by a VM, but one good suggestion is the Linux lscpu command. Whatever count is returned by such a utility should determine licensing count, of course. In summary, please beware of Oracle sales personnel attempting to freelance with their own perspectives on licensing. Oracle sales personnel are not the most reliable source of such information, due to the obvious conflict of interest. Oracle's License Management Services (LMS) team provides authoritative decisions on licensing. When anyone spreads misinformation about Oracle licensing, then please click the Contact Oracle LMS button on the LMS home page to get the word from the folks who can provide the real answer.TimGormanTechJul 14, 2022Microsoft1.3KViews4likes2CommentsData Skills Transitions are like Sunsets and Sunrises
I was inspired by BuckWoodyMSFT's post on Monday but it was not until I was walking the dog (Hudson) this morning that I got the hook I was looking for. My Mom always said, when it came to sunrises and sunsets "Orange sky in morning sailors take warning; Orange sky at night sailors delight". We can argue over the color of this sunrise (it is winter in Seattle so any sun is good), but when it comes to Data Skills transitions as Data professionals (Data Architects, DBAs, Data Engineers, Data Scientists) we need to heed the warnings. Cloud Data, AI and BI offerings like Azure Synapse Analytics, Azure Machine Learning, Azure Data Factory, Azure Databricks, Azure Data Lake Storage, and Power BI have changed the game. I started my data warehousing and decision support career back in 1993 and have seen many a product sunset: Metaphor Computer Systems, Sagent (great ETL tool the is somewhere in Pitney Bowes data integration), and Brio Technology I moved to Seattle 23 years ago and used SQL Server 6.5, Sagent, and Brio to enable data analysts at University of Washington Physicians. I quickly migrated my data mart to SQL Server 7 in 1998. SQL Server had a major impact on the BI/DW industry (OLAP services changed the market) and on my career. It was important for my career when I joined Business Objects as an OLAP champion back in 2000 and at WaMu where a team I managed deployed SQL Server and Bobj like crazy. I joined Microsoft in 2005 just before SQL Server 2005 launched. In 2005 I also attended my first TechReady (Microsoft Tech internal event) at the Washington State Convention Center. It was an awesome readiness event all the way to TechReady 24 (that is 12 years because these happen twice a year). We are at the sunrise of Winter Ready in Seattle this Monday Feb 3rd. I can't wait to learn about and skillup because I can see the new data skills needed on the horizon. So why this history lesson? If you are seasoned like me and nearing the sunset of your career, I implore you to batten down the hatches and get ready for the storm by upskilling. Businesses need your energy and experience. Many of you are well on your way. Don't let all the newbies get all the opportunities for the new projects that are out there, the the new generation of Data Pros need us. For people early in your data careers IMO you have an awesome future. I think SQL Server 2019 Big Data Clusters and the things it "Spawned" like Azure Arc and Azure Synapse Analytics will provide you with decades of learning, challenges, and employment. I have had a love/hate relationship with certifications but would recommend Microsoft Certified: Azure Data Engineer Associate, IMO, it will make us all better and the more people that take it the better the certification will be. But get your hours in using the Azure Portal and Power BI Portal. Lets be Ready and Certified. May there be many sunrises and sunsets in your work and personal lives. Thanks for joining the Data Architecture Community! Darwin Schweitzer | Education Cloud Solution Architect | US Education | darsch@microsoft.com | Twitter @DataSnowman | GitHub DataSnowman Data and AI resources at https://github.com/Azure/carpriceDarwinSchweitzerJan 30, 2020Microsoft2.1KViews4likes2CommentsData Warehousing using Apache Spark on Azure HDinsight
Hi Team Hope all are safe! This is my first project in Azure and we are looking at developing a DW using Apache Spark on Azure HDinsight. In simple terms we are currently trying to pick files from Share Point and then do transformations using pyspark and then load the data into a Azure Sql db. Can someone help me on the below queries: 1) Can we connect Apache Spark or Pyspark on Azure HDinsight to Share Point to pick files? 2) Can we implement the usual SCD1 or SCD2 logic using pyspark? Thanks in advance!Aishwar04Jul 12, 2020Copper Contributor1.4KViews2likes3CommentsProposal for a new data structure that extremely reduces data sizes for data in which two item types
I propose a new data structure that reduces data sizes for data in which two item types have many-to-many relations. The proposed data structure newly introduces container variables related to many values of both items, and these container variables record many-to-many relations between them. The proposed data structure maintains data normalization and integrity and is independent of indexing methods conventionally used for relational databases, allowing simultaneous use of both. When one item type has N items and the other item type has M items and all of N is related to all of M, the conventional RDB requests N×M rows, whereas the proposed data structure requests N+M rows. When N=100,000 and M=10,000,000, the conventional RDB requests 1,000,000,000,000 rows, whereas the proposed data structure requests only 10,100,000 rows. In detail, please show the journal article linked by https://www.iaiai.org/journals/index.php/IEE/article/view/589 , or, please show the US Patent No.11294961. In the patent, upper item or item group index is used as anther name of container variable.KuwabaraTAug 05, 2022Copper Contributor2.4KViews1like8CommentsMicrosoft Azure: Routing manufacturing IoT Edge data between on-premise PURDUE model levels via MQTT
Microsoft Azure IoT Hub provides out-of-the-box capabilities to send device-to-cloud messages directly into Azure for advanced logging/routing and generating actions based on events occurring on the edge. However, many customers, for example, in manufacturing domain adopt Purdue Enterprise Reference Architecture (PERA) in their plant IoT implementations. And one of the frequent requirements is to allow Azure IoT hub to send data to their internal MQTT brokers, especially to allow communication between PURDUE's Level 2 (Control Systems) to Level 4 (Business Planning) . However, this scenario is NOT just limited to manufacturing domain. Although Azure IoT Hub itself supports MQTT end-points for direct communication, it doesn't provide out-of-the-box capability to post messages to "customer managed" local MQTT brokers. In fact, Azure IoT product group is working on BYOMB (Bring your own [MQTT] broker), but this may take some time to fully bake this capability into out-of-the-box experience. It is very interesting to note that routing IoT device messages to local eco-systems (on premise) without reaching out to Azure cloud is becoming increasingly popular data architecture patterns in manufacturing and many other industries. Most customers want this capability to generate actions/alerts locally, for example, manufacturing plants wants to send an alert to SCADA (Supervisory Controls And Data Acquisition) / HMI (Human Machine Interface) systems for an immediate actions without making a round trip to Azure Cloud. Provisioning MQTT brokers like eclipse-mosquitto is very common to fulfill this kind of needs, so that single alert can be fanned-out to many subscribed systems, if necessary, to continuously fulfill the need for event driven data architecture for improved decision and business outcomes. Recently, one of the manufacturing customers was looking to addressing this exact gap in Azure IoT data architecture solutions. While designing the solution the customer wanted to leverage only Azure PaaS (Platform-as-a-service) offerings available on the edge, which makes lot of sense. Hence, the solution was developed using Azure Functions PaaS service which already supports deployment on edge. And we chose Python as a language - the most adopted scripting language in recent days. However, Azure Functions on the edge also supports C#.Net - if you are a .NET shop! The step-by-step instructions & some learnings from the solution we created are already documented here on this GitHub repository.2.4KViews1like1CommentExam DP-300: Administering Relational Databases on Microsoft Azure
Hello Guys Please where can i get materials for the above exam?MVPromiseApr 25, 2020Brass Contributor1.8KViews1like1Comment
Resources
Tags
- data architecture1 Topic
- Manufacturing IoT1 Topic
- learning1 Topic
- AZURE AMA1 Topic
- Machine Learning Service1 Topic
- DP-1001 Topic
- docker1 Topic