azure
7785 TopicsLive AMA: Demystifying Azure pricing (AM session)
⏱️ This live AMA is on January 22nd, 2026 at 9:00 AM PT. This same session is also scheduled at 5:00 PM PT on January 22nd. SESSION DETAILS This session breaks down the complexity of Azure pricing to help you make informed decisions. We’ll cover how to estimate costs accurately using tools like the Azure Pricing Calculator, explore strategic pricing offers such as Reservations, Savings Plans, and Azure Hybrid Benefit, and share best practices for optimizing workloads for cost efficiency. Whether you’re planning migrations or managing ongoing cloud spend, you’ll learn actionable strategies to forecast, control, and reduce costs without compromising performance. This session includes a live chat-based AMA. Submit your questions below as Comment to be answered by the product team!395Views0likes1CommentAutomating Microsoft Sentinel: A blog series on enabling Smart Security
This entry guides readers through building custom Playbooks in Microsoft Sentinel, highlighting best practices for trigger selection, managed identities, and integrating built-in tools and external APIs. It offers practical steps and insights to help security teams automate incident response and streamline operations within Sentinel.141Views0likes0CommentsLive AMA: Demystifying Azure pricing (PM session)
⏱️ This live AMA is on January 22nd, 2026 at 5:00 PM PT. This same session is also scheduled at 9:00 AM PT on January 22nd. SESSION DETAILS This session breaks down the complexity of Azure pricing to help you make informed decisions. We’ll cover how to estimate costs accurately using tools like the Azure Pricing Calculator, explore strategic pricing offers such as Reservations, Savings Plans, and Azure Hybrid Benefit, and share best practices for optimizing workloads for cost efficiency. Whether you’re planning migrations or managing ongoing cloud spend, you’ll learn actionable strategies to forecast, control, and reduce costs without compromising performance. This session includes a live chat-based AMA. Submit your questions below as Comment to be answered by the product team!69Views0likes0CommentsMoving an app from AWS to Azure
Moving an application from AWS to Azure is more than a simple lift‑and‑shift. Successful replication requires selecting the right Azure services to support scalability, security, and long‑term growth. This article outlines the key migration considerations software development teams must address to successfully replicate AWS‑based applications on Azure. Walk through how to: Map core AWS services to their Azure equivalents Design Azure‑ready architectures with the right identity and security foundations Avoid common pitfalls that slow migrations and increase rework Discover the key Azure resources that help teams: Move faster with confidence Reduce rework during development and deployment Set a strong foundation for long‑term success on the Azure platform Read the full blog here: Replicating your AWS application to Azure: key resources for software development companies | Microsoft Community HubExpanding the multicloud advantage: Database migration paths into Azure
Broaden your customer base and enhance your app’s exposure by bringing your AWS-based solution to Azure and listing it on Microsoft Marketplace. This guide walks you through how Azure database services compare to those on AWS—spotlighting differences in architecture, scalability, and feature sets—so you can make confident choices when replicating your app’s data layer to Azure. This post is part of a series on replicating apps from AWS to Azure. View all posts in this series For software development companies looking to expand or replicate their marketplace offerings from AWS to Microsoft Azure, one of the most critical steps is selecting the right Azure database services. While both AWS and Azure provide robust managed database options, their architecture, service availability, and design approaches vary. To deliver reliable performance, scale globally, and meet operational requirements, it’s essential to understand how Azure databases work—and how they compare to AWS—before you replicate your app. AWS to Azure database mapping When replicating your app from AWS to Azure, start by mapping your existing database services to the closest Azure equivalents. Both clouds offer relational, NoSQL, and analytics databases, but they differ in architecture, features, and integration points. Choosing the right Azure service helps keep your app performant, secure, and manageable—and aligns with Azure Marketplace requirements for an Azure-native deployment. AWS Service Azure Equivalent Recommended Use Cases & Key Differences Amazon RDS (MySQL/PostgreSQL) Azure Database for MySQL / PostgreSQL Fully managed relational DB with built-in HA, scaling, and security. Building Generative AI apps. Amazon RDS (SQL Server) Azure SQL Database or Azure SQL Managed Instance Use Azure SQL Database for modern apps; choose Managed Instance for near 100% compatibility with on-prem SQL Server. SQL Server on EC2 SQL Server on Azure VMs Best for lift-and-shift scenarios requiring full OS-level control. Amazon RDS (Oracle) Oracle Database@Azure Managed Oracle workloads with Azure integration. Amazon Aurora (PostgreSQL/MySQL) Azure Database for PostgreSQL (Flexible Server) or Azure Database for MySQL Similar managed experience for large workloads, consider Azure HorizonDB (public preview)—built on PostgreSQL to compete with Aurora & AlloyDB. Learn more. Amazon DynamoDB Azure Cosmos DB (NoSQL API) Global distribution, multi-model support, and guaranteed SLAs for latency and throughput. Amazon Keyspaces (Cassandra) Azure Managed Instance for Apache Cassandra Managed Cassandra with elastic scaling and Azure-native security. Cassandra on EC2 Azure Managed Instance for Apache Cassandra Same as above; ideal for lift-and-shift Cassandra clusters. Amazon DocumentDB MongoDB Atlas MongoDB on EC2 Azure DocumentDB Azure DocumentDB Azure DocumentDB Drop-in compatibility for MongoDB workloads with global replication and vCore-based pricing. Amazon Redshift Azure Synapse Analytics Enterprise analytics with integrated data lake and Power BI connectivity. Amazon ElastiCache (Redis) Azure Cache for Redis Low-latency caching with clustering and persistence options. Match your use case After mapping AWS services to Azure equivalents, the next step is selecting the right service for your workload. Start by considering the data model (relational, document, key-value), then factor in performance, consistency, and global reach. Building AI apps: Generative AI, vector search, advanced analytics. Relational workloads: Use Azure SQL Database, Azure SQL Managed Instance, or Azure Database for MySQL/PostgreSQL for transactional apps; enable zone redundancy for HA. Review schema compatibility, stored procedures, triggers, and extensions. Inventory all databases, tables, indexes, users, and dependencies before migration. Document any required refactoring for Azure. NoSQL workloads: Choose Azure Cosmos DB for globally distributed apps; select the API (No SQL, MongoDB, Cassandra) that matches your existing schema. Validate data: Model mapping and test migration in a sandbox environment to ensure data integrity and application connectivity. Analytics: For large-scale queries and BI integration, Azure Synapse Analytics offers MPP architecture and tight integration with Azure Data Lake. Inventory all analytics assets, ETL pipelines, and dependencies. Plan for migration using Azure Data Factory or Synapse pipelines. Test performance benchmarks and optimize query plans post-migration. Caching: Azure Cache for Redis accelerates app performance with in-memory data and clustering. Update application connection strings and drivers to use Azure endpoints. Implement retry logic and connection pooling for reliability. Validate cache warm-up and failover strategies. Hybrid scenarios: Combine Cosmos DB with Synapse Link (for Synapse as target) or Fabric Mirroring (for Fabric as target) for real-time analytics without ETL overhead. Assess network isolation, security, and compliance requirements. Deploy Private Endpoints and configure RBAC as needed. Document integration points and monitor hybrid data flows. Factor in security and compliance Encryption: Confirm default encryption meets compliance requirements; enable customer-managed keys (CMK) if needed. Enable Transparent Data Encryption (TDE) and review encryption for backups and in-transit data. Access control: Apply Azure RBAC and database-level roles for granular permissions. Audit user roles and permissions regularly to ensure least privilege. Network isolation: Use Private Endpoints within a virtual network to keep traffic off the public internet. Configure Network Security Groups (NSGs) and firewalls for additional protection. Identity integration: Prefer Managed Identities for secure access to databases. Integrate with Azure Active Directory for centralized identity management. Compliance checks: Verify certifications like GDPR, HIPAA, or industry-specific standards. Use Azure Policy and Compliance Manager to automate compliance validation Audit logging and threat detection: Enable audit logging and advanced threat detection with Microsoft Defender for all database services. Review logs and alerts regularly. Optimize for cost Compute tiers: Choose General Purpose for balanced workloads; Business Critical for low-latency and high IOPS. Review workload sizing and adjust tiers as needed for cost efficiency. Autoscaling: Enable autoscale for Cosmos DB and flexible servers to avoid overprovisioning. Monitor scaling events and set thresholds to control spend. Reserved capacity: Commit to 1–3 years for predictable workloads to unlock discounts. Evaluate usage patterns before committing to reservations. Serverless: Use serverless compute for workloads with completely ad hoc usage and low frequency of access. This eliminates the need for pre-provisioned resources and reduces costs for unpredictable workloads. Monitoring: Use Azure Cost Management and query performance insights to optimize spend. Set up budget alerts and analyze cost trends monthly. Include basic resource monitoring to detect adverse usage patterns early. Storage and backup costs: Review storage costs, backup retention policies, and configure lifecycle management for backups and archives. Data migration from AWS to Azure Migrating your data from AWS to Azure is a key step in replicating your app’s database layer for Azure Marketplace. The goal is a one-time transfer—after migration, your app runs fully on Azure. Azure Database Migration Service (DMS): Automates migration from RDS, Aurora, or on-prem to Azure Database, Azure SQL Managed Instance, Azure Database for MySQL/PostgreSQL, and SQL Server on Azure VM (for MySQL/PostgreSQL/SQL Server). Supports online and offline migrations; run pre-migration assessments and schema validation. Azure Data Factory: Orchestrates data movement from DynamoDB, Redshift, or S3 to Azure Cosmos DB or Synapse. Use mapping data flows for transformations and data cleansing. MongoDB migrations: Use the online migration utility designed for medium to large-scale migrations to Azure DocumentDB. Ensure schema compatibility and validate performance benchmarks before cutover. Cassandra migrations: Use Cassandra hybrid cluster or dual write proxy for Azure Managed Instance for Apache Cassandra. Validate schema compatibility and test migration in a sandbox environment. Offline transfers: For very large datasets, use Azure Data Box for secure physical migration. Plan logistics and security for device handling. Migration best practices: Schedule migration during a maintenance window, validate data integrity post-migration, and perform cutover only after successful data validation & verifications. Final readiness before marketplace listing Validate performance: Benchmark with real data and confirm chosen SKUs deliver required throughput and latency. Test application functionality under expected load and validate query performance for all critical scenarios. Lock down security: Ensure RBAC roles, Private Endpoints, and encryption meet compliance requirements. Review audit logs, enable threat detection, and verify access controls for all database and storage resources. Control costs: Verify autoscaling, reserved capacity, and cost alerts are active. Review storage and backup policies, and set up budget alerts for ongoing cost control. Enable monitoring: Set up dashboards for query performance, latency, and capacity. Configure alerts for failures, anomalies, and capacity thresholds. Monitor with Azure Monitor and Log Analytics for real-time operational insights. Documentation and support: Update migration runbooks, operational guides, troubleshooting documentation, and escalation contacts for post-migration support. Key Resources SaaS Workloads - Microsoft Azure Well-Architected Framework | Microsoft Learn Metered billing for SaaS offers in Partner Center Create plans for a SaaS offer in Microsoft Marketplace Get over $126K USD in benefits and technical consultations to help you replicate and publish your app with ISV Success Maximize your momentum with step-by-step guidance to publish and grow your app with App Advisor75Views2likes0CommentsUnleashing New Business Opportunities for Microsoft Partners with PostgreSQL & MySQL on Azure
The latest innovations announced at Microsoft Ignite 2025 for PostgreSQL and MySQL running on Azure are more than just technical upgrades—they’re a launchpad for new business growth, deeper customer engagement, and accelerated digital transformation. Here’s how these advancements can help you deliver greater value and unlock new opportunities for your clients. 1. Introducing Azure HorizonDB: Built for Performance and AI Workloads We’re excited to unveil Azure HorizonDB in private preview—a new, fully managed PostgreSQL service engineered for business and developers alike. HorizonDB is designed for ultra-low latency, high read scale, and built-in AI capabilities, offering seamless scaling up to 192 virtual cores and 128 TB of storage. Deep integration with developer tools, including GitHub Copilot, delivers performance, resilience, and simplicity at any scale. With HorizonDB, teams can: Build AI apps at scale using advanced DiskANN vector indexing, pre-provisioned AI models, semantic search, and unified support for both relational and graph data. Accelerate app development with built-in extensions, including the PostgreSQL extension for Visual Studio Code integrated with GitHub Copilot. Copilot in VS Code is context-aware for PostgreSQL and enables one-click performance debugging. Unlock data insights through deep integrations with Microsoft Fabric and Microsoft Foundry. Expect reliability with enterprise-ready features from day one, including Entra ID integration, Private Link networking, and Azure Defender for Cloud. Business Opportunity: Position your practice as an early adopter and expert in next-generation database solutions. Position your practice as an early adopter and expert in next-generation database solutions by introducing customers to Azure HorizonDB. Use this conversation to offer migration, modernization, and AI-powered application development services leveraging Azure Database for PostgreSQL with future migrations to HorizonDB. Help clients build resilient, high-performance, and intelligent data platforms—driving new revenue streams and deeper customer engagement. 2. Modernize Data Infrastructure with Limitless Scale and Performance Azure’s new Elastic Clusters in Azure Database for PostgreSQL enable organizations to scale their databases horizontally across multiple nodes, supporting virtually unlimited throughput and storage. This means you can help clients build and grow multi-tenant SaaS applications and large-scale analytics solutions without the complexity of manual sharding or the limitations of legacy infrastructure. Azure’s managed service automates shard management, tenant isolation, and cross-node query coordination, freeing up your teams to focus on innovation instead of administration. Business Opportunity: Position your practice as the go-to partner for scalable, future-proof data platforms. Offer migration services, architecture consulting, and managed solutions that leverage Azure’s unique scale-out capabilities. 3. Accelerate Innovation with AI-Ready Databases Azure is leading the way in AI integration for open-source databases. With the PostgreSQL extension for Visual Studio Code and native Microsoft Foundry support, developers can build smarter apps and AI agents leveraging advanced AI capabilities directly in the database. Features like natural language querying, vector search, and seamless Copilot integration mean your clients can unlock new insights and automate processes faster than ever. Business Opportunity: Expand your offerings to include AI-powered analytics, intelligent agent development, and custom Copilot solutions. Help organizations harness their data for real-time decision-making and enhanced customer experiences. 4. Simplify and Accelerate Migrations from Legacy Systems The new AI-assisted Oracle to PostgreSQL migration tool dramatically reduces the effort and risk of moving off expensive, proprietary databases. Integrated into the PostgreSQL extension for VS Code, it automates schema and code conversion, provides inline AI explanations, and ensures secure, context-aware migrations. Business Opportunity: Lead migration projects that deliver rapid ROI. Offer assessment, planning, and execution services to help clients escape legacy costs and embrace open-source flexibility on Azure. 5. Enable Seamless Analytics and Real-Time Insights With support for Parquet in the Azure storage extension for PostgreSQL and Fabric zero-ETL mirroring for Azure Database for MySQL and Azure Database for PostgreSQL, Azure is bridging operational databases and analytics platforms. Business Opportunity: Build solutions that unify data estates, streamline analytics workflows, and deliver actionable intelligence. Position your team as experts in data integration and real-time analytics. 6. Drive Industry-Specific Transformation Ignite 2025 showcases real-world success stories from industries like healthcare (Apollo Hospitals), automotive (GM), and finance (Nasdaq), demonstrating how Azure’s open-source databases power resilient, scalable, and AI-driven solutions. Business Opportunity: Use these case studies to inspire clients in regulated or complex sectors. Offer tailored solutions that meet strict compliance, security, and performance requirements. Why Partners Win with Azure’s Latest Innovations Faster time-to-value: Help clients adopt the latest tech with minimal downtime and risk. Expanded service portfolio: From migration to AI, analytics to managed services, the new capabilities open doors to new revenue streams. Trusted platform: Azure’s enterprise-grade security, compliance, and high availability mean you can deliver solutions with confidence. Ready to help your customers achieve more? Dive deeper into the Ignite 2025 announcements and start building the next generation of intelligent, scalable, and AI-powered solutions on Microsoft Azure. Learn more here: https://ignite.microsoft.com/en-US/home65Views2likes0CommentsRun a SQL Query with Azure Arc
Hi All, In this article, you can find a way to retrieve database permission from all your onboarded databases through Azure Arc. This idea is born from a customer request around maintaining a standard permission set, in a very wide environment (about 1000 SQL Server). This solution is based on Azure Arc, so first you need to onboard your SQL Server to Azure Arc and enable the SQL Server extension. If you want to test Azure Arc in a test environment, you can use the Azure Jumpstart, in this repo you will find ready-to-deploy arm templates the deploy demos environments. The other solution components are an automation account, log analytics and a Data collection rule \ endpoint. Here you can find a little recap of the purpose of each component: Automation account: with this resource you can run and schedule a PowerShell script, and you can also store the credentials securely Log Analytics workspace: here you will create a custom table and store all the data that comes from the script Data collection Endpoint / Data Collection Rule: enable you to open a public endpoint to allow you to ingest collected data on Log analytics workspace In this section you will discover how I composed the six phases of the script: Obtain the bearer token and authenticate on the portal: First of all you need to authenticate on the azure portal to get all the SQL instance and to have to token to send your assessment data to log analytics $tenantId = "XXXXXXXXXXXXXXXXXXXXXXXXXXX" $cred = Get-AutomationPSCredential -Name 'appreg' Connect-AzAccount -ServicePrincipal -Tenant $tenantId -Credential $cred $appId = $cred.UserName $appSecret = $cred.GetNetworkCredential().Password $endpoint_uri = "https://sampleazuremonitorworkspace-weu-a5x6.westeurope-1.ingest.monitor.azure.com" #Logs ingestion URI for the DCR $dcrImmutableId = "dcr-sample2b9f0b27caf54b73bdbd8fa15908238799" #the immutableId property of the DCR object $streamName = "Custom-MyTable" $scope= [System.Web.HttpUtility]::UrlEncode("https://monitor.azure.com//.default") $body = "client_id=$appId&scope=$scope&client_secret=$appSecret&grant_type=client_credentials"; $headers = @{"Content-Type"="application/x-www-form-urlencoded"}; $uri = "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token" $bearerToken = (Invoke-RestMethod -Uri $uri -Method "Post" -Body $body -Headers $headers).access_token Get all the SQL instances: in my example I took all the instances, you can also use a tag to filter some resources, for example if a want to assess only the production environment you can use the tag as a filter $servers = Get-AzResource -ResourceType "Microsoft.AzureArcData/SQLServerInstances" When you have all the SQL instance you can run your t-query to obtain all the permission , remember now we are looking for the permission, but you can use for any query you want or in other situation where you need to run a command on a generic server $SQLCmd = @' Invoke-SQLcmd -ServerInstance . -Query "USE master; BEGIN IF LEFT(CAST(Serverproperty('ProductVersion') AS VARCHAR(1)),1) = '8' begin IF EXISTS (SELECT TOP 1 * FROM tempdb.dbo.sysobjects (nolock) WHERE name LIKE '#TUser%') begin DROP TABLE #TUser end end ELSE begin IF EXISTS (SELECT TOP 1 * FROM tempdb.sys.objects (nolock) WHERE name LIKE '#TUser%') begin DROP TABLE #TUser end end CREATE TABLE #TUser (DBName SYSNAME,[Name] SYSNAME,GroupName SYSNAME NULL,LoginName SYSNAME NULL,default_database_name SYSNAME NULL,default_schema_name VARCHAR(256) NULL,Principal_id INT); IF LEFT(CAST(Serverproperty('ProductVersion') AS VARCHAR(1)),1) = '8' INSERT INTO #TUser EXEC sp_MSForEachdb ' SELECT ''?'' as DBName, u.name As UserName, CASE WHEN (r.uid IS NULL) THEN ''public'' ELSE r.name END AS GroupName, l.name AS LoginName, NULL AS Default_db_Name, NULL as default_Schema_name, u.uid FROM [?].dbo.sysUsers u LEFT JOIN ([?].dbo.sysMembers m JOIN [?].dbo.sysUsers r ON m.groupuid = r.uid) ON m.memberuid = u.uid LEFT JOIN dbo.sysLogins l ON u.sid = l.sid WHERE (u.islogin = 1 OR u.isntname = 1 OR u.isntgroup = 1) and u.name not in (''public'',''dbo'',''guest'') ORDER BY u.name ' ELSE INSERT INTO #TUser EXEC sp_MSforeachdb ' SELECT ''?'', u.name, CASE WHEN (r.principal_id IS NULL) THEN ''public'' ELSE r.name END GroupName, l.name LoginName, l.default_database_name, u.default_schema_name, u.principal_id FROM [?].sys.database_principals u LEFT JOIN ([?].sys.database_role_members m JOIN [?].sys.database_principals r ON m.role_principal_id = r.principal_id) ON m.member_principal_id = u.principal_id LEFT JOIN [?].sys.server_principals l ON u.sid = l.sid WHERE u.TYPE <> ''R'' and u.TYPE <> ''S'' and u.name not in (''public'',''dbo'',''guest'') order by u.name '; SELECT DBName, Name, GroupName,LoginName FROM #TUser where Name not in ('information_schema') and GroupName not in ('public') ORDER BY DBName,[Name],GroupName; DROP TABLE #TUser; END" '@ $command = New-AzConnectedMachineRunCommand -ResourceGroupName "test_query" -MachineName $server1 -Location "westeurope" -RunCommandName "RunCommandName" -SourceScript $SQLCmd In a second, you will receive the output of the command, and you must send it to the log analytics workspace (aka LAW). In this phase, you can also review the output before sending it to LAW, for example, removing some text or filtering some results. In my case, I’m adding the information about the server where the script runs to each record. $array = ($command.InstanceViewOutput -split "r?n" | Where-Object { $.Trim() }) | ForEach-Object { $line = $ -replace '\', '\\' ù$array = $array | Where-Object { $_ -notmatch "DBName,Name,GroupName,LoginName" } | Where-Object {$_ -notmatch "------"} The last phase is designed to send the output to the log analytics workspace using the dce \ dcr. $staticData = @" [{ "TimeGenerated": "$currentTime", "RawData": "$raw", }]"@; $body = $staticData; $headers = @{"Authorization"="Bearer $bearerToken";"Content-Type"="application/json"}; $uri = "$endpoint_uri/dataCollectionRules/$dcrImmutableId/streams/$($streamName)?api-version=2023-01-01" $rest = Invoke-RestMethod -Uri $uri -Method "Post" -Body $body -Headers $headers When the data arrives in log analytics workspace, you can query this data, and you can create a dashboard or why not an alert. Now you will see how you can implement this solution. For the log analytics, dce and dcr, you can follow the official docs: Tutorial: Send data to Azure Monitor Logs with Logs ingestion API (Resource Manager templates) - Azure Monitor | Microsoft Learn After you create the dcr and the log analytics workspace with its custom table. You can proceed with the Automation account. Create an automation account using the creating wizard You can proceed with the default parameter. When the Automation Account creation is completed, you can create a credential in the Automation Account. This allows you to avoid the exposition of the credential used to connect to Azure You can insert here the enterprise application and the key. Now you are ready to create the runbook (basically the script that we will schedule) You can give the name you want and click create. Now go in the automation account than Runbooks and Edit in Portal, you can copy your script or the script in this link. Remember to replace your tenant ID, you will find in Entra ID section and the Enterprise application You can test it using the Test Pane function and when you are ready you can Publish and link a schedule, for example daily at 5am. Remember, today we talked about database permissions, but the scenarios are endless: checking a requirement, deploying a small fix, or removing/adding a configuration — at scale. At the end, as you see, Azure Arc is not only another agent, is a chance to empower every environment (and every other cloud provider 😉) with Azure technology. See you in the next techie adventure. **Disclaimer** The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.Microsoft Finland - Software Developing Companies monthly community series.
Tervetuloa jälleen mukaan Microsoftin webinaarisarjaan teknologiayrityksille! Microsoft Finlandin järjestämä Software Development monthly Community series on webinaarisarja, joka tarjoaa ohjelmistotaloille ajankohtaista tietoa, konkreettisia esimerkkejä ja strategisia näkemyksiä siitä, miten yhteistyö Microsoftin kanssa voi vauhdittaa kasvua ja avata uusia liiketoimintamahdollisuuksia. Sarja on suunnattu kaikenkokoisille ja eri kehitysvaiheissa oleville teknologiayrityksille - startupeista globaaleihin toimijoihin. Jokaisessa jaksossa pureudutaan käytännönläheisesti siihen, miten ohjelmistoyritykset voivat hyödyntää Microsoftin ekosysteemiä, teknologioita ja kumppanuusohjelmia omassa liiketoiminnassaan. Huom. Microsoft Software Developing Companies monthly community webinars -webinaarisarja järjestetään Cloud Champion -sivustolla, josta webinaarit ovat kätevästi saatavilla tallenteina pari tuntia live-lähetyksen jälkeen. Muistathan rekisteröityä Cloud Champion -alustalle ensimmäisellä kerralla, jonka jälkeen pääset aina sisältöön sekä tallenteisiin käsiksi. Pääset rekisteröitymään, "Register now"-kohdasta. Täytä tietosi ja valitse Distributor kohtaan - Other, mikäli et tiedä Microsoft-tukkurianne. Webinaarit: 30.1.2026 klo 09:00-09:30 - Model Context Protocol (MCP)—avoin standardi, joka mullistaa AI-integraatiot Webinaarissa käymme läpi, mikä on Model Context Protocol (MCP), miten se mahdollistaa turvalliset ja skaalautuvat yhteydet AI‑mallien ja ulkoisten järjestelmien välillä ilman räätälöityä koodia, mikä on Microsoftin lähestyminen MCP‑protokollan hyödyntämiseen sekä miten softayritykset voivat hyödyntää MCP‑standardin tarjoamia liiketoimintamahdollisuuksia. Webinaarissa käymme läpi: Mikä MCP on ja miksi se on tärkeä nykyaikaisissa AI‑prosesseissa Kuinka MCP vähentää integraatioiden monimutkaisuutta ja nopeuttaa kehitystä Käytännön esimerkkejä Webiinarin asiaosuus käydään läpi englanniksi. Rekisteröintilinkki: 30.1.2026 klo 09:00-09:30 – Model Context Protocol (MCP)—avoin standardi, joka mullistaa AI-integraatiot – Finland Cloud Champion Asiantuntijat: Massimo Caterino, Kumppaniteknologiastrategisti, Microsoft Europe North Mikko Marttinen, Sr Partner Development Manager, Microsoft Eetu Roponen, Sr Partner Development Manager, Microsoft 12.12. klo 09:00-09:30 - Mitä Suomen Azure-regioona tarkoittaa ohjelmistotaloille? Microsoftin uusi datakeskusalue Suomeen tuo pilvipalvelut lähemmäksi suomalaisia ohjelmistotaloja – olipa kyseessä startup, scaleup tai globaali toimija. Webinaarissa pureudumme siihen, mitä mahdollisuuksia uusi Azure-regioona avaa datan sijainnin, suorituskyvyn, sääntelyn ja asiakasvaatimusten näkökulmasta. Keskustelemme muun muassa: Miten datan paikallinen sijainti tukee asiakasvaatimuksia ja sääntelyä? Mitä hyötyä ohjelmistotaloille on pienemmästä latenssista ja paremmasta suorituskyvystä? Miten Azure-regioona tukee yhteismyyntiä ja skaalautumista Suomessa? Miten valmistautua teknisesti ja kaupallisesti uuden regioonan avaamiseen? Puhujat: Fama Doumbouya, Sales Director, Cloud Infra and Security, Microsoft Mikko Marttinen, Sr Partner Development Manager, Microsoft Eetu Roponen, Sr Partner Development Manager, Microsoft Katso nauhoite: Microsoft Finland – Software Developing Companies Monthly Community Series – Mitä Suomen Azure-regioona tarkoittaa ohjelmistotaloille? – Finland Cloud Champion 28.11. klo 09:00-09:30 - Pilvipalvelut omilla ehdoilla – mitä Microsoftin Sovereign Cloud tarkoittaa ohjelmistotaloille? Yhä useampi ohjelmistotalo kohtaa vaatimuksia datan sijainnista, sääntelyn noudattamisesta ja operatiivisesta kontrollista – erityisesti julkisella sektorilla ja säädellyillä toimialoilla. Tässä webinaarissa pureudumme siihen, miten Microsoftin uusi Sovereign Cloud -tarjonta vastaa näihin tarpeisiin ja mitä mahdollisuuksia se avaa suomalaisille ohjelmistoyrityksille. Keskustelemme muun muassa: Miten Sovereign Public ja Private Cloud eroavat ja mitä ne mahdollistavat? Miten datan hallinta, salaus ja operatiivinen suvereniteetti toteutuvat eurooppalaisessa kontekstissa? Mitä tämä tarkoittaa ohjelmistoyrityksille, jotka rakentavat ratkaisuja julkiselle sektorille tai säädellyille toimialoille? Puhujat: Juha Karppinen, National Security Officer, Microsoft Mikko Marttinen, Sr Partner Development Manager, Microsoft Eetu Roponen, Sr Partner Development Manager, Microsoft Katso nauhoite: Microsoft Finland – Software Developing Companies Monthly Community Series – Pilvipalvelut omilla ehdoilla – mitä Microsoftin Sovereign Cloud tarkoittaa ohjelmistotaloille? – Finland Cloud Champion 31.10. klo 09:00-09:30 - Kasvua ja näkyvyyttä ohjelmistotaloille – hyödynnä ISV Success ja Azure Marketplace rewards -ohjelmia Tässä webinaarissa pureudumme ohjelmistotaloille suunnattuihin Microsoftin keskeisiin kiihdytinohjelmiin, jotka tukevat kasvua, skaalautuvuutta ja kansainvälistä näkyvyyttä. Käymme läpi, miten ISV Success -ohjelma tarjoaa teknistä ja kaupallista tukea ohjelmistoyrityksille eri kehitysvaiheissa, ja miten Azure Marketplace toimii tehokkaana myyntikanavana uusien asiakkaiden tavoittamiseen. Lisäksi esittelemme Marketplace Rewards -edut, jotka tukevat markkinointia, yhteismyyntiä ja asiakashankintaa Microsoftin ekosysteemissä. Webinaari tarjoaa: Konkreettisia esimerkkejä ohjelmien hyödyistä Käytännön vinkkejä ohjelmiin liittymiseen ja hyödyntämiseen Näkemyksiä siitä, miten ohjelmistotalot voivat linjata strategiansa Microsoftin tarjoamiin mahdollisuuksiin Puhujat: Mikko Marttinen, Sr Partner Development Manager, Microsoft Eetu Roponen, Sr Partner Development Manager, Microsoft Nauhoite: Microsoft Finland – Software Developing Companies Monthly Community Series – Kasvua ja näkyvyyttä ohjelmistotaloille – hyödynnä ISV Success ja Azure Marketplace rewards -ohjelmia – Finland Cloud Champion 3.10. klo 09:00-09:30 - Autonomiset ratkaisut ohjelmistotaloille – Azure AI Foundry ja agenttiteknologioiden uudet mahdollisuudet Agenttiteknologiat mullistavat tapaa, jolla ohjelmistotalot voivat rakentaa älykkäitä ja skaalautuvia ratkaisuja. Tässä webinaarissa tutustumme siihen, miten Azure AI Foundry tarjoaa kehittäjille ja tuoteomistajille työkalut autonomisten agenttien rakentamiseen – mahdollistaen monimutkaisten prosessien automatisoinnin ja uudenlaisen asiakasarvon tuottamisen. Kuulet mm. Miten agenttiteknologiat muuttavat ohjelmistokehitystä ja liiketoimintaa. Miten Azure AI Foundry tukee agenttien suunnittelua, kehitystä ja käyttöönottoa. Miten ohjelmistotalot voivat hyödyntää agentteja kilpailuetuna. Puhujat: Juha Karvonen, Sr Partner Tech Strategist Mikko Marttinen, Sr Partner Development Manager, Microsoft Eetu Roponen, Sr Partner Development Manager, Microsoft Katso nauhoite täältä: Microsoft Finland – Software Developing Companies Monthly Community Series – Autonomiset ratkaisut ohjelmistotaloille – Azure AI Foundry ja agenttiteknologioiden uudet mahdollisuudet – Finland Cloud Champion 5.9.2025 klo 09:00-09:30 - Teknologiayritysten ja Microsoftin prioriteetit syksylle 2025. Tervetuloa jälleen mukaan Microsoftin webinaarisarjaan teknologiayrityksille! Jatkamme sarjassa kuukausittain pureutumista siihen, miten yhteistyö Microsoftin kanssa voi vauhdittaa kasvua ja avata uusia mahdollisuuksia eri vaiheissa oleville ohjelmistotaloille – olipa yritys sitten start-up, scale-up tai globaalia toimintaa harjoittava. Jokaisessa jaksossa jaamme konkreettisia esimerkkejä, näkemyksiä ja strategioita, jotka tukevat teknologia-alan yritysten liiketoiminnan kehitystä ja innovaatioita. Elokuun lopun jaksossa keskitymme syksyn 2025 prioriteetteihin ja uusiin mahdollisuuksiin, jotka tukevat ohjelmistoyritysten oman toiminnan suunnittelua, kehittämistä ja kasvun vauhdittamista. Käymme läpi, mitkä ovat Microsoftin strategiset painopisteet tulevalle tilikaudelle – ja ennen kaikkea, miten ohjelmistotalot voivat hyödyntää niitä omassa liiketoiminnassaan. Tavoitteena on tarjota kuulijoille selkeä ymmärrys siitä, miten oma tuote, palvelu tai markkinastrategia voidaan linjata ekosysteemin kehityksen kanssa, ja miten Microsoft voi tukea tätä matkaa konkreettisin keinoin. Puhujat: Mikko Marttinen, Sr Partner Development Manager, Microsoft Eetu Roponen, Sr Partner Development Manager, Microsoft Katso nauhoitus täältä: Teknologiayritysten ja Microsoftin prioriteetit syksylle 2025. – Finland Cloud Champion321Views0likes0CommentsMicrosoft and BlackRock unveil a $30 billion fund for AI infrastructure development.
The collaboration reflects growing interest from major tech and investment firms in capitalizing on AI's potential. The fund will focus on improving AI's infrastructure, making it more scalable and robust for future innovations. How will Microsoft's cloud services, such as Azure, contribute to supporting the AI infrastructure?691Views1like1Comment