Forum Widgets
Latest Discussions
Managing number of messages delivered by Services Bus Queue
Hi all, I would like to react to Blob storage events trough a webhook by using Event Grid. To handle all these events, I would like to introduce between the event publisher (Blob Storage) and the event handler (Webhook) a Service Bus queue to manage the message delivery. The event handler can only execute up to 80 messages in parallel. I am thinking about two options: Would it be possible to notify the Service Bus queue once the process is successfully completed on the event handler side to deliver the next message? Delivering the queue in batch of up to 80 messages and wait for the average time the event handler needs to complete the process to delivery the next batch (it won't be ideal). Would it be possible? Any help or another approach is very well appreciated 🙂 Thanks.salvamoMar 23, 2024Copper Contributor414Views0likes1CommentJMS 2.0 based apps on Azure Service Bus Standard Tier
Hello Everyone, I have an app hosted on Azure which uses Azure Service Bus Premium Tier, I need to downgrade the subscription to Standard Tier, to cutdown on expenses as it's a major concern, also it fits my application load. My concern is, my app is based on JMS API 2.0, can I migrate from Premium to Standard? Is JMS API 2.0 supported by Standard Tier? If not, please suggest any solutions or workaround. ThanksmadanmrMar 13, 2024Copper Contributor241Views0likes0CommentsEasiest way to handle ordering on Event Hub on Blob Containers
I am looking for an easy solution for ordering events. I have an Azure blob storage where an external service push files into a container with different subfolders. I want to respond to event by creating a handler on Event Hub on: Create blob for those files, but those files should be processed in a specific order, so files in folder-1 should be processed before folder-2 and then folder-3. The problem is the external service just upload files in whatever order, and those files might be big. So the file in folder-3 might load first, and then file in folder-1 which might be big so it might take 40 seconds to upload, and then file in folder-2. What the easiest way (without writing code if possible)? Is there a way in Event Hub , or any other Azure service (if possible without writing code) to send to the subscriber in a custom order?gkarwchanNov 07, 2023Copper Contributor322Views0likes1CommentCreating Dashboards using KQL in Grafana
I want to create a dashboard in Grafana using KQL to see number of Incoming & Outgoing messages on each topics of EventHUbs. i tried using KubePodInventory & ContainerLog tables in Log Analytics but i couldnot find anything related to messages .HimanshuShekharSep 01, 2023Copper Contributor340Views0likes0CommentsService Bus: Entra ID User Registration with Service Bus and MS Graph Parts 1 & 2
Azure Service Bus is a powerful messaging solution with a lot of applications for us to deploy! Let's explore how we can integrate Service Bus with MS Graph and import our users into Entra ID with lightning fast speed! Part 1 : The Terraform files What we need: An Azure Subscription VSCode or your favorite editor Terraform Docker Let’s Encrypt Certificate First things first, create a Service Principal so Terraform can authenticate to Azure az ad sp create-for-rbac --n tform --role Contributor --scopes /subscriptions/00000000-0000-0000-0000-000000000000 We are going to need : providers.tf variables.tf terraform.tfvars main.tf providers.tf terraform { required_providers { azurerm = { source = "hashicorp/azurerm" version = "3.48.0" } } } provider "azurerm" { features { key_vault { purge_soft_delete_on_destroy = true recover_soft_deleted_key_vaults = true } } subscription_id = var.azure_subidid tenant_id = var.azure_tenantid client_id = var.azure_clientid client_secret = var.azure_spsecret } variables.tf variable "azure_spsecret" { description = "SP" type = string sensitive = true } variable "azure_clientid" { description = "AppID" type = string } variable "azure_subid" { description = "SubscriptionID" type = string } variable "azure_tenantid" { description = "TenantID" type = string } variable "azure_user" { description = "Azure Portal User" type = "string" } terraform.tfvars #Azure SP Secret azure_spsecret = "xxxxx" #Azure ClientID azure_clientid = "xxxxxx-xxxx" #Azure SubsctiptionID azure_subid = "xxxxxxxxxx-xxx" #Azure TenantID azure_tenantid = "xxxxx-xxxxx" #Azure User azure_user = "xxx-xxx-xxx" main.tf resource "azurerm_resource_group" "rgroup" { name = "rg-app" location = "West Europe" } resource "random_string" "str-name" { length = 5 upper = false numeric = false lower = true special = false } resource "azurerm_storage_account" "storage" { name = "st${random_string.str-name.result}01" resource_group_name = azurerm_resource_group.rgroup.name location = azurerm_resource_group.rgroup.location account_tier = "Standard" account_replication_type = "LRS" } resource "azurerm_log_analytics_workspace" "logs" { name = "Logskp" location = azurerm_resource_group.rgroup.location resource_group_name = azurerm_resource_group.rgroup.name sku = "PerGB2018" retention_in_days = 30 } resource "azurerm_application_insights" "appinsights" { name = "funcinsights" location = azurerm_resource_group.rgroup.location resource_group_name = azurerm_resource_group.rgroup.name workspace_id = azurerm_log_analytics_workspace.logs.id application_type = "Node.JS" } output "instrumentation_key" { value = azurerm_application_insights.appinsights.instrumentation_key sensitive = true } output "connection_string" { value = azurerm_application_insights.appinsights.connection_string sensitive = true } resource "azurerm_service_plan" "appsrv" { name = "aplan-${random_string.str-name.result}" location = azurerm_resource_group.rgroup.location resource_group_name = azurerm_resource_group.rgroup.name os_type = "Linux" sku_name = "B1" } resource "azurerm_linux_function_app" "funcapp" { name = "fnc${random_string.str-name.result}" location = azurerm_resource_group.rgroup.location resource_group_name = azurerm_resource_group.rgroup.name service_plan_id = azurerm_service_plan.appsrv.id storage_account_name = azurerm_storage_account.storage.name storage_account_access_key = azurerm_storage_account.storage.primary_access_key functions_extension_version = "~4" app_settings = { "WEBSITE_RUN_FROM_PACKAGE" = "1" "FUNCTIONS_WORKER_RUNTIME" = "node" "APPLICATIONINSIGHTS_CONNECTION_STRING" = azurerm_application_insights.appinsights.connection_string "APPINSIGHTS_INSTRUMENTATIONKEY" = azurerm_application_insights.appinsights.instrumentation_key } identity { type = "SystemAssigned" } site_config { application_stack { node_version = "18" } cors { allowed_origins = ["*"] } } } resource "azurerm_resource_group" "rgsbus" { name = "rg-sbus" location = "West Europe" } resource "azurerm_servicebus_namespace" "sbus" { name = "kpsbus01" location = azurerm_resource_group.rgsbus.location resource_group_name = azurerm_resource_group.rgsbus.name sku = "Standard" identity { type = "SystemAssigned" } } resource "azurerm_servicebus_queue" "squeue" { name = "sbusqueue" namespace_id = azurerm_servicebus_namespace.sbus.id enable_partitioning = true } # Create a KeyVault data "azurerm_client_config" "current" {} resource "azurerm_key_vault" "kv1" { name = "kvk${random_string.str-name.result}2" location = azurerm_resource_group.rgsbus.location resource_group_name = azurerm_resource_group.rgsbus.name tenant_id = data.azurerm_client_config.current.tenant_id sku_name = "standard" } resource "azurerm_key_vault_access_policy" "kvpolicy" { key_vault_id = azurerm_key_vault.kv1.id tenant_id = data.azurerm_client_config.current.tenant_id object_id = azurerm_linux_function_app.funcapp.identity[0].principal_id secret_permissions = [ "Get", ] } resource "azurerm_key_vault_access_policy" "worker_access_policy" { key_vault_id = azurerm_key_vault.kv1.id tenant_id = data.azurerm_client_config.current.tenant_id object_id = data.azurerm_client_config.current.object_id key_permissions = [ "Create", "Get" ] secret_permissions = [ "Set", "Get", "Delete", "Purge", "Recover" ] } resource "azurerm_key_vault_access_policy" "user_access_policy" { key_vault_id = azurerm_key_vault.kv1.id tenant_id = data.azurerm_client_config.current.tenant_id object_id = var.azure_user secret_permissions = [ "List", "Set", "Get", "Purge", "Recover", "Delete", "Backup", "Restore" ] } resource "azurerm_container_app_environment" "cappenv" { name = "contEnvironment" location = azurerm_resource_group.rgsbus.location resource_group_name = azurerm_resource_group.rgsbus.name log_analytics_workspace_id = azurerm_log_analytics_workspace.logs.id } resource "azurerm_container_app" "capp" { name = "c${random_string.str-name.result}001" container_app_environment_id = azurerm_container_app_environment.cappenv.id resource_group_name = azurerm_resource_group.rgsbus.name revision_mode = "Single" template { max_replicas = 5 min_replicas = 1 container { name = "webreg01" image = "docker.io/kpassadis/webreg01:v2" cpu = 1.0 memory = "2Gi" } } ingress { allow_insecure_connections = "false" external_enabled = "true" target_port = 80 traffic_weight { percentage = "100" latest_revision = true } } } At this point we have created two resource groups with all the required resources : Log Analytics Workspace with Application Insights Function App with an App Service Plan (Linux) and Storage Account Service Bus Queue Key Vault Container App with a Docker App ( Simple HTML to POST the HTTP Trigger) Now, i will make quick reference to the Docker Application. All we need is a Docker File and we can push it to Docker Hub and later call it directly from Container Apps. So here are the HTML container app elements (index.html ,style.css ,Dockerfile) : index.html <!DOCTYPE html> <html> <head> <link rel="stylesheet" type="text/css" href="style.css"> <style> #message { margin-top: 20px; padding: 10px; background-color: lightgray; border-radius: 5px; text-align: center; font-weight: bold; display: none; } /* added CSS */ .center { position: absolute; top: 50%; left: 50%; transform: translate(-50%, -50%); } </style> <script> async function sendMessage(event) { event.preventDefault(); const firstname = document.getElementById("firstname").value; const lastname = document.getElementById("lastname").value; const nickname = document.getElementById("nickname").value; const userData = { firstname: firstname, lastname: lastname, nickname: nickname }; const response = await fetch("https://xxxxx.azurewebsites.net/api/xxxxx", { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify(userData) }); const result = await response.json(); document.getElementById('message').innerText = result.message; } </script> </head> <body> <div class="container"> <form method="post" action="https://xxxxx.azurewebsites.net/api/submit-form" onsubmit="sendMessage(event)"> <label for="firstname">First Name</label> <input type="text" id="firstname" name="firstname" required> <label for="lastname">Last Name</label> <input type="text" id="lastname" name="lastname" required> <label for="nickname">Nickname</label> <input type="text" id="nickname" name="nickname" required> <button type="submit">Submit</button> </form> </div> <div id="message"></div> </body> </html> style.css body { font-family: Arial, sans-serif; background-color: #cce6ff; display: flex; justify-content: center; align-items: center; height: 100vh; margin: 0; } .container { background-color: #0073e6; padding: 2rem; border-radius: 5px; box-shadow: 0 0 10px rgba(0, 0, 0, 0.15); width: 400px; } form { display: flex; flex-direction: column; } label { font-weight: bold; margin-bottom: 0.5rem; } input { margin-bottom: 1rem; padding: 0.5rem; border: 1px solid #ccc; border-radius: 3px; } button { padding: 0.5rem 1rem; background-color: #4CAF50; color: white; border: none; border-radius: 3px; cursor: pointer; } button:hover { background-color: #45a049; } And here is our simple Dockerfile : FROM nginx:stable-alpine COPY . /usr/share/nginx/html docker build -t myapp:v1 . docker login docker push myusername/myapp:v1 I won’t dive deeper into Docker but it is that simple! You can validate also with a local run {docker run –name test-container -p 8080:80 -d myusername/myapp:v1}, and all documentation is available at Docker Get Started. Now prepare a Let’s Encrypt certificate, and let's go to Part 2! Part 2 : The Configuration We have 2 Resource Groups with Log Analytics , Application Insights , Key Vault , Service Bus and Function Apps. It is time to deploy our Function App Triggers, and take the URL to add to our Docker Image. From VSCode make sure you have the latest version of azure-functions-core-tools. Install required packages : npm install azure/service-bus npm install Pernille-Eskebo/microsoft-graph-client azure/msal-node Create a new HTTP Trigger with the following details : index.js const { ServiceBusClient } = require("@azure/service-bus"); module.exports = async function (context, req) { context.log('Sending message to Azure Service Bus'); const userData = req.body; if (userData) { const connectionString = process.env["ServiceBusConnectionString"]; const queueName = "sbusqueue"; const sbClient = new ServiceBusClient(connectionString); const sender = sbClient.createSender(queueName); const message = { body: JSON.stringify(userData), contentType: "application/json" }; await sender.sendMessages(message); await sender.close(); await sbClient.close(); /* context.res = { status: 200, body: "Message sent to Service Bus successfully." }; */ context.res = { status: 200, body: { message: 'User registration submitted successfully!' } }; } else { context.res = { status: 400, body: { message: 'User registration was NOT submitted successfully!' } }; } }; and function.json for our bindings: { "bindings": [ { "authLevel": "anonymous", "type": "httpTrigger", "direction": "in", "name": "req", "methods": [ "post", "get" ], "route": "submit-form" }, { "type": "http", "direction": "out", "name": "$return" }, { "type": "serviceBus", "direction": "out", "connection": "ServiceBusConnectionString", "name": "outputSbMsg", "queueName": "sbusqueue" } ] } Set the Function App's System Assigned Managed Identity as a Service Bus Data Owner on the Service Bus resource. Add the Service Bus Connection String Setting from Service Bus Shared Access Policies . You can use the Root connection or create a new one, and create the Function Configuration Setting with one of the Connection Strings. Now, go ahead and update the Docker container with the provided URL from the trigger, into the HTML of the index.html file and push the updated image! Remember to save your files on VSCode! docker build -t myusername/myapp:v3 . docker push myusername/myapp:v3 Next we need to create our Function Trigger to import the user into Azure Active Directory with MS Graph. First of all, authentication ! We will create a Service Principal with MS Graph API Permissions that will allow it to write into Azure AD. The procedure is simple , use the az ad sp create-for-rbac command , or add a new Application Registration and create a secret. Add the MS Graph API Permissions: Directory.ReadWrite.All (application) User.ReadWrite.All (application) We will store the Application (Service Principal) Details into our Key Vault so lets do this Create 3 new entries into Key Vault as Secrets , the Service Principal Object ID, Tenant ID and the secret. The name of the values should be kept for we need them in our code. Observe that our Function App has already the GET Secrets Access Policy in place from the terraform configuration script. Install : npm install azure/identity and npm install azure/keyvault-secrets Now create a new Trigger for our Function, a Service Bus Trigger with the following code : index.js const { ConfidentialClientApplication } = require('@azure/msal-node'); const { Client } = require('@microsoft/microsoft-graph-client'); const { DefaultAzureCredential } = require('@azure/identity'); const { SecretClient } = require('@azure/keyvault-secrets'); async function getKeyVaultSecret(keyVaultUrl, secretName) { const credential = new DefaultAzureCredential(); const secretClient = new SecretClient(keyVaultUrl, credential); const secret = await secretClient.getSecret(secretName); return secret.value; } module.exports = async function (context, myQueueItem) { context.log('JavaScript ServiceBus queue trigger function processed message', myQueueItem); const userData = JSON.parse(myQueueItem); const firstName = userData.firstname; const lastName = userData.lastname; const nickname = userData.nickname; const domainSuffix = 'example.com'; // Replace 'example.com' with your desired domain suffix // Set up authentication // In case you dont want KeyVault // const clientId = 'xxxxxxxxxxx'; // const clientSecret = 'xxxxxxxxxxxxxxx'; // const tenantId = 'xxxxxxxxxxxxxxxx'; // ----- // Replace with your Key Vault URL const keyVaultUrl = 'https://<your_key_vault_name>.vault.azure.net/'; const clientId = await getKeyVaultSecret(keyVaultUrl, 'appRegistrationClientId'); const clientSecret = await getKeyVaultSecret(keyVaultUrl, 'appRegistrationClientSecret'); const tenantId = await getKeyVaultSecret(keyVaultUrl, 'appRegistrationTenantId'); const config = { auth: { clientId: clientId, authority: `https://login.microsoftonline.com/${tenantId}`, clientSecret: clientSecret } }; const app = new ConfidentialClientApplication(config); // Acquire token const tokenRequest = { scopes: ['https://graph.microsoft.com/.default'] }; const authResult = await app.acquireTokenByClientCredential(tokenRequest); const accessToken = authResult.accessToken; // Set up Graph client const client = Client.init({ authProvider: (done) => { done(null, accessToken); } }); // Create a new user in Azure AD const newUser = { accountEnabled: true, displayName: `${firstName} ${lastName}`, mailNickname: nickname, userPrincipalName: `${nickname}@${domainSuffix}`, passwordProfile: { forceChangePasswordNextSignIn: true, password: 'mYComp@2022!@' } }; const createdUser = await client.api('/users').post(newUser); context.log(`User created with ID: ${createdUser.id}`); }; and function.json : { "bindings": [ { "name": "myQueueItem", "type": "serviceBusTrigger", "direction": "in", "queueName": "sbusqueue", "connection": "ServiceBusConnectionString" } ] } The final touch is to add our own custom domain! So from the Azure Portal , Container Apps -Settings we add a custom domain: Write the domain and you will be preented the option to Upload the Certbot certificate ( or any PFX Certificate for this domain): Proceed with the Validation steps by adding the TXT and CNAME records and that’s it ! Watch how fast the users are created in Entra ID (Azure AD) ! The Solution is a sample and needs additional features, but is a good example of Service Bus and how Azure Integration can help us create literally anything we want ! Final Thoughts Integration is key for Cloud Services and Azure is a perfect example of how Integration can help us create unique solutions with ease. Azure Service Bus along with Container Apps and Function Apps helped us create a Web App, that can register users into Azure AD fast and reliable.KonstantinosPassadisAug 27, 2023Learn Expert749Views0likes0CommentsHow to add a complete Azure SQL Database while creating NEW Azure Cognitive Search Service Resource?
Hello All, I am trying to create a MIS (Management Information Systems) Chatbot which will give answers to natural languages pertaining to business driven KPIs which will be formed using aggregated/summary/grouped data to drive conclusions on business-critical decision making and it is expected that many such KPI data asked using natural languages (instead of developer friendly complicated SQL) would be derived by joining multiple tables. As pe the present process to create a Semantic Search capable chatbot using Azure OpenAI I first need to create an Azure Cognitive Search Service and there in the "Test Connection" button I have to choose just ONE database object be it a table/view etc. Is there a way to choose all the database objects that are there in the connected database? I am also thinking of using multiple custom views each specific for specific aggregation addressing KPI definitions. The point is my custom join SQLs will be saved as views and I want the chatbot to deliver the answer of these views while someone asks the question in a Natural Language in Azure Open AI chat-box (as if someone has asked the connected database the said SQL query whose answers are saved as the view to be shown back to the user)? Important: I don't want to create a just one custom view to join tables as that would NOT solve my MIS KPI addressing use case. Please assist.!Arya_DeyAug 09, 2023Copper Contributor266Views0likes0CommentsNotification Hub design for a single application
I have an Azure Notification hub in a namespace for sending notifications to a mobile app. If I go with a basic tier initially and later if the active device count increases, is it better to upgrade to another higher plan or can we create another namespace for the same Mobile app.If we create 2 Namespaces for a single app, do we need to explicitly manage the sending of notification or the notification hub will do it automatically, as we have few users in 1 namespace and few users users in another namespace for the same Mobile app will it be a good approach?Krishnas12Nov 04, 2022Copper Contributor649Views0likes1CommentFailed to remove group membership
I am using a group in AAD to assign licenses to my Conference room room mailbox account. from last few days when i am trying to remove the particular group from a room account, i am getting below error. Failed to remove group membership Unable to complete due to service connection error. Please suggestAshka123Apr 03, 2022Brass Contributor1.7KViews1like1CommentHow to publish user-defined topic to IoT hub?
I am new to azure ecosystem. I followed some examples and documents and able to connect to the Azure IoT Hub and able to send data. The data is being sent on device/<deviceID>/messages/events topic(telemetry message) topic. From same application, I am not able to publish the message on custom topic(user-defined). Can you please help me if I am missing any additional configuration/settings for Azure IoT?Inderveer1060Feb 18, 2022Copper Contributor802Views0likes0CommentsBest practices for Azure Event Hubs - multiple subscriptions
We have a customer who has 36 subscriptions and only one is being sent to an on-prem SIEM(LogRhythm). The question was asked if you need a separate EH for each subscription or what is the best method to get all subscriptions sent to LogRhythm? Cost is a major factor in our customers decision. Thanks you for this opportunity. Cheers, SergiosnteranApr 21, 2021Copper Contributor3.8KViews0likes2Comments