User Profile
LukeJMadden
Brass Contributor
Joined 3 years ago
User Widgets
Recent Discussions
Re: Selecting complex fields conditionally
Hello dipeshnepal , To conditionally select only those collection fields that match a certain criteria, you can use the "filter" parameter in the search request to filter the documents based on your criteria, and then use the "select" parameter to specify the fields to return. In your example, you are filtering the documents based on the Network ID of 5. To return only the Networks collection that match this criteria, you can use the following "select" parameter: "select": "First-name, Last Name, Networks/Network ID, Networks/Network Name" This will return only the first name, last name, and Networks collection fields that match the filter criteria, which is Network ID of 5. I hope this helps! Let me know if you have any further questions. Kind Regards, Luke Madden. "Simplifying Tech, Empowering You."425Views0likes0CommentsRe: Azure Synapse Serverless pool connection issues/ query timeout with DataLake in BR SOUTH
Hello, ismaelhenzel Let's see if we can help you. It sounds like you are experiencing issues with the connection between your server-less SQL built-in pool and your data lake in BR SOUTH. Specifically, you are encountering query timeouts and cancellations when attempting to query a file stored within your data lake. This issue only seems to occur when using a private connection, and you have noticed similar problems among other teams that use Synapse in BR SOUTH. One thing to note is that this is not a performance issue, as you have experienced these issues even with small tables using a select top 1 without other queries running. Based on the information provided, it is difficult to determine the root cause of the issue. However, you may want to consider the following steps: Verify that your network connectivity is stable and not experiencing any issues that may affect the connection between your Synapse and data lake instances. Check if there are any known issues or outages with Synapse in your region. You can do this by checking the Azure Status page. Consider reaching out to Azure Support for further assistance with troubleshooting this issue. I hope this helps! Kind Regards, Luke Madden "Simplifying Tech, Empowering you."3.2KViews0likes0CommentsRe: Azure Synapse - Serverless Sql Pool block querys when start to sync with hive metastore using delta
Hi ismaelhenzel, It sounds like you are experiencing an issue with Azure Synapse Serverless SQL Pool query performance when syncing with Hive metastore using Delta. Here are some suggestions that might help: Check the resource usage of your Synapse Serverless SQL Pool to ensure that it has enough resources to handle the load. You can scale up the pool if necessary. Check if there are any specific tables that are causing the issue. If so, try optimizing the schema or partitioning the table. Check if there are any slow-running queries that might be causing the issue. You can use the query store feature in Synapse to identify and optimize slow-running queries. Ensure that your Delta Lake version is compatible with Synapse Serverless SQL Pool. Consider using a dedicated SQL Pool instead of a Serverless SQL Pool if you require consistent performance. Contact Microsoft Azure support for further assistance if the issue persists. I hope this helps. Let me know if you have any other questions. Kind regards, Luke Madden "Simplifying Tech, Empowering you."1.4KViews0likes0CommentsRe: HP ALM to Azure DevOps Migration
Hello Surabhi, There are multiple ways to migrate data from HP ALM to Azure DevOps, depending on the complexity and size of the data. Here are a few options: Azure DevOps Data Migration Tool: Microsoft provides a data migration tool for migrating data from various sources to Azure DevOps, including HP ALM. The tool supports migrating requirements, test cases, test runs, defects, and attachments. You can download the tool from the Azure DevOps Marketplace and follow the instructions provided in the documentation. Custom Scripts: You can also create custom scripts to extract data from HP ALM and load it into Azure DevOps. This method requires a good understanding of both HP ALM and Azure DevOps data structures and APIs. You can use HP ALM REST API or OTA API to extract data from HP ALM and Azure DevOps REST API or SDK to load data into Azure DevOps. Third-Party Migration Tools: There are multiple third-party tools available in the market that specialize in data migration from HP ALM to Azure DevOps. These tools provide a more comprehensive and automated approach to data migration and can handle complex data structures and relationships. Some popular tools are OpsHub, Tasktop, and Kovair. Regardless of the migration method, it is important to perform thorough testing and validation of the migrated data to ensure that all data has been migrated correctly and without any data loss. It is also recommended to involve all stakeholders, including developers, testers, and project managers, in the migration process to ensure a smooth transition. Hope this helps Mate!, Kind regards, Luke Madden5.6KViews1like1CommentRe: MS Access Front End and Azure DB Backend
Arun_Jagarlapudi Great question! Hopefully I can help. Yes, it is possible to maintain a front-end MS Access application while maintaining the DB in an Azure SQL. To point an MS Access database to an Azure SQL database, you will need to create a linked table in Access. To do this, you will need to have the appropriate permissions in the Azure SQL database. The steps to create a linked table in Access are as follows: Open the Access database that you want to link to Azure SQL. Click on the External Data tab and then click on the ODBC Database button. In the Get External Data wizard, choose the option to link to the data source by creating a linked table. Choose the option to connect to a data source by creating a new data source and click on the Next button. Choose the driver that corresponds to your version of Azure SQL and click on the Next button. Enter the server name and database name for your Azure SQL database, as well as your login credentials. You may also need to specify additional connection options depending on your specific configuration. Click on the Test Connection button to ensure that the connection is working properly. Once the connection is established, you will be presented with a list of tables that you can link to. Choose the tables that you want to link to and click on the Finish button. Yes, it is possible to change the connection setting in MS Access to point to another database instead of the MS Access inbuilt DB. To do this, you will need to modify the connection string for the linked table in Access. The steps to do this are as follows: Open the Access database that contains the linked table. Click on the Linked Table Manager button in the External Data tab. Select the linked table that you want to modify and click on the Edit button. In the Connection tab, modify the connection string to point to the new database. The format of the connection string will depend on the specific database that you are linking to. Click on the Test Connection button to ensure that the new connection string is working properly. Once you have verified that the connection is working, click on the OK button to save your changes. Note that if you have multiple linked tables in your Access database, you will need to repeat these steps for each table. I have not used MS Access for a while but this should work, let me know if you have any problems. Kind regrads, Luke Madden4.6KViews0likes2CommentsRe: Fetch data into app(online site) from Office 365 client Active directories and vice versa.
Hello Naviwalia, To fetch data from client Office 365 active directory to your site, you can use Microsoft Graph API, which allows you to access data from multiple Microsoft cloud services, including Office 365. You can use Graph API to retrieve user information, groups, and other directory-related data. To use Graph API, you will need to register your application in Azure Active Directory (Azure AD) and obtain an access token. You can then use the access token to make API requests to Graph API. Here are the high-level steps you can follow to fetch data from Office 365 active directory: 1- Register your application in Azure AD and obtain an access token. 2- Use Graph API to retrieve user information, groups, or other directory-related data. 3- Map the retrieved data to your application's data model. 4- Display the data on your application's pages. To push data from your site to the client's Office 365 active directory, you can use Microsoft Graph API or Azure AD Graph API. These APIs provide methods to create, update, or delete directory data, such as users, groups, and applications. Here are the high-level steps you can follow to push data to Office 365 active directory: 1- Register your application in Azure AD and obtain an access token with the necessary permissions to create, update, or delete directory data. 2- Use Graph API or Azure AD Graph API to create, update, or delete directory data. 3- Map your application's data to the directory data model. 4- Push the data to the client's Office 365 active directory. Overall, the approach will depend on the specific requirements of your application and the data you need to fetch or push. You can refer to the Microsoft Graph API documentation (https://learn.microsoft.com/en-us/graph/) or Azure AD Graph API documentation (https://learn.microsoft.com/en-us/graph/azuread-identity-access-management-concept-overview) for more information on the available APIs and methods. Hope that helps, Luke2.4KViews0likes0CommentsRe: Upgrade from S2 to P2V2 - Downtime IP change
Hi Ben, This is defiantly not my subject of expertise, so take my comments with a tablespoon of salt. When upgrading from S2 to P2V2 in Azure, the IP address of your instance will indeed change. This will not necessarily kill your websites immediately, but there may be some lead time before propagation occurs. It is difficult to predict exactly how long it will take for the IP address change to propagate, as this can depend on a variety of factors such as DNS caching by ISPs and other factors. However, it is generally recommended to plan for some downtime during an IP address change. You may want to consider setting a short TTL on your DNS records ahead of time, which can help to minimize the propagation delay. Additionally, it may be helpful to communicate the planned downtime to your users in advance, to minimize any potential impact. I hope this helps!, might be something worth talking to a Microsoft SME about. Kind regards, Luke Madden811Views0likes0CommentsRe: Azure landing Zone With Application GW and Azure Firewall
Hello Ganesh, In your proposed landing zone, if the web server traffic needs to go to the internet, it would be routed through the Azure Firewall in the hub. You can use Azure Firewall's network rules to allow outbound traffic from the web servers in the spoke to the internet via the Azure Firewall in the hub. In addition, you can use application rules to allow or deny specific traffic based on FQDN, IP address range, port, and protocol. As for best practices, it depends on your specific requirements and the services you are using. However, some general best practices for landing zones in Azure include: Using a hub-and-spoke topology to isolate critical services and resources from each other. Deploying infrastructure as code using tools such as Azure Resource Manager (ARM) templates or Terraform. Implementing security controls such as network security groups (NSGs), Azure Firewall, Azure DDoS Protection, and Azure Security Center. Using Azure Monitor and Azure Log Analytics to monitor and analyze resource health and performance. Implementing identity and access management (IAM) controls such as RBAC and Azure AD. I hope this helps! Let me know if you have any further questions. Cheers, Luke Madden1.4KViews0likes0CommentsRe: Azure Analysis Services Memory Management
Hi Kelly, When you clear and rebuild an Azure Analysis Services instance, all of the data is loaded into memory. The amount of memory used by the instance will depend on the size of the data, the complexity of the model, and the number of queries being executed. Pausing the instance releases all of the memory used by the instance, so it is not surprising that restarting the instance after a clear and full rebuild results in a lower memory usage. However, it is not generally recommended to pause and restart an Azure Analysis Services instance as it can impact user performance and data. Instead, you can consider using the scale-up or scale-out options in Azure Analysis Services to manage memory usage. Scale-up allows you to increase the resources allocated to the instance, while scale-out allows you to add more instances to the server. Additionally, you can monitor memory usage using Azure Monitor to determine if additional resources are needed. I hope this helps! Let me know if you have any further questions. Kind regards, Luke Madden1.3KViews0likes0CommentsRe: Hybrid IAM with O365 and AWS
Hey mate, You have an interesting scenario. I am not an SME on the subject but here are some suggestions: Yes, you can sync a single AD source through Azure AD Connect to multiple O365 Azure AD tenants. However, each user must have a unique User Principal Name (UPN) across all the tenants. You can achieve this by using a different domain suffix for each tenant in the UPN. For example, email address removed for privacy reasons, email address removed for privacy reasons, email address removed for privacy reasons. If you create new IDs in the local AD and sync them, existing O365 IDs for users in Azure AD will remain as they are. The new IDs will be created in Azure AD as new user accounts. If you want to merge the existing O365 IDs with the new IDs, you can use a tool like Azure AD Connect to match them based on a common attribute, such as the email address. Yes, you can set up a federation between AWS SaaS services and local AD using ADFS. You can also use Azure AD as the identity provider for AWS SaaS applications. To achieve this, you need to configure AWS SaaS applications to trust Azure AD as the identity provider and configure Azure AD to federate with your local AD using ADFS. I hope this helps. Let me know if you have any further questions. Kind regards, Luke Madden771Views0likes0CommentsRe: azure ad B2C + multi tenancy
Hi wode, I'll try my best to answer your questions regarding multi-tenancy using Azure AD B2C: As for the best strategy for multi-tenancy on the application level, it depends on your specific requirements. Using a separate subdomain per tenant is a common approach, but it is also possible to host on a single domain and use other mechanisms such as custom claims to separate tenants. Yes, when using Azure AD B2C in multi-tenant mode, any tenant can use your application. You will need to filter out tenants based on their IDs in your application code. Azure AD B2C supports federated authentication, which allows you to delegate authentication to other identity systems such as SAML or OpenID Connect. You can inspect the token and distinguish the tenant from the token sent in every request using the issuer parameter. The issuer parameter specifies the issuer of the token, which includes the tenant ID. You can store accounts in Azure AD B2C locally and spread them out over multiple tenants inside B2C. As far as I know, there is no limitation on the number of tenants that can be configured in Azure AD B2C. I hope this helps! Let me know if you have any further questions. Cheers, Luke5.6KViews0likes0CommentsRe: How do I select or change time In MSBookings when I am not available?
Hey Damian, Sorry to hear you are still having issues. One possibility is that there may be some misconfiguration in your scheduling settings that is causing this issue. I would recommend reviewing your settings and double-checking that everything is set up correctly. Take your time, check each setting. If still no-joy, log a ticket with Microsoft. Kind regards, Luke2.3KViews0likes0CommentsRe: AKS error
Hey dannybhar, second question I have seen form you, having a rough time? It looks like you are encountering an error pulling a Docker image in AKS. The error message indicates that the image pull failed due to authentication issues. Here are some steps you can try to resolve the issue: Verify that the image name and tag are correct: Double-check that the image name and tag you are trying to pull are correct. Make sure that the image is available in your container registry and that you have permissions to access it. Check your image pull secret: Ensure that you have created a valid image pull secret and that it is correctly specified in your YAML file. Verify that the secret has the correct credentials to access your container registry. Check your cluster network configuration: Ensure that your AKS cluster has the necessary network configuration to pull images from your container registry. You may need to configure a private endpoint or allow traffic to your container registry in your network security groups. Verify that your cluster is up to date: Ensure that your AKS cluster is running the latest version of Kubernetes and AKS. Check your Azure subscription permissions: Ensure that you have the necessary permissions in your Azure subscription to pull images from your container registry. I hope this helps! Let me know if you have any further questions. Kind regards, Luke Madden1.2KViews0likes0CommentsRe: Geospatial application
Dear dannybhar, Not an expert but it was an interesting question, so I did some digging. To store the contents of .gpkg files in a database, you can use PostGIS, an open-source spatial database extender for PostgreSQL. Azure provides a managed PostgreSQL service called Azure Database for PostgreSQL, which you can use to host your PostGIS database. To migrate your data from .gpkg files to PostGIS, you can use a data integration tool like Azure Data Factory (ADF). ADF is a cloud-based data integration service that allows you to create data pipelines to move and transform data between various sources and destinations, including databases. To use ADF for this scenario, you would need to create a pipeline that reads the .gpkg files from their source location and then writes the data to your PostGIS database. You can use the Copy Data activity in ADF to copy the data from the .gpkg files to PostGIS. Once your data is in PostGIS, you can build a web application to serve the data to your users. You can use Azure App Service to host your web application and Azure Maps to display your geospatial data. It seems like a rather complicated process, but I hope this helps. Kind regards, Luke Madden789Views0likes0CommentsRe: Tibco Architecture Reference Model
Hello ld2022, Migrating from Tibco to Azure can be a complex process, but fortunately there are resources available to help guide you through it. One place to start is with the Azure Integration Services Migration Playbook, which provides guidance on how to plan and execute a migration from on-premises integration platforms like Tibco to Azure Integration Services. In terms of Tibco alternate services in Azure, there are several options depending on your specific needs: Azure Logic Apps: Logic Apps provides a serverless platform for building and running workflows and integrating systems and services. Azure Service Bus: Service Bus is a cloud messaging service for connecting applications and services across different environments. Azure Event Grid: Event Grid is a fully-managed event routing service that simplifies the development of event-driven applications. Azure API Management: API Management provides a scalable, multi-cloud API management platform for securely exposing and managing APIs. These are just a few examples of the Tibco alternate services available in Azure. The specific services you choose will depend on your organization's specific needs and requirements. I hope this helps! Let me know if you have any further questions. Kind regards, Luke Madden1.7KViews0likes0CommentsRe: API Push in Azure
Hey Id2022, Is this what you are after? https://learn.microsoft.com/en-us/azure/architecture/example-scenario/apps/publish-internal-apis-externally If not, search for reference architecture here: https://learn.microsoft.com/en-us/azure/architecture/browse/?terms=API%20Push Hope that helps, let me know if you cannot find what you are looking for. Kind regards, Luke Madden943Views0likes0CommentsRe: How to start Azure queue processing by multiple WebJob instances when time constant expired or queue
Hello Andrei_R , To ensure safe processing of BatchQueue when you have multiple instances of your WebJob, you can use Azure Functions to schedule the processing. Here are the steps you can follow: Create an Azure Function that is triggered by a timer. Set the interval of the timer to the value of the TimeInterval you mentioned. In the Azure Function, check the length of the BatchQueue. If the length is greater than MaxLength, trigger the processing of the BatchQueue. To ensure safe processing when you have multiple instances of your WebJob, you can use a distributed lock service like Azure Redis Cache. You can acquire a lock before processing the BatchQueue to ensure that only one instance of the WebJob is processing the BatchQueue at any given time. After processing the BatchQueue, release the lock. By using Azure Functions and a distributed lock service like Azure Redis Cache, you can ensure safe processing of the BatchQueue when you have multiple instances of your WebJob. I hope this helps! Let me know if you have any further questions. Kind regards, Luke Madden1.1KViews0likes0CommentsRe: Hub and Spoke Low level design (including Azure firewall, APP Gateway and Load balancer)
Hello Ganesh, Microsoft already ahs a really good resource page and diagram the cover everything you are asking for. I would recommend starting here: https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-best-practices/hub-spoke-network-topology . Then reply with any follow up questions. I hope that helps! Kind regards, Luke Madden2KViews0likes0CommentsRe: Trigger pipeline on Approval
Hello Venkata, Yes, it is definitely possible to trigger a pipeline from a button click in Power BI or SharePoint using a Logic App. Here's a high-level overview of how you can achieve this: In Power BI or SharePoint, create a button and set up an action to be triggered when the button is clicked. For example, you can use Power Automate (previously known as Flow) to create a flow that runs when the button is clicked. In the Power Automate flow, you can use the "HTTP" action to call the Logic App's trigger URL and pass in the necessary parameters. You can use the "Response" action to check the result of the Logic App trigger and perform any additional actions if needed. In the Logic App, you can use the "HTTP Request" trigger to receive the HTTP request from Power Automate. You can then use the necessary actions and connectors to trigger the pipeline with the required parameters. I hope this helps! Let me know if you have any further questions. Kind regards, Luke Madden656Views0likes0Comments
Recent Blog Articles
No content to show