Azure VMware Solution (AVS) offers a fully managed software defined data center based on VMware vSphere technologies in the shape of an Azure PaaS service. The PaaS nature of Azure VMware Solution results in a service that is functionally equivalent to your well-known on-premises VMware deployment you may have been using for years with some specific “restrictions” as Microsoft provides a service level agreement and therefor Microsoft is responsible for ensuring a robust and resilient platform in deployment and operation. Azure VMware Solution is functionally equivalent to on-premises VMware but due to the PaaS nature of the service there are significant technical differences to properly consider.
As a consumer of Azure VMware Solution, you are granted limited administrative privileges that are aligned with the PaaS nature of the service. You are granted access to the CloudAdmin role which holds a subset of the full administrator role. As it is at times needed to allow for “privilege escalation”, Microsoft has enabled you to do so using a feature in the Azure AVS portal blades called “Run-Commands”. Run-Commands allow you to perform a selected number of high privileged actions without requiring the need to submit a service request through Azure Support from the Azure Portal. More details on Run-Commands can be found on Microsoft Learn here.
To integrate your Azure VMware Solution SDDC with a directory services provider, in our case Active Directory, we will use a Run-Command: New-LDAPSIdentitySource.
To allow us to create this article using “understandable” values for the prerequisites and the parameters required to successfully implement the LDAPS identity integration, we have created a lab environment which is shown in the picture below. Using this picture as a reference we hope to make it easier for anyone to implement LDAPS integration by simply replacing the prerequisite values and parameters (displayed in bold through-out this document) with the values applicable to your environment by simply comparing with our lab and easily translating this to what is applicable for your deployment.
Our lab consists of:
Required resources: | |
---|---|
One Azure VMware Solution Private Cloud called | avs-fta-gwc |
One Active Directory forest and domain called | avsemea.com |
Two Active Directory domain controllers called | avs-gwc-dc001 and avs-gwc-dc002 |
One Active Directory hosted enterprise root certificate authority called | avsemea-avs-gwc-rootca |
One Azure Virtual WAN called | avs-germanywestcentral-vwan1 |
One jumpbox virtual machine used for management activities called | avs-gwc-jumpbox |
One Azure Virtual Network called | avs-gwc-172_16_11_0-24 |
This vnet holds three separate subnets called | SN-172_16_11_0-26-ADDS, AzureBastionSubnet and SN-172_16_11_128-26 |
All virtual machines used in the lab are joined to the above mentioned avsemea.com domain Our lab environment also holds some additional resources that are not required for the LDAPS integration but they will be used when creating additional articles on different Azure VMware related topics just like this one.
Optional resources for future use: | |
---|---|
Two NSX-T segments inside our AVS SDDC called | NSX-SN-192-168-200-0-24 and NSX-SN-192-168-201-0-24 |
One virtual machine for forwarding additional metrics from our AVS SDDC to the Azure Metrics infrastructure called | avs-gwc-metrics001 |
One Azure NetApp Files account that will be used as extensible storage for our AVS SDDC in a later article called | avs-gwc-anfaccount001 |
A graphical representation of our lab environment is shown below:
While guiding you through the process of gathering all required details and artifacts and using them to complete the LDAPS integration for AVS, we assume that you have all resources mentioned in the table required resources available and that sufficient access permissions are in place.
Before we can configure integration with an external identity store (e.g. Active Directory Domain Services) we need to make sure that the AVS platform components have the ability to resolve customer DNS zones hosting the LDAPS domain records. This configuration must be made through the Azure Portal blades for Azure VMware Solution.
The first step in configuring DNS name resolution from the Azure VMware Solution networks (management and workload segments) is to add a DNS zone for Azure VMware. Login to the Azure Portal and select the Azure Active Directory tenant and Azure subscription where you have deployed your Azure VMware Solution Private Cloud.
As depicted in the image above:
The Azure VMware Solution DNS configuration pane opens the section where “conditional DNS forwarder” zones are configured.
As shown in the image above:
After a few minutes, the DNS FQDN zone, avsemea.com, will be listed in the DNS blade for your Azure VMware Solution Private Cloud as shown below:
Now that we created the DNS conditional forwarder zone we need to attach this zone to the NSX-T DNS service running in Azure VMware Solution to enable NSX-T to actually use the DNS conditional forwarder zone for use.
In the Azure VMware Solution DNS blade:
When the change to the NSX-T DNS service is effectuated the avsemea.com DNS zone will now be listed in the DNS service configuration:
After this step the configuration of the Azure VMware Solution DNS service is complete. The next article in this series will describe the detailed steps in configuring LDAPS integration through the Azure portal or automation where possible.
The following sections will guide you through the required process step-by-step
It is important to validate that the identity provider (in most cases Active Directory) is configured to support LDAPS based authentication requests before continuing with the LDAPS integration setup.
For each domain controller that is designated for use with Azure VMware Solution, check if the required certificate(s) with the “server authentication” intended purpose are present in the computer certificate store. The “Certificates” snap-in for the Microsoft Management Console (mmc.exe) offers a simple way to do this.
After locally or remotely opening the computer certificate store:
Available certificates for avs-gwc-dc001.avsemea.com:
Available certificates for avs-gwc-dc002.avsemea.com:
Next we need to verify whether the Active Directory domain controllers are configured to offer LDAPS services:
LDAPS listener for avs-gwc-dc001.avsemea.com:
LDAPS listener for avs-gwc-dc002.avsemea.com:
As a next step the certificates used by the domain controllers for LDAPS services need to be extracted from the domain controllers. This step is always performed by executing the script shown in the image below. The reason for this is to ensure we extract the correct certificate which is “attached” to the LDAPS service. When multiple certificates are available with the “Server Authentication” intended purpose, the domain controller selects one of them to be used for LDAPS.
With the following commands we will connect to the required domain controllers. In our scenario, these are avs-gwc-dc001.avsemea.com and avs-gwc-dc002.avsemea.com and we then use OpenSSL to extract the required certificates. You will need to specify the path of your openssl.exe, currently tested version is 3.0 and was installed with Chocolatey. Our install path is C:\Program Files\OpenSSL-Win64\bin\openssl.exe. You will need to update the export path to suit your needs, we chose to use c:\certTemp
Notes
$openSSLFilePath may need to be changed, was installed using chocolatey in our example.
$remoteComputers would need to be changed to suit your environment.
$exportFolder can be changed to something more suitable for your environment.
## Get certs
$openSSLFilePath = "C:\Program Files\OpenSSL-Win64\bin\openssl.exe"
$remoteComputers = "avs-gwc-dc001.avsemea.com","avs-gwc-dc002.avsemea.com"
$port = "636"
$exportFolder = "c:\temp\"
foreach ($computer in $remoteComputers)
{
$output = echo "1" | & $openSSLFilePath "s_client" "-connect" "$computer`:$port" "-showcerts" | out-string
$Matches = $null
$cn = $output -match "0 s\:CN = (?<cn>.*?)\r\n"
$cn = $Matches.cn
$Matches = $null
$certs = select-string -inputobject $output -pattern "(?s)(?<cert>-----BEGIN CERTIFICATE-----.*?-----END CERTIFICATE-----)" -allmatches
$cert = $certs.matches[0]
$exportPath = $exportFolder+($computer.split(".")[0])+".cer"
$cert.Value | Out-File $exportPath -Encoding ascii
}
The next step is to ensure that the certificate extraction was performed successfully. This will always be a manual step in the process.
As displayed in the image above:
For each of the certificates:
To complete the certificate verification process, for each certificate:
The following content is divided into two sub-sections. One section will describe how the required procedure is performed manually through the Azure Portal and the other section describes a way to perform the required steps through automation.
As part of this manual process a storage account will be created that is used to store the domain controller certificates for later use by the “New-LDAPSIdentitySource” run-command.
In the Azure Portal:
On the “basics” tab:
On the review screen:
With these commands, we will check for the Azure Module, install them if missing and then continue the script. We will create the required storage account, or use an existing storage account. The $storageAccountName and $resourceGroupLocation variables can be updated or replaced as needed to meet your needs. These scripts are designed to be run in sections one after each other to ensure the variable names are correctly referenced.
Notes
$resourceGroupLocation will need to be updated to your desired location.
$storageRgName will need to be updated.
$storageAccountName will need to be updated.
## Do you have the Azure Module installed?
if (Get-Module -ListAvaialble -name Az.Storage)
{ write-output "Module exists" }
{
write-output "Module does not exist"
write-output "installing Module"
Install-Module -name Az.Storage -Scope CurrentUser -Force -AllowClobber
}
## create storage account
$resourceGroupLocation = "germanywestcentral"
$storageRgName = "avs-$resourceGroupLocation-operational_rg"
## Storage account variables
$guid = New-Guid
$storageAccountName = "avsgwcsa"
$storageAccountNameSuffix = $guid.ToString().Split("-")[0]
$storageAccountName = (($storageAccountName.replace("-",""))+$storageAccountNameSuffix )
## Define tags to be used if needed
## tags can be modified to suit your needs, another example below.
#$tags = @{"Environment"="Development";"Owner"="FTA";"CostCenter"="123456"}
$tags = @{"deploymentMethod"="PowerShell"; "Technology"="AVS"}
## create storage account
$saCheck = Get-AzStorageAccount -ResourceGroupName $storageRgName -Name $storageAccountName -ErrorAction SilentlyContinue
if ($null -eq $saCheck)
{
New-AzStorageAccount -ResourceGroupName $storageRgName -Name $storageAccountName -Location $resourceGroupLocation -SkuName Standard_LRS -Kind StorageV2 -EnableHttpsTrafficOnly $true -Tags $tags
Write-Output "Storage account created: $storageAccountName"
} else {
write-output "Storage Account already exists"
}
The next step is to create a blob container to help structure/organize the resources required for the LDAPS identity integration. The following content is divided into two sub-sections. One section will describe how the required procedure is performed manually through the Azure Portal and the other section describes a way to perform the required steps through automation.
In the Azure Portal:
In the “New container” dialog:
With these commands, we will create the container to upload the earlier exported certificates to, this will be important for the creation of the SAS Tokens for the AVS LDAPS Run Command.
Notes
$containerName will need to be updated
## create container
$containerName = "ldaps"
$containerCheck = Get-AzStorageContainer -name $containerName -Context (Get-AzStorageAccount -ResourceGroupName $storageRgName -Name $storageAccountName).Context -ErrorAction SilentlyContinue
if ($null -eq $containerCheck)
{
New-AzStorageContainer -name $containerName -Context (Get-AzStorageAccount -ResourceGroupName $storageRgName -Name $storageAccountName).Context
Write-Output "Storage container created: $containerName"
} else {
write-output "Container already exists"
}
The next step is to upload the domain controller certificates from the temporary folder where they were extracted to into the newly created ldaps-blog-post container in the avsgwcsa14a2c2db storage account we created. The following content is also divided into two sub-sections. One section will describe how the required procedure is performed manually through the Azure Portal and the other section describes a way to perform the required steps through automation.
As the first step, it is needed to “enter” the ldaps-blog-post container in the storage account:
We will now upload the certificates into the container:
Navigate to the folder where the certificates have been extracted to (in our scenario c:\certTemp) and:
The “Open” screen will close and return to the Azure Portal “Upload blob” panel again, click “Upload” at the bottom of the screen:
The certificates will now be uploaded into the blob container. The process is complete when green checkmarks are shown for each certificate uploaded:
With these commands, we will upload the actual certificates into the previously created container. In this example we are using “ldaps-blog-post”
## upload certs to container
$certs = Get-ChildItem -Path $exportFolder -Filter *.cer
$storageContext = (Get-AzStorageAccount -Name $storageAccountName -ResourceGroupName $storageRgName).Context
foreach ($item in $certs)
{
$localFilePath = $item.FullName
$azureFileName = $localFilePath.Split('\')[$localFilePath.Split('\').count-1]
Get-AzStorageAccount -Name $storageAccountName -ResourceGroupName $storageRgName | Get-AzStorageContainer -Name $containerName | Set-AzStorageBlobContent -File $localFilePath -Blob $azureFileName
}
As part of this step we will generate “shared access tokens” for all the certificates uploaded into the blob container so they can be accessed through the run-command for implementing the LDAPS integration for vCenter. One section describes how the required procedure is performed manually through the Azure Portal and the other section describes a way to perform the required steps through automation.
The manual deployment continues within the same blade in the Azure Portal where the previous step left off. For each of the uploaded certificates, generate a “SAS token”:
For each certificate separately:
In the following screen:
Be sure to copy the “Blob SAS URL” generated for each separate certificate into a text-file temporarily as they will need to be concatenated into a single string separated by a comma for use during the execution of the run-command as explained in step “Execute Run-Command”.
With these commands, we will generate the SAS Token needed for the next steps, please note down BOTH tokens. In this script, the tokens are valid for 24 hours and can be modified to suit your needs.
Notes
$containerName will need to be updated
## create SAS token
$containerName = "ldaps-blog-post"
$blobs = Get-AzStorageBlob -Container $containerName -Context $storageContext | Where-Object {$_.name -match ".cer"}
foreach ($blob in $blobs)
{
$StartTime = Get-Date
$EndTime = $startTime.AddHours(24.0)
$sasToken = New-AzStorageBlobSASToken -Container $containerName -Blob $blob.name -Permission rwd -StartTime $StartTime -ExpiryTime $EndTime -Context $storageContext -FullUri
#$sasToken
write-host "SASToken created: $sasToken"
}
These steps, for now, are run manually from the Azure Portal. This will be found “Azure VMware Solution” and under Operations, Run command. Then select “New-LDAPSIdentitySource”. An automated way of executing the run-command is in the making. Please check back soon as this article will be updated as soon as this is available. Navigate to Azure Portal and ensure you are on AVS Private Cloud blade;
Populate the information as needed.
The SSLCertificateSasUrl is a single string consisting of the SASTokens separated with a “,”. For example:
https://avsgwcsa14a2c2da.blob.core.windows.net/ldaps-blog-post/avs-gwc-dc001.cer?sv=2021-10-04&st=2023-01-12T13%3A46%3A45Z&se=2023-01-13T13%3A46%3A45Z&sr=b&sp=rwd&[Removed],https://avsgwcsa14a2c2da.blob.core.windows.net/ldaps-blog-post/avs-gwc-dc002.cer?sv=2021-10-04&st=2023-01-12T13%3A46%3A45Z&se=2023-01-13T13%3A46%3A45Z&sr=b&sp=rwd&[Removed]
and pasted as a single long string.
Example string:
https://avsgwcsa14a2c2da.blob.core.windows.net/ldaps-blog-post/avs-gwc-dc001.cer?sv=2021-10-04&st=2023-01-12T13%3A46%3A45Z&se=2023-01-13T13%3A46%3A45Z&sr=b&sp=rwd&[Removed],https://avsgwcsa14a2c2da.blob.core.windows.net/ldaps-blog-post/avs-gwc-dc002.cer?sv=2021-10-04&st=2023-01-12T13%3A46%3A45Z&se=2023-01-13T13%3A46%3A45Z&sr=b&sp=rwd&[Removed]
The other values would need to be updated as per your environment. The BaseDNGroups and BaseDNUsers are the Distinguished Names for the groups and users. These can be found using the ADSIedit tool and navigating to the required groups within Active Directory and noting down these values. More information can be here about Distinguished Names
With the BaseDNGroups and BaseDNUsers, watch the values used as these should be under the same tree. In this example, “OU=Corp,DC=avsemea,DC=com”
Once all information is entered, Run the command. You can check the status of the Run command within the portal. Navigate to Azure Portal and ensure you are on AVS Private Cloud blade
You can check the status of the job now and you are waiting for all the tasks to be completed and show that your selected group was added to CloudAdmins Once you have clicked the Execution name link
You can also check the output and ensure that the correct name was added. Once you have clicked the Execution name link
You can log on with LDAP based credentials. Using the link found under “VMware credentials”, navigate to vCenter Server Credentials Web client URL Navigate to Azure Portal and ensure you are on AVS Private Cloud blade
Navigate to the URL with your browser of choice and use your Directory Services based credentials to ensure that login and authentication is working as expected.
Logon to Azure VMware Solution based components is now possible and should be now be working as expected.
All the PowerShell code can be found here - LDAPS code snippets
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.