azure
34 TopicsAutomating Active Directory Domain Join in Azure
The journey to Azure is an exciting milestone for any organization, and our customer is no exception. With Microsoft assisting in migrating all application servers to the Azure IaaS platform, the goal was clear: make the migration seamless, error-free, efficient, and fast. While the customer had already laid some groundwork with Bicep scripts, we took it a step further—refactoring and enhancing these scripts to not only streamline the current process but also create a robust, scalable framework for their future in Azure. In this blog, we’ll dive into one of the critical pieces of this automation puzzle: Active Directory Domain Join. We'll explore how we crafted a PowerShell script to automate this essential step, ensuring that every migrated server is seamlessly integrated into the Azure environment. Let’s get started! Step 1: List all the tasks or functionalities we want to achieve AD domain Join process in this script: Verify Local Administrative Rights: Ensure the current user has local admin rights required for installation and configuration. Check for Active Directory PowerShell Module: Confirm if the module is already installed. If not, install the module. Check Domain Join Status: Determine the current domain join status of the server. Validate Active Directory Ports Availability: Ensure necessary AD ports are open and accessible. Verify Domain Controller (DC) Availability: Confirm the availability of a domain controller. Test Network Connectivity: Check connectivity between the server and the domain controller. Retrieve Domain Admin Credentials: Securely prompt and retrieve credentials for a domain administrator account. Perform AD Join: Execute the Active Directory domain join operation. Create Log Files: Capture progress and errors in detailed log files for troubleshooting. Update Event Logs: Record key milestones in the Windows Event Log for auditing and monitoring. Step 2: In PowerShell scripting, functions play a crucial role in creating efficient, modular, and reusable code. By making scripts flexible and customizable, functions help streamline processes within a global scope. To simplify the AD domain-join process, I grouped related tasks into functions that achieve specific functionalities. For instance, tasks like checking the server's domain join status (point 3) and validating AD ports (point 4) can be combined into a single function, VM-Checks, as they both focus on verifying the local server's readiness. Similarly, we can define other functions such as AD-RSAT-Module, DC-Discovery, Check-DC-Connectivity, and Application-Log. For better organization, we’ll divide all functions into two categories: Operation Functions: Functions responsible for executing the domain join process. Logging Functions: Functions dedicated to robust logging for progress tracking and error handling. Let’s start by building the operation functions. Step 1: Define the variables that we will be using in this script, like: $DomainName $SrvUsernameSecretName, $SrvPasswordSecretName $Creds . . . Step 2: We need a function to validate if the current user has local administrative rights, ensuring the script can perform privileged operations seamlessly. function Check-AdminRights { # Check if the current user is a member of the local Administrators group $isAdmin = [Security.Principal.WindowsPrincipal][Security.Principal.WindowsIdentity]::GetCurrent() $isAdminRole = [Security.Principal.WindowsBuiltInRole]::Administrator if ($isAdmin.IsInRole($isAdminRole)) { Add-Content $progressLogFile "The current user has administrative privileges." return $true } else { $errorMessage = "Exiting script due to lack of administrative privileges." Add-Content $progressLogFile $errorMessage Write-ErrorLog $errorMessage Log-Failure -functionName "Check-AdminRights" -message $errorMessage exit } } Step 3: Validate the status of Active Directory Module. If it's not installed already, install using the below logic: . . if (Get-Module -ListAvailable -Name ActiveDirectory) { Add-Content $progressLogFile "Active Directory Module is already installed." Log-Success -functionName "InstallADModule" -message "Active Directory Module is already installed." } else { Add-Content $progressLogFile "Active Directory Module not found. Initializing installation." Add-WindowsFeature RSAT-AD-PowerShell -ErrorAction Stop Install-WindowsFeature RSAT-AD-PowerShell -ErrorAction Stop Import-Module ActiveDirectory -ErrorAction Stop Add-Content $progressLogFile "Active Directory Module imported successfully." Log-Success -functionName "InstallADModule" -message "Active Directory Module imported successfully." } . . Step 4: Next, we need to perform multiple checks on the local server, and if desired can be clubbed into a function. Check the current domain-join status: If the server is already joined to a domain, there's no need to join. So, use the below logic to exit the script . . $computerSystem = Get-WmiObject Win32_ComputerSystem if ($computerSystem.PartOfDomain) { Add-Content $progressLogFile "This machine is already joined to : $($computerSystem.Domain)." Log-Success -functionName "VM-Checks" -message "Machine is already joined to : $($computerSystem.Domain)." exit 0 } else { Add-Content $progressLogFile "This machine is part of the workgroup: $($computerSystem.Workgroup)." } . . Check the Active directory ports availability: Define parameters with the list of all ports that needs to be available for domain-join : param ( $ports = @( @{Port = 88; Protocol = "TCP"}, @{Port = 389; Protocol = "TCP"}, @{Port = 445; Protocol = "TCP"} ) ) Once you have the parameters defined, check the status of each port using the below sample code. . . foreach ($port in $ports) { try { $checkPort = Test-NetConnection -ComputerName $DomainController -Port $port.Port if ($checkPort.TcpTestSucceeded) { Add-Content $progressLogFile "Port $($port.Port) ($($port.Protocol)) is open." } else { throw "Port $($port.Port) ($($port.Protocol)) is closed." } } catch { $errorMessage = "$($_.Exception.Message) Please check firewall settings." Write-ErrorLog $errorMessage Log-Failure -functionName "VM-Checks" -message $errorMessage exit } } . . Step 5: Now, we need to find an available domain controller in the domain, to process the domain join request. . . try { $domainController = (Get-ADDomainController -DomainName $DomainName -Discover -ErrorAction Stop).HostName Add-Content $progressLogFile "Discovered domain controller: $domainController" Log-Success -functionName "Dc-Discovery" -message "Discovered domain controller $domainController." } catch { $errorMessage = "Failed to discover domain controller for $DomainName." Write-ErrorLog $errorMessage Log-Failure -functionName "Dc-Discovery" -message $errorMessage exit } . . Step 6: We need to perform connectivity and name resolution checks between the local server and the previously identified domain controller. For Network connectivity check, you can use this logic: if (Test-Connection -ComputerName $DomainController -Count 2 -Quiet) { Write-Host "Domain Controller $DomainController is reachable." -ForegroundColor Green Add-Content $progressLogFile "Domain Controller $DomainController is reachable." } else { $errorMessage = "Domain Controller $DomainController is not reachable." Write-ErrorLog $errorMessage exit } For DNS check, you can use the below logic: try { Resolve-DnsName -Name $DomainController -ErrorAction Stop Write-Host "DNS resolution for $DomainController successful." -ForegroundColor Green Add-Content $progressLogFile "DNS resolution for $DomainController successful." } catch { $errorMessage = "DNS resolution for $DomainController failed." Write-Host $errorMessage -ForegroundColor Red Write-ErrorLog $errorMessage Log-Failure -functionName "Dc-ConnectivityCheck" -message $errorMessage exit } To fully automate the domain-join process, it’s essential to retrieve and pass service account credentials within the script without any manual intervention. However, this comes with a critical responsibility—ensuring the security of the service account, as it holds privileged rights. Any compromise here could have serious repercussions for the entire environment. To address this, we leverage Azure Key Vault for secure storage and retrieval of credentials. By using Key Vault, we ensure that sensitive information remains protected while enabling seamless automation. P.S : In this blog, we’ll focus on utilizing Azure Key Vault for this purpose. In the next post, we’ll explore how to retrieve domain credentials from the CyberArk Password Vault using the same level of security and automation. Stay tuned! Step 7: We need to declare the variable to provide the "key vault name" where the service account credentials are stored. This should be done in Step 1: $KeyVaultName = "MTest-KV" The below code ensures that the Azure Key Vault PowerShell module is installed on the local server and if not present, then installs it: # Check if the Az.KeyVault module is installed if (-not (Get-Module -ListAvailable -Name Az.KeyVault)) { Add-Content $progressLogFile "Az.KeyVault module not found. Installing..." # Install the Az.KeyVault module if not found Install-Module -Name Az.KeyVault -Force -AllowClobber -Scope CurrentUser } else { Add-Content $progressLogFile "Az.KeyVault module is already installed." } Now, we'll create a function to retrieve the service account credentials from Azure Key Vault, assuming the logged-in user already has the necessary permissions to access the secrets stored in the Key Vault. function Get-ServiceAccount-Creds { param ( [string]$KeyVaultName, [string]$SrvUsernameSecretName, [string]$SrvPasswordSecretName ) Add-Content $progressLogFile "Initiating retrieval of credentials from vault." try { Add-Content $progressLogFile "Retrieving service account credentials from Azure Key Vault." # Authenticate to access Azure KeyVault using the current account Connect-AzAccount -Identity # Retrieve service account's username and password from Azure Key Vault $SrvUsername = (Get-AzKeyVaultSecret -VaultName $KeyVaultName -Name $SrvUsernameSecretName).SecretValueText $SrvPassword = (Get-AzKeyVaultSecret -VaultName $KeyVaultName -Name $SrvPasswordSecretName).SecretValueText # Create a PSCredential object $SecurePassword = ConvertTo-SecureString $Password -AsPlainText -Force $Creds = New-Object System.Management.Automation.PSCredential($SrvUsername, $SecurePassword) Add-Content $progressLogFile "Successfully retrieved service account credentials." Log-Success -functionName "AD-DomainJoin" -message "Successfully retrieved credentials." return $Credentials } catch { $errorMessage = "Error retrieving credentials from Azure Key Vault: $($_.Exception.Message)" Add-Content $errorLogFile $errorMessage Log-Failure -functionName "AD-DomainJoin" -message $errorMessage exit } } Step 8: We will now use the retrieved service account credentials to send a domain join request to the identified domain controller. function Join-Domain { param ( [string]$DomainName, [PSCredential]$Creds, [string]$DomainController ) try { Add-Content $progressLogFile "Joining machine to domain: $DomainName via domain controller: $DomainController." # Perform the domain join specifying the domain controller Add-Computer -DomainName $DomainName -Credential $Creds -Server $DomainController -ErrorAction Stop Restart-Computer -Force -ErrorAction Stop Add-Content $progressLogFile "Successfully joined the machine to the domain via domain controller: $DomainController." Log-Success -functionName "AD-DomainJoin" -message "$ComputerName successfully joined $DomainName." } catch { $errorMessage = "Error joining machine to domain via domain controller $DomainController: $($_.Exception.Message)" Write-ErrorLog $errorMessage Log-Failure -functionName "AD-DomainJoin" -message $errorMessage Add-Content $progressLogFile "Domain join to $DomainName for $ComputerName failed. Check error log." exit } } Now that we've done all the heavy lifting with operational functions, let's talk about logging functions. During technical activities, especially complex ones, real-time progress monitoring and quick issue identification are essential. Robust logging improves visibility, simplifies troubleshooting, and ensures efficiency when something goes wrong. To achieve this, we’ll implement two types of logs: a detailed progress log to track each step and an error log to capture issues separately. This approach provides a clear audit trail and makes post-execution analysis much easier. Let's see how we can implement this. Step 1: Create log files including current timestamp in the variable declaration holding the log file path: # Global Log Files $progressLogFile = "C:\Logs\ProgressLog" + (Get-Date -Format yyyy-MM-dd_HH-m) + ".log" $errorLogFile = "C:\Logs\ErrorLog" + (Get-Date -Format yyyy-MM-dd_HH-m) + ".log" Note: I have included the timestamp in file name, this allows us to capture the logs in separate files in case of multiple attempts. If you do not want multiple files and want to overwrite the existing file, you can remove "+ (Get-Date -Format yyyy-MM-dd_HH-m) +" and it will create a single file named ProcessLog.log. Step 2: How to write events in the log files: To capture the occurrence of any event in the log file while building the PowerShell script, you can use the following code: For capturing progress in ProgressLog.log file, use: Add-Content $progressLogFile "This machine is part of the workgroup: $($computerSystem.Workgroup)." For capturing error occurrence in ErrorLog.log file we need to create a function: # Function to Write Error Log function Write-ErrorLog { param ( [string]$message ) Add-Content $errorLogFile "$message at $(Get-Date -Format 'HH:mm, dd-MMM-yyyy')." } We will call this function to capture the failure occurrence in the log file: $errorMessage = "Error while checking the domain: $($_.Exception.Message)" Write-ErrorLog $errorMessage Step 3: As we want to capture the milestones in Application event logs locally on the server as well, we create another function: # Function to Write to the Application Event Log function Write-ApplicationLog { param ( [string]$functionName, [string]$message, [int]$eventID, [string]$entryType ) # Ensure the event source exists if (-not (Get-EventLog -LogName Application -Source "BuildDomainJoin" -ErrorAction SilentlyContinue)) { New-EventLog -LogName Application -Source "BuildDomainJoin" -ErrorAction Stop } $formattedMessage = "$functionName : $message at $(Get-Date -Format 'HH:mm, dd-MMM-yyyy')." Write-EventLog -LogName Application -Source "BuildDomainJoin" -EventID $eventID -EntryType $entryType -Message $formattedMessage } To capture the success and failure events in Application event logs, we can create separate functions for each case. These functions can be called from other function(s) to capture the results. Step 4: Function for the success events: We will use Event Id 3011 to capture the success, by creating separate function. You can use any event Id of your choice but do due diligence to ensure that it does not conflict with any of the existing event ID functions. # Function to Log Success function Log-Success { param ( [string]$functionName, [string]$message ) Write-ApplicationLog -functionName $functionName -message "Success: $message" -eventID 3011 -entryType "Information" } Step 5: To capture failure events, we’ll create a separate function that uses Event ID 3010. Ensure the chosen Event ID does not conflict with any existing Event ID functions. # Function to Log Failure function Log-Failure { param ( [string]$functionName, [string]$message ) Write-ApplicationLog -functionName $functionName -message "Failed: $message" -eventID 3010 -entryType "Error" } Step 6: How to call and use Log-Success function in script: In case of successful completion of any task, call the function to write success event in the application log. For example, I used the below code in "AD-RSAT-Module" function to report the successful completion of the module installation: Log-Success -functionName "AD-RSAT-Module" -message "RSAT-AD-PowerShell feature and Active Directory Module imported successfully." Step 7: How to call and use Log-Failure function in script: In case of a failure in any task, call the function to write failure event in the application log. For example, I used the below code in "AD-RSAT-Module" function to report the failure along with the error it failed with. It also stops further processing of the PowerShell script: $errorMessage = "Error during RSAT-AD-PowerShell feature installation or AD module import: $($_.Exception.Message)" Log-Failure -functionName "RSAT-ADModule-Installation" -message $errorMessage Exit With the implementation of an automated domain join solution, the process of integrating servers into Azure becomes more efficient and error-free. By leveraging PowerShell and Azure services, we’ve laid the foundation for future-proof scalability, reducing manual intervention and increasing reliability. This approach sets the stage for further automation in the migration process, providing a seamless experience as the organization continues to grow in the cloud.788Views1like0CommentsAdd authentication to your Azure App Service or Function app using Microsoft Entra External ID
Azure App Service and Function app offers built-in authentication and authorization features, allowing you to sign in users by writing minimal or no code in your web app, RESTful API, or mobile back end. It’s built directly into the platform and doesn’t require any particular language, library, security expertise, or even any code to use. The built-in authentication feature for App Service and Function app can save you time and effort by providing out-of-the-box authentication with federated identity providers, allowing you to focus on the rest of your application. This built-in authentication includes: Easy activation and configuration via the Azure portal and app settings. No need for SDKs, specific languages, or changes to your application code. Support for multiple identity providers: Microsoft GitHub Facebook Google Sign in with Apple X Any OpenID Connect provider When the authentication/authorization module is enabled, every incoming HTTP request is processed through it before reaching your app code. For more details, see Authentication and Authorization in Azure App Service. This blog shows you how to configure authentication for Azure App Service and Azure Functions so that your app signs in external users with the Microsoft identity platform (Microsoft Entra External ID) as the authentication provider. How to enable External ID on your Azure App Service or Function app Prerequisites An external tenant on Microsoft Entra Admin Center. If you don’t have one, create an external tenant with an Azure subscription. Ensure you have the Application Administrator role and External ID User Flow Administrator role on Microsoft Entra. A Contributor role on Azure to create Function apps. Have an existing Function app or Azure App Service. If you don’t have one, follow this guide to create your first function app or this training to host a web application with Azure App Service. 1. Choose a tenant for your applications and its users Now that you have your Function app or Azure App Service, let’s set up sign in for your users. Since we want our app to be available to consumers and business customers, we first need to register the app in an external tenant. Sign in to the Azure portal and navigate to your function app or Azure App Service. On your app's left menu, under Settings, select Authentication, and then select Add identity provider. In the Add an identity provider page, select Microsoft as the Identity provider to sign in Microsoft and Microsoft Entra identities. For Tenant type, select External configuration for consumers and business customers (external users). 2. Choose the app registration The Authentication feature can automatically create an app registration for you or you can use a registration that you or a directory admin created separately. To create a new app registration, select the Create new app registration option. Select an existing tenant to use from the drop-down, or select Create new to create a new external tenant. The second option is to use an existing app registration where we select Provide the details of an existing app registration then provide application (client) ID, Client secret and Issuer URL which you can find under App Registration> All applications > Select your app. The following situations are the most common cases to use an existing app registration: Your account doesn't have permissions to create app registrations in your Microsoft Entra tenant. You want to use an app registration from a different Microsoft Entra tenant than the one your app is in. The option to create a new registration isn't available for government clouds. 3. Configure external authentication Follow these steps to set up sign-in and customize branding. Select Configure to configure external authentication for the new tenant. The browser opens Configure external authentication. Select a user flow from the drop-down or select Create new. The user flow defines the sign-in methods your external users can use. Each app can only have one user flow, but you can reuse the same user flow for multiple apps then click Next. On the Customize Branding tab, add your logo and background color, and Center-align or Right-align your sign-in page and click Next. Review your configurations and click Configure. 4. Configure additional checks Configure Additional checks, which determine which requests are allowed to access your application. You can customize this behavior now or adjust these settings later from the main Authentication screen by choosing Edit next to Authentication settings. For Client application requirement, choose whether to: Allow requests only from this application itself Allow requests from specific client applications Allow requests from any application (Not recommended) For Identity requirement, choose whether to: Allow requests from any identity Allow requests from specific identities For Tenant requirement, choose whether to: Allow requests only from the issuer tenant Allow requests from specific tenants Use default restrictions based on issuer 5. Configure authentication settings These options determine how your application responds to unauthenticated requests, and the default selections will redirect all requests to sign in with this new provider. You can change customize this behavior now or adjust these settings later from the main Authentication screen by choosing Edit next to Authentication settings. To learn more about these options, see Authentication flow. For Restrict access, decide whether to: Require authentication. Allow unauthenticated access For Unauthenticated requests HTTP 302 Found redirect: recommended for websites HTTP 401 Unauthorized: recommended for APIs HTTP 403 Forbidden HTTP 404 Not found Select Token store (recommended). The token store collects, stores, and refreshes tokens for your application. You can disable this later if your app doesn't need tokens or you need to optimize performance. 6. Test your app After following the above steps, External ID should now be added as an identity provider for your app. To verify that this is now working, navigate to your Function App or Azure App Service. Click Overview > Browse. This will take you straight to the sign-in page. Follow the sign-up process for a new user. On successful sign-up, this should take you to your app as shown below. Next steps Continue exploring Microsoft Entra External ID on Azure App Service by checking out the documentation. We have a YouTube playlist on ‘Identity for developers’ that shows you other developer tools integrating External ID. You can also explore other features in the Microsoft Entra portfolio by visiting our: Developer center Identity blog YouTube for tutorials, deep dives, and the latest news.1.8KViews1like0CommentsAccelerate AI adoption with next-gen security and governance capabilities
Generative AI adoption is accelerating across industries, and organizations are looking for secure ways to harness its potential. Today, we are excited to introduce new capabilities designed to drive AI transformation with strong security and governance tools.8.3KViews2likes0CommentsSimplifying signing integration for Trusted Signing
One of the biggest challenges for developers is the integration of signing into their workflows, and with Trusted Signing, we aim to make that easier and simpler to do by bringing integration into the developer tool chains and CI/CD pipelines. We started with targeting the common tools used today for code signing Windows apps with the Windows SDK SignTool.exe integration, and covered CI/CD pipelines with the Trusted Signing Azure DevOps Extension and the Trusted Signing GitHub Action. These integrations are great options for developers familiar with signing, but developers are still required to understand what to sign in the production of their app and the package they use to distribute their application. Developers must orchestrate the signing of loose binaries before the packaging project kicks off and packages the binaries into an installer package. During the Trusted Signing Preview, we observed not only many solution patterns being used by the early subscribers, but also subscribers sharing their solutions and supporting each other. This was a microcosm of the broader developer community that is always supportive of one another in the ecosystem. The sharing of simplification leads to a reduction of time spent on setting up and configuring signing solutions. This in turn leads to developers having more time to do things they couldn’t previously prioritize. We witnessed some of those opportunities where the time and effort saved in working on signing solutions turned into new features and new changes that have been on the proverbial back burner, in some cases for many years. We are happy to see that some of those simplification solutions for signing with Trusted Signing are now released for all subscribers to leverage, and we want to highlight some of those simplifications with you now. Simplified SignTool Plugin (Dlib) Setup During the Preview stages of Trusted Signing, we recognized the challenges of setting up our Windows SDK SignTool.exe plugin with all of its dependencies to be more time consuming for customers. Even though the setup is typically a one-time exercise, we wanted to simplify this to make it easier for customers. Today we are happy to announce that we’ve released the Trusted Signing Client Tools Installer. The Trusted Signing Client Tools Installer is simply an MSI package that installs the latest version of the Microsoft.Trusted.Signing.Client, the latest compatible Windows SDK SignTool.exe, latest .NET runtime, and all necessary Visual C++ dlib dependencies. We took it a step further and listed the Trusted Signing Client Tools in the Windows Package Manager (WinGet), making it easy to download and install from a single PowerShell command that strongly enforces the package ID (using -e parameter): winget install -e --id Microsoft.Azure.TrustedSigningClientTools For anyone not using WinGet, a slightly more complex PowerShell method may be used to quietly download and invoke the MSI as shown in the example here (run as administrator): $ProgressPreference = 'SilentlyContinue'; Invoke-WebRequest -Uri "https://download.microsoft.com/download/6d9cb638-4d5f-438d-9f21-23f0f4405944/TrustedSigningClientTools.msi" -OutFile .\TrustedSigningClientTools.msi; Start-Process msiexec.exe -Wait -ArgumentList '/I TrustedSigningClientTools.msi /quiet'; Remove-Item .\TrustedSigningClientTools.msi For all the details on how to download and install the Trusted Signing Client Tools, visit our Set up signing integrations to use Trusted Signing | Microsoft Learn documentation. Trusted Signing Integration into dotnet/Sign The dotnet/sign: Code Signing CLI tool supporting Authenticode, NuGet, VSIX, and ClickOnce (github.com) has a growing popularity among the developer community for it ease of use for bringing the signing operations of many tools under one Cli experience. Integration with Trusted Signing has been part of the service roadmap, and we piloted a PoC integration during our Private Preview with a few customers. With the Public Preview launch for Trusted Signing, the demand for the integration of the service into the dotnet/Sign project grew (Support for Azure Trusted Signing · Issue #683), and being an OSS project, we’ve collaborated with OSS developers who wanted to contribute to delivering this integration. Most notably was the contribution by Dirk Lemstra, a well-recognized OSS contributor and one of the maintainers of the ImageMagick and other related projects. Dirk's collaboration and contributions were instrumental in delivering the support for Trusted Signing in dotnet/Sign. Dirk published his story in the post, Signing NuGet packages with Trusted Signing, and he continues to pursue support in NuGet.org with a pull request on the NuGet project here for all to review and share their perspective. The results of the integration in the dotnet/Sign Cli were first made available in pre-release version sign 0.9.1-beta.24325.5 and we’ve started working on integrating the dotnet/Sign Cli into our Trusted Signing GitHub Action and Azure DevOp Pipeline Extension to make it even simpler to integrate signing of NuGet, VSIX, and ClickOnce into these CI/CD pipelines. To install dotnet/Sign, you'll need to run the .NET Tool and specify the version of the Sign Cli as shown in this example: dotnet tool install --global sign --version 0.9.1-beta.24325.5 Once installed, to execute the sign cli it's just a matter of running a command. Here is an example script that can be used to measure how much time it takes to sign N files, and you can see there is a concurrency parameter that could be adjusted for your needs: [CmdletBinding()] param ( [string] $endpoint = " https://region.codesigning.azure.net/", #Your Trusted Signing Account URI [string] $signaccount = "account", #Your Trusted Signing Account Name [string] $signprofile = "profile", #Your Trusted Signing Certificate Profile Name [string] $basePath = "C:\temp", [string] $filter = "**/*.exe", [int] $maxConcurrency = 5 ) $start = Get-date Write-Host $start sign.exe code trusted-signing -tse $endpoint -tsa $signaccount -tscp $signprofile -b $basePath $filter -m $maxConcurrency $end = Get-Date $duration = New-TimeSpan -Start $start -End $end Write-Host $duration For more information on dotnet/Sign and to engage with the community on the project, visit the GitHub project, dotnet/sign: Code Signing CLI tool supporting Authenticode, NuGet, VSIX, and ClickOnce. Advanced Installer makes packaging with signing built in For any developer, packaging their application into an installer media type includes common steps of building the binaries and packaging them. The packaging steps can be solved with packaging projection solutions such as Caphyon's Advanced Installer product. The Advanced Installer team recognized that having signing built into the packaging project tooling was a great way to simplify the signing process and remove the need for custom orchestration. The Advanced Installer team announced, “Trusted Signing Integration: Advanced Installer's Newest Standard” where Trusted Signing is integrated into the packaging process, so binaries are signed before packaging and the package itself is signed automatically as well. Making the multiple step process now a single step process. We, in the Trusted Signing team, recognized this integration into Advanced Installer as a powerful option for app developers, and during our Private Preview we saw an opportunity to share the Advanced Installer solution with the popular OSS project, ImageMagick. ImageMagick's signing challenges were well documented in late October 2023, when a maintainer for ImageMagick, Dirk Lemstra, shared on the project's GitHub Discussions that they will no longer be signing their Windows installer. The day of that Discussion post by Dirk led to a number of existing Trusted Signing Private Preview subscribers reaching out to see if we could help the ImageMagick project, and in week we had ImageMagick up and running with Trusted Signing. One of the challenges Dirk noted when working with us was the installer story for ImageMagick on Windows included 17 separate InnoSetup EXE installers, each being around 20MB in size. We connected Dirk with the folks at Advanced Installer, and they worked together to not only simplify the packaging and signing process for ImageMagick, but also modernized the installer by producing an MSIXBundle for ImageMagick on Windows that is less than 19 MB in size, and ImageMagick is now available on WinGet. The full story from Dirk and ImageMagick can be found here: ImageMagick MSIX installer now uses Trusted Signing. More options and looking forward The Advanced Installer feature with Trusted Signing integration wasn't the only development where Trusted Signing subscribers simplified signing and shared their solution. Some of the others we'd like to call out include: Jsign - Authenticode signing in Java (ebourg.github.io) KoalaDocs/azure-code-signing-for-plugin-developers SignToolGUI – A new tool is here for your digital signing experience – Now public! – Blog - Sonne´s Cloud They may be more out there, and we applaud the community engagement and interest. The Trusted Signing team does provide a few components that we hope invites and encourages more developer community driven solutions. These components include: Trusted Signing Crypto Provider (.NET) - simplifies the integration into existing tooling Trusted Signing SDKs Trusted Signing Developer SDK Trusted Signing Resource Manager SDK Going forward, Trusted Signing will be launching more integration options and enhancing existing integrations we support to deliver on our goals to make signing easier and simpler for all. If you want to discuss opportunities for collaboration on integrations with Trusted Signing, please reach out to us. Learn More Learn more about Trusted Signing here.758Views3likes0Comments