User Profile
GTRekter
MCT
Joined 9 years ago
User Widgets
Recent Discussions
Changes in Azure DevOps free grants explained
During the past few days, I received several emails from people asking why their CI/CD pipelines started randomly receiving errors like the following: Since 2018, Azure DevOps Services has provided free CI/CD pipelines for its users. By default, it allowed a developer to run arbitrary code for up to 1800 free minutes per month on Microsoft-hosted agents and ten parallel jobs for public projects. However, the market capitalization of cryptocurrency has skyrocketed in recent years, and as a result, it has become very profitable for some folks to violate the terms of service of PaaS providers’ free tiers, like Azure DevOps, to run cryptocurrency miners as a step in their build pipeline. “…cryptocurrency surged from $190 billion in January of 2020 to $2 trillion in April of 2021..” CEO of DevOps platform LayerCI, Colin Chartier Without getting into details, cryptocurrency mining is a process that allows cryptocurrencies to work as a peer-to-peer decentralized network by verifying and adding transactions into the blockchain public ledger or even generate new coins. This process requires high-performance machines, and often they hijack system hardware resources like the CPU, GPU, and network bandwidth, affecting the entire host. As a result, Azure DevOps users saw a degradation in the service, increasing the build time from 20 to 50 percent. To guarantee a certain level of performance for its customers, Microsoft marked those activities as abusive and, after seeing an increase in the number of projects used for this purpose on Azure DevOps, announced on February 18th, 2021, that it would stop granting free pipelines to new public projects. Sequentially, crypto miners shifted to private projects forcing Microsoft to do the same for private projects on March 16th, 2021. IMPORTANT: This change does not impact any existing organizations. It only affects new public/private projects that you create in new Azure DevOps organizations. You can still enjoy the free tier, but to do so, you must send an email to azpipelines-ossgrant@microsoft.com and provide the following details: Your name Azure DevOps organization for which you are requesting the free grant For public projects, you must also include: Links to the repositories that you plan to build Brief description of your project Once the request is received, their team will review and respond in 2–3 business days. References: Announcement change in Azure Pipelines Grant for Public Projects: https://devblogs.microsoft.com/devops/change-in-azure-pipelines-grant-for-public-projects/ Announcement change in Azure Pipelines Grant for Private Projects: https://devblogs.microsoft.com/devops/change-in-azure-pipelines-grant-for-private-projects/ Azure DevOps status: https://status.dev.azure.com/_event/2355011102.7KViews1like0CommentsLoad Testing with Azure DevOps and k6
In today’s article, I guide you through your Azure DevOps setup to perform automated load tests using k6. Before we begin, I want to take a minute to explain what load tests are and why they are essential. What is a load test? There are many different types of testing in software development. For example, some tests check that different models of an application work together as expected (integration testing), some focus on the business requirements of an application by verifying the output of action without considering the intermediate state (functional testing), and others perform different types of testing. Load testing is a type of performance testing as well as a type of stress test and capacity test. It focuses on verifying the application’s stability and reliability under both normal and peak load conditions. Important: While load testing tests your application under realistic average or peak loads, stress testing tests it under conditions that far exceed realistic estimates. How does load testing work? During load testing, the testing tool simulates the concurrent requests to your application through multiple virtual users (VUs) and measures insights like response times, throughput rates, resource utilization levels, and more. Why is it important? In today’s world, both enterprises and consumers rely on digital applications for crucial functions. For this reason, even a small failure can be costly both in terms of reputation and money. For example, imagine if Amazon did not know the amount of traffic that its servers could sustain; it would fail to supply requests from its customer during peak seasons like Black Friday. You might think that this event is unlikely. However, according to a survey taken by the global research and advisory firm Gartner, in 2020, 25% of respondents reported the average hourly downtime cost of their application was between $301,000 and $400,000. Furthermore, 17% said it cost them $5M per hour. What is k6? k6 is an open-source load testing tool written in Go that embeds a JavaScript runtime to allow developers to write performance tests in JavaScript. Each script must have at least one default function. This function represents the entry point of the virtual user. The structure of each script has two main areas: Init Code: Code outside the default function that is run only once per VU VU Code: Code inside the default function that runs continuously as long as the test is running // init code export default function() { // vu code } If you want to define characteristics like duration or DNS, or if you want to increase or decrease the number of VU during the test, you can use the options objects as follows: export let options = { test_1: { executor: 'constant-arrival-rate', rate: 90, timeUnit: '1m', duration: '5m', preAllocatedVUs: 10, tags: { test_type: 'api' }, env: { API_PROTOCOL: 'http' }, exec: 'api', }, test_2: { executor: 'ramping-arrival-rate', stages: [ { duration: '30s', target: 600 }, { duration: '6m30s', target: 200 }, { duration: '90s', target: 15 }, ], startTime: '90s', startRate: 15, timeUnit: '10s', preAllocatedVUs: 50, maxVUs: 1000, tags: { test_type: 'api' }, env: { API_PROTOCOL: 'https' }, exec: 'api', }, }; Azure DevOps and k6 Processes like continuous integration promote shift-left testing, giving you the advantage of discovering and addressing issues in the early stages of application development. However, you should run load tests designed to determine if your application can handle requests in an environment as close as possible to production. While the cost of maintaining an environment identical to that of production may be prohibitive, you should make it as similar as possible. If you have the resources, a solution might be to create a staging environment that is a copy of our production environment. Otherwise, you can consider carefully running tests on your production environment. In this demo, I will show you how to set up your DevOps to perform load testing of a .NET 5 API written in C# using k6. Now that you have a general understanding of this article’s leading players, let us dig into the demonstration. Create the API First, you need to create a new ASP.NET 5 API. You can do this easily by using the following code: dotnet new sln dotnet new webapi -o Training -f net5.0 --no-https dotnet sln add Training/Training.csproj To make this API more realistic, add the Entity Framework Core in-memory database provider and create a DbContext instance that represents a session with the database you will use to query and save instances of your entities. public class ApplicationDbContext : DbContext { public ApplicationDbContext(DbContextOptions<ApplicationDbContext> options) : base(options) { } public DbSet<Product> Products { get; set; } } Then, register the context as a service in the IService Collection by copying and pasting the following instruction in your Startup.cs file: services.AddDbContext<ApplicationDbContext>(opt => opt.UseInMemoryDatabase("ApplicationDbContext")); Create the Product class with its attributes and methods: public class Product { public int Id { get; set; } public string Name { get; set; } public decimal Price { get; set; }public static void AddProduct(ApplicationDbContext context, Product product) { context.Products.Add(product); context.SaveChanges(); } public static Product GetProductById(ApplicationDbContext context, int id) { return context.Products.FirstOrDefault((p) => p.Id == id); } public static List<Product> GetAllProduct(ApplicationDbContext context) { return context.Products.ToList(); } public static void RemoveProductById(ApplicationDbContext context, int id) { var productToRemove = context.Products.FirstOrDefault((p) => p.Id == id); context.Products.Remove(productToRemove); context.SaveChanges(); } } To complete your API, create a controller to define the methods of your API: [ApiController] [Route("[controller]")] public class ProductController : ControllerBase { private readonly ApplicationDbContext _context; public ProductController(ApplicationDbContext context) { _context = context; } [HttpPost] [Route("AddProduct")] public void AddProduct(Product product) { Product.AddProduct(_context, product); } [HttpPost] [Route("RemoveProduct")] public void RemoveProduct(int id) { Product.RemoveProductById(_context, id); } [HttpGet] [Route("GetAllProducts")] public IEnumerable<Product> GetAllProducts() { return Product.GetAllProduct(_context); } [HttpGet] [Route("GetProduct")] public Product GetProduct(int id) { return Product.GetProductById(_context, id); } } Create the load test Now that your API is ready, you can move forward with the load tests. To do so, create a new folder and copy the following code into it in a script.js file: import http from 'k6/http'; import { check } from 'k6'; import { jUnit, textSummary } from 'https://jslib.k6.io/k6-summary/0.0.1/index.js'; export const options = { stages: [ { duration: '10s', target: 10 }, { duration: '20s' }, { duration: '10s', target: 5}, ], thresholds: { http_req_duration: ['p(95)<250'], }, }; export default function () { let res = http.get(`${__ENV.API_PROTOCOL}://${__ENV.API_BASEURL}/Product/GetAllProducts`); check(res, { 'is status 200': (r) => r.status === 200, }); } export function handleSummary(data) { let filepath = `./${__ENV.TESTRESULT_FILENAME}-result.xml`; return { 'stdout': textSummary(data, { indent: ' ', enableColors: true}), './loadtest-results.xml': jUnit(data), } } Let us take a moment to explain what this code does. First, it configures the test to ramp up the number of virtual users from 1 to 10 in 10 seconds in the options object. Then, this number will stay constant to 10 VU for 20 seconds. Finally, it will ramp down to 5 VU in the final 10 seconds of your test. The thresholds object defines that, to succeed, at least 95% of requests of this test should be below 250ms. Now, let us talk about functions. In this script, we have two functions. As I mentioned earlier in this article, the default function is the entry point of each virtual user (VUs). I used the k6/http module to perform an HTTP to the GetAllProducts method of my API. I also added two environment variables to change the protocol and base URL dynamically. The second function is the handleSummary(). This function is invoked at the end of the test; it contains the results in their parameter. Then, I use a JS helper function to generate JUnit files from the summary data using the results. Install the k6 extension To use the k6 functionalities, you can either manually run the script to install the tool on the virtual machine where the agent is running or install the k6 extension in your organization. To install the extension, follow these steps: Sign in to your Azure DevOps organization. Go to Organization Settings. Select Extensions. Click on Browse Marketplace at the top right. As the Marketplace opens, search for k6. Click on the correct k6 result as shown below. Click on the Get it Free button. Select your target organization from the dropdown, then click Install to complete the procedure. Where to execute the load test in the pipelines? In Azure DevOps, you can develop both build and release pipelines. You might be tempted to perform your load test in your build pipeline to spot any degradation. However, this exempts it from the build pipeline duties. Also, to be reliable, load testing should be performed in an environment that better represents the production. From an economic perspective, the development environment typically uses far fewer resources than production for computing, storage, and networking. Therefore, I recommend executing the load test in the release pipeline. Since the focus of this demo is load testing, I avoid explaining the steps necessary to create a service connection to your cloud provider and your build pipeline and I'm moving directly to the release pipeline. From the dashboard, select Pipelines and then Releases. Click the New Pipeline button. Select Add an Artifact. Select Build as the source type, then select the building pipeline from within the Source (Build Pipeline) dropdown. Click on Add a Stage. Select Empty Job. Click the + icon then select the Azure App Service Deploy tasks. Select the service connection to the Azure Resource Manager that you have previously created. Go back to the Pipeline tab. Click the + icon again then select the k6 task. Then, specify both the location of your load test and the values of your environment variables previously defined in your script. To conclude your pipeline, click the + icon, and select the Publish Test Results task. Select the JUnit format, specify the path where your report is located, and enable the option to make the pipeline file if there are test failures. Run the test To run your pipeline, click on the Create New Release button on the top right side of the screen. If you enable the Continuous Deployment trigger, just perform a commit to your repository. As you can see, we will be able to see the load test’s result in both the console and in Azure Test Plan. In this case, the test was successful and the release pipeline continued. If the application did not reach the criteria configured in the options file of the load test, the entire pipeline would have failed. From this failure, you would know that there was something wrong with it before it reached the production environment. References Azure DevOps public project: https://dev.azure.com/GTRekter/Training k6 official documentation: https://k6.io/20KViews1like1CommentHow Azure can help your company expand in multiple regions (2 of 5)
In my previous article, we created a proposal based on the company’s overview, goals, and technology, then highlighting the benefits of an architecture base on Azure resources and giving an overview of its flow. In today’s article, I am going to start guiding you by implementing the proposed architecture by creating an Azure SQL database resource in our main company’s database in Montana, replicating it in Europe, and then synchronizing it with the On-Premises database. Create an Azure SQL Database Sign in to the Azure portal. Select + Create a resource, filter the result for Databases, and click on SQL Database On the creation tab, select a resource group, insert a database name, create a new server in a Central US (primary location of the company), and select a database’s configuration. Click Review + create to create the Azure SQL database resource. Replicate the database Once the creation process is completed, select your SQL server, and click your SQL database from the list of available databases. Select the Geo-Replication tab and click North Europe (closest region to the company’s target audience). In the creation form, create a new server in North Europe, select the same configuration as the primary location, and click OK. Azure will then initialize the new instance and then seed the data from the primary database to your replica. Synchronize your On-Premise database to Azure SQL database Select your primary SQL database from the list of available databases. Click the Sync to other databases tab and click New Sync Group. On the creation tab, insert a group name, select the Existing database and enable the automatic synchronization. By doing so, you will set the frequency with which the primary instance of your Azure SQL database will synchronize with the group and resolve possible conflicts. The possible options are: Hub win: When conflicts occur, data in the hub database overwrite conflicting data in the member database. Member win: When conflicts occur, data in the member database overwrite conflicting data in the hub database. Once the synchronization for the group is created, you will see it in the status column, Not Ready. Select the synchronization group, click databases, and then click the Add an On-Premises Database button. To complete the configuration of an On-Premises database, we must set up an SQL Azure Data Sync agent on the machine where the SQL server is located and then Select which Database to sync. Set up the SQL Azure Data Sync Agent Click Choose the sync Agent Gateway. If you don’t have an agent already installed on your machine, select the Create a new agent radio button. Download the data sync agent from the following link https://www.microsoft.com/en-us/download/details.aspx?id=27693. Open the installer and start the installation wizard. Read and accept the License Agreement and Privacy Information, select Accept and click Next. Enter the credential of the account with Network Access to run Windows Service (for example, contoso\gtrekter) and click Next. Enter the location where you want to install the agent and click Next to start the installation. Once the installation is complete, launch the Microsoft SQL Data Sync Agent. Before continuing, go back to Azure, fill the agent name textbox, and click Create and Generate Key button. Once the key is generated, copy it to the clipboard. In the Microsoft SQL Data Sync, click the Submit Agent Key button. In the configuration window, add the Agent Key generated by Azure and the Login and Password fields. Enter the Azure SQL Database server’s credentials where the Hub database is located. Test that everything is working properly by clicking on the Test Connection button and then OK. Click the Register button, and in the SQL Server Configuration window, select the Authentication type, the Server, and the Database to synchronize. Once again, check that everything is fine by clicking on the Test Connection button and Save. Select the database To complete the configuration of your On-Premises database, follow the following steps: Click select the database. The tab that will appear, add a Sync Member Name, select the On-premises database that you have previously registered in the Microsoft SQL Data Sync and select the Sync Directions. The available options are: Bi-directional Sync: Data changes on either the on-premises SQL Server database or the hub database will be written to the other database. To the Hub: Changes in the on-premises SQL Server database are written to the hub database but not vice versa. From the Hub: Changes in the hub are written to the on-premises SQL Server, but not vice versa. Click OK Congratulations! You have successfully created an Azure SQL database resource, geo-replicated it in a secondary location, and synchronized it with your On-Premises database!747Views1like0CommentsHow Azure can help your company expand in multiple regions (1 of 5)
This month, I want to do something different. Typically, I discuss standalone topics related to an overall theme without really connecting each of them. But this time, I will guide you through a real-life scenario of how Azure can help your organization; starting with the analysis of the deliverable, through to their implementation. Company overview Contoso is a privately-owned construction company that does business in Montana and North Dakota. Contoso’s business model focuses on on-site developments, municipal work, asphalt and concrete construction, and structural pier foundations. Their customer base includes real-estate developers and government agencies. Business Environment and Goals Even though the United States’ economy is doing well, Contoso is looking to expand its presence into the European market by acquiring new customers through heavy investments in marketing campaigns. Contoso’s president has set a goal of 35% growth both in the company’s revenue and size in the next year. Contoso wants to modernize its business model and allow its website to be easily deployed using standard tools, be scalable and able to grow/shrink with new marketing campaigns, and perform well across regions. Technology Currently, both the website and enterprise data is managed by IT on a server in the company’s data center in Montana. The data is stored in an SQL database that is periodically backed up on different servers within the same datacenter. The website has just been updated to .NET 5 and is constantly updated by a team of 20 engineers spread across the United States. The development team is not following any software development framework. System releases are scheduled with the IT department through an open-ticket system with the new version and documentation package containing the instructions on how to deploy it. Proposal Starting from the company overview, I have created the following architecture to help Contoso achieve its goal using Microsoft Azure’s services. The resources used in this architecture are the following: Azure DevOps: Software as a service (SaaS) platform that provides the tools for agile planning, work item tracking, cloud-hosting of your Git Repositories, integrated package management, and cloud-agnostic CI/CD pipelines. Azure Front Door: An integral part of the Azure SDN stack provides a global, scalable entry-point to your web applications. It also provides high-performance, low-latency, load balancing for HTTP protocols, and high availability by redirecting the traffic to the secondary region if the primary is unavailable. Azure App Service: HTTP-based service for hosting web, mobile, and REST APIs applications. Azure SQL Database: Fully managed Platform as a Service (PaaS) database engine that provides dynamic scalability with no downtime, and global availability which automatically handles the upgrade, patch, and monitoring of your database. Azure Monitor: Collect, analyze, and act on telemetries collected by both cloud and On-Premises resources. Power BI: Cloud-based analytics service that provides tools for aggregating, analyzing, and visualizing data. Advantages By adopting the proposed architecture, Contoso will benefit from the following: Geo-distribution: Both application and data can be deployed and replicated in data centers worldwide with no upfront cost. This will make it easier for them to reach new markets like Europe and Asia while maintaining low latency and high-availability. Disaster Recovery: In the current architecture, both application and data are stored in the same datacenter in Montana. In the case of a natural disaster or power outage, there is a real possibility of unstable connectivity to the data center or even could knock out the company’s entire datacenter. If the company does not have a disaster recovery policy in place, it may result in data loss or service disruption. To minimize this damage, Azure offers multiple services able to backup, replicate, and geo-distribute your application and data. High availability: Azure will provide a continuous user experience with no apparent downtime — even with partial server failure. Scalability: While scaling refers to the resource’s ability to increase/decrease the machine’s power (vertical) or adding new replicas of the resource (horizontal). In our scenario, this functionality will be critical during the marketing campaign, where the usage of specific resources will rise and consequently require more power. Elasticity: Azure takes a step further by giving the possibility to scale your resource according to the demands and usage automatically. This characteristic is called elasticity and is beneficial in case of unexpected peaks in usage. Pricing: Azure delivers its services using a Pay-as-you-go pricing model. This means that you will not have any upfront commitment and will be charged only for the services you use. In our scenario, Contoso will be able to run resources in different parts of the world without paying to construct a data center, recruiting specialized IT personnel, or maintaining the resources. And if Contoso chooses to relocate its resources to another region, it can do so with just a few clicks. Application flows Its flow is going to be as follows: The development team uses Azure Boards’ functionalities for task management, sprint planning, bug tracking, visualization, and reporting tool. The source code is managed by one of the source control management systems supported by Azure Repos that help developers collaborate, resolve conflicts, branching, and provide a history of the code changes. When a new release candidate is ready, a special trigger starts the CI/CD pipelines defined in Azure Pipeline. These pipelines double-check that no vulnerabilities are introduced in the new version and publish it to both App Services in Central US and North Europe. The On-Premises databases are synchronized to the Azure SQL database regularly and then geo-replicated in both Central US and North Europe. Azure Front Door acts as a middleman between the user and the backend; forwarding the HTTP request to the closest healthy App Service all while decreasing latency. The Azure Monitor instance collects monitoring telemetry from the multiple Azure resources used in this architecture. A manager consults the insight collected by Azure Monitor directly on Power BI, where they can create charts, dashboards, and more.1.4KViews0likes0CommentsSharePoint Site Templates
If you want to create a new SharePoint site using the Client Object Model, you can use the class WebCreationInformation to specify the site’s properties and then create it by adding the site to the Webs. However, it is required to pass a unique TemplateType as a parameter. This parameter is a string and must match with one of the names of the available templates. To get the list of all available templates, you can run the command Get-SpoWebTemplate in PowerShell. The table below gives an overview of all available templates. As the libraries contained in the package Microsoft.SharePointOnline.CSOM don’t provide any enumerator for the available templates. I have created the following snippet. using System.Runtime.Serialization; namespace SharePoint.Models { public enum WebTemplate { [EnumMember(Value = "GLOBAL#0")] GlobalTemplate, [EnumMember(Value = "STS#0")] TeamSite, [EnumMember(Value = "STS#1")] BlankSite, [EnumMember(Value = "STS#2")] DocumentWorkspace, [EnumMember(Value = "MPS#0")] BasicMeetingWorkspace, [EnumMember(Value = "MPS#1")] BlankMeetingWorkspace, [EnumMember(Value = "MPS#2")] DecisionMeetingWorkspace, [EnumMember(Value = "MPS#3")] SocialMeetingWorkspace, [EnumMember(Value = "MPS#4")] MultipageMeetingWorkspace, [EnumMember(Value = "CENTRALADMIN#0")] CentralAdminSite, [EnumMember(Value = "WIKI#0")] WikiSite, [EnumMember(Value = "BLOG#0")] Blog, [EnumMember(Value = "SGS#0")] GroupWorkSite, [EnumMember(Value = "TENANTADMIN#0")] TenantAdminSite, [EnumMember(Value = "ACCSRV#0")] AccessServicesSite, [EnumMember(Value = "ACCSRV#1")] AssetsWebDatabase, [EnumMember(Value = "ACCSRV#3")] CharitableContributionsWebDatabase, [EnumMember(Value = "ACCSRV#4")] ContactsWebDatabase, [EnumMember(Value = "ACCSRV#6")] IssuesWebDatabase, [EnumMember(Value = "ACCSRV#5")] ProjectsWebDatabase, [EnumMember(Value = "BDR#0")] DocumentCenter, [EnumMember(Value = "EXPRESS#0")] ExpressTeamSite, [EnumMember(Value = "OFFILE#1")] RecordsCenter, [EnumMember(Value = "EHS#0")] ExpressHostedSite, [EnumMember(Value = "OSRV#0")] SharedServicesAdministrationSite, [EnumMember(Value = "PowerPointBroadcast#0")] PowerPointBroadcastSite, [EnumMember(Value = "PPSMASite#0")] BusinessIntelligenceCenter, [EnumMember(Value = "SPS#0")] SharePointPortalServerSite, [EnumMember(Value = "SPSPERS#0")] SharePointPortalServerPersonalSpace, [EnumMember(Value = "SPSMSITE#0")] PersonalizationSite, [EnumMember(Value = "SPSTOC#0")] ContentsAreaTemplate, [EnumMember(Value = "SPSTOPIC#0")] TopicAreatemplate, [EnumMember(Value = "SPSNEWS#0")] NewsSite, [EnumMember(Value = "CMSPUBLISHING#0")] PublishingSite, [EnumMember(Value = "BLANKINTERNET#0")] PublishingSiteBlank, [EnumMember(Value = "BLANKINTERNET#1")] PressReleasesSite, [EnumMember(Value = "BLANKINTERNET#2")] PublishingSiteWithWorkflow, [EnumMember(Value = "SPSNHOME#0")] NewsHomeSite, [EnumMember(Value = "SPSSITES#0")] SiteDirectory, [EnumMember(Value = "SPSCOMMU#0")] CommunityAreaTemplate, [EnumMember(Value = "SPSREPORTCENTER#0")] ReportCenter, [EnumMember(Value = "SPSPORTAL#0")] CollaborationPortal, [EnumMember(Value = "SRCHCEN#0")] EnterpriseSearchCenter, [EnumMember(Value = "PROFILES#0")] Profiles, [EnumMember(Value = "BLANKINTERNETCONTAINER#0")] PublishingPortal, [EnumMember(Value = "SPSMSITEHOST#0")] MySiteHost, [EnumMember(Value = "ENTERWIKI#0")] EnterpriseWiki, [EnumMember(Value = "SRCHCENTERLITE#0")] BasicSearchCenter, [EnumMember(Value = "SRCHCENTERFAST#0")] FastSearchCenter, [EnumMember(Value = "TenantAdminSpo#0")] SharePointOnlineTenantAdmin, [EnumMember(Value = "visprus#0")] VisioProcessRepository, } } To access the EnumMember attribute, we have to use the following Enum extension method. using System; using System.Linq; using System.Runtime.Serialization; using System.Reflection; using System.ComponentModel; namespace SharePoint.Extensions { public static class EnumExtensions { public static string GetMemberAttributeValue(this Enum source) { Type enumType = source.GetType(); if (!enumType.IsEnum) { throw new ArgumentException("source must be an enumerated type"); } var memInfo = enumType.GetMember(source.ToString()); var attr = memInfo.FirstOrDefault()?.GetCustomAttributes(false).OfType<EnumMemberAttribute>().FirstOrDefault(); if (attr != null) { return attr.Value; } return null; } } } Now, to create a new website on SharePoint would be enough using the following code: WebCreationInformation webCreationInfo = new WebCreationInformation { Title = "Title", Url = "url", WebTemplate = WebTemplate.ExpressTeamSite.GetMemberAttributeValue(), Description = "Description", UseSamePermissionsAsParentSite = true };ctx.Site.RootWeb.Webs.Add(webCreationInfo); ctx.ExecuteQuery();3.3KViews0likes0CommentsAutomating your Microsoft Teams creation process using PowerShell
Microsoft Teams’ daily users are skyrocketing. From April 2020 to October 2020, their numbers rose more than 50 percent from 75 million to 115 million. Suppose your company decides to adopt this software for daily internal or external communications. In that case, you might face the manual construction of public/private teams and channels, assign Office 365 users to them, and so on. This procedure might become very time consuming as the complexity of your company increases. To solve this problem, I have created a script that takes care of all of this for you by automating the entire process. The only thing you must do is create a JSON file as described below and pass a few parameters to the script. Configuration file This JSON file contains all the teams and channels that you want to create in your organization. You can also specify each object’s visibility (private or standard), the users you wish to assign to it, and their roles. The schema of the JSON is the following: { "definitions": {}, "$schema": "http://json-schema.org/draft-07/schema#", "$id": "https://example.com/object1610484731.json", "title": "root", "type": "object", "required": [ "teams" ], "properties": { "teams": { "$id": "#root/teams", "title": "teams", "type": "array", "default": [], "items": { "$id": "#root/teams/items", "title": "items", "type": "object", "required": [ "displayname", "visibility", "users", "channels" ], "properties": { "displayname": { "$id": "#root/teams/items/displayname", "title": "displayname", "description": "Team display name", "type": "string", "default": "", "examples": [ "public team1" ], "pattern": "^.*$" }, "visibility": { "$id": "#root/teams/items/visibility", "title": "visibility", "description": "Team visibility", "type": "string", "default": "public", "enum": ["public", "private"], "examples": [ "public", "private" ] }, "users": { "$id": "#root/teams/items/users", "title": "users", "type": "array", "default": [], "items": { "$id": "#root/teams/items/users/items", "title": "items", "type": "object", "required": [ "email", "role" ], "properties": { "email": { "$id": "#root/teams/items/users/items/email", "title": "email", "description": "User email", "type": "string", "default": "", "format": "email", "examples": [ "user1@domain.com" ] }, "role": { "$id": "#root/teams/items/users/items/role", "title": "role", "description": "User role", "type": "string", "default": "owner", "enum": ["owner", "member"], "examples": [ "owner", "member" ] } } } }, "channels": { "$id": "#root/teams/items/channels", "title": "channels", "type": "array", "default": [], "items": { "$id": "#root/teams/items/channels/items", "title": "items", "description": "Team users", "type": "object", "required": [ "displayname", "membershiptype", "users" ], "properties": { "displayname": { "$id": "#root/teams/items/channels/items/displayname", "title": "displayname", "description": "Channels display name", "type": "string", "default": "", "examples": [ "public channel" ], "pattern": "^.*$" }, "membershiptype": { "$id": "#root/teams/items/channels/items/membershiptype", "title": "membershiptype", "description": "Channels membership type", "type": "string", "default": "standard", "enum": ["standard", "private"], "examples": [ "standard", "private" ] }, "users": { "$id": "#root/teams/items/channels/items/users", "title": "users", "description": "Channels users", "type": "array", "default": [], "items": { "$id": "#root/teams/items/channels/items/users/items", "title": "items", "type": "object", "required": [ "email", "role" ], "properties": { "email": { "$id": "#root/teams/items/channels/items/users/items/email", "title": "email", "description": "User email", "type": "string", "default": "", "format": "email", "examples": [ "user1@domain.com" ] }, "role": { "$id": "#root/teams/items/channels/items/users/items/role", "title": "role", "description": "User role", "type": "string", "default": "member", "enum": ["owner", "member"], "examples": [ "owner", "member" ] } } } } } } } } } } } } The following example shows how to create different teams and channels with multiple users and permissions: { "teams":[ { "displayName": "Public team1", "visibility": "Public", "users": [ { "email":"User1@domain.com", "role":"Owner" }, { "email":"User2@domain.com", "role":"Member" }, { "email":"User3@domain.com", "role":"Member" } ], "channels": [ { "displayName": "Public channel", "membershipType": "Public", "users":[] }, { "displayName": "Private channel", "membershipType": "Private", "users":[ { "email":"User1@domain.com", "role":"Owner" }, { "email":"User2@domain.com", "role":"Member" } ] } ] }, { "displayName": "Public team2", "visibility": "Public", "users": [ { "email":"User1@domain.com", "role":"Owner" }, { "email":"User2@domain.com", "role":"Member" }, { "email":"User3@domain.com", "role":"Member" } ], "channels": [ { "displayName": "Public channel", "membershipType": "Public", "users":[] }, { "displayName": "Private channel", "membershipType": "Private", "users":[ { "email":"User1@domain.com", "role":"Owner" }, { "email":"User2@domain.com", "role":"Member" } ] } ] } ] } Script This script is based on the official Microsoft Teams PowerShell module. If you are using a version of PowerShell prior the 1.0.18, you might face the following error: New-TeamChannel : A parameter cannot be found that matches parameter name 'MembershipType'. At line:1 char:101 + ... upId -DisplayName PrivateChannelDisplayName -MembershipType Private + ~~~~~~~~~~~~~~~ + CategoryInfo : InvalidArgument: (:) [New-TeamChannel], ParameterBindingException + FullyQualifiedErrorId : NamedParameterNotFound,Microsoft.TeamsCmdlets.PowerShell.Custom.NewTeamChannel The problem is that the creation of private channels is not supported by versions prior to 1.0.18. At the time of this writing, I am using version 1.1.10-preview. The logic of this script is straightforward. However, there are few keynotes: Because each team and private channel has its dedicated SharePoint communication site, channels are created asynchronously. To solve possible issues related to assigning users to an incomplete instance, the function Add-UserToPrivateChannel will perform several attempts to set users to a private channel. If you want to set a user as the owner of a private channel, the user must first be added as a standard user and then changed to the owner. Param( [Parameter(Position=0)] [string]$Office365Username, [Parameter(Position=1)] [string]$Office365Password, [Parameter(Position=2)] [string]$TeamsFilePath ) Write-Verbose "Importing modules" $Module = Get-Module -Name MicrosoftTeams -ListAvailable if($Module.Count -eq 0) { Write-Verbose "Installing MicrosoftTeams module" Install-Module -Name MicrosoftTeams -AllowPrerelease -AllowClobber -Force } Function New-MicrosoftTeam([object]$Team) { Try { Write-Verbose "Creating $($Team.DisplayName) $($Team.Visibility) team" $NewTeam = New-Team -DisplayName $Team.DisplayName -Visibility $Team.Visibility Write-Verbose "Adding $($Team.Users.Length) users to $($Team.DisplayName) team" $Team.Users | ForEach-Object -Begin { $Index = 0 } -Process { $Index = $Index + 1 Write-Progress -Id 1 -ParentId 0 -Activity "Add user to the team" -Status "$($Index) of $($Team.Users.Length) - User: $($_.Email), Role: $($_.Role)" -PercentComplete ($Index/$Team.Users.Length*100) Write-Verbose "Adding $($_.Email) to $($Team.DisplayName) teams as $($_.Role)" Add-TeamUser -User $_.Email -Role $_.Role -GroupId $NewTeam.GroupId } -End { Write-Verbose "Users succesfully added to the $($Team.DisplayName) team" } Write-Verbose "Add $($Team.Channels.Length) channels to $($Team.DisplayName) team" $Team.Channels | ForEach-Object -Begin { $Index = 0 } -Process { $Index = $Index + 1 Write-Progress -Id 2 -ParentId 0 -Activity "Creation of a new channel" -Status "$($Index) of $($Team.Channels.Length) - Display Name: $($_.DisplayName), Membership Type: $($_.MembershipType)" -PercentComplete ($index/$Team.Channels.Length*100) New-TeamChannel -DisplayName $_.DisplayName -MembershipType $_.MembershipType -GroupId $NewTeam.GroupId Write-Verbose "Check channel membership type" if('Private' -eq $_.MembershipType -And $_.Users.Length -gt 0) { Write-Verbose "Add $($_.Users.Length) users to $($_.DisplayName) private channel" $_.Users | ForEach-Object -Begin { $IndexUsers = 0 $UsersLength = $_.Users.Length $DisplayName = $_.DisplayName } -Process { $IndexUsers = $IndexUsers + 1 Write-Progress -Id 3 -ParentId 2 -Activity "Add user to the private channel" -Status "$($IndexUsers) of $($UsersLength) - User: $($_.Email), Role: $($_.Role)" -PercentComplete ($IndexUsers/$UsersLength*100) Write-Verbose "Adding $($_.Email) to $($DisplayName) private channel as $($_.Role)" Add-UserToPrivateChannel -DisplayName $DisplayName -Email $_.Email -Role $_.Role -GroupId $NewTeam.groupId } -End { Write-Verbose "Users succesfully added to the $($_.DisplayName) channel" } } } -End { Write-Verbose "Channels succesfully created" } } Catch { Write-Error "Message: [$($_.Exception.Message)]" -ErrorId B1 } } Function Add-UserToPrivateChannel([string]$DisplayName, [string]$Email, [string]$Role, [string]$GroupId) { $MaxNumberOfAttemps = 5 $Attemps = 0 do { try { Write-Verbose "$($Attemps) attempt/s" Write-Verbose "Waiting $(60*$Attemps) seconds" Start-Sleep -s (60*$Attemps) Write-Verbose "Adding $($Email) to $($DisplayName) private channel" Add-TeamChannelUser -DisplayName $DisplayName -User $Email -GroupId $GroupId Write-Verbose "Check user role" if("Owner" -eq $Role){ Write-Verbose "Set $($Email) as owner of the $($DisplayName) private channel" Add-TeamChannelUser -DisplayName $DisplayName -User $Email -Role "Owner" -GroupId $GroupId } break; } catch { $Attemps = $Attemps + 1 if($_.Exception.ErrorCode -ne 404 -And $attemps -eq $MaxNumberOfAttemps){ throw } } } while ($Attemps -lt $MaxNumberOfAttemps) } Write-Verbose "Generating secure password" $SecurePassword = ConvertTo-SecureString -AsPlainText $Office365Password -Force Write-Verbose "Generating PSCredential object" $Credentials = New-Object -TypeName System.Management.Automation.PSCredential -Argumentlist $Office365Username, $SecurePassword; Write-Verbose "Connecting to Microsoft Teams" Connect-MicrosoftTeams -Credential $Credentials Write-Verbose "Read JSON file from $($TeamsFilePath)" $Json = Get-Content -Raw -Path $TeamsFilePath | ConvertFrom-Json $Json.Teams | ForEach-Object -Begin { $Index = 0 } -Process { $Index = $Index + 1 Write-Progress -Id 0 -Activity "Creation of the teams" -Status "$($Index) of $($Json.Teams.Length) - Display Name: $($_.DisplayName), Visibility: $($_.Visibility)" -PercentComplete ($Index/$Json.Teams.Length*100) New-MicrosoftTeam -Team $_ } -End { Write-Host "Update completed" -ForegroundColor Green } References PowerShell module: https://www.powershellgallery.com/packages/MicrosoftTeams/1.1.10-preview GitHub repository: https://github.com/GTRekter/Apollo
Recent Blog Articles
No content to show