Data Protection
77 TopicsHow Microsoft 365 Backup works and how to set it up
Protect your Microsoft 365 data and stay in control with Microsoft 365 Backup — whether managing email, documents, or sites across Exchange, OneDrive, and SharePoint. Define exactly what you want to back up and restore precisely what you need to with speeds reaching 2TB per hour at scale. With flexible policies, dynamic rules, and recovery points up to 365 days back, you can stay resilient and ready. In this introduction, I'll show you how to minimize disruption and keep your organization moving forward even in the event of a disaster with Microsoft 365 Backup. Fine-tune what gets backed up. Back up by user, site, group, or file type — to meet your exact needs. Get started with Microsoft 365 Backup. Restore data in-place or to a new location. Compare versions before committing. Take a look at Microsoft 365 Backup. Restore content from months ago. Use fast weekly snapshots — even when the issue went unnoticed for weeks. Start here with Microsoft 365 Backup. QUICK LINKS: 00:00 — Automate recovery process 00:37 — How to use Microsoft 365 Backup 01:49 — Compare with migration-based solutions 02:30 — How to set it up 03:33 — Exchange policy for email backup 05:00 — View and manage backups 05:24 — Recover from a restore point 07:45 — Restore from OneDrive & SharePoint 08:33 — Bulk restore 09:41 — Wrap up Link References Check out https://aka.ms/M365Backup Additional backup and restore considerations at https://aka.ms/M365BackupNotes Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter: https://twitter.com/MSFTMechanics Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram: https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics Video Transcript: -If something bad happens, like someone accidentally does a bulk file deletion or files are corrupted by a malicious user or ransomware, the first question is, can we recover from our backup? And the second question is, how long until we’re back online? Now to help you automate a targeted recovery process, Microsoft 365 Backup has a self-service solution that helps you scope the data that you want to recover. Your data remains inside your Microsoft 365 trust boundary, providing bulk restore recovery speeds of up to 2 terabytes per hour at scale. -Now, you might be wondering, do I even need to back up Microsoft 365 data? Let’s look at where it makes sense. So, first, if there’s a natural disaster, Microsoft 365 already natively offers high availability and disaster recovery with built-in service resiliency. That said, if you experience a data breach or maybe unexpected data corruption from a processor person on your end, or because of ransomware, your Microsoft support options depend on the workload in Microsoft 365. For example, for SharePoint, if you do nothing additional at all, when you contact Microsoft Support, if the event happened up to 14 days prior, Microsoft will recover OneDrive and SharePoint to a previous state within that timeframe. That said, if you want to get more specific on what gets restored or want to go back further than 14 days to recover your data, this is where the Microsoft 365 Backup service comes in. It’s self-service by design for SharePoint Exchange and OneDrive, giving more targeted control to scope exactly what you need to restore for up to 365 days. We’ll be adding more Microsoft 365 Backup coverage to other Microsoft 365 workloads over time. -Let’s compare this with migration solutions that you may be familiar with. These solutions work by moving your data and transforming it to store it into their service. Then, for recovery, the backup has to be restored back to its original form, then migrated back to your Microsoft 365 tenant, adding significant recovery time. Instead, Microsoft 365 Backup takes incremental snapshots of your data. The data stays in your Microsoft 365 service boundary in its native encrypted form. So, when you need to recover your data, the recovery process is accelerated. Microsoft 365 Backup is a consumption-based service with billing based on the amount of data protected. -Next, let’s walk through the setup steps and controls to manage backups and restore them. Starting with setting up a billing plan, where in advance, you’ll need to have an Azure subscription as well as a defined resource group. So, from the Microsoft 365 admin center under Setup, you’ll activate pay-as-you-go services and select Get started. Here, I’ll choose my Azure subscription and the resource group, and the region. Note that this region here is only used for billing. Your data will remain in the location that it’s currently in. Now, still on this page from the Settings tab, in the Storage location, you’ll choose Backup. Then, turn it on and save to confirm. -Now, with the service running, the rest of the steps will be performed from the Microsoft 365 Backup page in the admin center. So, here, I can configure backup policies to initiate automated backup processes. I have navigated within settings to Microsoft 365 Backup. From there, each workload, SharePoint, Exchange, and OneDrive, can have its own individual policies. So, I’m going to walk through an Exchange policy for email backup, but all three follow similar steps. After hitting Set up policy, the overview page displays policy attributes like the backup frequency. In this case, it’s every 10 minutes. The backup retention up to one year. -Now, the backup frequency does not impact your costs. Here, I can choose the selection method. The options are to upload a CSV file with mailboxes. Now, for SharePoint policies, this would be sites, and for OneDrive, we’d target user accounts. You can also use a dynamic rule, which allows the mailboxes in scope to dynamically update as group membership changes. Or you can define specific filters where you can select up to three distribution lists or security groups, or both. Now, these are the same filters for OneDrive policies. And for SharePoint, you can use filters for site names, URL contains values, or site last modified dates. The final option is then to select mailboxes individually, where you can manually select the mailboxes that you want to back up. In my case, I’ll choose the dynamic rule and use distribution lists, and I’ll select Project Falcon and Northwind Traders. -Now, I just need to review, and from there, I can create the policy. The policy will typically be active within an hour of creation, depending on the size of your group, and you can edit policy attributes at any time. So, now with the policy created, let’s move on to the process of viewing and managing backups. I’m back on the Microsoft 365 Backup page, and now I have active policy set up for each workload. And as mentioned, I can make required edits and changes to these policies from here. For example, you can pause backups or add, or remove sites from the SharePoint policy. -So, at this point, all of our services are running automated backups. Now, let’s assume that something happened to our Exchange mailboxes that were backed up and we want to recover from our restore point. Now, to simulate that, I’m logged in as Adele. I’m deleting email from the last month and even removing those from the Deleted items folder. One thing to note is that a restore from Exchange will only impact items that were modified, hard-deleted, or purged during the recovery window. So, let’s recover those deleted emails. So, I can start that for Exchange by hitting Restore mailboxes. -Now, for the choose selection method option, there is an option to upload a CSV list of mailboxes or select them individually. I’ll choose that one. And then, I’ll search for Adele and there she is. Now, I’ll add her mailbox and hit Next. Then, in content scope, I can select all emails including notes, contacts, calendars, and tasks, or I can choose a specific timeframe as well as apply filters, as you can see here. I’m going to keep the default of all items. Then, I can choose a time before the event happened to restore too. From there, I’ll be presented with available restore points. Email restore points are created every 10 minutes from when the policy’s active for up to 365 days. And I’ll choose this one for April 4th at 8:40 AM. -Then, for the destination of restored items, I have two primary options. I can replace mailbox items with backups, or the current version of the items will be overwritten by the items recovered from the restore point. Or I can create new mailbox items from backups within the user’s mailbox, which will be named Recovered Items, with the year, month, day, and time. I’ll keep replace mailbox items. Note that only effective items as mentioned will be overwritten. Any items received after the restore point or unmodified items will not be reverted and will also not get copied over if you decide to create a new folder. Once I confirm and commit to the file restore, from there, I can track progress from the Restoration tasks tab in the Microsoft 365 Backup page and see how things are going. So, I’m going to fast forward a little in time. And just to prove it, I’m back in Adele’s mailbox, and you can see that all of the emails that I deleted before have returned. That’s Exchange. -And there are also a few differences when restoring from OneDrive and SharePoint worth pointing out. Now, I’ll start with SharePoint. Here, I can upload a CSV file of site addresses or select them individually. I’ll do that. Now, I can select exactly which sites I want. There we go. Then, in Search for backups, you’ll see that things are a little different compared to Exchange. And again, I need to choose a date closest to the restore event, as well as a time of day. And for the previous two weeks, there are standard restore points captured every 10 minutes. And for a small-scale restore where you want to prioritize speed over the exact restore time, the prioritized backup options shown here will be faster and is recommended. These faster restore points are taken roughly every 24 hours. -One other thing to note here, if you’re doing a bulk restore, for example, to thousands of sites, then the fast restore points are not relevant. If you want to restore beyond two weeks, because these are weekly snapshots, if I choose the most recent date, where I know that my content is safe, the tool will automatically select the closest restore point captured prior to my selected time. And these weekly restore points are also fast restore points too. The other options are similar to what I showed in Exchange, where you can use in-place Restore or also create new sites. Note that content restored to a new location will apply and address suffix of R, followed by the restore number in a numeric sequence for each restore, starting with R0, as you can see with this site’s URL. In this case, you can copy restored items manually from the restored location to the prior location as needed, and in-place restore will mean users recent edits made to sites, files, and metadata since the time of the restore point will be lost. You can find additional backup and restore considerations at aka.ms/M365BackupNotes. -As you saw today, Microsoft 365 Backup doesn’t just let you self-manage your backups, it helps you recover faster. To find out more, checkout aka.ms/M365Backup. And keep watching Mechanics for the latest tech updates, subscribe to our channel, and thanks for watching.2.3KViews0likes0CommentsUsing MCAS to block file upload to SharePoint Online based on (external) file property?
Hi, With MCAS (by file policy or by Conditional Access App Control), would it be possible to act on single file if specific file property matches search criteria? E.g. if any value in multivalued property "Tags" in Office file matches "testtag01" or if any value in multivalued property "Keywords" in PDF file matches "testtag01". I've tried with O365 DLP, but with traditional Office 365 DLP issue is that those properties are not indexed in SharePoint search index by default and therefore DLP wont detect those.[On demand] Data protection with hardware-based security and Windows 11
Do you know how to combine Windows 11 security features like Personal Data Encryption and BitLocker integrate with hardware features like TPM 2.0, Microsoft Pluton, and VBS to keep users and data protected? Watch Data protection with hardware-based security and Windows 11 – now on demand – and join the conversation at https://aka.ms/HardwareBasedSecurity. For more free technical skilling on the latest in Windows, Windows in the cloud, and Microsoft Intune, view the full Microsoft Technical Takeoff session list.44Views0likes0CommentsRobust data protection features of Azure Synapse
Introduction Data serves as the vital essence of any organization. Whether you’re dealing with sensitive customer information, or financial records, safeguarding your data is non-negotiable. Many organizations face challenges such as: How do you protect the data if you don't know where it is? What level of protection is needed? —because some datasets require more protection than others. Azure Synapse Analytics offers powerful features to help you achieve this, ensuring confidentiality, integrity, and availability. In this blog, we’ll explore the Data Encryption capabilities integrated into Azure Synapse Analytics, discussing encryption techniques for data at rest and in transit, as well as approaches for detecting and categorizing sensitive data in your Synapse workspace. What is Data Discovery and Classification? Imagine your company that have massive amounts of information stored in their databases. But some of columns needs extra protection – like Social Security numbers or financial records. Manually finding this sensitive data is a time-consuming nightmare. Here's the good news: there's a better way! Azure Synapse offers a feature called Data Discovery that automates this process. How does Data Discovery work? Think of Data Discovery as a super-powered scanner. It automatically goes through every row and column of your data lake or databases, looking for patterns that might indicate sensitive information. Just like a smart assistant, it can identify potentially sensitive data and classify those columns for you. Once the data discovery process is complete, it provides classification recommendations based on a predefined set of patterns, keywords, and rules. These recommendations can then be reviewed, and then Sensitivity-classification labels can be applied to the appropriate columns. This process is known as Classification. What happen after classifying sensitivity labels on columns? Sensitivity-classification labels is a new metadata attributes that have been added to the SQL Server database engine. So, after classifying sensitivity labels on columns, the organization can leverage these labels to: implement fine-grained access controls. Only authorized person with the necessary clearance can access sensitive data. masking the sensitive data when accessed by users who do not have the necessary permissions, allowing them to see only anonymized versions of the data. monitoring of access and modification activities on sensitive data (Auditing access to sensitive data). Any unusual or unauthorized activities can be flagged for investigation. Steps for Discovering, Classifying or labelling columns that contain sensitive data in your database The classification includes two metadata attributes: Labels: The main classification attributes, used to define the sensitivity level of the data stored in the column. Information types: Attributes that provide more granular information about the type of data stored in the column. Step 1 -> Choose Information Protection policy based on your requirement SQL Information Protection policy is a built-in set of sensitivity labels and information types with discovery logic, which is native to the SQL logical server. You can also customize the policy, according to your organization's needs, for more information, see Customize the SQL information protection policy in Microsoft Defender for Cloud (Preview). Step 2 -> View and apply classification recommendations The classification engine automatically scans your database for columns containing potentially sensitive data and provides a list of recommended column classifications. After accepting recommendation for columns by selecting the check box in the left column and then select Accept selected recommendations to apply the selected recommendations. You can also classify columns manually, as an alternative or in addition to the recommendation-based classification. To complete your classification, select Save in the Classification page. Note: There is another option for data discovery and classification, which is Microsoft Purview, which is a unified data governance solution that helps manage and govern on-premises, multi-cloud, and software-as-a-service (SaaS) data. It can automate data discovery, lineage identification, and data classification. By producing a unified map of data assets and their relationships, it makes data easily discoverable. Data Encryption Data encryption is a fundamental component of data security, ensuring that information is safeguarded both at rest and in transit. So, Azure Synapse take care of this responsibility for us. It leverages robust encryption technologies to protect data. Data at Rest Azure offers various methods of encryption across its different services. Azure Storage Encryption By default, Azure Storage encrypts all data at rest using server-side encryption (SSE). It's enabled for all storage types (including ADLS Gen2) and cannot be disabled. SSE uses AES 256 to encrypts and decrypts data transparently. AES 256 stands for 256-bit Advanced Encryption Standard. AES 256 is one of the strongest block ciphers available and is FIPS 140-2 compliant. Well, I know these sounds like some Hacking terms😅. But the platform itself manages the encryption key, so we don't need to understand these Hacking terms😅. Also, it forms the first layer of data encryption. This encryption applies to both user and system databases, including the master database. Note: For additional security, Azure offers the option of double encryption. Infrastructure encryption uses a platform-managed key in conjunction with the SSE key, encrypting data twice with two different encryption algorithms and keys. This provides an extra layer of protection, ensuring that data at rest is highly secure. Double the Protection with Transparent Data Encryption (TDE) It is an industrial methodology that encrypts the underlying files of the database and not the data itself. This adds a second layer of data encryption. TDE performs real-time I/O encryption and decryption of the data at the page level. Each page is decrypted when it's read into memory and then encrypted before being written to disk. TDE encrypts the storage of an entire database by using a symmetric key called the Database Encryption Key. Means when data is written to the database, it is organized into pages and then TDE encrypts each page using DEK before it is written to disk, that makes it unreadable without the key. And when a page is read from disk into memory, TDE decrypts it using the DEK, making the data readable for normal database operations. Why do we call it transparent? because the encryption and decryption processes are transparent to applications and users, they have no idea that the data is encrypted or not, the only way they would know if they don't have access to it. This is because encryption and decryption happen at the database engine level, without requiring application awareness or involvement. By default, TDE protects the database encryption key (DEK) with a built-in server certificate managed by Azure. However, organizations can opt for Bring Your Own Key (BYOK), that key can be securely stored in Azure Key Vault, offering enhanced control over encryption keys. Data in transit Data encryption in transit is equally crucial to protect sensitive information as it moves between clients and servers. Azure Synapse utilizes Transport Layer Security (TLS) to secure data in motion. Azure Synapse, dedicated SQL pool, and serverless SQL pool use the Tabular Data Stream (TDS) protocol to communicate between the SQL pool endpoint and a client machine. TDS depends on Transport Layer Security (TLS) for channel encryption, ensuring all data packets are secured and encrypted between endpoint and client machine. It uses a signed server certificate from the Certificate Authority (CA) used for TLS encryption, managed by Microsoft. Azure Synapse supports data encryption in transit with TLS v1.2, using AES 256 encryption.302Views0likes0CommentsSAP System Refresh and Cloning operations on Azure NetApp Files with SnapCenter
Discover the power of SAP HANA on Azure NetApp Files with seamless system refresh and cloning operations using SnapCenter. This innovative solution leverages Azure NetApp Files snapshot and volume cloning capabilities to provide end-to-end workflows for data protection and SAP system refresh operations. Whether you need to create system copies for testing, address logical corruption, or perform disaster recovery failover tests, SnapCenter ensures quick and efficient processes, saving you time and resources.Data Protection for SAP Solutions
Data protection is key for all (SAP) customers. We must find an optimal way to protect data against data corruption caused by hardware or software defects, accidentally deletion of data, external and internal data fraud. Also important is how do we setup HA (high availability) and DR (disaster recovery).File Policy: Change stale externally shared files from modified to created with same parameters
Hello, So I applied a file policy which works great with our organization which is the "Stale externally shared files". This File policy detects any files shared externally that have not been modified for X amount of days. My question is, can I change this modified parameter so that instead of modified, it's created? Here's a screenshot of what I mean. When I add the Created parameter, it only gives me data ranges instead of by days like in the last modified parameter. Is this a customized parameter that comes with the policy? Can I replicate it with Created? How can I make it so that it can detect any files that were created more than X days, to apply governance actions? Thank you!2.3KViews0likes1CommentMicrosoft Purview: Comprehensive solutions for data governance, protection, compliance & management.
Microsoft Purview provides a unified data governance solution to help manage and govern your on-premises, multicloud, and software as a service (SaaS) data, Office Apps, Microsoft Office 365 services, Devices and Cloud Apps. Easily create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. Enable data consumers to access valuable, trustworthy data management.