Using Copy-SPSite with 200gb database and migration options in SharePoint 2013

Iron Contributor

Hello,

 

I want to split a 200 GB web application that only has the root site collection into several databases. I made a test in my dev environment with success using Copy-SPSite PowerShell cmdlet (https://www.toddklindt.com/blog/Lists/Posts/Post.aspx?ID=372). My content database in my development environment has 1.2 GB in size and the cmdlet was executed sucessfully.

My concern is using the Copy-SPSite with a 200gb or more. Is there any limit on the database limit for the Copy-SPSite cmdlet?

 

I know we can use Sharegate or other third party tool to do the job but I believe it would take longer than the this approach.

 

My scenario is that I want to split the content database in several content databases (one content db for each of new site collections I will create). I considered Copy-SPSite because it would allow the following approach:

  • Create a duplicate of my site collection faster that migration using a 3rd party tool
  • In the destination site collection, I would delete everything I don't need (everything except what I want to migrate)
  • In the origin site collection, I would delete the data that I don't any more (basically the information I want to migrate)

Although this looks overkill, it would allow:

  • Fast creation of the new site collection and users could start working in the new site collection much faster than using a migration tool like Sharegate or similar 3rd party migration tool that from previous experience it can take a lot of time to accomplish a migration (depends on CPU, memory of the PC where it is installed, network conditions, etc). I have several document libraries I want to migrate and some of them have 50GB and would like to make sure that users could quickly start working in the new site collection without having to wait many hours or days until the migration using a 3rd party tool is complete (it should be much slower)
  • Using Copy-Site, workflow history is maintained. If I use a migration tool, I cant migrate workflow history

From what I read in some articles in the web (Ex: https://blogs.technet.microsoft.com/christianheim/2017/03/09/carefully-consider-using-the-copy-spsit...), there are a few problems that can arise from using Copy-SPSite, essentially duplicate GUIDs even with different site colllections.

 

My questions are:

  • Using a migration 3rd party tool onprem, how fast is migrating large amounts of data (I know it depends on the above conditions I mention above)?
  • Assuming is slow (can take days to migrate 50gb), what alternatives do I have? A couple I remember:

          1. Using Copy-SPSite - the approach I tried, doubts about supported db size and problems with duplicate GUIDs event in different content databases (not sure if it is a problem if we use different content databases that is my case)

          2. Using Backup-Site and Restore-SPSite - same concerns here about Copy-SPSite approach

          3. Using DB Backup and Restore - I believe this could work, but SharePoint content DB ID itself must be changed to be unique and the problem with GUIDs can also happen here

          4. Using Export-SPWeb - this allows a more granular copy or move but has as advantage the fact of not allowing WF history to be preserved (new GUIDs are creating during import and probably workflow history is lost)

 

In general, in this scenario, what would be the best approach to:

        1 - Quicky large amounts of data (50 GB in a single document library) to a new site collection

        2 - Preserve workflow history

        3 - Avoid duplicate GUIDs in content databases. Not sure this is a problem with different content databases (if is for sure in the same content db).

 

What is your recommendation?

1 Reply

@Miguel Lopes Isidoro 

Hi! We are facing similar problems here, and we are thinking about splitting  a content database into two by using copy-SPsite to copy the content into a new database. We will then delete webs so that the same web are only in one of the databases. One of the databases will be more of an archive.

 

How did you proceed?

 

Regards

Jesper