I've got a SQL Server 2017 database I wrote for a client that has a handful of file tables in it. Full backups of the entire database are usually in the 7 GB range. We added a new file table recently but after working with it for a few weeks we realized it was a lot larger than expected causing full database backups to be in the 70 GB range.
The main database is very actively used so regular backups are critical. However, the file table I mentioned doesn't get updated nearly as often. We decided to pull it out of the main database and put it in an separate SQL Server database so that backups of the main database run faster and are back in the 7 GB range. The problem is that after dropping the file table from the main database the space taken up by the filestream folder is still huge and when running a backup of the main database that is still huge as well.
Before the addition of this new file table, the filestream folders totaled only about 2 GB. After addign the new file table it grew to over 40 GB and even after dropping the file table the filestream folders are still around 40 GB. Also, the db size and backups are still much larger than expected.
Right clicking and viewing properties of a recent copy of the db before the large file table was added shows the database at about 15 GB with 5 GB of free space. Doing the same on the newer one with the same file tables (but after removing the huge new one) It's like 50 GB with very little free space.
How do I get rid of this wasted filestream space? I'm at the point where I'm thinking I'll need to drop all the filestream objects from this system and recreate them in order to do this.