Forum Widgets
Latest Discussions
- Indexing Ideas Needed - Parts I.D. Order in Libraries : Ex.) Parts R1 thru R10, R2...GOAL: To resolve the subject line issue that is intensive and eliminates the following workaround. WORKAROUND A sequential number in a separate index number is formatted as a "Number". Many times I must export the library and add the number in MS Excel and paste these back into the library's index number column. IDEAS ON RESOLUTION Your thoughts on how to properly index the part I.D.s without a separate index number column are welcome including: Alternative formatting to apply to the alphanumeric Part I.D. description in the Part I.D. column to promote correct indexing Any programmed solutions Best regards, Clint HillClint_E_HillOct 30, 2025Brass Contributor12Views0likes0Comments
- Advice on what method of List to implementHello. I'm trying to implement a control of water meters readings. At first, I created a list for the equipments and one list for the readings, with a lookup column linking both lists through the equipment column in the first. But then i remembered the delegation limit. I have more than five hundred equipments and the readings must be registered in a daily basis and that data would surpass the delegation limits of five thousand rows in about two weeks. Because of that, I decided to create a list to each equipment, what would result in more than five hundred lists, but would give much more time (13 years) before the delegation limit for each equipment. But to avoid human errors on selecting the equipment, I'm failing on setting a default value for the readings lists and it made me rethink if that is really a clever idea. My question: what method should work best in this case? Or is there a better way to do it I'm not seeing? Thank you in advance!williamazevedo2210Oct 30, 2025Copper Contributor8Views0likes0Comments
- PNP Search Results sort by titleHi. I have a PNP Search results webpart, that brings me all the Document Libraries in SharePoint site (that are not the standard ones Like site assets, pages etc) I want the results to be sorted, but i tried with sort column and it does not work, I want it to show in alphabetical order If i use "edit sort order" and select by title, it gives me an error I added a RefinableString where i have the title but it is still not sorting it How can i get this list sorted without having to add another column for the Title (with out the link) THank youMVCuserOct 30, 2025Brass Contributor7Views0likes0Comments
- CSOM: My “one query to rule them all” plan backfired — got “Request uses too many resources” 😅Hey folks, So I’ve been playing around with CSOM (Client-Side Object Model), feeling fancy about making one super-efficient query that loads everything I need at once. Something like this: clientContext.Load( clientContext.Web, w => w.SiteUsers.Include(...), w => w.Lists.Include(...), w => w.SiteGroups.Include(...) ); Basically, my thought was: “Why make multiple calls when I can just get everything in one go?” But CSOM had other plans. Instead of being impressed, it hit me with the dreaded: "Request uses too many resources." Even when I tried to be nice and limit the properties, it still said “Nope.” 🙃 So now I’m wondering: Is it actually more efficient (and safer) to create a new ClientContext for each object I want to query (like Web, SiteUsers, Lists, etc.)? Or am I just thinking about this the wrong way and missing some batching trick? Would love to hear how others handle this — or if there’s a secret sauce to making CSOM not freak out when you ask for too much.AdminShaOct 30, 2025Occasional Reader5Views0likes0Comments
- CSOM batch request: “Request uses too many resources” when loading multiple sites with GetSiteByUrlI’m currently exploring CSOM (Client-Side Object Model) and analyzing how property values for client objects such as Web and Site are requested and returned in XML format. I used Fiddler to inspect the request and response bodies. I’m now trying to implement a batch-based query where I load multiple sites with a limited set of properties using the GetSiteByUrl method. However, I’m running into this issue: When I load 10 sites, I get the error: “Request uses too many resources.” When I reduce the number to 8 sites, it works fine. I also compared the bytes sent and bytes received for different batch sizes (screenshot attached). So my question is: Is there a specific limitation on the total request size (bytes sent/received) or number of operations in a single CSOM batch request? If yes, is there any official documentation or guidance on how to determine these limits? Thanks in advance!AdminShaOct 30, 2025Occasional Reader10Views0likes0Comments
- Disable ready-made agents with Sharepoint AgentsIs there any way of disabling ready-made agents when using pay as you go with sharepoint agents. I want to restrict these ready-made agents in the sharepoint sites of my organization and deploy Copilot Studio agents in sharepoint for some sites. Is any way of achieving this without making use of copilot licenses? There is a way using "Restricted content discovery" but this needs at least one copilot license and this feature I think that disables the use of that sharepoint site as knowledge source for future copilot studio agents while it is active.jferOct 30, 2025Occasional Reader3Views0likes0Comments
- This List is Empty - but it isn'tHey all, I have a bug that I've never encountered before. I have a sharepoint list that was working perfectly fine until around 3 PM today. My sharepoint list has about 14 items in it. When I tried to view my sharepoint list, it said "this list is empty". My first thought was that someone accidently deleted the records, so I checked the recycle bin and there was nothing in it related to the list. I checked the sharepoint list under "site contents" and it lists about 14 items in the list (which is what I expect), but when I open the list, it still says "this list is empty". I have a power automate flow that triggers when a Microsoft form is sent, it gathers the response details and makes a row in this list with the content from that form. Again, this process was perfect for a solid week and I've used the same process for dozens of forms over 4 years with no issue. Where this gets really odd is that when I make a new Microsoft form response, my power automate flow runs perfectly and It appears it makes a sharepoint list item. Indeed, in the site contents, after submitting a new form response, the item count goes up by 1 for the sharepoint list. However, the list still says "this list is empty". I even made a new view that only shows a single row (no filters applied), that all 14 records have filled out, my sharepoint list refuses to show any data. I also can't seem to add columns, edit columns, or do what I would normally do for a sharepoint list. I am an owner of the sharepoint list and the form has permission inheritance on. What's even more odd is that I can run powerbi desktop and use my same account credentials to use the sharepoint list as a data source and all 14 rows show up PowerBi desktop, which leads me to believe that the data does actually still exist in the sharepoint list. Every other sharepoint list, including other sharepoint lists on the site, do not have this problem. If possible, I don't want to have to make another list since it's used for other power platform items. However, this is baffling me.WBADAM03Oct 30, 2025Copper Contributor27Views0likes1Comment
- SharePoint : Your storage is almost full - one approachI've been looking for low impact ways to minimize our storage usage across our SharePoint tenant. This post outlines the approach I took and the outcome, and who it might work for. My environment: Only 350GB remaining in Tenancy 8.7 TB used. 265 sharepoint sites in total (teams / communication etc) I decided to focus on sites with more than 100Gb of storage used. In my case this turned up 16 sites. Of these I had to exclude 2 sites. This left 14 sites using 5.194553 TB My goal was to have low/no impact on users while reducing the volume of content in our M365 tenant SharePoint environment. My approach was to do the following: - Get list of all SharePoint site collections, find out Size of each Those larger than 100Gb do the following: - delete all items in recycle bin deleted more than 30 days ago (current setting 93 days, unchanged) - delete all items from 2nd stage recycle bin (current setting 30 days, unchanged) - change all user libraries from 500 to 100 major versions In my environment the recycle bins are rarely used and when I ran a script to work out the average number of versions it was in the 10s at most and for most sites it was 3 or 4. NOTE no sites have minor versioning turned on. I wanted to measure the change so I exported a usage report pre and post doing the above. To achieve all this I used the following code, then the following day checked the SharePoint admin centre reported the same "Storage used" numbers. BE AWARE the code below DELETES data you cannot get back (file versions, and files in the recycle bin). DO NOT RUN THIS to try it out. Test it on one or two sites first. Even if you have backups getting version history back would be very challenging. <# goal is to reduce size sites take up 1 change version from 500 to 100 2 recycle bin deleted more than 30 days ago, move to 2nd stage recycling 3 empty second stage recycling #> #Get current stats #Connect to Admin Center $AdminCenterURL = "https://MyCompanyName-admin.sharepoint.com" Connect-PnPOnline -Url $AdminCenterURL -Interactive reportTenantSiteCollectioninfo function reportTenantSiteCollectioninfo { Try { #export file path $dateStamp = get-date -format "yyyyMMdd-hhmm" $CSVPath = "c:\Temp\SiteUsageRpt-"+$dateStamp+".csv" #Get all site usage details $Sites = Get-PnPTenantSite -Detailed | Select * $SiteUsageData = @() ForEach ($Site in $Sites) { #Collect site data $SiteUsageData += New-Object PSObject -Property ([ordered]@{ 'Title' = $Site.Title 'URL' = $Site.Url 'Description' = $Site.Description 'Owner' = $Site.OwnerName 'Storage Quota' = $Site.StorageQuota 'Storage MaximumLevel' = $Site.StorageMaximumLevel 'Storage Usage Current' = $Site.StorageUsageCurrent 'Resource Quota' = $Site.ResourceQuota 'Resource Quota Warning' = $Site.ResourceQuotaWarningLevel 'Resource Usage Average' = $Site.ResourceUsageAverage 'Resource Usage Current' = $Site.ResourceUsageCurrent 'Template' = $Site.Template 'Sharing Capability' = $Site.SharingCapability 'Lock Status' = $Site.LockState 'Last Modified Date' = $Site.LastContentModifiedDate 'Subsites Count' = $Site.WebsCount }) } $SiteUsageData #Export Site Usage Data to CSV $SiteUsageData | Export-Csv $CSVPath -NoTypeInformation Write-Host "Site Usage Report Generated Successfully!" -ForegroundColor Green } Catch { Write-Host -ForegroundColor Red "Error generating site usage report:" $_.Exception.Message } } Function Set-PnPVersionHistoryLimit { param ( [Parameter(Mandatory=$true)] $Web, [parameter(Mandatory=$false)][int]$VersioningLimit = 100 ) Try { Write-host "Processing Web:"$Web.URL -f Yellow Connect-PnPOnline -Url $Web.URL -Interactive #Array to exclude system libraries $SystemLibraries = @("Form Templates", "Pages", "Preservation Hold Library","Site Assets", "Site Pages", "Images", "Site Collection Documents", "Site Collection Images","Style Library","Teams Wiki Data") $Lists = Get-PnPList -Includes BaseType, Hidden, EnableVersioning #Get All document libraries $DocumentLibraries = $Lists | Where {$_.BaseType -eq "DocumentLibrary" -and $_.Hidden -eq $False -and $_.Title -notin $SystemLibraries} #Set Versioning Limits ForEach($Library in $DocumentLibraries) { #powershell to set limit on version history If($Library.EnableVersioning) { #Set versioning limit Set-PnPList -Identity $Library -MajorVersions $VersioningLimit Write-host -f Green "`tVersion History Settings has been Updated on '$($Library.Title)'" } Else { Write-host -f Yellow "`tVersion History is turned-off at '$($Library.Title)'" } } } Catch { Write-host -f Red "Error:" $_.Exception.Message } } <# Get a list of big sites #> $bigSites = $SiteUsageData | where "Storage Usage Current" -gt 100000 $bigSites = $bigSites | where title -notin ("excluded site 1","excluded site 2") $bigSites.count #loop through the big sites foreach($bigsite in $bigSites) { connect-pnponline -url $bigSite.URL -interactive ## get the deleted items that were deleteed more than 30 days ago $date30daysago = (get-date).adddays(-30) $DeletedItemsOlder = Get-PnPRecycleBinItem | Where { $_.DeletedDate -le $date30daysago} | Sort-Object -Property DeletedDate -Descending $DeletedItemsOlder.count #move all these to 2nd stage recycle bin $DeletedItemsOlder | Move-PnPRecycleBinItem -force #empty the second stage recycle bin Clear-PnPRecycleBinItem -SecondStageOnly -force #get all the libraries $Webs = Get-PnPSubWeb -Recurse -IncludeRootWeb ForEach($Web in $Webs) { Set-PnPVersionHistoryLimit -Web $Web } } #get site info again reportTenantSiteCollectioninfo Outcome [Updated 7 Aug 2023] On the day the outcome was spectacularly unsuccessful. I saved 24GB by doing this across 5.194553 Terra bytes or less than 0.5 % So worth trying, but not valuable in my environment, I thought. But when I looked at the SharePoint admin screen, Storage used, in August there is a big drop two days after running this script. From 8.8TB down to 8.6TB so approx 200GB saving. Unfortunately this may not have been all to do with this script (other things going on in my tenant) but a lot of it is, again not massive at 2% but useful. When would this be valuable for reducing storage volume ? I think the key thing isn't the removal of files from the recycle bins, but the removal of older versions of files. SO if you have an environment with most files having 100's of versions , and those versions are large changes to the file each time. (e.g. a daily report that has new images replace existing ones each day) then this could save a lot of space. What about you? How do you reduce SharePoint storage? What have you found that works?Dorje-McKinnonOct 29, 2025Iron Contributor11KViews0likes5Comments
- Active Site Usage Not Matching Storage Used?At my new position our SharePoint storage is getting used quickly. As of right now we have 69GB available of 11.75TB. When look at the active sites the total Storage Used only adds up to 2TB. Where is the other 8.75TB coming from? Version history was changed to 100 major versions and to delete versions after 1 year. Is it possible that the versions still being retained could add up to 8.75TB? I archived a site that was 13GB but it's still showing 69GB available. How long does it take to reclaim that storage?vis-mesOct 29, 2025Occasional Reader7Views0likes0Comments
Resources
Tags
- SharePoint Online18,131 Topics
- Document Library3,173 Topics
- Lists3,098 Topics
- Sites2,563 Topics
- admin2,227 Topics
- sharepoint server2,026 Topics
- Permissions1,964 Topics
- files1,690 Topics
- developer1,591 Topics
- microsoft lists1,543 Topics