sharepoint support
27 TopicsAwareness of temporary adjustments in SharePoint Online
On March 24 th we shared in an announcement in the M365 message center (MC207439) details around temporary adjustments we are making to select capabilities in SharePoint Online and OneDrive. During these unprecedented times, we are taking steps to ensure that SharePoint Online and OneDrive services remain available and reliable for your users who depend on the service more than ever in remote work scenarios.18KViews7likes0CommentsSharePoint fails to create Configuration Database for a new farm
SharePoint fails to create Configuration Database for a new farm with exception "Microsoft.SharePoint.Upgrade.SPUpgradeException: One or more types failed to load. Please refer to the upgrade log for more details."32KViews5likes6CommentsHow to disable the modern experience in SharePoint 2019
SharePoint 2019 delivers an updated modern look and feel for lists and libraries and enabled by default. However, if the classic experience is required for your farm, the modern experience can be disabled by Administrators.33KViews5likes4CommentsCoaching your guest users through the External Sharing Experience.
Here is a resource to which you can point those users you collaborate with using the guest user experiences on SharePoint Online. There are three possible experiences a user can encounter when being invited to SharePoint Online. We will deal with each of those in turn. Feel free to copy and paste this or provide the individual link for each invitation type, depending on which method of invitation you are using.102KViews5likes26CommentsScript to copy specified IIS logs from multiple Servers
Summary Need to copy a set of IIS logs from multiple servers for data analysis? Are you doing it manually? If so, please check out this script as it will help expedite the process. The Script: #Modify Inputs # Host and Drive to store the IISLogs $Hostdir = "\\HOSTSERVER\c$" # Actual Location of the IIS Logs on the Server $iislogsfolder = "c$\inetpub\logs\LogFiles\W3SVC1892304237" #A wild card parameter to determine a file range $thefiles = "*ex1906*" # Create a target folder on host if does not exist $TARGETROOT = "$Hostdir\logs" if(!(Test-Path -Path $TARGETROOT)){ New-Item -ItemType directory -Path $TARGETROOT } # Create an export folder if it does not exist $target = "$Hostdir\logs\export" if(!(Test-Path -Path $target)){ New-Item -ItemType directory -Path $target } #Simple Server list $servers = Get-Content C:\servers.txt # For loop to do the work foreach ($server in $servers) { #make a new folder by server name if it does not exist $TARGETDIR = "$target\$Server" if(!(Test-Path -Path $TARGETDIR)){ New-Item -ItemType directory -Path $TARGETDIR } #Get the files $iislogLogDir = "\\$server\$iislogsfolder" $iislogName = Get-ChildItem "$iislogLogDir" -Recurse -Include "$thefiles" | Get-Item #Start a loop to copy all the files to the host locatiion foreach ($log in $iislogName) { copy-Item -path $log $TARGETDIR } } What the Script Does You only need to modify the inputs and create a server list. Then the script will do the following: Use a list of servers to collect the data from. Create a "logs" folder on the specified drive. Create an "export" sub folder under "logs". Create a sub folder under "export" using server name of each server. Use the wild card file name you specified to target specific files. Copy each targeted file to the correct serer name folder. Notes: If you want to collet 1 file from each server, just specify use the full date name in the search input (ie 190606). You can run this more than once if another file is needed and they will be added to the existing folders. The "export" folder will contain all the files needed. Just compress the folder and your ready to share.5.1KViews3likes1CommentSharePoint throws 500 or 403 and remains inaccessible until IISRESET
Summary I have seen this very obscure SharePoint issue a few times and almost impossible to identity and resolve without extensive debugging. So, I just wanted to get this blog out there to help the next SharePoint Admin that may experience this situation with a quick resolution. Symptom: While trying to access the site http://sharepoint users are intermittently presented with 403 and 500 errors and the site remains inaccessible until performing a manual IIS RESET. However, after resetting IIS the site may remain operational for a very short time before the issue reoccurs. When this issue occurs, you will find the following COMException recorded in the ULS Logs. 12/14/2018 14:48:11.11 w3wp.exe (0x22A0) 0x270C SharePoint Foundation Runtime tkau Unexpected System.Runtime.InteropServices.COMException: Cannot complete this action. Please try again. at Microsoft.SharePoint.Library.SPRequestInternalClass.GetFileAndMetaInfo(String bstrUrl, Byte bPageView, Byte bPageMode, Byte bGetBuildDependencySet, String bstrCurrentFolderUrl, Int32 iRequestVersion, Boolean& pbCanCustomizePages, Boolean& pbCanPersonalizeWebParts, Boolean& pbCanAddDeleteWebParts, Boolean& pbGhostedDocument, Boolean& pbDefaultToPersonal, Boolean& pbIsWebWelcomePage, String& pbstrSiteRoot, Guid& pgSiteId, UInt32& pdwVersion, String& pbstrTimeLastModified, String& pbstrContent, Byte& pVerGhostedSetupPath, UInt32& pdwPartCount, Object& pvarMetaData, Object& pvarMultipleMeetingDoclibRootFolders, String& pbstrRedirectUrl, Boolean& pbObjectIsList, Guid& pgListId, UInt32& pdwItemId, Int64& pllListFlags, Boolean& pbAccessDenied, Guid& pgDocid, Byte& piLevel, UInt64& ppermMask, Object& pvarBuildDependencySet, UInt32& pdwNumBuildDependencies, Object& pvarBuildDependencies, String& pbstrFolderUrl, String& pbstrContentTypeOrder) at Microsoft.SharePoint.Library.SPRequest.GetFileAndMetaInfo(String bstrUrl, Byte bPageView, Byte bPageMode, Byte bGetBuildDependencySet, String bstrCurrentFolderUrl, Int32 iRequestVersion, Boolean& pbCanCustomizePages, Boolean& pbCanPersonalizeWebParts, Boolean& pbCanAddDeleteWebParts, Boolean& pbGhostedDocument, Boolean& pbDefaultToPersonal, Boolean& pbIsWebWelcomePage, String& pbstrSiteRoot, Guid& pgSiteId, UInt32& pdwVersion, String& pbstrTimeLastModified, String& pbstrContent, Byte& pVerGhostedSetupPath, UInt32& pdwPartCount, Object& pvarMetaData, Object& pvarMultipleMeetingDoclibRootFolders, String& pbstrRedirectUrl, Boolean& pbObjectIsList, Guid& pgListId, UInt32& pdwItemId, Int64& pllListFlags, Boolean& pbAccessDenied, Guid& pgDocid, Byte& piLevel, UInt64& ppermMask, Object& pvarBuildDependencySet, UInt32& pdwNumBuildDependencies, Object& pvarBuildDependencies, String& pbstrFolderUrl, String& pbstrContentTypeOrder) Cause: This issue is caused by an excessive amount of AD / SharePoint groups or user permissions being added to site collections, list & libraries or pages, which fully consumes the maximum allowed “in-memory security cache” (owssvr!VsecCacheManager) which is 2mb by default. Once this memory has been exceeded, SharePoint is unable to verify user security and responds with a 500/403 error for all users at the server level. Resolution: To resolve this problem, increase the “SecurityCacheMemoryAllowed” setting from the default of 2MB to 20MB. Registry DISCLAIMER: Modifying REGISTRY settings incorrectly can cause serious problems that may prevent your computer from booting properly. Microsoft cannot guarantee that any problems resulting from the configuring of REGISTRY settings can be solved. Modifications of these settings are at your own risk. Steps: Click Start, click Run, type regedit, and then click OK In Registry Editor, locate and then click the following registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\14.0 Right-click 14.0, point to New, and then click Key Note: If you are using SharePoint 2013 the key will be under 15.0 and 16.0 with SharePoint 2016. Type SecurityCacheOptions, and then press ENTER Right-click SecurityCacheOptions, point to New, and then click DWORD value Type SecurityCacheMemoryAllowed, and then press ENTER Right-click SecurityCacheMemoryAllowed, and then click Modify In the Value data box, change the Base to decimal, type the value 20, and then click OK Execute an IISRESET and manually restart the “IIS Administration service”5KViews3likes4CommentsScript to check if a reg key is set on multiple servers
Summary If your servers require a registry key and unsure if its properly configured. You can use this script to check if the key is present or is set to the correct value across multiple servers. The Script # Simple Server list $servers = Get-Content C:\servers.txt # Loop through all servers and check key foreach ($server in $servers) { $REG = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', $server) $REGKEY = $REG.OpenSubKey("SYSTEM\CurrentControlSet\Control\Lsa") $val = $REGKEY.GetValue("DisableLoopBackCheck") # If the key is missing or not set it will be displayed in red # If the key is set it will be displayed in green if (!$val){ Write-Host $server "is not set" -ForegroundColor Red break } #specify the value here if ($val -ne "1"){ Write-Host $server "is not set properly" -ForegroundColor Red } else{ Write-Host $server "is set" -ForegroundColor Green } } What the Script Does In the sample above it will check the “DisableLoopBackCheck” key across a list of servers and if the key does not exist or not set to “1” it will be reported in the output. Example: You can simply change which key to check and the desired value to make this work for your scenario. I hope this helps make your job easier in the future.7.4KViews2likes0CommentsScript to backup event logs using a server list
Summary Have you ever needed to back up event logs for root cause analysis or auditing? Did you access each server and manually export the requested log file? If yes, I hope you find this script handy. The Script # Specify which Log File $EventLogName = “Application” # Specify drive to store event logs $drive= “c$” # Specify server to store event logs $dest = "SERVERNAME" #Simple Server list $servers = Get-Content C:\servers.txt # For loop to do the work foreach ($server in $servers) { # Create a target folder on host if does not exist $TARGETROOT = "\\$server\$drive\logs" if(!(Test-Path -Path $TARGETROOT)){ New-Item -ItemType directory -Path $TARGETROOT } # This is the WMI call to select the application log from each server $logFile = Get-WmiObject -EnableAllPrivileges -ComputerName $server Win32_NTEventlogFile | Where-Object {$_.logfilename -eq $EventLogName} # Creating a file name based on server, log and time $exportFileName = $server + “_” + $EventLogName + “_” +(get-date -f yyyyMMdd) + “.evt” # Perform the backup $logFile.backupeventlog($TARGETROOT + “\” + $exportFileName) # Create an export folder if it does not exist $target = "\\$dest\$drive\logs\export" if(!(Test-Path -Path $target)){ New-Item -ItemType directory -Path $target } # Since WMI does the work on the remote machine you can’t copy to file share. # This is a workaround to move to files to a single location after the backup Move-Item $TARGETROOT\$exportFileName $target } What does it the script do? This script will read a list of servers and backup the specified event logs to a local folder on the source servers. After the backup is complete it will move the event logs files to network share specified by the destination, so all backed up files are stored in a single location. Server List and Script completion example: I hope you find this useful the next time you need to backup event logs from multiple servers.5KViews2likes0CommentsSharePoint / How is mail deleted from the “DROP” Folder?
Summary I recently work on a case that involved slow incoming mail destined to SharePoint libraries. We found the problem was caused by a buildup of mail in the "drop" folders across multiple servers. Looking closer at the issue, we found that most of the messages were intended for aliases that no longer existed. Example: 03/11/2019 12:19:09.61 OWSTIMER.EXE (0x2DB0) 0x2598 SharePoint Foundation E-Mail 6871 Information The Incoming E-Mail service has completed a batch. The elapsed time was 00:00:00. The service processed 3 message(s) in total. Errors occurred processing 3 message(s): Message ID: Message ID: Message ID: The following aliases were unknown: docs72 docs74 docs73 9dfbc79e-e62a-a083-7f7d-713af476a4e7 After the drop folder accumulates about 400 files, it prolongs the "job-email-delivery" timer job and can cause delivery delays. What was done? To resolve this issue, we provided a few scripts to keep the drop folder clean, by moving messages destined an unknown aliases to a holding folder and managed outside of the drop folder. The first script uses a "search string" to match the unknown alias found the ULS events. Example: ###################### #Move Mail based on name# ###################### $SrcDir = "c:\inetpub\mailroot\drop" $DestDir = "c:\inetpub\mailroot\badmail\spmail" if(!(Test-Path -Path c:\inetpub\mailroot\badmail\spmail)){ New-Item -ItemType directory -Path $DestDir} $SearchString = "To: <docs72@sp2016.com>" Get-Childitem $SrcDir -filter *.eml | ? {select-string $SearchString $_ -quiet} | move-item -Destination $DestDir The 2 nd script simply moved any message that was not processed after 10 minutes. So, after cleaning up all the known badmail, nothing should last longer than 10 minutes. If so, assume its bad and move it. Example: ####################### #Move Mail after 10 minutes# ####################### $SrcDir = "c:\inetpub\mailroot\drop" $DestDir = "c:\inetpub\mailroot\badmail\spmail" if(!(Test-Path -Path c:\inetpub\mailroot\badmail\spmail)){ New-Item -ItemType directory -Path $DestDir} Foreach($file in (Get-ChildItem $SrcDir)) { If($file.LastWriteTime -lt (Get-Date).AddMinutes(+10)) {Move-Item -Path $file.fullname -Destination $DestDir} } How does SharePoint handle it? SharePoint does delete mail directly from the drop folder if it has not been processed after 24 hours via the OWSTIMER process and within the timer job. Example: As you can see from PROCMON , the OWSTIMER deletes the "eml" file directly from the DROP folder. However, since each bad mail will take 24 hours to be deleted, expect a large queue of messages in the drop folder on busy farms. Conclusion If you are in the process of a migration, chances are you will have e-mail destined to folders that no longer exist. Once a large build up of mail is created in the drop folder, expect delays. However, I hope this blog creates awareness and helps you manage the issue more efficiently, if you find yourself in this situation in the future.4.2KViews2likes0Comments