Crawling
5 TopicsMy website not indexing on Bing, core vitals went to 0 after change the webhost
I am facing some issues with coreweb vitals as well as bing indexing, I have tried to let bing get the index of the site and with no reason it keeps getting nothing. I checked if it had the sitemap and it does, also checked robots.txt to see if there was a “no crawl” option, and there is nothing of sorts. . On the other hand, the host of the site changed from Hostinger to Wordpress. After that, the coreweb vitals section went to 0, it used to be very well. How can we fix that? any recommendations would be greatly appreciated!"112Views2likes5CommentsRobots.txt tester failed
Hi, I would like to understand the reason why my robots.txt tester keeps on failing. I have emailed tech support who told me to review my sitemap xml file, however I’m running 3 blogs with identical sitemap and robots structure and this problem only occurs in 1 of them. I have done robots testing on Bing Webmaster and the results stated that nothing crawling is allowed. Some my pages are already indexed by Bing as well. Could someone let me know step by step on what I could do to remedy this issue? Thanks in advance!1.3KViews0likes5CommentsGroup Site Collections Columns not Crawled
Hi everyone, I added some custom columns in a group site collection but I can't find these columns as crawled properties in the search schema. All columns have content and the content is searchable. If I do the same with a classic teamsite colums are there as usual and I can map the properties in the search schema. Anyone with the same behavior?1KViews0likes1CommentSharePoint 2013 - Crawl Error- The content processing pipeline failed to process the item
SharePoint 2013 has been recently updated to May 2017 CU. I am receiving the below errors when performing a Full Crawl: "The content processing pipeline failed to process the item. ( Object reference not set to an instance of an object.; ; SearchID = XXXX....." I have also tried to reset index and start a new full crawl but issue persists. Anyone knows a resolution for this error? Thanks for your help :)1.5KViews0likes0CommentsSchedule multiple content source crawls: SharePoint 2013
The maximum Content Source boundary for a SharePoint 2013 Search service application is 500. That is a lot of individual Content Source objects. Still if you are crawling file shares and you want to show specific units as Content Sources in a refiner, you may be quickly growing a large set of Content Source. One shortcoming that has always bothered me is the limitation of "scheduling" crawls. Each schedule is independent; only relevant to the content source it is scheduled to crawl. This means you could easily get into some complex mapping to try to figure out when you have what crawling. Without unlimited resources, 20 million or more items to crawl and more that a hand full of content source locations, will likely run into some frustrations. One solution to keep your crawling better managed is to look at a schedule taking the entire Farm SSA into account. You can do this through PowerShell scripting and a Windows Task. In this example we will assume we don't want more than 10 crawls running simultaneously. We'll exclude "Local SharePoint sites" from being crawled by this process. It has continuous crawl enabled and its own aggressive Full and Incremental crawl schedule. We're also going to exclude a few content sources from the schedule for some reason. Maybe they are extremely large or on very slow disk, so we want a completely unique choice in when we crawl them. Our maximum limit count of ten will include all non-idle crawl components. Yes, this code can be cleaned up, made into a function, etc. The point is to provide a functional tool to review and use if desired. === START === <# Purpose: Start Incremental Crawl on oldest Content Source up to $MaxNonIdel instances Check how many non-idle Content Sources there are If less than $MaxNonIdel, start remainder as Incremental Crawl based on oldest crawl #> $ErrorActionPreference = “Stop”; # Load SharePoint Module If ((Get-PSSnapIn -Name Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue) -eq $null ) { Add-PSSnapIn -Name Microsoft.SharePoint.PowerShell } # Get duration of crawl $NameSSA = "Your Search Service Application Name"; $MaxNonIdel = 10; $NonIdelCount = 0; $sources = Get-SPEnterpriseSearchCrawlContentSource -SearchApplication $NameSSA; ForEach ($source in $sources) { if ($source.CrawlStatus -ne "Idle") { $NonIdelCount++; } } if($NonIdelCount -lt $MaxNonIdel) { $sources = $sources | Sort-Object -Property CrawlCompleted; ForEach ($source in $sources) { if($NonIdelCount -lt $MaxNonIdel) { if ($source.CrawlStatus -eq "Idle" -and $source.Name -ne "Promo*" -and $source.Name -notlike "Blue*" -and $source.Name -ne "Local*") { $source.StartIncrementalCrawl(); $NonIdelCount++; } } if($NonIdelCount -ge $MaxNonIdel) { Exit; } } } === END ===1.3KViews1like0Comments