sharepoint 2013
40 TopicsBest JS framework(Angular/React/Vue) to use in SharePoint 2013
Hi, We have tools (Basically CRUD operations in multiple lists across sites/site collections. Handles large amount of data and has lots of business functionalities/logic) that were built using jQuery and Datatables.js. We are planning to rebuild in 2013. Visual Studio/ server side coding is not allowed. Can anyone suggest which JavaScript framework (Angular Js1.x, Angular 2+, React, Vue/ Others) is best to use in SharePoint 2013? Difficult to get Node.JS/NPM/CLI on DEV environment. we have tried a PoC in Angular 4 using SystemJs-Manual mapping concept. But not sure how bundling and future upgrade will be if CLI/NPM is not used. Any suggestions please? Thanks, Puli2.1KViews1like1CommentSchedule multiple content source crawls: SharePoint 2013
The maximum Content Source boundary for a SharePoint 2013 Search service application is 500. That is a lot of individual Content Source objects. Still if you are crawling file shares and you want to show specific units as Content Sources in a refiner, you may be quickly growing a large set of Content Source. One shortcoming that has always bothered me is the limitation of "scheduling" crawls. Each schedule is independent; only relevant to the content source it is scheduled to crawl. This means you could easily get into some complex mapping to try to figure out when you have what crawling. Without unlimited resources, 20 million or more items to crawl and more that a hand full of content source locations, will likely run into some frustrations. One solution to keep your crawling better managed is to look at a schedule taking the entire Farm SSA into account. You can do this through PowerShell scripting and a Windows Task. In this example we will assume we don't want more than 10 crawls running simultaneously. We'll exclude "Local SharePoint sites" from being crawled by this process. It has continuous crawl enabled and its own aggressive Full and Incremental crawl schedule. We're also going to exclude a few content sources from the schedule for some reason. Maybe they are extremely large or on very slow disk, so we want a completely unique choice in when we crawl them. Our maximum limit count of ten will include all non-idle crawl components. Yes, this code can be cleaned up, made into a function, etc. The point is to provide a functional tool to review and use if desired. === START === <# Purpose: Start Incremental Crawl on oldest Content Source up to $MaxNonIdel instances Check how many non-idle Content Sources there are If less than $MaxNonIdel, start remainder as Incremental Crawl based on oldest crawl #> $ErrorActionPreference = “Stop”; # Load SharePoint Module If ((Get-PSSnapIn -Name Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue) -eq $null ) { Add-PSSnapIn -Name Microsoft.SharePoint.PowerShell } # Get duration of crawl $NameSSA = "Your Search Service Application Name"; $MaxNonIdel = 10; $NonIdelCount = 0; $sources = Get-SPEnterpriseSearchCrawlContentSource -SearchApplication $NameSSA; ForEach ($source in $sources) { if ($source.CrawlStatus -ne "Idle") { $NonIdelCount++; } } if($NonIdelCount -lt $MaxNonIdel) { $sources = $sources | Sort-Object -Property CrawlCompleted; ForEach ($source in $sources) { if($NonIdelCount -lt $MaxNonIdel) { if ($source.CrawlStatus -eq "Idle" -and $source.Name -ne "Promo*" -and $source.Name -notlike "Blue*" -and $source.Name -ne "Local*") { $source.StartIncrementalCrawl(); $NonIdelCount++; } } if($NonIdelCount -ge $MaxNonIdel) { Exit; } } } === END ===1.3KViews1like0CommentsIssues getting SharePoint List Image to my Mobile Image Preview
I am facing issues getting the SharePoint list image to my Mobile App on Image Preview. When I pass the Image URL as in the code below, I am getting a 401 error. I would appreciate, if anybody could suggest a solution for this issue. Image Url = "Tenant name/NewsImagesHF/allergens-banner_0.jpg"; async getImageUrl(url: any) { let options = this._apiHeader(); this._apiGetData(url, options); } async _apiGetData(apiURI: any, options: any) { return new Promise((resolve, reject) => { this._http.get(apiURI, options) .subscribe(res => { resolve(res.json()); console.log(res.json()); console.log(res); }, (err) => { reject(err); }); }); } _apiHeader() { let headers = new Headers(); headers.append('Authorization', 'Bearer ' + localStorage.getItem("token")); headers.append('Content-Type', 'application/json'); let options = new RequestOptions({ headers: headers }); return options; }1.3KViews1like0CommentsCannot change content types for MS OOXML docs (.docx, .xlsx, .pptx, etc.).
In order to create Nintex workflows useable across the site collection, required fields needed to be added to the 'Item' content type. These changes were pushed down to all child content types, including 'Document'. The 'Document' content type was renamed to 'Non-record'. Workflows were applied to the 'Non-record' (previously, 'Document') content type and all other custom content types inheriting from 'Document'. At first it worked, but without major change suddenly stopped. All file types (even, .doc, .xls, and .ppt) change between the default CT and other CTs assigned to the document library. Only OOXML files fail to change CTs. Not sure it matters, but list content types change without issue. The issue seems to be confined to the 'library-side' of SharePoint.1KViews1like2Comments