This blog post is the second in a series spotlighting the local AI samples contained in the AI Dev Gallery. The Gallery is a preview project that aims to showcase local AI scenarios on Windows and to give developers the guidance they need to enable those scenarios themselves. The Gallery is open-source and contains a wide selection of different models and samples, including text, image, audio, and video use cases. The first blog post in the series, on Pose Estimation, can be found here.
This second post will cover how to implement a semantic search with a locally running text embedding model. The Gallery currently includes the all-MiniLM-L6-v2 and all-MiniLM-L12-v2 models for text embedding tasks, and these models are run in the app via ONNX Runtime.
Semantic Search at a High Level
Note: If you're familiar with embedding models already, and just want to see code, skip to the Code Walkthrough section.
At its core, a semantic search is a search that matches meaning rather than directly matching text or text similarity. For example, if a user did a search with the query term "sports," and the source text only contained the word "athletics," a traditional search would find no matches, while a semantic search would surface the relevant text results. Semantic matching enables a robust version of searching that more closely parallels a human view of information, taking into account meaning, context, and nuance.
Semantic searches are almost always enabled by some sort of text embedding model.
How do text embedding models work?
Text embedding models convert source text to a numerical representation of the meaning of that text. Instead of a string, you will have a vector of numerical values that captures as much of the original meaning of the text as possible.
It's important to note that text embeddings models usually create vectors for each token in a sequence, where a token is usually a single word but may also be a portion of a word, punctuation, or any other piece of text. For example, if the input sequence is "I love basketball", the model may output embeddings vectors for "I", "love", "basket", and "ball", depending on how the input is broken into tokens. These tokens will then each have an embedding vector generated for them, each of the same length:
Another key thing to know about of each of these vectors is that embedding models are usually designed to account for context when generating output. This means that the model will also consider the tokens before and after a particular token, to make sure that meaning is captured in context. In the example above, this is how the model would know that "basket" is referring to the goal in the sport of basketball, rather than a basket for holding fruit, even though it was separated as its own token.
Comparing Text Embeddings
Because text embeddings from the same model have the same length, comparing them is relatively easy. Calculate the numerical distance between two embedding vectors, and that will correspond with the semantic distance. The larger the distance between two vector representations, the farther apart in embedded meaning the two texts are. And the inverse is true as well: smaller distance between embedding vectors means closer meaning.
If we were comparing the tokens above to find the most similar token to "ball":
The distances (found using the Euclidean distance formula) between "ball" and each other token in the sequence is calculated and compared. The token with the smallest distance (in this case "basket") is the most semantically similar token.
But what if we wanted to compare the entire sequence "I love basketball" with another sequence, like "I love football"? Longer sequences can be compared using mean pooling: averaging out all the token embeddings to get one for the entire sequence. Once the token embeddings have been averaged, the comparison is identical to the one described above.
Enabling Semantic Search with Text Embedding
Given that we have a model that can produce embeddings for input text, the algorithm for semantic search is fairly straightforward. Let's assume we already have a sourceText string and a searchQuery string:
Assuming sourceText is on the longer side, it must be broken it up into small enough pieces that the embedding model can handle.
Once the input has been chunked, both sourceText and searchQuery must first be tokenized before they can be passed to the embeddings model. In this case, the BERT tokenizer is used.
The tokenized input is passed to an ONNX inference session that runs the embeddings model with the desired inputs.
The text embeddings are output by the inference session.
These are token-by-token embeddings, so the embeddings for each sequence must be averaged to get an embedding for the entire string.
Now that sourceText and searchQuery have been embedded, a search can be performed to find the chunks of text from sourceText that most closely match the meaning of the searchQuery. This is done by calculating the distance between two embeddings.
The closest matches are output as semantic matches for the searchQuery.
The code walkthrough below will cover key steps like how to instantiate the inference session and tokenizer, generate embeddings, and search for the closest match. The full code can also be viewed alongside the sample within the AI Dev Gallery, or in the GitHub repository.
Code Walkthrough
This code walkthrough will cover the key logic for generating text embeddings and using them to execute a semantic search. Let's start with setting up the inference session and tokenizer.
Instantiating the inference session and tokenizer
The ONNX InferenceSession and the BertTokenizer instantiated in the constructor for the EmbeddingGenerator class:
// Create default session options
_sessionOptions = new SessionOptions();
// Create a new ONNX inference session
_inferenceSession = new InferenceSession(Path.Join(modelPath, "onnx", "model.onnx"), _sessionOptions);
// Create a new BERT tokenizer
_tokenizer = BertTokenizer.Create(Path.Join(modelPath, "vocab.txt"));
In this code:
A new SessionOptions object is created that will be passed to the inference session.
A new InferenceSession is instantiated that is passed the model path and the session options.
A new BertTokenizer is instantiated that is passed the vocabulary file that shipped with the embedding model.
The InferenceSession and BertTokenizer will be referenced in the following code to handle embedding generation.
Generating Embeddings
Embedding generation/inference is handled in the GetVectorsAsync function, which takes in an IEnumerable<string> values and a RunOptions run options:
private async Task<float[][]> GetVectorsAsync(IEnumerable<string> values, RunOptions runOptions)
{
// Clean input strings of "unprintable" characters
values = values.Select(s => MyRegex().Replace(s, string.Empty));
// Encode string enumerable into a EmbeddingModelInputs with the BERT tokenizer
IEnumerable<EmbeddingModelInput> encoded = _tokenizer.EncodeBatch(values);
// Store total values count
int count = values.Count();
At the start of the GetVectorsAsync function:
The input is cleaned of any unprintable control characters that could causes issues for the tokenizer. MyRegex matches these characters.
The input is encoded with the BERT tokenizer into an enumerable of EmbeddingModelInput that can be used by the embedding model.
Next, the list of EmbeddingModelInputs is flattened to a single input:
InputIds: The input IDs represents the tokenized version of the string that was passed to the tokenizer. Each token is assigned a ID that correlates to a particular token in our array. You can think of this as the actual data that is being processed.
AttentionMask: The attention mask is an array of values of either 1 or 0 that tells the embedding model which tokens to pay attention to. Tokens that are assigned 1 are processed while tokens that are assigned 0 are ignored. This is necessary because the tokenizer encodes all strings in a batch to be of equal length, and will pad shorter strings to match the lengths. The attention mask tells the embedding model to ignore the padding tokens.
TokenTypes: The token types array assigns "types" to each token in the input. For certain use cases, this can allow separator tokens to be denoted as different than normal input tokens. For now, this can mostly be ignored as it doesn't come up much when just dealing with plain strings.
Now that the input is flattened, it needs to converted to OrtValues so that it matches ONNX's expected input format:
// Get the length of each of the input sequences by dividing the total length by the count
int sequenceLength = input.InputIds.Length / count;
// Create input tensors over the input data using count and sequence length to define the shape
using var inputIdsOrtValue = OrtValue.CreateTensorValueFromMemory(
input.InputIds,
[count, sequenceLength]);
using var attMaskOrtValue = OrtValue.CreateTensorValueFromMemory(
input.AttentionMask,
[count, sequenceLength]);
using var typeIdsOrtValue = OrtValue.CreateTensorValueFromMemory(
input.TokenTypeIds,
[count, sequenceLength]);
Each of the fields of the flattened EmbeddingModelInput is passed to CreateTensorValueFromMemory along with a shape define as [count, sequenceLength]. The input IDs, attention mask, and token types will each get their own input tensor with `count` number of entries, all of the same length sequenceLength.
Next, these values are added to a list and a corresponding list of input names is created:
// Input names that correspond with input OrtValues, ONNX expects this
var inputNames = new List<string>
{
"input_ids",
"attention_mask",
"token_type_ids"
};
// Move OrtValues that were created to a list
var inputs = new List<OrtValue>
{
{ inputIdsOrtValue },
{ attMaskOrtValue },
{ typeIdsOrtValue }
};
Both of these lists will be passed when inference is called to tell ONNX how to process the input. The last thing left before embeddings are generated is to define an output tensor:
// Allocate an output tensor of the expected dimensionality (Sequence Count x Sequence Length x 384)
using var output = OrtValue.CreateAllocatedTensorValue(OrtAllocator.DefaultInstance, TensorElementType.Float, [count, sequenceLength, 384]);
This tensor has a similar shape to the input, but the third dimension is specified as 384, which is the number of float values that define each individual token embedding. For every sequence we inferred on, there will be 384 output values for every token in that sequence: Number of Sequences x Number of Tokens in each sequence x 384 embedding values.
The last thing left to do is call for inference:
// Run inference using the input and output fields that were defined in the above steps
await _inferenceSession.RunAsync(runOptions, inputNames, inputs, _inferenceSession.OutputNames, [output]);
And then process the output:
// Get the type and shape of the output to help with post-processing operations
var typeAndShape = output.GetTensorTypeAndShape();
// Mean pool token-level embeddings to get sequence-level embeddings
ReadOnlyTensorSpan<float> sentence_embeddings = MeanPooling(output.GetTensorDataAsSpan<float>(), input.AttentionMask, typeAndShape.Shape);
// Normalize final embedding values to smooth out outliers
float[] resultArray = NormalizeSentenceEmbeddings(sentence_embeddings, typeAndShape.Shape);
The output processing for sentence embedding involves quite a bit of tensor math, so we can break down what these functions do at a high-level and link out to the source code if you want to dive into them deeper.
Mean Pooling: This step is necessary to get from token-level embeddings to sequence-level embeddings. Essentially, the model creates an embedding for every individual token in a sequence. If we want embeddings for each full sequence instead, we need to average out the token embedding values for every single token in each sequence. This is what mean pooling does: it reduces the dimensionality of the output from Sequence Count x Sequence Length x 384 to Sequence Count x 384 through averaging. After mean pooling the embeddings, there is a single embedding for each sequence, rather than an embedding for every token in that sequence. View the code for this function for implementation details.
Normalization: This step smooths the sequence-level embeddings to prevent outliers and data anomalies from affecting embedding comparisons later on in our code. Every embedding is divided by the square root of the sum of all its squared values. This normalizes each embedding relative to its own scale, preventing particularly large or small embeddings from shifting comparisons. View the code for this function for implementation details.
Once the output has been pooled and normalized, it just needs to be chunked and then returned:
The call to Enumerable.Chunk ensures that each float array that we return is the proper size for a single embedding.
Creating GeneratedEmbeddings
Let's take a look at how we build properly formatted embeddings from the float array output of GetVectorsAsync:
// Make the call to GetVectorsAsync to get the embedding data
float[][] vectors = await GetVectorsAsync(values, runOptions).ConfigureAwait(false);
// Instantiate a new GeneratedEmbeddings object to store each float Embedding
var generatedEmbeddings = new GeneratedEmbeddings<Embedding<float>>();
// Add each vector as a new Embedding<float> to the GeneratedEmbeddings
generatedEmbeddings.AddRange(vectors.Select(x => new Embedding<float>(x)));
GetVectorsAsync outputs a two-dimensional float array. Each sub-array represents a single pooled embedding and is added to the GeneratedEmbeddings as an Embedding<float>. It's important to note that the order of these are preserved, so the text that corresponds with the embedding can be found via indexing the original source content.
Next, let's look at how we search for semantic matches.
Putting It All Together: Semantic Search
We can use the Microsoft.Extensions.VectorData library to do a lot of the heavy lifting for implementing a vector search functionality.
To start, an InMemoryVectorStore is instantiated, which is used to fetch/create a IVectorStoreRecordCollection:
// Create an InMemoryVectorStore
IVectorStore? vectorStore = new InMemoryVectorStore();
// Query the vector store for the VectoreStoreRecordCollection associated with the embeddings
IVectorStoreRecordCollection<int, StringData> embeddingsCollection = vectorStore.GetCollection<int, StringData>("embeddings");
// Double check that that collection exists and create it if not
await embeddingsCollection.CreateCollectionIfNotExistsAsync(ct).ConfigureAwait(false);
The above code sets up a IVectorStoreRecordCollection<int, StringData> that will be used to store the embeddings. This interface includes defines the VectorizedSearch function which can be used to perform a semantic search with embeddings.
Next, the input text is chunked to meet length requirements and passed to the embeddings generator:
// Chunk the input source text under a maximum length requirement, in this case, 512
List<string> sourceContent = ChunkSourceText(sourceText, 512);
// Generate embeddings for both the search text and the source content
// GenerateAsync will eventually make a call to GetVectorsAsync, where inference and processing was implemented
GeneratedEmbeddings<Embedding<float>> searchVectors = await _embeddings.GenerateAsync([searchText], null, ct).ConfigureAwait(false);
GeneratedEmbeddings<Embedding<float>> sourceVectors = await _embeddings.GenerateAsync(sourceContent, null, ct).ConfigureAwait(false);
After this code executes, searchVectors and sourceVectors contain the embedding vector data for both the search text and source content, respectively.
Now, the sourceVectors are upserted into the IVectorStoreRecordCollection:
// Call to UpsertBatchAsync
await foreach (var key in embeddingsCollection.UpsertBatchAsync(
// Map each source vector to a StringData object to add to the collection
sourceVectors.Select((x, i) => new StringData
{
Key = i,
Text = sourceContent[i],
Vector = x.Vector
}),
ct).ConfigureAwait(false))
{
// No action here, just iterating over entire enumerable.
}
Every embedding that was generated from the source content will have a corresponding entry in the record collection.
We can now finally execute our semantic search using the VectorizedSearch function exposed through the IVectorStoreRecordCollection interface:
// Call to the vectorized search via the embeddings collection
VectorSearchResults<StringData> vectorSearchResults = await embeddingsCollection.VectorizedSearchAsync(
searchVectors[0].Vector, // Pass in the search vector
new VectorSearchOptions<StringData> // Pass in search options
{
Top = 5, // Number of results to return
VectorProperty = (str) => str.Vector // The mapping to the vector property to compare on
},
ct).ConfigureAwait(false);
This search will return the 5 closest (or most similar in meaning) embedding vectors to the search vector stored in searchVectors[0].Vector. The closest vectors are found using the Euclidean distance formula, but with 384 dimensions for each embedding value instead of 2 or 3 as in real space.
The output is returned as a VectorSearchResults<StringData> and the final output strings can be accessed like so:
var resultMessage = string.Join("\n\n", vectorSearchResults.Results.ToBlockingEnumerable().Select(r => r.Record.Text));
That final string will contain the five output strings with newlines in-between.
And that's all it takes to implement an effective semantic search with a local text embedding model!
This walkthrough was focused on generating text embeddings and using them to execute a semantic search, but there are other elements to factor in like error handling and updating the UI. You can see the full sample code and try it out yourself in the AI Dev Gallery.
Next Steps
If you enjoyed this post and want to see more of what's possible with local AI on Windows, check out the AI Dev Gallery.
If you have any feedback, want to open an issue, or contribute to the Gallery, please visit the Github Repository.
Lastly, if you enjoyed this article, check out the others in the series for walkthroughs of other Gallery samples:
"}},"componentScriptGroups({\"componentId\":\"custom.widget.MicrosoftFooter\"})":{"__typename":"ComponentScriptGroups","scriptGroups":{"__typename":"ComponentScriptGroupsDefinition","afterInteractive":{"__typename":"PageScriptGroupDefinition","group":"AFTER_INTERACTIVE","scriptIds":[]},"lazyOnLoad":{"__typename":"PageScriptGroupDefinition","group":"LAZY_ON_LOAD","scriptIds":[]}},"componentScripts":[]},"cachedText({\"lastModified\":\"1745505307000\",\"locale\":\"en-US\",\"namespaces\":[\"components/community/NavbarDropdownToggle\"]})":[{"__ref":"CachedAsset:text:en_US-components/community/NavbarDropdownToggle-1745505307000"}],"cachedText({\"lastModified\":\"1745505307000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/users/UserAvatar\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/users/UserAvatar-1745505307000"}],"cachedText({\"lastModified\":\"1745505307000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/ranks/UserRankLabel\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/ranks/UserRankLabel-1745505307000"}],"cachedText({\"lastModified\":\"1745505307000\",\"locale\":\"en-US\",\"namespaces\":[\"components/users/UserRegistrationDate\"]})":[{"__ref":"CachedAsset:text:en_US-components/users/UserRegistrationDate-1745505307000"}],"cachedText({\"lastModified\":\"1745505307000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/nodes/NodeAvatar\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/nodes/NodeAvatar-1745505307000"}],"cachedText({\"lastModified\":\"1745505307000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/nodes/NodeDescription\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/nodes/NodeDescription-1745505307000"}],"cachedText({\"lastModified\":\"1745505307000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageListMenu\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageListMenu-1745505307000"}],"message({\"id\":\"message:4402953\"})":{"__ref":"BlogReplyMessage:message:4402953"},"cachedText({\"lastModified\":\"1745505307000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/nodes/NodeIcon\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/nodes/NodeIcon-1745505307000"}]},"Theme:customTheme1":{"__typename":"Theme","id":"customTheme1"},"User:user:-1":{"__typename":"User","id":"user:-1","uid":-1,"login":"Deleted","email":"","avatar":null,"rank":null,"kudosWeight":1,"registrationData":{"__typename":"RegistrationData","status":"ANONYMOUS","registrationTime":null,"confirmEmailStatus":false,"registrationAccessLevel":"VIEW","ssoRegistrationFields":[]},"ssoId":null,"profileSettings":{"__typename":"ProfileSettings","dateDisplayStyle":{"__typename":"InheritableStringSettingWithPossibleValues","key":"layout.friendly_dates_enabled","value":"false","localValue":"true","possibleValues":["true","false"]},"dateDisplayFormat":{"__typename":"InheritableStringSetting","key":"layout.format_pattern_date","value":"MMM dd yyyy","localValue":"MM-dd-yyyy"},"language":{"__typename":"InheritableStringSettingWithPossibleValues","key":"profile.language","value":"en-US","localValue":null,"possibleValues":["en-US","es-ES"]},"repliesSortOrder":{"__typename":"InheritableStringSettingWithPossibleValues","key":"config.user_replies_sort_order","value":"DEFAULT","localValue":"DEFAULT","possibleValues":["DEFAULT","LIKES","PUBLISH_TIME","REVERSE_PUBLISH_TIME"]}},"deleted":false},"CachedAsset:pages-1746564001512":{"__typename":"CachedAsset","id":"pages-1746564001512","value":[{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"BlogViewAllPostsPage","type":"BLOG","urlPath":"/category/:categoryId/blog/:boardId/all-posts/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"CasePortalPage","type":"CASE_PORTAL","urlPath":"/caseportal","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"CreateGroupHubPage","type":"GROUP_HUB","urlPath":"/groups/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"CaseViewPage","type":"CASE_DETAILS","urlPath":"/case/:caseId/:caseNumber","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"InboxPage","type":"COMMUNITY","urlPath":"/inbox","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"HelpFAQPage","type":"COMMUNITY","urlPath":"/help","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"IdeaMessagePage","type":"IDEA_POST","urlPath":"/idea/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"IdeaViewAllIdeasPage","type":"IDEA","urlPath":"/category/:categoryId/ideas/:boardId/all-ideas/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"LoginPage","type":"USER","urlPath":"/signin","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"BlogPostPage","type":"BLOG","urlPath":"/category/:categoryId/blogs/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"UserBlogPermissions.Page","type":"COMMUNITY","urlPath":"/c/user-blog-permissions/page","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ThemeEditorPage","type":"COMMUNITY","urlPath":"/designer/themes","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"TkbViewAllArticlesPage","type":"TKB","urlPath":"/category/:categoryId/kb/:boardId/all-articles/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1730819800000,"localOverride":null,"page":{"id":"AllEvents","type":"CUSTOM","urlPath":"/Events","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"OccasionEditPage","type":"EVENT","urlPath":"/event/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"OAuthAuthorizationAllowPage","type":"USER","urlPath":"/auth/authorize/allow","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"PageEditorPage","type":"COMMUNITY","urlPath":"/designer/pages","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"PostPage","type":"COMMUNITY","urlPath":"/category/:categoryId/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ForumBoardPage","type":"FORUM","urlPath":"/category/:categoryId/discussions/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"TkbBoardPage","type":"TKB","urlPath":"/category/:categoryId/kb/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"EventPostPage","type":"EVENT","urlPath":"/category/:categoryId/events/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"UserBadgesPage","type":"COMMUNITY","urlPath":"/users/:login/:userId/badges","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"GroupHubMembershipAction","type":"GROUP_HUB","urlPath":"/membership/join/:nodeId/:membershipType","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"MaintenancePage","type":"COMMUNITY","urlPath":"/maintenance","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"IdeaReplyPage","type":"IDEA_REPLY","urlPath":"/idea/:boardId/:messageSubject/:messageId/comments/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"UserSettingsPage","type":"USER","urlPath":"/mysettings/:userSettingsTab","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"GroupHubsPage","type":"GROUP_HUB","urlPath":"/groups","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ForumPostPage","type":"FORUM","urlPath":"/category/:categoryId/discussions/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"OccasionRsvpActionPage","type":"OCCASION","urlPath":"/event/:boardId/:messageSubject/:messageId/rsvp/:responseType","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"VerifyUserEmailPage","type":"USER","urlPath":"/verifyemail/:userId/:verifyEmailToken","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"AllOccasionsPage","type":"OCCASION","urlPath":"/category/:categoryId/events/:boardId/all-events/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"EventBoardPage","type":"EVENT","urlPath":"/category/:categoryId/events/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"TkbReplyPage","type":"TKB_REPLY","urlPath":"/kb/:boardId/:messageSubject/:messageId/comments/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"IdeaBoardPage","type":"IDEA","urlPath":"/category/:categoryId/ideas/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"CommunityGuideLinesPage","type":"COMMUNITY","urlPath":"/communityguidelines","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"CaseCreatePage","type":"SALESFORCE_CASE_CREATION","urlPath":"/caseportal/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"TkbEditPage","type":"TKB","urlPath":"/kb/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ForgotPasswordPage","type":"USER","urlPath":"/forgotpassword","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"IdeaEditPage","type":"IDEA","urlPath":"/idea/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"TagPage","type":"COMMUNITY","urlPath":"/tag/:tagName","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"BlogBoardPage","type":"BLOG","urlPath":"/category/:categoryId/blog/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"OccasionMessagePage","type":"OCCASION_TOPIC","urlPath":"/event/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ManageContentPage","type":"COMMUNITY","urlPath":"/managecontent","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ClosedMembershipNodeNonMembersPage","type":"GROUP_HUB","urlPath":"/closedgroup/:groupHubId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"CommunityPage","type":"COMMUNITY","urlPath":"/","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ForumMessagePage","type":"FORUM_TOPIC","urlPath":"/discussions/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"IdeaPostPage","type":"IDEA","urlPath":"/category/:categoryId/ideas/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1730819800000,"localOverride":null,"page":{"id":"CommunityHub.Page","type":"CUSTOM","urlPath":"/Directory","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"BlogMessagePage","type":"BLOG_ARTICLE","urlPath":"/blog/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"RegistrationPage","type":"USER","urlPath":"/register","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"EditGroupHubPage","type":"GROUP_HUB","urlPath":"/group/:groupHubId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ForumEditPage","type":"FORUM","urlPath":"/discussions/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ResetPasswordPage","type":"USER","urlPath":"/resetpassword/:userId/:resetPasswordToken","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1730819800000,"localOverride":null,"page":{"id":"AllBlogs.Page","type":"CUSTOM","urlPath":"/blogs","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"TkbMessagePage","type":"TKB_ARTICLE","urlPath":"/kb/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"BlogEditPage","type":"BLOG","urlPath":"/blog/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ManageUsersPage","type":"USER","urlPath":"/users/manage/:tab?/:manageUsersTab?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ForumReplyPage","type":"FORUM_REPLY","urlPath":"/discussions/:boardId/:messageSubject/:messageId/replies/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"PrivacyPolicyPage","type":"COMMUNITY","urlPath":"/privacypolicy","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"NotificationPage","type":"COMMUNITY","urlPath":"/notifications","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"UserPage","type":"USER","urlPath":"/users/:login/:userId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"HealthCheckPage","type":"COMMUNITY","urlPath":"/health","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"OccasionReplyPage","type":"OCCASION_REPLY","urlPath":"/event/:boardId/:messageSubject/:messageId/comments/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ManageMembersPage","type":"GROUP_HUB","urlPath":"/group/:groupHubId/manage/:tab?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"SearchResultsPage","type":"COMMUNITY","urlPath":"/search","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"BlogReplyPage","type":"BLOG_REPLY","urlPath":"/blog/:boardId/:messageSubject/:messageId/replies/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"GroupHubPage","type":"GROUP_HUB","urlPath":"/group/:groupHubId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"TermsOfServicePage","type":"COMMUNITY","urlPath":"/termsofservice","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"CategoryPage","type":"CATEGORY","urlPath":"/category/:categoryId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"ForumViewAllTopicsPage","type":"FORUM","urlPath":"/category/:categoryId/discussions/:boardId/all-topics/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"TkbPostPage","type":"TKB","urlPath":"/category/:categoryId/kbs/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1746564001512,"localOverride":null,"page":{"id":"GroupHubPostPage","type":"GROUP_HUB","urlPath":"/group/:groupHubId/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"}],"localOverride":false},"CachedAsset:text:en_US-components/context/AppContext/AppContextProvider-0":{"__typename":"CachedAsset","id":"text:en_US-components/context/AppContext/AppContextProvider-0","value":{"noCommunity":"Cannot find community","noUser":"Cannot find current user","noNode":"Cannot find node with id {nodeId}","noMessage":"Cannot find message with id {messageId}","userBanned":"We're sorry, but you have been banned from using this site.","userBannedReason":"You have been banned for the following reason: {reason}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/common/Loading/LoadingDot-0":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/common/Loading/LoadingDot-0","value":{"title":"Loading..."},"localOverride":false},"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/cmstNC05WEo0blc\"}":{"__typename":"AssociatedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/cmstNC05WEo0blc","height":512,"width":512,"mimeType":"image/png"},"Rank:rank:4":{"__typename":"Rank","id":"rank:4","position":6,"name":"Microsoft","color":"333333","icon":{"__ref":"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/cmstNC05WEo0blc\"}"},"rankStyle":"OUTLINE"},"User:user:2504992":{"__typename":"User","id":"user:2504992","uid":2504992,"login":"zteutsch","deleted":false,"avatar":{"__typename":"UserAvatar","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/m_assets/avatars/default/avatar-9.svg?time=0"},"rank":{"__ref":"Rank:rank:4"},"email":"","messagesCount":4,"biography":null,"topicsCount":4,"kudosReceivedCount":9,"kudosGivenCount":0,"kudosWeight":1,"registrationData":{"__typename":"RegistrationData","status":null,"registrationTime":"2024-06-03T10:01:31.313-07:00","confirmEmailStatus":null},"followersCount":null,"solutionsCount":0},"Category:category:Azure":{"__typename":"Category","id":"category:Azure","entityType":"CATEGORY","displayId":"Azure","nodeType":"category","depth":3,"title":"Azure","shortTitle":"Azure","parent":{"__ref":"Category:category:products-services"},"categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:top":{"__typename":"Category","id":"category:top","entityType":"CATEGORY","displayId":"top","nodeType":"category","depth":0,"title":"Top","shortTitle":"Top"},"Category:category:communities":{"__typename":"Category","id":"category:communities","entityType":"CATEGORY","displayId":"communities","nodeType":"category","depth":1,"parent":{"__ref":"Category:category:top"},"title":"Communities","shortTitle":"Communities"},"Category:category:products-services":{"__typename":"Category","id":"category:products-services","entityType":"CATEGORY","displayId":"products-services","nodeType":"category","depth":2,"parent":{"__ref":"Category:category:communities"},"title":"Products","shortTitle":"Products"},"Blog:board:AzureDevCommunityBlog":{"__typename":"Blog","id":"board:AzureDevCommunityBlog","entityType":"BLOG","displayId":"AzureDevCommunityBlog","nodeType":"board","depth":4,"conversationStyle":"BLOG","repliesProperties":{"__typename":"RepliesProperties","sortOrder":"REVERSE_PUBLISH_TIME","repliesFormat":"threaded"},"tagProperties":{"__typename":"TagNodeProperties","tagsEnabled":{"__typename":"PolicyResult","failureReason":null}},"requireTags":false,"tagType":"PRESET_ONLY","description":"","title":"Microsoft Developer Community Blog","shortTitle":"Microsoft Developer Community Blog","parent":{"__ref":"Category:category:Azure"},"ancestors":{"__typename":"CoreNodeConnection","edges":[{"__typename":"CoreNodeEdge","node":{"__ref":"Community:community:gxcuf89792"}},{"__typename":"CoreNodeEdge","node":{"__ref":"Category:category:communities"}},{"__typename":"CoreNodeEdge","node":{"__ref":"Category:category:products-services"}},{"__typename":"CoreNodeEdge","node":{"__ref":"Category:category:Azure"}}]},"userContext":{"__typename":"NodeUserContext","canAddAttachments":false,"canUpdateNode":false,"canPostMessages":false,"isSubscribed":false},"theme":{"__ref":"Theme:customTheme1"},"boardPolicies":{"__typename":"BoardPolicies","canViewSpamDashBoard":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.feature.moderation_spam.action.access_spam_quarantine.allowed.accessDenied","key":"error.lithium.policies.feature.moderation_spam.action.access_spam_quarantine.allowed.accessDenied","args":[]}},"canArchiveMessage":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.content_archivals.enable_content_archival_settings.accessDenied","key":"error.lithium.policies.content_archivals.enable_content_archival_settings.accessDenied","args":[]}},"canPublishArticleOnCreate":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.forums.policy_can_publish_on_create_workflow_action.accessDenied","key":"error.lithium.policies.forums.policy_can_publish_on_create_workflow_action.accessDenied","args":[]}}},"eventPath":"category:Azure/category:products-services/category:communities/community:gxcuf89792board:AzureDevCommunityBlog/"},"BlogTopicMessage:message:4400142":{"__typename":"BlogTopicMessage","uid":4400142,"subject":"Semantic Search with the AI Dev Gallery","id":"message:4400142","revisionNum":15,"repliesCount":1,"author":{"__ref":"User:user:2504992"},"depth":0,"hasGivenKudo":false,"board":{"__ref":"Blog:board:AzureDevCommunityBlog"},"conversation":{"__ref":"Conversation:conversation:4400142"},"messagePolicies":{"__typename":"MessagePolicies","canPublishArticleOnEdit":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.forums.policy_can_publish_on_edit_workflow_action.accessDenied","key":"error.lithium.policies.forums.policy_can_publish_on_edit_workflow_action.accessDenied","args":[]}},"canModerateSpamMessage":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.feature.moderation_spam.action.moderate_entity.allowed.accessDenied","key":"error.lithium.policies.feature.moderation_spam.action.moderate_entity.allowed.accessDenied","args":[]}}},"contentWorkflow":{"__typename":"ContentWorkflow","state":"PUBLISH","scheduledPublishTime":null,"scheduledTimezone":null,"userContext":{"__typename":"MessageWorkflowContext","canSubmitForReview":null,"canEdit":false,"canRecall":null,"canSubmitForPublication":null,"canReturnToAuthor":null,"canPublish":null,"canReturnToReview":null,"canSchedule":false},"shortScheduledTimezone":null},"readOnly":false,"editFrozen":false,"moderationData":{"__ref":"ModerationData:moderation_data:4400142"},"teaser":"","body":"
This blog post is the second in a series spotlighting the local AI samples contained in the AI Dev Gallery. The Gallery is a preview project that aims to showcase local AI scenarios on Windows and to give developers the guidance they need to enable those scenarios themselves. The Gallery is open-source and contains a wide selection of different models and samples, including text, image, audio, and video use cases. The first blog post in the series, on Pose Estimation, can be found here.
\n
This second post will cover how to implement a semantic search with a locally running text embedding model. The Gallery currently includes the all-MiniLM-L6-v2 and all-MiniLM-L12-v2 models for text embedding tasks, and these models are run in the app via ONNX Runtime.
\n
Semantic Search at a High Level
\n
Note: If you're familiar with embedding models already, and just want to see code, skip to the Code Walkthrough section.
\n
At its core, a semantic search is a search that matches meaning rather than directly matching text or text similarity. For example, if a user did a search with the query term \"sports,\" and the source text only contained the word \"athletics,\" a traditional search would find no matches, while a semantic search would surface the relevant text results. Semantic matching enables a robust version of searching that more closely parallels a human view of information, taking into account meaning, context, and nuance.
\n
Semantic searches are almost always enabled by some sort of text embedding model.
\n
How do text embedding models work?
\n
Text embedding models convert source text to a numerical representation of the meaning of that text. Instead of a string, you will have a vector of numerical values that captures as much of the original meaning of the text as possible.
\n
It's important to note that text embeddings models usually create vectors for each token in a sequence, where a token is usually a single word but may also be a portion of a word, punctuation, or any other piece of text. For example, if the input sequence is \"I love basketball\", the model may output embeddings vectors for \"I\", \"love\", \"basket\", and \"ball\", depending on how the input is broken into tokens. These tokens will then each have an embedding vector generated for them, each of the same length:
\n\n
Another key thing to know about of each of these vectors is that embedding models are usually designed to account for context when generating output. This means that the model will also consider the tokens before and after a particular token, to make sure that meaning is captured in context. In the example above, this is how the model would know that \"basket\" is referring to the goal in the sport of basketball, rather than a basket for holding fruit, even though it was separated as its own token.
\n
Comparing Text Embeddings
\n
Because text embeddings from the same model have the same length, comparing them is relatively easy. Calculate the numerical distance between two embedding vectors, and that will correspond with the semantic distance. The larger the distance between two vector representations, the farther apart in embedded meaning the two texts are. And the inverse is true as well: smaller distance between embedding vectors means closer meaning.
\n
If we were comparing the tokens above to find the most similar token to \"ball\":
\n\n
The distances (found using the Euclidean distance formula) between \"ball\" and each other token in the sequence is calculated and compared. The token with the smallest distance (in this case \"basket\") is the most semantically similar token.
\n
But what if we wanted to compare the entire sequence \"I love basketball\" with another sequence, like \"I love football\"? Longer sequences can be compared using mean pooling: averaging out all the token embeddings to get one for the entire sequence. Once the token embeddings have been averaged, the comparison is identical to the one described above.
\n
Enabling Semantic Search with Text Embedding
\n
Given that we have a model that can produce embeddings for input text, the algorithm for semantic search is fairly straightforward. Let's assume we already have a sourceText string and a searchQuery string:
\n\n
Assuming sourceText is on the longer side, it must be broken it up into small enough pieces that the embedding model can handle.
\n
Once the input has been chunked, both sourceText and searchQuery must first be tokenized before they can be passed to the embeddings model. In this case, the BERT tokenizer is used.
\n
The tokenized input is passed to an ONNX inference session that runs the embeddings model with the desired inputs.
\n
The text embeddings are output by the inference session.
\n
These are token-by-token embeddings, so the embeddings for each sequence must be averaged to get an embedding for the entire string.
\n
Now that sourceText and searchQuery have been embedded, a search can be performed to find the chunks of text from sourceText that most closely match the meaning of the searchQuery. This is done by calculating the distance between two embeddings.
\n
The closest matches are output as semantic matches for the searchQuery.
\n\n
The code walkthrough below will cover key steps like how to instantiate the inference session and tokenizer, generate embeddings, and search for the closest match. The full code can also be viewed alongside the sample within the AI Dev Gallery, or in the GitHub repository.
\n
Code Walkthrough
\n
This code walkthrough will cover the key logic for generating text embeddings and using them to execute a semantic search. Let's start with setting up the inference session and tokenizer.
\n
Instantiating the inference session and tokenizer
\n
The ONNX InferenceSession and the BertTokenizer instantiated in the constructor for the EmbeddingGenerator class:
\n
// Create default session options\n_sessionOptions = new SessionOptions();\n\n// Create a new ONNX inference session\n_inferenceSession = new InferenceSession(Path.Join(modelPath, \"onnx\", \"model.onnx\"), _sessionOptions);\n\n// Create a new BERT tokenizer\n_tokenizer = BertTokenizer.Create(Path.Join(modelPath, \"vocab.txt\"));
\n
In this code:
\n\n
A new SessionOptions object is created that will be passed to the inference session.
\n
A new InferenceSession is instantiated that is passed the model path and the session options.
\n
A new BertTokenizer is instantiated that is passed the vocabulary file that shipped with the embedding model.
\n\n
The InferenceSession and BertTokenizer will be referenced in the following code to handle embedding generation.
\n
Generating Embeddings
\n
Embedding generation/inference is handled in the GetVectorsAsync function, which takes in an IEnumerable<string> values and a RunOptions run options:
\n
private async Task<float[][]> GetVectorsAsync(IEnumerable<string> values, RunOptions runOptions)\n{\n // Clean input strings of \"unprintable\" characters\n values = values.Select(s => MyRegex().Replace(s, string.Empty));\n\n // Encode string enumerable into a EmbeddingModelInputs with the BERT tokenizer\n IEnumerable<EmbeddingModelInput> encoded = _tokenizer.EncodeBatch(values);\n\n // Store total values count\n int count = values.Count();
\n
At the start of the GetVectorsAsync function:
\n\n
The input is cleaned of any unprintable control characters that could causes issues for the tokenizer. MyRegex matches these characters.
\n
The input is encoded with the BERT tokenizer into an enumerable of EmbeddingModelInput that can be used by the embedding model.
\n\n
Next, the list of EmbeddingModelInputs is flattened to a single input:
InputIds: The input IDs represents the tokenized version of the string that was passed to the tokenizer. Each token is assigned a ID that correlates to a particular token in our array. You can think of this as the actual data that is being processed.
\n
AttentionMask: The attention mask is an array of values of either 1 or 0 that tells the embedding model which tokens to pay attention to. Tokens that are assigned 1 are processed while tokens that are assigned 0 are ignored. This is necessary because the tokenizer encodes all strings in a batch to be of equal length, and will pad shorter strings to match the lengths. The attention mask tells the embedding model to ignore the padding tokens.
\n
TokenTypes: The token types array assigns \"types\" to each token in the input. For certain use cases, this can allow separator tokens to be denoted as different than normal input tokens. For now, this can mostly be ignored as it doesn't come up much when just dealing with plain strings.
\n
\n
Now that the input is flattened, it needs to converted to OrtValues so that it matches ONNX's expected input format:
\n
// Get the length of each of the input sequences by dividing the total length by the count\n int sequenceLength = input.InputIds.Length / count;\n\n // Create input tensors over the input data using count and sequence length to define the shape\n using var inputIdsOrtValue = OrtValue.CreateTensorValueFromMemory(\n input.InputIds,\n [count, sequenceLength]);\n\n using var attMaskOrtValue = OrtValue.CreateTensorValueFromMemory(\n input.AttentionMask,\n [count, sequenceLength]);\n\n using var typeIdsOrtValue = OrtValue.CreateTensorValueFromMemory(\n input.TokenTypeIds,\n [count, sequenceLength]);
\n
Each of the fields of the flattened EmbeddingModelInput is passed to CreateTensorValueFromMemory along with a shape define as [count, sequenceLength]. The input IDs, attention mask, and token types will each get their own input tensor with `count` number of entries, all of the same length sequenceLength.
\n
Next, these values are added to a list and a corresponding list of input names is created:
\n
// Input names that correspond with input OrtValues, ONNX expects this\n var inputNames = new List<string>\n {\n \"input_ids\",\n \"attention_mask\",\n \"token_type_ids\"\n };\n\n // Move OrtValues that were created to a list\n var inputs = new List<OrtValue>\n {\n { inputIdsOrtValue },\n { attMaskOrtValue },\n { typeIdsOrtValue }\n };
\n
Both of these lists will be passed when inference is called to tell ONNX how to process the input. The last thing left before embeddings are generated is to define an output tensor:
\n
// Allocate an output tensor of the expected dimensionality (Sequence Count x Sequence Length x 384)\n using var output = OrtValue.CreateAllocatedTensorValue(OrtAllocator.DefaultInstance, TensorElementType.Float, [count, sequenceLength, 384]);
\n
This tensor has a similar shape to the input, but the third dimension is specified as 384, which is the number of float values that define each individual token embedding. For every sequence we inferred on, there will be 384 output values for every token in that sequence: Number of Sequences x Number of Tokens in each sequence x 384 embedding values.
\n
The last thing left to do is call for inference:
\n
// Run inference using the input and output fields that were defined in the above steps\n await _inferenceSession.RunAsync(runOptions, inputNames, inputs, _inferenceSession.OutputNames, [output]);
\n
And then process the output:
\n
// Get the type and shape of the output to help with post-processing operations\n var typeAndShape = output.GetTensorTypeAndShape();\n\n // Mean pool token-level embeddings to get sequence-level embeddings\n ReadOnlyTensorSpan<float> sentence_embeddings = MeanPooling(output.GetTensorDataAsSpan<float>(), input.AttentionMask, typeAndShape.Shape);\n\n // Normalize final embedding values to smooth out outliers\n float[] resultArray = NormalizeSentenceEmbeddings(sentence_embeddings, typeAndShape.Shape);
\n
The output processing for sentence embedding involves quite a bit of tensor math, so we can break down what these functions do at a high-level and link out to the source code if you want to dive into them deeper.
\n\n
Mean Pooling: This step is necessary to get from token-level embeddings to sequence-level embeddings. Essentially, the model creates an embedding for every individual token in a sequence. If we want embeddings for each full sequence instead, we need to average out the token embedding values for every single token in each sequence. This is what mean pooling does: it reduces the dimensionality of the output from Sequence Count x Sequence Length x 384 to Sequence Count x 384 through averaging. After mean pooling the embeddings, there is a single embedding for each sequence, rather than an embedding for every token in that sequence. View the code for this function for implementation details.
\n
Normalization: This step smooths the sequence-level embeddings to prevent outliers and data anomalies from affecting embedding comparisons later on in our code. Every embedding is divided by the square root of the sum of all its squared values. This normalizes each embedding relative to its own scale, preventing particularly large or small embeddings from shifting comparisons. View the code for this function for implementation details.
\n\n
Once the output has been pooled and normalized, it just needs to be chunked and then returned:
The call to Enumerable.Chunk ensures that each float array that we return is the proper size for a single embedding.
\n
Creating GeneratedEmbeddings
\n
Let's take a look at how we build properly formatted embeddings from the float array output of GetVectorsAsync:
\n
// Make the call to GetVectorsAsync to get the embedding data\nfloat[][] vectors = await GetVectorsAsync(values, runOptions).ConfigureAwait(false);\n\n// Instantiate a new GeneratedEmbeddings object to store each float Embedding\nvar generatedEmbeddings = new GeneratedEmbeddings<Embedding<float>>();\n\n// Add each vector as a new Embedding<float> to the GeneratedEmbeddings\ngeneratedEmbeddings.AddRange(vectors.Select(x => new Embedding<float>(x)));
\n
GetVectorsAsync outputs a two-dimensional float array. Each sub-array represents a single pooled embedding and is added to the GeneratedEmbeddings as an Embedding<float>. It's important to note that the order of these are preserved, so the text that corresponds with the embedding can be found via indexing the original source content.
\n
Next, let's look at how we search for semantic matches.
\n
Putting It All Together: Semantic Search
\n
We can use the Microsoft.Extensions.VectorData library to do a lot of the heavy lifting for implementing a vector search functionality.
\n
To start, an InMemoryVectorStore is instantiated, which is used to fetch/create a IVectorStoreRecordCollection:
\n
// Create an InMemoryVectorStore\nIVectorStore? vectorStore = new InMemoryVectorStore();\n\n// Query the vector store for the VectoreStoreRecordCollection associated with the embeddings\nIVectorStoreRecordCollection<int, StringData> embeddingsCollection = vectorStore.GetCollection<int, StringData>(\"embeddings\");\n\n// Double check that that collection exists and create it if not\nawait embeddingsCollection.CreateCollectionIfNotExistsAsync(ct).ConfigureAwait(false);
\n
The above code sets up a IVectorStoreRecordCollection<int, StringData> that will be used to store the embeddings. This interface includes defines the VectorizedSearch function which can be used to perform a semantic search with embeddings.
\n
Next, the input text is chunked to meet length requirements and passed to the embeddings generator:
\n
// Chunk the input source text under a maximum length requirement, in this case, 512\nList<string> sourceContent = ChunkSourceText(sourceText, 512);\n\n// Generate embeddings for both the search text and the source content\n// GenerateAsync will eventually make a call to GetVectorsAsync, where inference and processing was implemented\nGeneratedEmbeddings<Embedding<float>> searchVectors = await _embeddings.GenerateAsync([searchText], null, ct).ConfigureAwait(false);\nGeneratedEmbeddings<Embedding<float>> sourceVectors = await _embeddings.GenerateAsync(sourceContent, null, ct).ConfigureAwait(false);
\n
After this code executes, searchVectors and sourceVectors contain the embedding vector data for both the search text and source content, respectively.
\n
Now, the sourceVectors are upserted into the IVectorStoreRecordCollection:
\n
// Call to UpsertBatchAsync\nawait foreach (var key in embeddingsCollection.UpsertBatchAsync(\n // Map each source vector to a StringData object to add to the collection\n sourceVectors.Select((x, i) => new StringData\n {\n Key = i,\n Text = sourceContent[i], \n Vector = x.Vector\n }),\n ct).ConfigureAwait(false))\n{\n // No action here, just iterating over entire enumerable.\n}
\n
Every embedding that was generated from the source content will have a corresponding entry in the record collection.
\n
We can now finally execute our semantic search using the VectorizedSearch function exposed through the IVectorStoreRecordCollection interface:
\n
// Call to the vectorized search via the embeddings collection\nVectorSearchResults<StringData> vectorSearchResults = await embeddingsCollection.VectorizedSearchAsync(\n searchVectors[0].Vector, // Pass in the search vector\n new VectorSearchOptions<StringData> // Pass in search options\n {\n Top = 5, // Number of results to return\n VectorProperty = (str) => str.Vector // The mapping to the vector property to compare on\n },\n ct).ConfigureAwait(false);
\n
This search will return the 5 closest (or most similar in meaning) embedding vectors to the search vector stored in searchVectors[0].Vector. The closest vectors are found using the Euclidean distance formula, but with 384 dimensions for each embedding value instead of 2 or 3 as in real space.
\n
The output is returned as a VectorSearchResults<StringData> and the final output strings can be accessed like so:
\n
var resultMessage = string.Join(\"\\n\\n\", vectorSearchResults.Results.ToBlockingEnumerable().Select(r => r.Record.Text));
\n
That final string will contain the five output strings with newlines in-between.
\n
And that's all it takes to implement an effective semantic search with a local text embedding model!
\n
This walkthrough was focused on generating text embeddings and using them to execute a semantic search, but there are other elements to factor in like error handling and updating the UI. You can see the full sample code and try it out yourself in the AI Dev Gallery.
\n
Next Steps
\n
If you enjoyed this post and want to see more of what's possible with local AI on Windows, check out the AI Dev Gallery.
\n
If you have any feedback, want to open an issue, or contribute to the Gallery, please visit the Github Repository.
\n
Lastly, if you enjoyed this article, check out the others in the series for walkthroughs of other Gallery samples:
This blog post is the second in a series spotlighting the local AI samples contained in the AI Dev Gallery. The Gallery is a preview project that aims to showcase local AI scenarios on Windows and to give developers the guidance they need to enable those scenarios themselves. The Gallery is open-source and contains a wide selection of different models and samples, including text, image, audio, and video use cases. The first blog post in the series, on Pose Estimation, can be found here.
\n
This second post will cover how to implement a semantic search with a locally running text embedding model. The Gallery currently includes the all-MiniLM-L6-v2 and all-MiniLM-L12-v2 models for text embedding tasks, and these models are run in the app via ONNX Runtime.
\n
Semantic Search at a High Level
\n
Note: If you're familiar with embedding models already, and just want to see code, skip to the Code Walkthrough section.
\n
At its core, a semantic search is a search that matches meaning rather than directly matching text or text similarity. For example, if a user did a search with the query term \"sports,\" and the source text only contained the word \"athletics,\" a traditional search would find no matches, while a semantic search would surface the relevant text results. Semantic matching enables a robust version of searching that more closely parallels a human view of information, taking into account meaning, context, and nuance.
\n
Semantic searches are almost always enabled by some sort of text embedding model.
\n
How do text embedding models work?
\n
Text embedding models convert source text to a numerical representation of the meaning of that text. Instead of a string, you will have a vector of numerical values that captures as much of the original meaning of the text as possible.
\n
It's important to note that text embeddings models usually create vectors for each token in a sequence, where a token is usually a single word but may also be a portion of a word, punctuation, or any other piece of text. For example, if the input sequence is \"I love basketball\", the model may output embeddings vectors for \"I\", \"love\", \"basket\", and \"ball\", depending on how the input is broken into tokens. These tokens will then each have an embedding vector generated for them, each of the same length:
\n\n
Another key thing to know about of each of these vectors is that embedding models are usually designed to account for context when generating output. This means that the model will also consider the tokens before and after a particular token, to make sure that meaning is captured in context. In the example above, this is how the model would know that \"basket\" is referring to the goal in the sport of basketball, rather than a basket for holding fruit, even though it was separated as its own token.
\n
Comparing Text Embeddings
\n
Because text embeddings from the same model have the same length, comparing them is relatively easy. Calculate the numerical distance between two embedding vectors, and that will correspond with the semantic distance. The larger the distance between two vector representations, the farther apart in embedded meaning the two texts are. And the inverse is true as well: smaller distance between embedding vectors means closer meaning.
\n
If we were comparing the tokens above to find the most similar token to \"ball\":
\n\n
The distances (found using the Euclidean distance formula) between \"ball\" and each other token in the sequence is calculated and compared. The token with the smallest distance (in this case \"basket\") is the most semantically similar token.
\n
But what if we wanted to compare the entire sequence \"I love basketball\" with another sequence, like \"I love football\"? Longer sequences can be compared using mean pooling: averaging out all the token embeddings to get one for the entire sequence. Once the token embeddings have been averaged, the comparison is identical to the one described above.
\n
Enabling Semantic Search with Text Embedding
\n
Given that we have a model that can produce embeddings for input text, the algorithm for semantic search is fairly straightforward. Let's assume we already have a sourceText string and a searchQuery string:
\n\n
Assuming sourceText is on the longer side, it must be broken it up into small enough pieces that the embedding model can handle.
\n
Once the input has been chunked, both sourceText and searchQuery must first be tokenized before they can be passed to the embeddings model. In this case, the BERT tokenizer is used.
\n
The tokenized input is passed to an ONNX inference session that runs the embeddings model with the desired inputs.
\n
The text embeddings are output by the inference session.
\n
These are token-by-token embeddings, so the embeddings for each sequence must be averaged to get an embedding for the entire string.
\n
Now that sourceText and searchQuery have been embedded, a search can be performed to find the chunks of text from sourceText that most closely match the meaning of the searchQuery. This is done by calculating the distance between two embeddings.
\n
The closest matches are output as semantic matches for the searchQuery.
\n\n
The code walkthrough below will cover key steps like how to instantiate the inference session and tokenizer, generate embeddings, and search for the closest match. The full code can also be viewed alongside the sample within the AI Dev Gallery, or in the GitHub repository.
\n
Code Walkthrough
\n
This code walkthrough will cover the key logic for generating text embeddings and using them to execute a semantic search. Let's start with setting up the inference session and tokenizer.
\n
Instantiating the inference session and tokenizer
\n
The ONNX InferenceSession and the BertTokenizer instantiated in the constructor for the EmbeddingGenerator class:
\n// Create default session options\n_sessionOptions = new SessionOptions();\n\n// Create a new ONNX inference session\n_inferenceSession = new InferenceSession(Path.Join(modelPath, \"onnx\", \"model.onnx\"), _sessionOptions);\n\n// Create a new BERT tokenizer\n_tokenizer = BertTokenizer.Create(Path.Join(modelPath, \"vocab.txt\"));\n
In this code:
\n\n
A new SessionOptions object is created that will be passed to the inference session.
\n
A new InferenceSession is instantiated that is passed the model path and the session options.
\n
A new BertTokenizer is instantiated that is passed the vocabulary file that shipped with the embedding model.
\n\n
The InferenceSession and BertTokenizer will be referenced in the following code to handle embedding generation.
\n
Generating Embeddings
\n
Embedding generation/inference is handled in the GetVectorsAsync function, which takes in an IEnumerable<string> values and a RunOptions run options:
\nprivate async Task<float[][]> GetVectorsAsync(IEnumerable<string> values, RunOptions runOptions)\n{\n // Clean input strings of \"unprintable\" characters\n values = values.Select(s => MyRegex().Replace(s, string.Empty));\n\n // Encode string enumerable into a EmbeddingModelInputs with the BERT tokenizer\n IEnumerable<EmbeddingModelInput> encoded = _tokenizer.EncodeBatch(values);\n\n // Store total values count\n int count = values.Count();\n
At the start of the GetVectorsAsync function:
\n\n
The input is cleaned of any unprintable control characters that could causes issues for the tokenizer. MyRegex matches these characters.
\n
The input is encoded with the BERT tokenizer into an enumerable of EmbeddingModelInput that can be used by the embedding model.
\n\n
Next, the list of EmbeddingModelInputs is flattened to a single input:
InputIds: The input IDs represents the tokenized version of the string that was passed to the tokenizer. Each token is assigned a ID that correlates to a particular token in our array. You can think of this as the actual data that is being processed.
\n
AttentionMask: The attention mask is an array of values of either 1 or 0 that tells the embedding model which tokens to pay attention to. Tokens that are assigned 1 are processed while tokens that are assigned 0 are ignored. This is necessary because the tokenizer encodes all strings in a batch to be of equal length, and will pad shorter strings to match the lengths. The attention mask tells the embedding model to ignore the padding tokens.
\n
TokenTypes: The token types array assigns \"types\" to each token in the input. For certain use cases, this can allow separator tokens to be denoted as different than normal input tokens. For now, this can mostly be ignored as it doesn't come up much when just dealing with plain strings.
\n
\n
Now that the input is flattened, it needs to converted to OrtValues so that it matches ONNX's expected input format:
\n // Get the length of each of the input sequences by dividing the total length by the count\n int sequenceLength = input.InputIds.Length / count;\n\n // Create input tensors over the input data using count and sequence length to define the shape\n using var inputIdsOrtValue = OrtValue.CreateTensorValueFromMemory(\n input.InputIds,\n [count, sequenceLength]);\n\n using var attMaskOrtValue = OrtValue.CreateTensorValueFromMemory(\n input.AttentionMask,\n [count, sequenceLength]);\n\n using var typeIdsOrtValue = OrtValue.CreateTensorValueFromMemory(\n input.TokenTypeIds,\n [count, sequenceLength]);\n
Each of the fields of the flattened EmbeddingModelInput is passed to CreateTensorValueFromMemory along with a shape define as [count, sequenceLength]. The input IDs, attention mask, and token types will each get their own input tensor with `count` number of entries, all of the same length sequenceLength.
\n
Next, these values are added to a list and a corresponding list of input names is created:
\n // Input names that correspond with input OrtValues, ONNX expects this\n var inputNames = new List<string>\n {\n \"input_ids\",\n \"attention_mask\",\n \"token_type_ids\"\n };\n\n // Move OrtValues that were created to a list\n var inputs = new List<OrtValue>\n {\n { inputIdsOrtValue },\n { attMaskOrtValue },\n { typeIdsOrtValue }\n };\n
Both of these lists will be passed when inference is called to tell ONNX how to process the input. The last thing left before embeddings are generated is to define an output tensor:
\n // Allocate an output tensor of the expected dimensionality (Sequence Count x Sequence Length x 384)\n using var output = OrtValue.CreateAllocatedTensorValue(OrtAllocator.DefaultInstance, TensorElementType.Float, [count, sequenceLength, 384]);\n
This tensor has a similar shape to the input, but the third dimension is specified as 384, which is the number of float values that define each individual token embedding. For every sequence we inferred on, there will be 384 output values for every token in that sequence: Number of Sequences x Number of Tokens in each sequence x 384 embedding values.
\n
The last thing left to do is call for inference:
\n // Run inference using the input and output fields that were defined in the above steps\n await _inferenceSession.RunAsync(runOptions, inputNames, inputs, _inferenceSession.OutputNames, [output]);\n
And then process the output:
\n // Get the type and shape of the output to help with post-processing operations\n var typeAndShape = output.GetTensorTypeAndShape();\n\n // Mean pool token-level embeddings to get sequence-level embeddings\n ReadOnlyTensorSpan<float> sentence_embeddings = MeanPooling(output.GetTensorDataAsSpan<float>(), input.AttentionMask, typeAndShape.Shape);\n\n // Normalize final embedding values to smooth out outliers\n float[] resultArray = NormalizeSentenceEmbeddings(sentence_embeddings, typeAndShape.Shape);\n
The output processing for sentence embedding involves quite a bit of tensor math, so we can break down what these functions do at a high-level and link out to the source code if you want to dive into them deeper.
\n\n
Mean Pooling: This step is necessary to get from token-level embeddings to sequence-level embeddings. Essentially, the model creates an embedding for every individual token in a sequence. If we want embeddings for each full sequence instead, we need to average out the token embedding values for every single token in each sequence. This is what mean pooling does: it reduces the dimensionality of the output from Sequence Count x Sequence Length x 384 to Sequence Count x 384 through averaging. After mean pooling the embeddings, there is a single embedding for each sequence, rather than an embedding for every token in that sequence. View the code for this function for implementation details.
\n
Normalization: This step smooths the sequence-level embeddings to prevent outliers and data anomalies from affecting embedding comparisons later on in our code. Every embedding is divided by the square root of the sum of all its squared values. This normalizes each embedding relative to its own scale, preventing particularly large or small embeddings from shifting comparisons. View the code for this function for implementation details.
\n\n
Once the output has been pooled and normalized, it just needs to be chunked and then returned:
The call to Enumerable.Chunk ensures that each float array that we return is the proper size for a single embedding.
\n
Creating GeneratedEmbeddings
\n
Let's take a look at how we build properly formatted embeddings from the float array output of GetVectorsAsync:
\n// Make the call to GetVectorsAsync to get the embedding data\nfloat[][] vectors = await GetVectorsAsync(values, runOptions).ConfigureAwait(false);\n\n// Instantiate a new GeneratedEmbeddings object to store each float Embedding\nvar generatedEmbeddings = new GeneratedEmbeddings<Embedding<float>>();\n\n// Add each vector as a new Embedding<float> to the GeneratedEmbeddings\ngeneratedEmbeddings.AddRange(vectors.Select(x => new Embedding<float>(x)));\n
GetVectorsAsync outputs a two-dimensional float array. Each sub-array represents a single pooled embedding and is added to the GeneratedEmbeddings as an Embedding<float>. It's important to note that the order of these are preserved, so the text that corresponds with the embedding can be found via indexing the original source content.
\n
Next, let's look at how we search for semantic matches.
\n
Putting It All Together: Semantic Search
\n
We can use the Microsoft.Extensions.VectorData library to do a lot of the heavy lifting for implementing a vector search functionality.
\n
To start, an InMemoryVectorStore is instantiated, which is used to fetch/create a IVectorStoreRecordCollection:
\n// Create an InMemoryVectorStore\nIVectorStore? vectorStore = new InMemoryVectorStore();\n\n// Query the vector store for the VectoreStoreRecordCollection associated with the embeddings\nIVectorStoreRecordCollection<int, StringData> embeddingsCollection = vectorStore.GetCollection<int, StringData>(\"embeddings\");\n\n// Double check that that collection exists and create it if not\nawait embeddingsCollection.CreateCollectionIfNotExistsAsync(ct).ConfigureAwait(false);\n
The above code sets up a IVectorStoreRecordCollection<int, StringData> that will be used to store the embeddings. This interface includes defines the VectorizedSearch function which can be used to perform a semantic search with embeddings.
\n
Next, the input text is chunked to meet length requirements and passed to the embeddings generator:
\n// Chunk the input source text under a maximum length requirement, in this case, 512\nList<string> sourceContent = ChunkSourceText(sourceText, 512);\n\n// Generate embeddings for both the search text and the source content\n// GenerateAsync will eventually make a call to GetVectorsAsync, where inference and processing was implemented\nGeneratedEmbeddings<Embedding<float>> searchVectors = await _embeddings.GenerateAsync([searchText], null, ct).ConfigureAwait(false);\nGeneratedEmbeddings<Embedding<float>> sourceVectors = await _embeddings.GenerateAsync(sourceContent, null, ct).ConfigureAwait(false);\n
After this code executes, searchVectors and sourceVectors contain the embedding vector data for both the search text and source content, respectively.
\n
Now, the sourceVectors are upserted into the IVectorStoreRecordCollection:
\n// Call to UpsertBatchAsync\nawait foreach (var key in embeddingsCollection.UpsertBatchAsync(\n // Map each source vector to a StringData object to add to the collection\n sourceVectors.Select((x, i) => new StringData\n {\n Key = i,\n Text = sourceContent[i], \n Vector = x.Vector\n }),\n ct).ConfigureAwait(false))\n{\n // No action here, just iterating over entire enumerable.\n}\n
Every embedding that was generated from the source content will have a corresponding entry in the record collection.
\n
We can now finally execute our semantic search using the VectorizedSearch function exposed through the IVectorStoreRecordCollection interface:
\n// Call to the vectorized search via the embeddings collection\nVectorSearchResults<StringData> vectorSearchResults = await embeddingsCollection.VectorizedSearchAsync(\n searchVectors[0].Vector, // Pass in the search vector\n new VectorSearchOptions<StringData> // Pass in search options\n {\n Top = 5, // Number of results to return\n VectorProperty = (str) => str.Vector // The mapping to the vector property to compare on\n },\n ct).ConfigureAwait(false);\n
This search will return the 5 closest (or most similar in meaning) embedding vectors to the search vector stored in searchVectors[0].Vector. The closest vectors are found using the Euclidean distance formula, but with 384 dimensions for each embedding value instead of 2 or 3 as in real space.
\n
The output is returned as a VectorSearchResults<StringData> and the final output strings can be accessed like so:
That final string will contain the five output strings with newlines in-between.
\n
And that's all it takes to implement an effective semantic search with a local text embedding model!
\n
This walkthrough was focused on generating text embeddings and using them to execute a semantic search, but there are other elements to factor in like error handling and updating the UI. You can see the full sample code and try it out yourself in the AI Dev Gallery.
\n
Next Steps
\n
If you enjoyed this post and want to see more of what's possible with local AI on Windows, check out the AI Dev Gallery.
\n
If you have any feedback, want to open an issue, or contribute to the Gallery, please visit the Github Repository.
\n
Lastly, if you enjoyed this article, check out the others in the series for walkthroughs of other Gallery samples:
","kudosSumWeight":2,"postTime":"2025-04-10T00:00:00.015-07:00","images":{"__typename":"AssociatedImageConnection","edges":[{"__typename":"AssociatedImageEdge","cursor":"MjUuM3wyLjF8b3wyNXxfTlZffDE","node":{"__ref":"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLVFValQxUg?revision=15\"}"}},{"__typename":"AssociatedImageEdge","cursor":"MjUuM3wyLjF8b3wyNXxfTlZffDI","node":{"__ref":"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLTJ0ZXNZNw?revision=15\"}"}},{"__typename":"AssociatedImageEdge","cursor":"MjUuM3wyLjF8b3wyNXxfTlZffDM","node":{"__ref":"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLW9hY3Focg?revision=15\"}"}}],"totalCount":3,"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}},"attachments":{"__typename":"AttachmentConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[]},"tags":{"__typename":"TagConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[]},"timeToRead":11,"rawTeaser":"","introduction":"","coverImage":{"__typename":"UploadedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLVFValQxUg?revision=15","width":1127,"height":628},"coverImageProperties":{"__typename":"CoverImageProperties","style":"STANDARD","titlePosition":"BOTTOM","altText":""},"currentRevision":{"__ref":"Revision:revision:4400142_15"},"latestVersion":{"__typename":"FriendlyVersion","major":"1","minor":"0"},"metrics":{"__typename":"MessageMetrics","views":317},"visibilityScope":"PUBLIC","canonicalUrl":null,"seoTitle":null,"seoDescription":null,"placeholder":false,"originalMessageForPlaceholder":null,"contributors":{"__typename":"UserConnection","edges":[]},"nonCoAuthorContributors":{"__typename":"UserConnection","edges":[]},"coAuthors":{"__typename":"UserConnection","edges":[]},"blogMessagePolicies":{"__typename":"BlogMessagePolicies","canDoAuthoringActionsOnBlog":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.blog.action_can_do_authoring_action.accessDenied","key":"error.lithium.policies.blog.action_can_do_authoring_action.accessDenied","args":[]}}},"archivalData":null,"replies":{"__typename":"MessageConnection","edges":[{"__typename":"MessageEdge","cursor":"MjUuM3wyLjF8aXwxMHwxMzI6MHxpbnQsNDQwMjk1Myw0NDAyOTUz","node":{"__ref":"BlogReplyMessage:message:4402953"}}],"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}},"customFields":[],"revisions({\"constraints\":{\"isPublished\":{\"eq\":true}},\"first\":1})":{"__typename":"RevisionConnection","totalCount":15}},"Conversation:conversation:4400142":{"__typename":"Conversation","id":"conversation:4400142","solved":false,"topic":{"__ref":"BlogTopicMessage:message:4400142"},"lastPostingActivityTime":"2025-04-10T04:47:03.964-07:00","lastPostTime":"2025-04-10T04:47:03.964-07:00","unreadReplyCount":1,"isSubscribed":false},"ModerationData:moderation_data:4400142":{"__typename":"ModerationData","id":"moderation_data:4400142","status":"APPROVED","rejectReason":null,"isReportedAbuse":false,"rejectUser":null,"rejectTime":null,"rejectActorType":null},"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLVFValQxUg?revision=15\"}":{"__typename":"AssociatedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLVFValQxUg?revision=15","title":"SemanticSearch (1).jpg","associationType":"COVER","width":1127,"height":628,"altText":""},"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLTJ0ZXNZNw?revision=15\"}":{"__typename":"AssociatedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLTJ0ZXNZNw?revision=15","title":"image.png","associationType":"BODY","width":1427,"height":590,"altText":""},"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLW9hY3Focg?revision=15\"}":{"__typename":"AssociatedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00NDAwMTQyLW9hY3Focg?revision=15","title":"image.png","associationType":"BODY","width":970,"height":524,"altText":""},"Revision:revision:4400142_15":{"__typename":"Revision","id":"revision:4400142_15","lastEditTime":"2025-04-02T14:39:01.917-07:00"},"CachedAsset:theme:customTheme1-1746564000902":{"__typename":"CachedAsset","id":"theme:customTheme1-1746564000902","value":{"id":"customTheme1","animation":{"fast":"150ms","normal":"250ms","slow":"500ms","slowest":"750ms","function":"cubic-bezier(0.07, 0.91, 0.51, 1)","__typename":"AnimationThemeSettings"},"avatar":{"borderRadius":"50%","collections":["default"],"__typename":"AvatarThemeSettings"},"basics":{"browserIcon":{"imageAssetName":"favicon-1730836283320.png","imageLastModified":"1730836286415","__typename":"ThemeAsset"},"customerLogo":{"imageAssetName":"favicon-1730836271365.png","imageLastModified":"1730836274203","__typename":"ThemeAsset"},"maximumWidthOfPageContent":"1300px","oneColumnNarrowWidth":"800px","gridGutterWidthMd":"30px","gridGutterWidthXs":"10px","pageWidthStyle":"WIDTH_OF_BROWSER","__typename":"BasicsThemeSettings"},"buttons":{"borderRadiusSm":"3px","borderRadius":"3px","borderRadiusLg":"5px","paddingY":"5px","paddingYLg":"7px","paddingYHero":"var(--lia-bs-btn-padding-y-lg)","paddingX":"12px","paddingXLg":"16px","paddingXHero":"60px","fontStyle":"NORMAL","fontWeight":"700","textTransform":"NONE","disabledOpacity":0.5,"primaryTextColor":"var(--lia-bs-white)","primaryTextHoverColor":"var(--lia-bs-white)","primaryTextActiveColor":"var(--lia-bs-white)","primaryBgColor":"var(--lia-bs-primary)","primaryBgHoverColor":"hsl(var(--lia-bs-primary-h), var(--lia-bs-primary-s), calc(var(--lia-bs-primary-l) * 0.85))","primaryBgActiveColor":"hsl(var(--lia-bs-primary-h), var(--lia-bs-primary-s), calc(var(--lia-bs-primary-l) * 0.7))","primaryBorder":"1px solid transparent","primaryBorderHover":"1px solid transparent","primaryBorderActive":"1px solid transparent","primaryBorderFocus":"1px solid var(--lia-bs-white)","primaryBoxShadowFocus":"0 0 0 1px var(--lia-bs-primary), 0 0 0 4px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","secondaryTextColor":"var(--lia-bs-gray-900)","secondaryTextHoverColor":"hsl(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), calc(var(--lia-bs-gray-900-l) * 0.95))","secondaryTextActiveColor":"hsl(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), calc(var(--lia-bs-gray-900-l) * 0.9))","secondaryBgColor":"var(--lia-bs-gray-200)","secondaryBgHoverColor":"hsl(var(--lia-bs-gray-200-h), var(--lia-bs-gray-200-s), calc(var(--lia-bs-gray-200-l) * 0.96))","secondaryBgActiveColor":"hsl(var(--lia-bs-gray-200-h), var(--lia-bs-gray-200-s), calc(var(--lia-bs-gray-200-l) * 0.92))","secondaryBorder":"1px solid transparent","secondaryBorderHover":"1px solid transparent","secondaryBorderActive":"1px solid transparent","secondaryBorderFocus":"1px solid transparent","secondaryBoxShadowFocus":"0 0 0 1px var(--lia-bs-primary), 0 0 0 4px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","tertiaryTextColor":"var(--lia-bs-gray-900)","tertiaryTextHoverColor":"hsl(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), calc(var(--lia-bs-gray-900-l) * 0.95))","tertiaryTextActiveColor":"hsl(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), calc(var(--lia-bs-gray-900-l) * 0.9))","tertiaryBgColor":"transparent","tertiaryBgHoverColor":"transparent","tertiaryBgActiveColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.04)","tertiaryBorder":"1px solid transparent","tertiaryBorderHover":"1px solid hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.08)","tertiaryBorderActive":"1px solid transparent","tertiaryBorderFocus":"1px solid transparent","tertiaryBoxShadowFocus":"0 0 0 1px var(--lia-bs-primary), 0 0 0 4px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","destructiveTextColor":"var(--lia-bs-danger)","destructiveTextHoverColor":"hsl(var(--lia-bs-danger-h), var(--lia-bs-danger-s), calc(var(--lia-bs-danger-l) * 0.95))","destructiveTextActiveColor":"hsl(var(--lia-bs-danger-h), var(--lia-bs-danger-s), calc(var(--lia-bs-danger-l) * 0.9))","destructiveBgColor":"var(--lia-bs-gray-200)","destructiveBgHoverColor":"hsl(var(--lia-bs-gray-200-h), var(--lia-bs-gray-200-s), calc(var(--lia-bs-gray-200-l) * 0.96))","destructiveBgActiveColor":"hsl(var(--lia-bs-gray-200-h), var(--lia-bs-gray-200-s), calc(var(--lia-bs-gray-200-l) * 0.92))","destructiveBorder":"1px solid transparent","destructiveBorderHover":"1px solid transparent","destructiveBorderActive":"1px solid transparent","destructiveBorderFocus":"1px solid transparent","destructiveBoxShadowFocus":"0 0 0 1px var(--lia-bs-primary), 0 0 0 4px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","__typename":"ButtonsThemeSettings"},"border":{"color":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.08)","mainContent":"NONE","sideContent":"LIGHT","radiusSm":"3px","radius":"5px","radiusLg":"9px","radius50":"100vw","__typename":"BorderThemeSettings"},"boxShadow":{"xs":"0 0 0 1px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.08), 0 3px 0 -1px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.16)","sm":"0 2px 4px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.12)","md":"0 5px 15px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.3)","lg":"0 10px 30px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.3)","__typename":"BoxShadowThemeSettings"},"cards":{"bgColor":"var(--lia-panel-bg-color)","borderRadius":"var(--lia-panel-border-radius)","boxShadow":"var(--lia-box-shadow-xs)","__typename":"CardsThemeSettings"},"chip":{"maxWidth":"300px","height":"30px","__typename":"ChipThemeSettings"},"coreTypes":{"defaultMessageLinkColor":"var(--lia-bs-link-color)","defaultMessageLinkDecoration":"none","defaultMessageLinkFontStyle":"NORMAL","defaultMessageLinkFontWeight":"400","defaultMessageFontStyle":"NORMAL","defaultMessageFontWeight":"400","defaultMessageFontFamily":"var(--lia-bs-font-family-base)","forumColor":"#4099E2","forumFontFamily":"var(--lia-bs-font-family-base)","forumFontWeight":"var(--lia-default-message-font-weight)","forumLineHeight":"var(--lia-bs-line-height-base)","forumFontStyle":"var(--lia-default-message-font-style)","forumMessageLinkColor":"var(--lia-default-message-link-color)","forumMessageLinkDecoration":"var(--lia-default-message-link-decoration)","forumMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","forumMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","forumSolvedColor":"#148563","blogColor":"#1CBAA0","blogFontFamily":"var(--lia-bs-font-family-base)","blogFontWeight":"var(--lia-default-message-font-weight)","blogLineHeight":"1.75","blogFontStyle":"var(--lia-default-message-font-style)","blogMessageLinkColor":"var(--lia-default-message-link-color)","blogMessageLinkDecoration":"var(--lia-default-message-link-decoration)","blogMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","blogMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","tkbColor":"#4C6B90","tkbFontFamily":"var(--lia-bs-font-family-base)","tkbFontWeight":"var(--lia-default-message-font-weight)","tkbLineHeight":"1.75","tkbFontStyle":"var(--lia-default-message-font-style)","tkbMessageLinkColor":"var(--lia-default-message-link-color)","tkbMessageLinkDecoration":"var(--lia-default-message-link-decoration)","tkbMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","tkbMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","qandaColor":"#4099E2","qandaFontFamily":"var(--lia-bs-font-family-base)","qandaFontWeight":"var(--lia-default-message-font-weight)","qandaLineHeight":"var(--lia-bs-line-height-base)","qandaFontStyle":"var(--lia-default-message-link-font-style)","qandaMessageLinkColor":"var(--lia-default-message-link-color)","qandaMessageLinkDecoration":"var(--lia-default-message-link-decoration)","qandaMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","qandaMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","qandaSolvedColor":"#3FA023","ideaColor":"#FF8000","ideaFontFamily":"var(--lia-bs-font-family-base)","ideaFontWeight":"var(--lia-default-message-font-weight)","ideaLineHeight":"var(--lia-bs-line-height-base)","ideaFontStyle":"var(--lia-default-message-font-style)","ideaMessageLinkColor":"var(--lia-default-message-link-color)","ideaMessageLinkDecoration":"var(--lia-default-message-link-decoration)","ideaMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","ideaMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","contestColor":"#FCC845","contestFontFamily":"var(--lia-bs-font-family-base)","contestFontWeight":"var(--lia-default-message-font-weight)","contestLineHeight":"var(--lia-bs-line-height-base)","contestFontStyle":"var(--lia-default-message-link-font-style)","contestMessageLinkColor":"var(--lia-default-message-link-color)","contestMessageLinkDecoration":"var(--lia-default-message-link-decoration)","contestMessageLinkFontStyle":"ITALIC","contestMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","occasionColor":"#D13A1F","occasionFontFamily":"var(--lia-bs-font-family-base)","occasionFontWeight":"var(--lia-default-message-font-weight)","occasionLineHeight":"var(--lia-bs-line-height-base)","occasionFontStyle":"var(--lia-default-message-font-style)","occasionMessageLinkColor":"var(--lia-default-message-link-color)","occasionMessageLinkDecoration":"var(--lia-default-message-link-decoration)","occasionMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","occasionMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","grouphubColor":"#333333","categoryColor":"#949494","communityColor":"#FFFFFF","productColor":"#949494","__typename":"CoreTypesThemeSettings"},"colors":{"black":"#000000","white":"#FFFFFF","gray100":"#F7F7F7","gray200":"#F7F7F7","gray300":"#E8E8E8","gray400":"#D9D9D9","gray500":"#CCCCCC","gray600":"#717171","gray700":"#707070","gray800":"#545454","gray900":"#333333","dark":"#545454","light":"#F7F7F7","primary":"#0069D4","secondary":"#333333","bodyText":"#1E1E1E","bodyBg":"#FFFFFF","info":"#409AE2","success":"#41C5AE","warning":"#FCC844","danger":"#BC341B","alertSystem":"#FF6600","textMuted":"#707070","highlight":"#FFFCAD","outline":"var(--lia-bs-primary)","custom":["#D3F5A4","#243A5E"],"__typename":"ColorsThemeSettings"},"divider":{"size":"3px","marginLeft":"4px","marginRight":"4px","borderRadius":"50%","bgColor":"var(--lia-bs-gray-600)","bgColorActive":"var(--lia-bs-gray-600)","__typename":"DividerThemeSettings"},"dropdown":{"fontSize":"var(--lia-bs-font-size-sm)","borderColor":"var(--lia-bs-border-color)","borderRadius":"var(--lia-bs-border-radius-sm)","dividerBg":"var(--lia-bs-gray-300)","itemPaddingY":"5px","itemPaddingX":"20px","headerColor":"var(--lia-bs-gray-700)","__typename":"DropdownThemeSettings"},"email":{"link":{"color":"#0069D4","hoverColor":"#0061c2","decoration":"none","hoverDecoration":"underline","__typename":"EmailLinkSettings"},"border":{"color":"#e4e4e4","__typename":"EmailBorderSettings"},"buttons":{"borderRadiusLg":"5px","paddingXLg":"16px","paddingYLg":"7px","fontWeight":"700","primaryTextColor":"#ffffff","primaryTextHoverColor":"#ffffff","primaryBgColor":"#0069D4","primaryBgHoverColor":"#005cb8","primaryBorder":"1px solid transparent","primaryBorderHover":"1px solid transparent","__typename":"EmailButtonsSettings"},"panel":{"borderRadius":"5px","borderColor":"#e4e4e4","__typename":"EmailPanelSettings"},"__typename":"EmailThemeSettings"},"emoji":{"skinToneDefault":"#ffcd43","skinToneLight":"#fae3c5","skinToneMediumLight":"#e2cfa5","skinToneMedium":"#daa478","skinToneMediumDark":"#a78058","skinToneDark":"#5e4d43","__typename":"EmojiThemeSettings"},"heading":{"color":"var(--lia-bs-body-color)","fontFamily":"Segoe UI","fontStyle":"NORMAL","fontWeight":"400","h1FontSize":"34px","h2FontSize":"32px","h3FontSize":"28px","h4FontSize":"24px","h5FontSize":"20px","h6FontSize":"16px","lineHeight":"1.3","subHeaderFontSize":"11px","subHeaderFontWeight":"500","h1LetterSpacing":"normal","h2LetterSpacing":"normal","h3LetterSpacing":"normal","h4LetterSpacing":"normal","h5LetterSpacing":"normal","h6LetterSpacing":"normal","subHeaderLetterSpacing":"2px","h1FontWeight":"var(--lia-bs-headings-font-weight)","h2FontWeight":"var(--lia-bs-headings-font-weight)","h3FontWeight":"var(--lia-bs-headings-font-weight)","h4FontWeight":"var(--lia-bs-headings-font-weight)","h5FontWeight":"var(--lia-bs-headings-font-weight)","h6FontWeight":"var(--lia-bs-headings-font-weight)","__typename":"HeadingThemeSettings"},"icons":{"size10":"10px","size12":"12px","size14":"14px","size16":"16px","size20":"20px","size24":"24px","size30":"30px","size40":"40px","size50":"50px","size60":"60px","size80":"80px","size120":"120px","size160":"160px","__typename":"IconsThemeSettings"},"imagePreview":{"bgColor":"var(--lia-bs-gray-900)","titleColor":"var(--lia-bs-white)","controlColor":"var(--lia-bs-white)","controlBgColor":"var(--lia-bs-gray-800)","__typename":"ImagePreviewThemeSettings"},"input":{"borderColor":"var(--lia-bs-gray-600)","disabledColor":"var(--lia-bs-gray-600)","focusBorderColor":"var(--lia-bs-primary)","labelMarginBottom":"10px","btnFontSize":"var(--lia-bs-font-size-sm)","focusBoxShadow":"0 0 0 3px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","checkLabelMarginBottom":"2px","checkboxBorderRadius":"3px","borderRadiusSm":"var(--lia-bs-border-radius-sm)","borderRadius":"var(--lia-bs-border-radius)","borderRadiusLg":"var(--lia-bs-border-radius-lg)","formTextMarginTop":"4px","textAreaBorderRadius":"var(--lia-bs-border-radius)","activeFillColor":"var(--lia-bs-primary)","__typename":"InputThemeSettings"},"loading":{"dotDarkColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.2)","dotLightColor":"hsla(var(--lia-bs-white-h), var(--lia-bs-white-s), var(--lia-bs-white-l), 0.5)","barDarkColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.06)","barLightColor":"hsla(var(--lia-bs-white-h), var(--lia-bs-white-s), var(--lia-bs-white-l), 0.4)","__typename":"LoadingThemeSettings"},"link":{"color":"var(--lia-bs-primary)","hoverColor":"hsl(var(--lia-bs-primary-h), var(--lia-bs-primary-s), calc(var(--lia-bs-primary-l) - 10%))","decoration":"none","hoverDecoration":"underline","__typename":"LinkThemeSettings"},"listGroup":{"itemPaddingY":"15px","itemPaddingX":"15px","borderColor":"var(--lia-bs-gray-300)","__typename":"ListGroupThemeSettings"},"modal":{"contentTextColor":"var(--lia-bs-body-color)","contentBg":"var(--lia-bs-white)","backgroundBg":"var(--lia-bs-black)","smSize":"440px","mdSize":"760px","lgSize":"1080px","backdropOpacity":0.3,"contentBoxShadowXs":"var(--lia-bs-box-shadow-sm)","contentBoxShadow":"var(--lia-bs-box-shadow)","headerFontWeight":"700","__typename":"ModalThemeSettings"},"navbar":{"position":"FIXED","background":{"attachment":null,"clip":null,"color":"var(--lia-bs-white)","imageAssetName":"","imageLastModified":"0","origin":null,"position":"CENTER_CENTER","repeat":"NO_REPEAT","size":"COVER","__typename":"BackgroundProps"},"backgroundOpacity":0.8,"paddingTop":"15px","paddingBottom":"15px","borderBottom":"1px solid var(--lia-bs-border-color)","boxShadow":"var(--lia-bs-box-shadow-sm)","brandMarginRight":"30px","brandMarginRightSm":"10px","brandLogoHeight":"30px","linkGap":"10px","linkJustifyContent":"flex-start","linkPaddingY":"5px","linkPaddingX":"10px","linkDropdownPaddingY":"9px","linkDropdownPaddingX":"var(--lia-nav-link-px)","linkColor":"var(--lia-bs-body-color)","linkHoverColor":"var(--lia-bs-primary)","linkFontSize":"var(--lia-bs-font-size-sm)","linkFontStyle":"NORMAL","linkFontWeight":"400","linkTextTransform":"NONE","linkLetterSpacing":"normal","linkBorderRadius":"var(--lia-bs-border-radius-sm)","linkBgColor":"transparent","linkBgHoverColor":"transparent","linkBorder":"none","linkBorderHover":"none","linkBoxShadow":"none","linkBoxShadowHover":"none","linkTextBorderBottom":"none","linkTextBorderBottomHover":"none","dropdownPaddingTop":"10px","dropdownPaddingBottom":"15px","dropdownPaddingX":"10px","dropdownMenuOffset":"2px","dropdownDividerMarginTop":"10px","dropdownDividerMarginBottom":"10px","dropdownBorderColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.08)","controllerBgHoverColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.1)","controllerIconColor":"var(--lia-bs-body-color)","controllerIconHoverColor":"var(--lia-bs-body-color)","controllerTextColor":"var(--lia-nav-controller-icon-color)","controllerTextHoverColor":"var(--lia-nav-controller-icon-hover-color)","controllerHighlightColor":"hsla(30, 100%, 50%)","controllerHighlightTextColor":"var(--lia-yiq-light)","controllerBorderRadius":"var(--lia-border-radius-50)","hamburgerColor":"var(--lia-nav-controller-icon-color)","hamburgerHoverColor":"var(--lia-nav-controller-icon-color)","hamburgerBgColor":"transparent","hamburgerBgHoverColor":"transparent","hamburgerBorder":"none","hamburgerBorderHover":"none","collapseMenuMarginLeft":"20px","collapseMenuDividerBg":"var(--lia-nav-link-color)","collapseMenuDividerOpacity":0.16,"__typename":"NavbarThemeSettings"},"pager":{"textColor":"var(--lia-bs-link-color)","textFontWeight":"var(--lia-font-weight-md)","textFontSize":"var(--lia-bs-font-size-sm)","__typename":"PagerThemeSettings"},"panel":{"bgColor":"var(--lia-bs-white)","borderRadius":"var(--lia-bs-border-radius)","borderColor":"var(--lia-bs-border-color)","boxShadow":"none","__typename":"PanelThemeSettings"},"popover":{"arrowHeight":"8px","arrowWidth":"16px","maxWidth":"300px","minWidth":"100px","headerBg":"var(--lia-bs-white)","borderColor":"var(--lia-bs-border-color)","borderRadius":"var(--lia-bs-border-radius)","boxShadow":"0 0.5rem 1rem hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.15)","__typename":"PopoverThemeSettings"},"prism":{"color":"#000000","bgColor":"#f5f2f0","fontFamily":"var(--font-family-monospace)","fontSize":"var(--lia-bs-font-size-base)","fontWeightBold":"var(--lia-bs-font-weight-bold)","fontStyleItalic":"italic","tabSize":2,"highlightColor":"#b3d4fc","commentColor":"#62707e","punctuationColor":"#6f6f6f","namespaceOpacity":"0.7","propColor":"#990055","selectorColor":"#517a00","operatorColor":"#906736","operatorBgColor":"hsla(0, 0%, 100%, 0.5)","keywordColor":"#0076a9","functionColor":"#d3284b","variableColor":"#c14700","__typename":"PrismThemeSettings"},"rte":{"bgColor":"var(--lia-bs-white)","borderRadius":"var(--lia-panel-border-radius)","boxShadow":" var(--lia-panel-box-shadow)","customColor1":"#bfedd2","customColor2":"#fbeeb8","customColor3":"#f8cac6","customColor4":"#eccafa","customColor5":"#c2e0f4","customColor6":"#2dc26b","customColor7":"#f1c40f","customColor8":"#e03e2d","customColor9":"#b96ad9","customColor10":"#3598db","customColor11":"#169179","customColor12":"#e67e23","customColor13":"#ba372a","customColor14":"#843fa1","customColor15":"#236fa1","customColor16":"#ecf0f1","customColor17":"#ced4d9","customColor18":"#95a5a6","customColor19":"#7e8c8d","customColor20":"#34495e","customColor21":"#000000","customColor22":"#ffffff","defaultMessageHeaderMarginTop":"40px","defaultMessageHeaderMarginBottom":"20px","defaultMessageItemMarginTop":"0","defaultMessageItemMarginBottom":"10px","diffAddedColor":"hsla(170, 53%, 51%, 0.4)","diffChangedColor":"hsla(43, 97%, 63%, 0.4)","diffNoneColor":"hsla(0, 0%, 80%, 0.4)","diffRemovedColor":"hsla(9, 74%, 47%, 0.4)","specialMessageHeaderMarginTop":"40px","specialMessageHeaderMarginBottom":"20px","specialMessageItemMarginTop":"0","specialMessageItemMarginBottom":"10px","__typename":"RteThemeSettings"},"tags":{"bgColor":"var(--lia-bs-gray-200)","bgHoverColor":"var(--lia-bs-gray-400)","borderRadius":"var(--lia-bs-border-radius-sm)","color":"var(--lia-bs-body-color)","hoverColor":"var(--lia-bs-body-color)","fontWeight":"var(--lia-font-weight-md)","fontSize":"var(--lia-font-size-xxs)","textTransform":"UPPERCASE","letterSpacing":"0.5px","__typename":"TagsThemeSettings"},"toasts":{"borderRadius":"var(--lia-bs-border-radius)","paddingX":"12px","__typename":"ToastsThemeSettings"},"typography":{"fontFamilyBase":"Segoe UI","fontStyleBase":"NORMAL","fontWeightBase":"400","fontWeightLight":"300","fontWeightNormal":"400","fontWeightMd":"500","fontWeightBold":"700","letterSpacingSm":"normal","letterSpacingXs":"normal","lineHeightBase":"1.5","fontSizeBase":"16px","fontSizeXxs":"11px","fontSizeXs":"12px","fontSizeSm":"14px","fontSizeLg":"20px","fontSizeXl":"24px","smallFontSize":"14px","customFonts":[{"source":"SERVER","name":"Segoe UI","styles":[{"style":"NORMAL","weight":"400","__typename":"FontStyleData"},{"style":"NORMAL","weight":"300","__typename":"FontStyleData"},{"style":"NORMAL","weight":"600","__typename":"FontStyleData"},{"style":"NORMAL","weight":"700","__typename":"FontStyleData"},{"style":"ITALIC","weight":"400","__typename":"FontStyleData"}],"assetNames":["SegoeUI-normal-400.woff2","SegoeUI-normal-300.woff2","SegoeUI-normal-600.woff2","SegoeUI-normal-700.woff2","SegoeUI-italic-400.woff2"],"__typename":"CustomFont"},{"source":"SERVER","name":"MWF Fluent Icons","styles":[{"style":"NORMAL","weight":"400","__typename":"FontStyleData"}],"assetNames":["MWFFluentIcons-normal-400.woff2"],"__typename":"CustomFont"}],"__typename":"TypographyThemeSettings"},"unstyledListItem":{"marginBottomSm":"5px","marginBottomMd":"10px","marginBottomLg":"15px","marginBottomXl":"20px","marginBottomXxl":"25px","__typename":"UnstyledListItemThemeSettings"},"yiq":{"light":"#ffffff","dark":"#000000","__typename":"YiqThemeSettings"},"colorLightness":{"primaryDark":0.36,"primaryLight":0.74,"primaryLighter":0.89,"primaryLightest":0.95,"infoDark":0.39,"infoLight":0.72,"infoLighter":0.85,"infoLightest":0.93,"successDark":0.24,"successLight":0.62,"successLighter":0.8,"successLightest":0.91,"warningDark":0.39,"warningLight":0.68,"warningLighter":0.84,"warningLightest":0.93,"dangerDark":0.41,"dangerLight":0.72,"dangerLighter":0.89,"dangerLightest":0.95,"__typename":"ColorLightnessThemeSettings"},"localOverride":false,"__typename":"Theme"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/common/Loading/LoadingDot-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/common/Loading/LoadingDot-1745505307000","value":{"title":"Loading..."},"localOverride":false},"CachedAsset:quilt:o365.prod:pages/blogs/BlogMessagePage:board:AzureDevCommunityBlog-1746740537062":{"__typename":"CachedAsset","id":"quilt:o365.prod:pages/blogs/BlogMessagePage:board:AzureDevCommunityBlog-1746740537062","value":{"id":"BlogMessagePage","container":{"id":"Common","headerProps":{"backgroundImageProps":null,"backgroundColor":null,"addComponents":null,"removeComponents":["community.widget.bannerWidget"],"componentOrder":null,"__typename":"QuiltContainerSectionProps"},"headerComponentProps":{"community.widget.breadcrumbWidget":{"disableLastCrumbForDesktop":false}},"footerProps":null,"footerComponentProps":null,"items":[{"id":"blog-article","layout":"ONE_COLUMN","bgColor":null,"showTitle":null,"showDescription":null,"textPosition":null,"textColor":null,"sectionEditLevel":"LOCKED","bgImage":null,"disableSpacing":null,"edgeToEdgeDisplay":null,"fullHeight":null,"showBorder":null,"__typename":"OneColumnQuiltSection","columnMap":{"main":[{"id":"blogs.widget.blogArticleWidget","className":"lia-blog-container","props":null,"__typename":"QuiltComponent"}],"__typename":"OneSectionColumns"}},{"id":"section-1729184836777","layout":"MAIN_SIDE","bgColor":"transparent","showTitle":false,"showDescription":false,"textPosition":"CENTER","textColor":"var(--lia-bs-body-color)","sectionEditLevel":null,"bgImage":null,"disableSpacing":null,"edgeToEdgeDisplay":null,"fullHeight":null,"showBorder":null,"__typename":"MainSideQuiltSection","columnMap":{"main":[],"side":[],"__typename":"MainSideSectionColumns"}}],"__typename":"QuiltContainer"},"__typename":"Quilt","localOverride":false},"localOverride":false},"CachedAsset:text:en_US-components/common/EmailVerification-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/common/EmailVerification-1745505307000","value":{"email.verification.title":"Email Verification Required","email.verification.message.update.email":"To participate in the community, you must first verify your email address. The verification email was sent to {email}. To change your email, visit My Settings.","email.verification.message.resend.email":"To participate in the community, you must first verify your email address. The verification email was sent to {email}. Resend email."},"localOverride":false},"CachedAsset:text:en_US-pages/blogs/BlogMessagePage-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-pages/blogs/BlogMessagePage-1745505307000","value":{"title":"{contextMessageSubject} | {communityTitle}","errorMissing":"This blog post cannot be found","name":"Blog Message Page","section.blog-article.title":"Blog Post","archivedMessageTitle":"This Content Has Been Archived","section.section-1729184836777.title":"","section.section-1729184836777.description":"","section.CncIde.title":"Blog Post","section.tifEmD.description":"","section.tifEmD.title":""},"localOverride":false},"CachedAsset:quiltWrapper:o365.prod:Common:1746563933776":{"__typename":"CachedAsset","id":"quiltWrapper:o365.prod:Common:1746563933776","value":{"id":"Common","header":{"backgroundImageProps":{"assetName":null,"backgroundSize":"COVER","backgroundRepeat":"NO_REPEAT","backgroundPosition":"CENTER_CENTER","lastModified":null,"__typename":"BackgroundImageProps"},"backgroundColor":"transparent","items":[{"id":"community.widget.navbarWidget","props":{"showUserName":true,"showRegisterLink":true,"useIconLanguagePicker":true,"useLabelLanguagePicker":true,"className":"QuiltComponent_lia-component-edit-mode__0nCcm","links":{"sideLinks":[],"mainLinks":[{"children":[],"linkType":"INTERNAL","id":"gxcuf89792","params":{},"routeName":"CommunityPage"},{"children":[],"linkType":"EXTERNAL","id":"external-link","url":"/Directory","target":"SELF"},{"children":[{"linkType":"INTERNAL","id":"microsoft365","params":{"categoryId":"microsoft365"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"windows","params":{"categoryId":"Windows"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"Common-microsoft365-copilot-link","params":{"categoryId":"Microsoft365Copilot"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-teams","params":{"categoryId":"MicrosoftTeams"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-securityand-compliance","params":{"categoryId":"microsoft-security"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"azure","params":{"categoryId":"Azure"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"Common-content_management-link","params":{"categoryId":"Content_Management"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"exchange","params":{"categoryId":"Exchange"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"windows-server","params":{"categoryId":"Windows-Server"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"outlook","params":{"categoryId":"Outlook"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-endpoint-manager","params":{"categoryId":"microsoftintune"},"routeName":"CategoryPage"},{"linkType":"EXTERNAL","id":"external-link-2","url":"/Directory","target":"SELF"}],"linkType":"EXTERNAL","id":"communities","url":"/","target":"BLANK"},{"children":[{"linkType":"INTERNAL","id":"a-i","params":{"categoryId":"AI"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"education-sector","params":{"categoryId":"EducationSector"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"partner-community","params":{"categoryId":"PartnerCommunity"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"i-t-ops-talk","params":{"categoryId":"ITOpsTalk"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"healthcare-and-life-sciences","params":{"categoryId":"HealthcareAndLifeSciences"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-mechanics","params":{"categoryId":"MicrosoftMechanics"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"public-sector","params":{"categoryId":"PublicSector"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"s-m-b","params":{"categoryId":"MicrosoftforNonprofits"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"io-t","params":{"categoryId":"IoT"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"startupsat-microsoft","params":{"categoryId":"StartupsatMicrosoft"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"driving-adoption","params":{"categoryId":"DrivingAdoption"},"routeName":"CategoryPage"},{"linkType":"EXTERNAL","id":"external-link-1","url":"/Directory","target":"SELF"}],"linkType":"EXTERNAL","id":"communities-1","url":"/","target":"SELF"},{"children":[],"linkType":"EXTERNAL","id":"external","url":"/Blogs","target":"SELF"},{"children":[],"linkType":"EXTERNAL","id":"external-1","url":"/Events","target":"SELF"},{"children":[{"linkType":"INTERNAL","id":"microsoft-learn-1","params":{"categoryId":"MicrosoftLearn"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-learn-blog","params":{"boardId":"MicrosoftLearnBlog","categoryId":"MicrosoftLearn"},"routeName":"BlogBoardPage"},{"linkType":"EXTERNAL","id":"external-10","url":"https://learningroomdirectory.microsoft.com/","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-3","url":"https://docs.microsoft.com/learn/dynamics365/?WT.mc_id=techcom_header-webpage-m365","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-4","url":"https://docs.microsoft.com/learn/m365/?wt.mc_id=techcom_header-webpage-m365","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-5","url":"https://docs.microsoft.com/learn/topics/sci/?wt.mc_id=techcom_header-webpage-m365","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-6","url":"https://docs.microsoft.com/learn/powerplatform/?wt.mc_id=techcom_header-webpage-powerplatform","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-7","url":"https://docs.microsoft.com/learn/github/?wt.mc_id=techcom_header-webpage-github","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-8","url":"https://docs.microsoft.com/learn/teams/?wt.mc_id=techcom_header-webpage-teams","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-9","url":"https://docs.microsoft.com/learn/dotnet/?wt.mc_id=techcom_header-webpage-dotnet","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-2","url":"https://docs.microsoft.com/learn/azure/?WT.mc_id=techcom_header-webpage-m365","target":"BLANK"}],"linkType":"INTERNAL","id":"microsoft-learn","params":{"categoryId":"MicrosoftLearn"},"routeName":"CategoryPage"},{"children":[],"linkType":"INTERNAL","id":"community-info-center","params":{"categoryId":"Community-Info-Center"},"routeName":"CategoryPage"}]},"style":{"boxShadow":"var(--lia-bs-box-shadow-sm)","controllerHighlightColor":"hsla(30, 100%, 50%)","linkFontWeight":"400","dropdownDividerMarginBottom":"10px","hamburgerBorderHover":"none","linkBoxShadowHover":"none","linkFontSize":"14px","backgroundOpacity":0.8,"controllerBorderRadius":"var(--lia-border-radius-50)","hamburgerBgColor":"transparent","hamburgerColor":"var(--lia-nav-controller-icon-color)","linkTextBorderBottom":"none","brandLogoHeight":"30px","linkBgHoverColor":"transparent","linkLetterSpacing":"normal","collapseMenuDividerOpacity":0.16,"dropdownPaddingBottom":"15px","paddingBottom":"15px","dropdownMenuOffset":"2px","hamburgerBgHoverColor":"transparent","borderBottom":"1px solid var(--lia-bs-border-color)","hamburgerBorder":"none","dropdownPaddingX":"10px","brandMarginRightSm":"10px","linkBoxShadow":"none","collapseMenuDividerBg":"var(--lia-nav-link-color)","linkColor":"var(--lia-bs-body-color)","linkJustifyContent":"flex-start","dropdownPaddingTop":"10px","controllerHighlightTextColor":"var(--lia-yiq-dark)","controllerTextColor":"var(--lia-nav-controller-icon-color)","background":{"imageAssetName":"","color":"var(--lia-bs-white)","size":"COVER","repeat":"NO_REPEAT","position":"CENTER_CENTER","imageLastModified":""},"linkBorderRadius":"var(--lia-bs-border-radius-sm)","linkHoverColor":"var(--lia-bs-body-color)","position":"FIXED","linkBorder":"none","linkTextBorderBottomHover":"2px solid var(--lia-bs-body-color)","brandMarginRight":"30px","hamburgerHoverColor":"var(--lia-nav-controller-icon-color)","linkBorderHover":"none","collapseMenuMarginLeft":"20px","linkFontStyle":"NORMAL","controllerTextHoverColor":"var(--lia-nav-controller-icon-hover-color)","linkPaddingX":"10px","linkPaddingY":"5px","paddingTop":"15px","linkTextTransform":"NONE","dropdownBorderColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.08)","controllerBgHoverColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.1)","linkBgColor":"transparent","linkDropdownPaddingX":"var(--lia-nav-link-px)","linkDropdownPaddingY":"9px","controllerIconColor":"var(--lia-bs-body-color)","dropdownDividerMarginTop":"10px","linkGap":"10px","controllerIconHoverColor":"var(--lia-bs-body-color)"},"showSearchIcon":false,"languagePickerStyle":"iconAndLabel"},"__typename":"QuiltComponent"},{"id":"community.widget.breadcrumbWidget","props":{"backgroundColor":"transparent","linkHighlightColor":"var(--lia-bs-primary)","visualEffects":{"showBottomBorder":true},"linkTextColor":"var(--lia-bs-gray-700)"},"__typename":"QuiltComponent"},{"id":"custom.widget.community_banner","props":{"widgetVisibility":"signedInOrAnonymous","useTitle":true,"usePageWidth":false,"useBackground":false,"title":"","lazyLoad":false},"__typename":"QuiltComponent"},{"id":"custom.widget.HeroBanner","props":{"widgetVisibility":"signedInOrAnonymous","usePageWidth":false,"useTitle":true,"cMax_items":3,"useBackground":false,"title":"","lazyLoad":false,"widgetChooser":"custom.widget.HeroBanner"},"__typename":"QuiltComponent"}],"__typename":"QuiltWrapperSection"},"footer":{"backgroundImageProps":{"assetName":null,"backgroundSize":"COVER","backgroundRepeat":"NO_REPEAT","backgroundPosition":"CENTER_CENTER","lastModified":null,"__typename":"BackgroundImageProps"},"backgroundColor":"transparent","items":[{"id":"custom.widget.MicrosoftFooter","props":{"widgetVisibility":"signedInOrAnonymous","useTitle":true,"useBackground":false,"title":"","lazyLoad":false},"__typename":"QuiltComponent"}],"__typename":"QuiltWrapperSection"},"__typename":"QuiltWrapper","localOverride":false},"localOverride":false},"CachedAsset:text:en_US-components/common/ActionFeedback-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/common/ActionFeedback-1745505307000","value":{"joinedGroupHub.title":"Welcome","joinedGroupHub.message":"You are now a member of this group and are subscribed to updates.","groupHubInviteNotFound.title":"Invitation Not Found","groupHubInviteNotFound.message":"Sorry, we could not find your invitation to the group. The owner may have canceled the invite.","groupHubNotFound.title":"Group Not Found","groupHubNotFound.message":"The grouphub you tried to join does not exist. It may have been deleted.","existingGroupHubMember.title":"Already Joined","existingGroupHubMember.message":"You are already a member of this group.","accountLocked.title":"Account Locked","accountLocked.message":"Your account has been locked due to multiple failed attempts. Try again in {lockoutTime} minutes.","editedGroupHub.title":"Changes Saved","editedGroupHub.message":"Your group has been updated.","leftGroupHub.title":"Goodbye","leftGroupHub.message":"You are no longer a member of this group and will not receive future updates.","deletedGroupHub.title":"Deleted","deletedGroupHub.message":"The group has been deleted.","groupHubCreated.title":"Group Created","groupHubCreated.message":"{groupHubName} is ready to use","accountClosed.title":"Account Closed","accountClosed.message":"The account has been closed and you will now be redirected to the homepage","resetTokenExpired.title":"Reset Password Link has Expired","resetTokenExpired.message":"Try resetting your password again","invalidUrl.title":"Invalid URL","invalidUrl.message":"The URL you're using is not recognized. Verify your URL and try again.","accountClosedForUser.title":"Account Closed","accountClosedForUser.message":"{userName}'s account is closed","inviteTokenInvalid.title":"Invitation Invalid","inviteTokenInvalid.message":"Your invitation to the community has been canceled or expired.","inviteTokenError.title":"Invitation Verification Failed","inviteTokenError.message":"The url you are utilizing is not recognized. Verify your URL and try again","pageNotFound.title":"Access Denied","pageNotFound.message":"You do not have access to this area of the community or it doesn't exist","eventAttending.title":"Responded as Attending","eventAttending.message":"You'll be notified when there's new activity and reminded as the event approaches","eventInterested.title":"Responded as Interested","eventInterested.message":"You'll be notified when there's new activity and reminded as the event approaches","eventNotFound.title":"Event Not Found","eventNotFound.message":"The event you tried to respond to does not exist.","redirectToRelatedPage.title":"Showing Related Content","redirectToRelatedPageForBaseUsers.title":"Showing Related Content","redirectToRelatedPageForBaseUsers.message":"The content you are trying to access is archived","redirectToRelatedPage.message":"The content you are trying to access is archived","relatedUrl.archivalLink.flyoutMessage":"The content you are trying to access is archived View Archived Content"},"localOverride":false},"QueryVariables:TopicReplyList:message:4400142:15":{"__typename":"QueryVariables","id":"TopicReplyList:message:4400142:15","value":{"id":"message:4400142","first":10,"sorts":{"postTime":{"direction":"DESC"}},"repliesFirst":3,"repliesFirstDepthThree":1,"repliesSorts":{"postTime":{"direction":"DESC"}},"useAvatar":true,"useAuthorLogin":true,"useAuthorRank":true,"useBody":true,"useKudosCount":true,"useTimeToRead":false,"useMedia":false,"useReadOnlyIcon":false,"useRepliesCount":true,"useSearchSnippet":false,"useAcceptedSolutionButton":false,"useSolvedBadge":false,"useAttachments":false,"attachmentsFirst":5,"useTags":true,"useNodeAncestors":false,"useUserHoverCard":false,"useNodeHoverCard":false,"useModerationStatus":true,"usePreviewSubjectModal":false,"useMessageStatus":true}},"ROOT_MUTATION":{"__typename":"Mutation"},"CachedAsset:component:custom.widget.community_banner-en-us-1746740526178":{"__typename":"CachedAsset","id":"component:custom.widget.community_banner-en-us-1746740526178","value":{"component":{"id":"custom.widget.community_banner","template":{"id":"community_banner","markupLanguage":"HANDLEBARS","style":".community-banner {\n a.top-bar.btn {\n top: 0px;\n width: 100%;\n z-index: 999;\n text-align: center;\n left: 0px;\n background: #0068b8;\n color: white;\n padding: 10px 0px;\n display: block;\n box-shadow: none !important;\n border: none !important;\n border-radius: none !important;\n margin: 0px !important;\n font-size: 14px;\n }\n}\n","texts":{},"defaults":{"config":{"applicablePages":[],"description":"community announcement text","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"components":[{"id":"custom.widget.community_banner","form":null,"config":null,"props":[],"__typename":"Component"}],"grouping":"CUSTOM","__typename":"ComponentTemplate"},"properties":{"config":{"applicablePages":[],"description":"community announcement text","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"form":null,"__typename":"Component","localOverride":false},"globalCss":{"css":".custom_widget_community_banner_community-banner_1x9u2_1 {\n a.custom_widget_community_banner_top-bar_1x9u2_2.custom_widget_community_banner_btn_1x9u2_2 {\n top: 0;\n width: 100%;\n z-index: 999;\n text-align: center;\n left: 0;\n background: #0068b8;\n color: white;\n padding: 0.625rem 0;\n display: block;\n box-shadow: none !important;\n border: none !important;\n border-radius: none !important;\n margin: 0 !important;\n font-size: 0.875rem;\n }\n}\n","tokens":{"community-banner":"custom_widget_community_banner_community-banner_1x9u2_1","top-bar":"custom_widget_community_banner_top-bar_1x9u2_2","btn":"custom_widget_community_banner_btn_1x9u2_2"}},"form":null},"localOverride":false},"CachedAsset:component:custom.widget.HeroBanner-en-us-1746740526178":{"__typename":"CachedAsset","id":"component:custom.widget.HeroBanner-en-us-1746740526178","value":{"component":{"id":"custom.widget.HeroBanner","template":{"id":"HeroBanner","markupLanguage":"REACT","style":null,"texts":{"searchPlaceholderText":"Search this community","followActionText":"Follow","unfollowActionText":"Following","searchOnHoverText":"Please enter your search term(s) and then press return key to complete a search.","blogs.sidebar.pagetitle":"Latest Blogs | Microsoft Tech Community","followThisNode":"Follow this node","unfollowThisNode":"Unfollow this node"},"defaults":{"config":{"applicablePages":[],"description":null,"fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[{"id":"max_items","dataType":"NUMBER","list":false,"defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"control":"INPUT","__typename":"PropDefinition"}],"__typename":"ComponentProperties"},"components":[{"id":"custom.widget.HeroBanner","form":{"fields":[{"id":"widgetChooser","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"title","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useTitle","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useBackground","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"widgetVisibility","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"moreOptions","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"cMax_items","validation":null,"noValidation":null,"dataType":"NUMBER","list":false,"control":"INPUT","defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"__typename":"FormField"}],"layout":{"rows":[{"id":"widgetChooserGroup","type":"fieldset","as":null,"items":[{"id":"widgetChooser","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"titleGroup","type":"fieldset","as":null,"items":[{"id":"title","className":null,"__typename":"FormFieldRef"},{"id":"useTitle","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"useBackground","type":"fieldset","as":null,"items":[{"id":"useBackground","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"widgetVisibility","type":"fieldset","as":null,"items":[{"id":"widgetVisibility","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"moreOptionsGroup","type":"fieldset","as":null,"items":[{"id":"moreOptions","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"componentPropsGroup","type":"fieldset","as":null,"items":[{"id":"cMax_items","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"}],"actionButtons":null,"className":"custom_widget_HeroBanner_form","formGroupFieldSeparator":"divider","__typename":"FormLayout"},"__typename":"Form"},"config":null,"props":[],"__typename":"Component"}],"grouping":"CUSTOM","__typename":"ComponentTemplate"},"properties":{"config":{"applicablePages":[],"description":null,"fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[{"id":"max_items","dataType":"NUMBER","list":false,"defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"control":"INPUT","__typename":"PropDefinition"}],"__typename":"ComponentProperties"},"form":{"fields":[{"id":"widgetChooser","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"title","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useTitle","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useBackground","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"widgetVisibility","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"moreOptions","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"cMax_items","validation":null,"noValidation":null,"dataType":"NUMBER","list":false,"control":"INPUT","defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"__typename":"FormField"}],"layout":{"rows":[{"id":"widgetChooserGroup","type":"fieldset","as":null,"items":[{"id":"widgetChooser","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"titleGroup","type":"fieldset","as":null,"items":[{"id":"title","className":null,"__typename":"FormFieldRef"},{"id":"useTitle","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"useBackground","type":"fieldset","as":null,"items":[{"id":"useBackground","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"widgetVisibility","type":"fieldset","as":null,"items":[{"id":"widgetVisibility","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"moreOptionsGroup","type":"fieldset","as":null,"items":[{"id":"moreOptions","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"componentPropsGroup","type":"fieldset","as":null,"items":[{"id":"cMax_items","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"}],"actionButtons":null,"className":"custom_widget_HeroBanner_form","formGroupFieldSeparator":"divider","__typename":"FormLayout"},"__typename":"Form"},"__typename":"Component","localOverride":false},"globalCss":null,"form":{"fields":[{"id":"widgetChooser","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"title","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useTitle","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useBackground","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"widgetVisibility","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"moreOptions","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"cMax_items","validation":null,"noValidation":null,"dataType":"NUMBER","list":false,"control":"INPUT","defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"__typename":"FormField"}],"layout":{"rows":[{"id":"widgetChooserGroup","type":"fieldset","as":null,"items":[{"id":"widgetChooser","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"titleGroup","type":"fieldset","as":null,"items":[{"id":"title","className":null,"__typename":"FormFieldRef"},{"id":"useTitle","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"useBackground","type":"fieldset","as":null,"items":[{"id":"useBackground","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"widgetVisibility","type":"fieldset","as":null,"items":[{"id":"widgetVisibility","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"moreOptionsGroup","type":"fieldset","as":null,"items":[{"id":"moreOptions","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"componentPropsGroup","type":"fieldset","as":null,"items":[{"id":"cMax_items","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"}],"actionButtons":null,"className":"custom_widget_HeroBanner_form","formGroupFieldSeparator":"divider","__typename":"FormLayout"},"__typename":"Form"}},"localOverride":false},"CachedAsset:component:custom.widget.MicrosoftFooter-en-us-1746740526178":{"__typename":"CachedAsset","id":"component:custom.widget.MicrosoftFooter-en-us-1746740526178","value":{"component":{"id":"custom.widget.MicrosoftFooter","template":{"id":"MicrosoftFooter","markupLanguage":"HANDLEBARS","style":".context-uhf {\n min-width: 280px;\n font-size: 15px;\n box-sizing: border-box;\n -ms-text-size-adjust: 100%;\n -webkit-text-size-adjust: 100%;\n & *,\n & *:before,\n & *:after {\n box-sizing: inherit;\n }\n a.c-uhff-link {\n color: #616161;\n word-break: break-word;\n text-decoration: none;\n }\n &a:link,\n &a:focus,\n &a:hover,\n &a:active,\n &a:visited {\n text-decoration: none;\n color: inherit;\n }\n & div {\n font-family: 'Segoe UI', SegoeUI, 'Helvetica Neue', Helvetica, Arial, sans-serif;\n }\n}\n.c-uhff {\n background: #f2f2f2;\n margin: -1.5625;\n width: auto;\n height: auto;\n}\n.c-uhff-nav {\n margin: 0 auto;\n max-width: calc(1600px + 10%);\n padding: 0 5%;\n box-sizing: inherit;\n &:before,\n &:after {\n content: ' ';\n display: table;\n clear: left;\n }\n @media only screen and (max-width: 1083px) {\n padding-left: 12px;\n }\n .c-heading-4 {\n color: #616161;\n word-break: break-word;\n font-size: 15px;\n line-height: 20px;\n padding: 36px 0 4px;\n font-weight: 600;\n }\n .c-uhff-nav-row {\n .c-uhff-nav-group {\n display: block;\n float: left;\n min-height: 1px;\n vertical-align: text-top;\n padding: 0 12px;\n width: 100%;\n zoom: 1;\n &:first-child {\n padding-left: 0;\n @media only screen and (max-width: 1083px) {\n padding-left: 12px;\n }\n }\n @media only screen and (min-width: 540px) and (max-width: 1082px) {\n width: 33.33333%;\n }\n @media only screen and (min-width: 1083px) {\n width: 16.6666666667%;\n }\n ul.c-list.f-bare {\n font-size: 11px;\n line-height: 16px;\n margin-top: 0;\n margin-bottom: 0;\n padding-left: 0;\n list-style-type: none;\n li {\n word-break: break-word;\n padding: 8px 0;\n margin: 0;\n }\n }\n }\n }\n}\n.c-uhff-base {\n background: #f2f2f2;\n margin: 0 auto;\n max-width: calc(1600px + 10%);\n padding: 30px 5% 16px;\n &:before,\n &:after {\n content: ' ';\n display: table;\n }\n &:after {\n clear: both;\n }\n a.c-uhff-ccpa {\n font-size: 11px;\n line-height: 16px;\n float: left;\n margin: 3px 0;\n }\n a.c-uhff-ccpa:hover {\n text-decoration: underline;\n }\n ul.c-list {\n font-size: 11px;\n line-height: 16px;\n float: right;\n margin: 3px 0;\n color: #616161;\n li {\n padding: 0 24px 4px 0;\n display: inline-block;\n }\n }\n .c-list.f-bare {\n padding-left: 0;\n list-style-type: none;\n }\n @media only screen and (max-width: 1083px) {\n display: flex;\n flex-wrap: wrap;\n padding: 30px 24px 16px;\n }\n}\n\n.social-share {\n position: fixed;\n top: 60%;\n transform: translateY(-50%);\n left: 0;\n z-index: 1000;\n}\n\n.sharing-options {\n list-style: none;\n padding: 0;\n margin: 0;\n display: block;\n flex-direction: column;\n background-color: white;\n width: 43px;\n border-radius: 0px 7px 7px 0px;\n}\n.linkedin-icon {\n border-top-right-radius: 7px;\n}\n.linkedin-icon:hover {\n border-radius: 0;\n}\n.social-share-rss-image {\n border-bottom-right-radius: 7px;\n}\n.social-share-rss-image:hover {\n border-radius: 0;\n}\n\n.social-link-footer {\n position: relative;\n display: block;\n margin: -2px 0;\n transition: all 0.2s ease;\n}\n.social-link-footer:hover .linkedin-icon {\n border-radius: 0;\n}\n.social-link-footer:hover .social-share-rss-image {\n border-radius: 0;\n}\n\n.social-link-footer img {\n width: 40px;\n height: auto;\n transition: filter 0.3s ease;\n}\n\n.social-share-list {\n width: 40px;\n}\n.social-share-rss-image {\n width: 40px;\n}\n\n.share-icon {\n border: 2px solid transparent;\n display: inline-block;\n position: relative;\n}\n\n.share-icon:hover {\n opacity: 1;\n border: 2px solid white;\n box-sizing: border-box;\n}\n\n.share-icon:hover .label {\n opacity: 1;\n visibility: visible;\n border: 2px solid white;\n box-sizing: border-box;\n border-left: none;\n}\n\n.label {\n position: absolute;\n left: 100%;\n white-space: nowrap;\n opacity: 0;\n visibility: hidden;\n transition: all 0.2s ease;\n color: white;\n border-radius: 0 10 0 10px;\n top: 50%;\n transform: translateY(-50%);\n height: 40px;\n border-radius: 0 6px 6px 0;\n display: flex;\n align-items: center;\n justify-content: center;\n padding: 20px 5px 20px 8px;\n margin-left: -1px;\n}\n.linkedin {\n background-color: #0474b4;\n}\n.facebook {\n background-color: #3c5c9c;\n}\n.twitter {\n background-color: white;\n color: black;\n}\n.reddit {\n background-color: #fc4404;\n}\n.mail {\n background-color: #848484;\n}\n.bluesky {\n background-color: white;\n color: black;\n}\n.rss {\n background-color: #ec7b1c;\n}\n#RSS {\n width: 40px;\n height: 40px;\n}\n\n@media (max-width: 991px) {\n .social-share {\n display: none;\n }\n}\n","texts":{"New tab":"What's New","New 1":"Surface Laptop Studio 2","New 2":"Surface Laptop Go 3","New 3":"Surface Pro 9","New 4":"Surface Laptop 5","New 5":"Surface Studio 2+","New 6":"Copilot in Windows","New 7":"Microsoft 365","New 8":"Windows 11 apps","Store tab":"Microsoft Store","Store 1":"Account Profile","Store 2":"Download Center","Store 3":"Microsoft Store Support","Store 4":"Returns","Store 5":"Order tracking","Store 6":"Certified Refurbished","Store 7":"Microsoft Store Promise","Store 8":"Flexible Payments","Education tab":"Education","Edu 1":"Microsoft in education","Edu 2":"Devices for education","Edu 3":"Microsoft Teams for Education","Edu 4":"Microsoft 365 Education","Edu 5":"How to buy for your school","Edu 6":"Educator Training and development","Edu 7":"Deals for students and parents","Edu 8":"Azure for students","Business tab":"Business","Bus 1":"Microsoft Cloud","Bus 2":"Microsoft Security","Bus 3":"Dynamics 365","Bus 4":"Microsoft 365","Bus 5":"Microsoft Power Platform","Bus 6":"Microsoft Teams","Bus 7":"Microsoft Industry","Bus 8":"Small Business","Developer tab":"Developer & IT","Dev 1":"Azure","Dev 2":"Developer Center","Dev 3":"Documentation","Dev 4":"Microsoft Learn","Dev 5":"Microsoft Tech Community","Dev 6":"Azure Marketplace","Dev 7":"AppSource","Dev 8":"Visual Studio","Company tab":"Company","Com 1":"Careers","Com 2":"About Microsoft","Com 3":"Company News","Com 4":"Privacy at Microsoft","Com 5":"Investors","Com 6":"Diversity and inclusion","Com 7":"Accessiblity","Com 8":"Sustainibility"},"defaults":{"config":{"applicablePages":[],"description":"The Microsoft Footer","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"components":[{"id":"custom.widget.MicrosoftFooter","form":null,"config":null,"props":[],"__typename":"Component"}],"grouping":"CUSTOM","__typename":"ComponentTemplate"},"properties":{"config":{"applicablePages":[],"description":"The Microsoft Footer","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"form":null,"__typename":"Component","localOverride":false},"globalCss":{"css":".custom_widget_MicrosoftFooter_context-uhf_105bp_1 {\n min-width: 17.5rem;\n font-size: 0.9375rem;\n box-sizing: border-box;\n -ms-text-size-adjust: 100%;\n -webkit-text-size-adjust: 100%;\n & *,\n & *:before,\n & *:after {\n box-sizing: inherit;\n }\n a.custom_widget_MicrosoftFooter_c-uhff-link_105bp_12 {\n color: #616161;\n word-break: break-word;\n text-decoration: none;\n }\n &a:link,\n &a:focus,\n &a:hover,\n &a:active,\n &a:visited {\n text-decoration: none;\n color: inherit;\n }\n & div {\n font-family: 'Segoe UI', SegoeUI, 'Helvetica Neue', Helvetica, Arial, sans-serif;\n }\n}\n.custom_widget_MicrosoftFooter_c-uhff_105bp_12 {\n background: #f2f2f2;\n margin: -1.5625;\n width: auto;\n height: auto;\n}\n.custom_widget_MicrosoftFooter_c-uhff-nav_105bp_35 {\n margin: 0 auto;\n max-width: calc(100rem + 10%);\n padding: 0 5%;\n box-sizing: inherit;\n &:before,\n &:after {\n content: ' ';\n display: table;\n clear: left;\n }\n @media only screen and (max-width: 1083px) {\n padding-left: 0.75rem;\n }\n .custom_widget_MicrosoftFooter_c-heading-4_105bp_49 {\n color: #616161;\n word-break: break-word;\n font-size: 0.9375rem;\n line-height: 1.25rem;\n padding: 2.25rem 0 0.25rem;\n font-weight: 600;\n }\n .custom_widget_MicrosoftFooter_c-uhff-nav-row_105bp_57 {\n .custom_widget_MicrosoftFooter_c-uhff-nav-group_105bp_58 {\n display: block;\n float: left;\n min-height: 0.0625rem;\n vertical-align: text-top;\n padding: 0 0.75rem;\n width: 100%;\n zoom: 1;\n &:first-child {\n padding-left: 0;\n @media only screen and (max-width: 1083px) {\n padding-left: 0.75rem;\n }\n }\n @media only screen and (min-width: 540px) and (max-width: 1082px) {\n width: 33.33333%;\n }\n @media only screen and (min-width: 1083px) {\n width: 16.6666666667%;\n }\n ul.custom_widget_MicrosoftFooter_c-list_105bp_78.custom_widget_MicrosoftFooter_f-bare_105bp_78 {\n font-size: 0.6875rem;\n line-height: 1rem;\n margin-top: 0;\n margin-bottom: 0;\n padding-left: 0;\n list-style-type: none;\n li {\n word-break: break-word;\n padding: 0.5rem 0;\n margin: 0;\n }\n }\n }\n }\n}\n.custom_widget_MicrosoftFooter_c-uhff-base_105bp_94 {\n background: #f2f2f2;\n margin: 0 auto;\n max-width: calc(100rem + 10%);\n padding: 1.875rem 5% 1rem;\n &:before,\n &:after {\n content: ' ';\n display: table;\n }\n &:after {\n clear: both;\n }\n a.custom_widget_MicrosoftFooter_c-uhff-ccpa_105bp_107 {\n font-size: 0.6875rem;\n line-height: 1rem;\n float: left;\n margin: 0.1875rem 0;\n }\n a.custom_widget_MicrosoftFooter_c-uhff-ccpa_105bp_107:hover {\n text-decoration: underline;\n }\n ul.custom_widget_MicrosoftFooter_c-list_105bp_78 {\n font-size: 0.6875rem;\n line-height: 1rem;\n float: right;\n margin: 0.1875rem 0;\n color: #616161;\n li {\n padding: 0 1.5rem 0.25rem 0;\n display: inline-block;\n }\n }\n .custom_widget_MicrosoftFooter_c-list_105bp_78.custom_widget_MicrosoftFooter_f-bare_105bp_78 {\n padding-left: 0;\n list-style-type: none;\n }\n @media only screen and (max-width: 1083px) {\n display: flex;\n flex-wrap: wrap;\n padding: 1.875rem 1.5rem 1rem;\n }\n}\n.custom_widget_MicrosoftFooter_social-share_105bp_138 {\n position: fixed;\n top: 60%;\n transform: translateY(-50%);\n left: 0;\n z-index: 1000;\n}\n.custom_widget_MicrosoftFooter_sharing-options_105bp_146 {\n list-style: none;\n padding: 0;\n margin: 0;\n display: block;\n flex-direction: column;\n background-color: white;\n width: 2.6875rem;\n border-radius: 0 0.4375rem 0.4375rem 0;\n}\n.custom_widget_MicrosoftFooter_linkedin-icon_105bp_156 {\n border-top-right-radius: 7px;\n}\n.custom_widget_MicrosoftFooter_linkedin-icon_105bp_156:hover {\n border-radius: 0;\n}\n.custom_widget_MicrosoftFooter_social-share-rss-image_105bp_162 {\n border-bottom-right-radius: 7px;\n}\n.custom_widget_MicrosoftFooter_social-share-rss-image_105bp_162:hover {\n border-radius: 0;\n}\n.custom_widget_MicrosoftFooter_social-link-footer_105bp_169 {\n position: relative;\n display: block;\n margin: -0.125rem 0;\n transition: all 0.2s ease;\n}\n.custom_widget_MicrosoftFooter_social-link-footer_105bp_169:hover .custom_widget_MicrosoftFooter_linkedin-icon_105bp_156 {\n border-radius: 0;\n}\n.custom_widget_MicrosoftFooter_social-link-footer_105bp_169:hover .custom_widget_MicrosoftFooter_social-share-rss-image_105bp_162 {\n border-radius: 0;\n}\n.custom_widget_MicrosoftFooter_social-link-footer_105bp_169 img {\n width: 2.5rem;\n height: auto;\n transition: filter 0.3s ease;\n}\n.custom_widget_MicrosoftFooter_social-share-list_105bp_188 {\n width: 2.5rem;\n}\n.custom_widget_MicrosoftFooter_social-share-rss-image_105bp_162 {\n width: 2.5rem;\n}\n.custom_widget_MicrosoftFooter_share-icon_105bp_195 {\n border: 2px solid transparent;\n display: inline-block;\n position: relative;\n}\n.custom_widget_MicrosoftFooter_share-icon_105bp_195:hover {\n opacity: 1;\n border: 2px solid white;\n box-sizing: border-box;\n}\n.custom_widget_MicrosoftFooter_share-icon_105bp_195:hover .custom_widget_MicrosoftFooter_label_105bp_207 {\n opacity: 1;\n visibility: visible;\n border: 2px solid white;\n box-sizing: border-box;\n border-left: none;\n}\n.custom_widget_MicrosoftFooter_label_105bp_207 {\n position: absolute;\n left: 100%;\n white-space: nowrap;\n opacity: 0;\n visibility: hidden;\n transition: all 0.2s ease;\n color: white;\n border-radius: 0 10 0 0.625rem;\n top: 50%;\n transform: translateY(-50%);\n height: 2.5rem;\n border-radius: 0 0.375rem 0.375rem 0;\n display: flex;\n align-items: center;\n justify-content: center;\n padding: 1.25rem 0.3125rem 1.25rem 0.5rem;\n margin-left: -0.0625rem;\n}\n.custom_widget_MicrosoftFooter_linkedin_105bp_156 {\n background-color: #0474b4;\n}\n.custom_widget_MicrosoftFooter_facebook_105bp_237 {\n background-color: #3c5c9c;\n}\n.custom_widget_MicrosoftFooter_twitter_105bp_240 {\n background-color: white;\n color: black;\n}\n.custom_widget_MicrosoftFooter_reddit_105bp_244 {\n background-color: #fc4404;\n}\n.custom_widget_MicrosoftFooter_mail_105bp_247 {\n background-color: #848484;\n}\n.custom_widget_MicrosoftFooter_bluesky_105bp_250 {\n background-color: white;\n color: black;\n}\n.custom_widget_MicrosoftFooter_rss_105bp_254 {\n background-color: #ec7b1c;\n}\n#custom_widget_MicrosoftFooter_RSS_105bp_1 {\n width: 2.5rem;\n height: 2.5rem;\n}\n@media (max-width: 991px) {\n .custom_widget_MicrosoftFooter_social-share_105bp_138 {\n display: none;\n }\n}\n","tokens":{"context-uhf":"custom_widget_MicrosoftFooter_context-uhf_105bp_1","c-uhff-link":"custom_widget_MicrosoftFooter_c-uhff-link_105bp_12","c-uhff":"custom_widget_MicrosoftFooter_c-uhff_105bp_12","c-uhff-nav":"custom_widget_MicrosoftFooter_c-uhff-nav_105bp_35","c-heading-4":"custom_widget_MicrosoftFooter_c-heading-4_105bp_49","c-uhff-nav-row":"custom_widget_MicrosoftFooter_c-uhff-nav-row_105bp_57","c-uhff-nav-group":"custom_widget_MicrosoftFooter_c-uhff-nav-group_105bp_58","c-list":"custom_widget_MicrosoftFooter_c-list_105bp_78","f-bare":"custom_widget_MicrosoftFooter_f-bare_105bp_78","c-uhff-base":"custom_widget_MicrosoftFooter_c-uhff-base_105bp_94","c-uhff-ccpa":"custom_widget_MicrosoftFooter_c-uhff-ccpa_105bp_107","social-share":"custom_widget_MicrosoftFooter_social-share_105bp_138","sharing-options":"custom_widget_MicrosoftFooter_sharing-options_105bp_146","linkedin-icon":"custom_widget_MicrosoftFooter_linkedin-icon_105bp_156","social-share-rss-image":"custom_widget_MicrosoftFooter_social-share-rss-image_105bp_162","social-link-footer":"custom_widget_MicrosoftFooter_social-link-footer_105bp_169","social-share-list":"custom_widget_MicrosoftFooter_social-share-list_105bp_188","share-icon":"custom_widget_MicrosoftFooter_share-icon_105bp_195","label":"custom_widget_MicrosoftFooter_label_105bp_207","linkedin":"custom_widget_MicrosoftFooter_linkedin_105bp_156","facebook":"custom_widget_MicrosoftFooter_facebook_105bp_237","twitter":"custom_widget_MicrosoftFooter_twitter_105bp_240","reddit":"custom_widget_MicrosoftFooter_reddit_105bp_244","mail":"custom_widget_MicrosoftFooter_mail_105bp_247","bluesky":"custom_widget_MicrosoftFooter_bluesky_105bp_250","rss":"custom_widget_MicrosoftFooter_rss_105bp_254","RSS":"custom_widget_MicrosoftFooter_RSS_105bp_1"}},"form":null},"localOverride":false},"CachedAsset:text:en_US-components/community/Breadcrumb-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/community/Breadcrumb-1745505307000","value":{"navLabel":"Breadcrumbs","dropdown":"Additional parent page navigation"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageBanner-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageBanner-1745505307000","value":{"messageMarkedAsSpam":"This post has been marked as spam","messageMarkedAsSpam@board:TKB":"This article has been marked as spam","messageMarkedAsSpam@board:BLOG":"This post has been marked as spam","messageMarkedAsSpam@board:FORUM":"This discussion has been marked as spam","messageMarkedAsSpam@board:OCCASION":"This event has been marked as spam","messageMarkedAsSpam@board:IDEA":"This idea has been marked as spam","manageSpam":"Manage Spam","messageMarkedAsAbuse":"This post has been marked as abuse","messageMarkedAsAbuse@board:TKB":"This article has been marked as abuse","messageMarkedAsAbuse@board:BLOG":"This post has been marked as abuse","messageMarkedAsAbuse@board:FORUM":"This discussion has been marked as abuse","messageMarkedAsAbuse@board:OCCASION":"This event has been marked as abuse","messageMarkedAsAbuse@board:IDEA":"This idea has been marked as abuse","preModCommentAuthorText":"This comment will be published as soon as it is approved","preModCommentModeratorText":"This comment is awaiting moderation","messageMarkedAsOther":"This post has been rejected due to other reasons","messageMarkedAsOther@board:TKB":"This article has been rejected due to other reasons","messageMarkedAsOther@board:BLOG":"This post has been rejected due to other reasons","messageMarkedAsOther@board:FORUM":"This discussion has been rejected due to other reasons","messageMarkedAsOther@board:OCCASION":"This event has been rejected due to other reasons","messageMarkedAsOther@board:IDEA":"This idea has been rejected due to other reasons","messageArchived":"This post was archived on {date}","relatedUrl":"View Related Content","relatedContentText":"Showing related content","archivedContentLink":"View Archived Content"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageView/MessageViewStandard-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageView/MessageViewStandard-1745505307000","value":{"anonymous":"Anonymous","author":"{messageAuthorLogin}","authorBy":"{messageAuthorLogin}","board":"{messageBoardTitle}","replyToUser":" to {parentAuthor}","showMoreReplies":"Show More","replyText":"Reply","repliesText":"Replies","markedAsSolved":"Marked as Solution","movedMessagePlaceholder.BLOG":"{count, plural, =0 {This comment has been} other {These comments have been} }","movedMessagePlaceholder.TKB":"{count, plural, =0 {This comment has been} other {These comments have been} }","movedMessagePlaceholder.FORUM":"{count, plural, =0 {This reply has been} other {These replies have been} }","movedMessagePlaceholder.IDEA":"{count, plural, =0 {This comment has been} other {These comments have been} }","movedMessagePlaceholder.OCCASION":"{count, plural, =0 {This comment has been} other {These comments have been} }","movedMessagePlaceholderUrlText":"moved.","messageStatus":"Status: ","statusChanged":"Status changed: {previousStatus} to {currentStatus}","statusAdded":"Status added: {status}","statusRemoved":"Status removed: {status}","labelExpand":"expand replies","labelCollapse":"collapse replies","unhelpfulReason.reason1":"Content is outdated","unhelpfulReason.reason2":"Article is missing information","unhelpfulReason.reason3":"Content is for a different Product","unhelpfulReason.reason4":"Doesn't match what I was searching for"},"localOverride":false},"CachedAsset:text:en_US-components/messages/ThreadedReplyList-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/ThreadedReplyList-1745505307000","value":{"title":"{count, plural, one{# Reply} other{# Replies}}","title@board:BLOG":"{count, plural, one{# Comment} other{# Comments}}","title@board:TKB":"{count, plural, one{# Comment} other{# Comments}}","title@board:IDEA":"{count, plural, one{# Comment} other{# Comments}}","title@board:OCCASION":"{count, plural, one{# Comment} other{# Comments}}","noRepliesTitle":"No Replies","noRepliesTitle@board:BLOG":"No Comments","noRepliesTitle@board:TKB":"No Comments","noRepliesTitle@board:IDEA":"No Comments","noRepliesTitle@board:OCCASION":"No Comments","noRepliesDescription":"Be the first to reply","noRepliesDescription@board:BLOG":"Be the first to comment","noRepliesDescription@board:TKB":"Be the first to comment","noRepliesDescription@board:IDEA":"Be the first to comment","noRepliesDescription@board:OCCASION":"Be the first to comment","messageReadOnlyAlert:BLOG":"Comments have been turned off for this post","messageReadOnlyAlert:TKB":"Comments have been turned off for this article","messageReadOnlyAlert:IDEA":"Comments have been turned off for this idea","messageReadOnlyAlert:FORUM":"Replies have been turned off for this discussion","messageReadOnlyAlert:OCCASION":"Comments have been turned off for this event"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageReplyCallToAction-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageReplyCallToAction-1745505307000","value":{"leaveReply":"Leave a reply...","leaveReply@board:BLOG@message:root":"Leave a comment...","leaveReply@board:TKB@message:root":"Leave a comment...","leaveReply@board:IDEA@message:root":"Leave a comment...","leaveReply@board:OCCASION@message:root":"Leave a comment...","repliesTurnedOff.FORUM":"Replies are turned off for this topic","repliesTurnedOff.BLOG":"Comments are turned off for this topic","repliesTurnedOff.TKB":"Comments are turned off for this topic","repliesTurnedOff.IDEA":"Comments are turned off for this topic","repliesTurnedOff.OCCASION":"Comments are turned off for this topic","infoText":"Stop poking me!"},"localOverride":false},"Category:category:Exchange":{"__typename":"Category","id":"category:Exchange","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Outlook":{"__typename":"Category","id":"category:Outlook","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Community-Info-Center":{"__typename":"Category","id":"category:Community-Info-Center","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:EducationSector":{"__typename":"Category","id":"category:EducationSector","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:DrivingAdoption":{"__typename":"Category","id":"category:DrivingAdoption","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Windows-Server":{"__typename":"Category","id":"category:Windows-Server","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:MicrosoftTeams":{"__typename":"Category","id":"category:MicrosoftTeams","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:PublicSector":{"__typename":"Category","id":"category:PublicSector","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:microsoft365":{"__typename":"Category","id":"category:microsoft365","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:IoT":{"__typename":"Category","id":"category:IoT","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:HealthcareAndLifeSciences":{"__typename":"Category","id":"category:HealthcareAndLifeSciences","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:ITOpsTalk":{"__typename":"Category","id":"category:ITOpsTalk","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:MicrosoftLearn":{"__typename":"Category","id":"category:MicrosoftLearn","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Blog:board:MicrosoftLearnBlog":{"__typename":"Blog","id":"board:MicrosoftLearnBlog","blogPolicies":{"__typename":"BlogPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}},"boardPolicies":{"__typename":"BoardPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:AI":{"__typename":"Category","id":"category:AI","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:MicrosoftMechanics":{"__typename":"Category","id":"category:MicrosoftMechanics","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:MicrosoftforNonprofits":{"__typename":"Category","id":"category:MicrosoftforNonprofits","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:StartupsatMicrosoft":{"__typename":"Category","id":"category:StartupsatMicrosoft","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:PartnerCommunity":{"__typename":"Category","id":"category:PartnerCommunity","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Microsoft365Copilot":{"__typename":"Category","id":"category:Microsoft365Copilot","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Windows":{"__typename":"Category","id":"category:Windows","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Content_Management":{"__typename":"Category","id":"category:Content_Management","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:microsoft-security":{"__typename":"Category","id":"category:microsoft-security","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:microsoftintune":{"__typename":"Category","id":"category:microsoftintune","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Rank:rank:37":{"__typename":"Rank","id":"rank:37","position":18,"name":"Copper Contributor","color":"333333","icon":null,"rankStyle":"TEXT"},"User:user:2989616":{"__typename":"User","id":"user:2989616","uid":2989616,"login":"Call-19495920094","biography":null,"registrationData":{"__typename":"RegistrationData","status":null,"registrationTime":"2025-04-09T03:44:08.033-07:00"},"deleted":false,"email":"","avatar":{"__typename":"UserAvatar","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/m_assets/avatars/default/avatar-1.svg?time=0"},"rank":{"__ref":"Rank:rank:37"},"entityType":"USER","eventPath":"community:gxcuf89792/user:2989616"},"ModerationData:moderation_data:4402953":{"__typename":"ModerationData","id":"moderation_data:4402953","status":"UNMODERATED","rejectReason":null,"isReportedAbuse":false,"rejectUser":null,"rejectTime":null,"rejectActorType":null},"BlogReplyMessage:message:4402953":{"__typename":"BlogReplyMessage","author":{"__ref":"User:user:2989616"},"id":"message:4402953","revisionNum":1,"uid":4402953,"depth":1,"hasGivenKudo":false,"subscribed":false,"board":{"__ref":"Blog:board:AzureDevCommunityBlog"},"parent":{"__ref":"BlogTopicMessage:message:4400142"},"conversation":{"__ref":"Conversation:conversation:4400142"},"subject":"Re: Semantic Search with the AI Dev Gallery","moderationData":{"__ref":"ModerationData:moderation_data:4402953"},"body":"
Gallery
","body@stripHtml({\"removeProcessingText\":false,\"removeSpoilerMarkup\":false,\"removeTocMarkup\":false,\"truncateLength\":200})@stringLength":"9","kudosSumWeight":0,"repliesCount":0,"postTime":"2025-04-10T04:47:03.964-07:00","lastPublishTime":"2025-04-10T04:47:03.964-07:00","metrics":{"__typename":"MessageMetrics","views":32},"visibilityScope":"PUBLIC","placeholder":false,"originalMessageForPlaceholder":null,"entityType":"BLOG_REPLY","eventPath":"category:Azure/category:products-services/category:communities/community:gxcuf89792board:AzureDevCommunityBlog/message:4400142/message:4402953","replies":{"__typename":"MessageConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[]},"customFields":[],"attachments":{"__typename":"AttachmentConnection","edges":[],"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}}},"CachedAsset:text:en_US-components/community/Navbar-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/community/Navbar-1745505307000","value":{"community":"Community Home","inbox":"Inbox","manageContent":"Manage Content","tos":"Terms of Service","forgotPassword":"Forgot Password","themeEditor":"Theme Editor","edit":"Edit Navigation Bar","skipContent":"Skip to content","gxcuf89792":"Tech Community","external-1":"Events","s-m-b":"Nonprofit Community","windows-server":"Windows Server","education-sector":"Education Sector","driving-adoption":"Driving Adoption","Common-content_management-link":"Content Management","microsoft-learn":"Microsoft Learn","s-q-l-server":"Content Management","partner-community":"Microsoft Partner Community","microsoft365":"Microsoft 365","external-9":".NET","external-8":"Teams","external-7":"Github","products-services":"Products","external-6":"Power Platform","communities-1":"Topics","external-5":"Microsoft Security","planner":"Outlook","external-4":"Microsoft 365","external-3":"Dynamics 365","azure":"Azure","healthcare-and-life-sciences":"Healthcare and Life Sciences","external-2":"Azure","microsoft-mechanics":"Microsoft Mechanics","microsoft-learn-1":"Community","external-10":"Learning Room Directory","microsoft-learn-blog":"Blog","windows":"Windows","i-t-ops-talk":"ITOps Talk","external-link-1":"View All","microsoft-securityand-compliance":"Microsoft Security","public-sector":"Public Sector","community-info-center":"Lounge","external-link-2":"View All","microsoft-teams":"Microsoft Teams","external":"Blogs","microsoft-endpoint-manager":"Microsoft Intune","startupsat-microsoft":"Startups at Microsoft","exchange":"Exchange","a-i":"AI and Machine Learning","io-t":"Internet of Things (IoT)","Common-microsoft365-copilot-link":"Microsoft 365 Copilot","outlook":"Microsoft 365 Copilot","external-link":"Community Hubs","communities":"Products"},"localOverride":false},"CachedAsset:text:en_US-components/community/NavbarHamburgerDropdown-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/community/NavbarHamburgerDropdown-1745505307000","value":{"hamburgerLabel":"Side Menu"},"localOverride":false},"CachedAsset:text:en_US-components/community/BrandLogo-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/community/BrandLogo-1745505307000","value":{"logoAlt":"Khoros","themeLogoAlt":"Brand Logo"},"localOverride":false},"CachedAsset:text:en_US-components/community/NavbarTextLinks-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/community/NavbarTextLinks-1745505307000","value":{"more":"More"},"localOverride":false},"CachedAsset:text:en_US-components/authentication/AuthenticationLink-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/authentication/AuthenticationLink-1745505307000","value":{"title.login":"Sign In","title.registration":"Register","title.forgotPassword":"Forgot Password","title.multiAuthLogin":"Sign In"},"localOverride":false},"CachedAsset:text:en_US-components/nodes/NodeLink-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/nodes/NodeLink-1745505307000","value":{"place":"Place {name}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageCoverImage-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageCoverImage-1745505307000","value":{"coverImageTitle":"Cover Image"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/nodes/NodeTitle-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/nodes/NodeTitle-1745505307000","value":{"nodeTitle":"{nodeTitle, select, community {Community} other {{nodeTitle}}} "},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageTimeToRead-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageTimeToRead-1745505307000","value":{"minReadText":"{min} MIN READ"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageSubject-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageSubject-1745505307000","value":{"noSubject":"(no subject)"},"localOverride":false},"CachedAsset:text:en_US-components/users/UserLink-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/users/UserLink-1745505307000","value":{"authorName":"View Profile: {author}","anonymous":"Anonymous"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/users/UserRank-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/users/UserRank-1745505307000","value":{"rankName":"{rankName}","userRank":"Author rank {rankName}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageTime-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageTime-1745505307000","value":{"postTime":"Published: {time}","lastPublishTime":"Last Update: {time}","conversation.lastPostingActivityTime":"Last posting activity time: {time}","conversation.lastPostTime":"Last post time: {time}","moderationData.rejectTime":"Rejected time: {time}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageBody-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageBody-1745505307000","value":{"showMessageBody":"Show More","mentionsErrorTitle":"{mentionsType, select, board {Board} user {User} message {Message} other {}} No Longer Available","mentionsErrorMessage":"The {mentionsType} you are trying to view has been removed from the community.","videoProcessing":"Video is being processed. Please try again in a few minutes.","bannerTitle":"Video provider requires cookies to play the video. Accept to continue or {url} it directly on the provider's site.","buttonTitle":"Accept","urlText":"watch"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageCustomFields-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageCustomFields-1745505307000","value":{"CustomField.default.label":"Value of {name}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageRevision-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageRevision-1745505307000","value":{"lastUpdatedDatePublished":"{publishCount, plural, one{Published} other{Updated}} {date}","lastUpdatedDateDraft":"Created {date}","version":"Version {major}.{minor}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/common/QueryHandler-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/common/QueryHandler-1745505307000","value":{"title":"Query Handler"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageReplyButton-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageReplyButton-1745505307000","value":{"repliesCount":"{count}","title":"Reply","title@board:BLOG@message:root":"Comment","title@board:TKB@message:root":"Comment","title@board:IDEA@message:root":"Comment","title@board:OCCASION@message:root":"Comment"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageAuthorBio-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageAuthorBio-1745505307000","value":{"sendMessage":"Send Message","actionMessage":"Follow this blog board to get notified when there's new activity","coAuthor":"CO-PUBLISHER","contributor":"CONTRIBUTOR","userProfile":"View Profile","iconlink":"Go to {name} {type}"},"localOverride":false},"CachedAsset:text:en_US-components/community/NavbarDropdownToggle-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/community/NavbarDropdownToggle-1745505307000","value":{"ariaLabelClosed":"Press the down arrow to open the menu"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/users/UserAvatar-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/users/UserAvatar-1745505307000","value":{"altText":"{login}'s avatar","altTextGeneric":"User's avatar"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/ranks/UserRankLabel-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/ranks/UserRankLabel-1745505307000","value":{"altTitle":"Icon for {rankName} rank"},"localOverride":false},"CachedAsset:text:en_US-components/users/UserRegistrationDate-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/users/UserRegistrationDate-1745505307000","value":{"noPrefix":"{date}","withPrefix":"Joined {date}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/nodes/NodeAvatar-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/nodes/NodeAvatar-1745505307000","value":{"altTitle":"Node avatar for {nodeTitle}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/nodes/NodeDescription-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/nodes/NodeDescription-1745505307000","value":{"description":"{description}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageListMenu-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageListMenu-1745505307000","value":{"postTimeAsc":"Oldest","postTimeDesc":"Newest","kudosSumWeightAsc":"Least Liked","kudosSumWeightDesc":"Most Liked","sortTitle":"Sort By","sortedBy.item":" { itemName, select, postTimeAsc {Oldest} postTimeDesc {Newest} kudosSumWeightAsc {Least Liked} kudosSumWeightDesc {Most Liked} other {}}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/nodes/NodeIcon-1745505307000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/nodes/NodeIcon-1745505307000","value":{"contentType":"Content Type {style, select, FORUM {Forum} BLOG {Blog} TKB {Knowledge Base} IDEA {Ideas} OCCASION {Events} other {}} icon"},"localOverride":false}}}},"page":"/blogs/BlogMessagePage/BlogMessagePage","query":{"boardId":"azuredevcommunityblog","messageSubject":"semantic-search-with-the-ai-dev-gallery","messageId":"4400142"},"buildId":"-gVUpXaWnPcjlrLJZ92B7","runtimeConfig":{"buildInformationVisible":false,"logLevelApp":"info","logLevelMetrics":"info","openTelemetryClientEnabled":false,"openTelemetryConfigName":"o365","openTelemetryServiceVersion":"25.3.0","openTelemetryUniverse":"prod","openTelemetryCollector":"http://localhost:4318","openTelemetryRouteChangeAllowedTime":"5000","apolloDevToolsEnabled":false,"inboxMuteWipFeatureEnabled":false},"isFallback":false,"isExperimentalCompile":false,"dynamicIds":["./components/community/Navbar/NavbarWidget.tsx","./components/community/Breadcrumb/BreadcrumbWidget.tsx","./components/customComponent/CustomComponent/CustomComponent.tsx","./components/blogs/BlogArticleWidget/BlogArticleWidget.tsx","./components/messages/MessageView/MessageViewStandard/MessageViewStandard.tsx","./components/messages/ThreadedReplyList/ThreadedReplyList.tsx","./components/external/components/ExternalComponent.tsx","./components/customComponent/CustomComponentContent/TemplateContent.tsx","../shared/client/components/common/List/UnstyledList/UnstyledList.tsx","./components/messages/MessageView/MessageView.tsx"],"appGip":true,"scriptLoader":[{"id":"analytics","src":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/pagescripts/1730819800000/analytics.js?page.id=BlogMessagePage&entity.id=board%3Aazuredevcommunityblog&entity.id=message%3A4400142","strategy":"afterInteractive"}]}