Blog Post

AI - Azure AI services Blog
11 MIN READ

Build Intelligent RAG For Multimodality and Complex Document Structure

mrajguru's avatar
mrajguru
Icon for Microsoft rankMicrosoft
Apr 19, 2024

The advent of Retrieval-Augmented Generation (RAG) models has been a significant milestone in the field of Natural Language Processing (NLP). These models combine the power of information retrieval with generative language models to produce answers that are not just accurate but also contextually enriched. However, as the digital universe expands beyond textual data, incorporating image understanding and hierarchical document structure analysis into RAG systems is becoming increasingly crucial. This article explores how these two elements can significantly enhance the capabilities of RAG models.

 

Understanding RAG Models

Before diving into the nuances of image understanding and document analysis, let’s briefly touch upon the essence of RAG models. These systems work by first retrieving relevant documents from a vast corpus and then using a generative model to synthesize information into a coherent response. The retrieval component ensures that the model has access to accurate and up-to-date information, while the generative component allows for the creation of human-like text.

 

Image Understanding and Structure Analysis

The Challenge

One of the most significant limitations of traditional RAG models is their inability to understand and interpret visual data. In a world where images accompany textual information ubiquitously, this represents a substantial gap in the model’s comprehension abilities. Documents are not just strings of text; they have structure — sections, subsections, paragraphs, and lists — all of which convey semantic importance. Traditional RAG models often overlook this hierarchical structure, potentially missing out on understanding the document’s full meaning.

The Solution

To bridge this gap, RAG models can be augmented with Computer Vision (CV) capabilities. This involves integrating image recognition and understanding modules that can analyze visual data, extract relevant information, and convert it into a textual format that the RAG model can process. Incorporating hierarchical document structure analysis involves teaching RAG models to recognize and interpret the underlying structure of documents.

 

 

Implementation

  • Visual Feature Extraction: Use pre-trained neural networks to identify objects, scenes, and activities in images.
  • Visual Semantics: Develop algorithms that can understand the context and semantics of the visual content.
  • Multimodal Data Fusion: Combine the extracted visual information with textual data to create a multimodal context for the RAG system.
  • Structure Recognition: Implement algorithms to identify different levels of hierarchy in documents, such as titles, headings, and bullet points.
  • Semantic Role Labeling: Assign semantic roles to different parts of the document, understanding the purpose of each section.
  • Structure-Aware Retrieval: Enhance the retrieval process by considering the hierarchical structure of documents, ensuring that the most relevant sections are used for generation.

 

In this blog we will look at how do we implement this using Azure Document Intelligence, LangChain and Azure OpenAI.

Prerequisites

Before we implement this we will require some prerequisites

  • GPT-4-Vision-Preview model deployed
  • GPT-4–1106-Preview model deployed
  • text-ada-embedding model deployed
  • Azure Document Intelligence deployed

Once we have the above information , lets get started !!

Let’s import the required libraries.

 

 

import os
from dotenv import load_dotenv
load_dotenv('azure.env')

from langchain import hub
from langchain_openai import AzureChatOpenAI
#from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader
from doc_intelligence import AzureAIDocumentIntelligenceLoader
from langchain_openai import AzureOpenAIEmbeddings
from langchain.schema import StrOutputParser
from langchain.schema.runnable import RunnablePassthrough
from langchain.text_splitter import MarkdownHeaderTextSplitter
from langchain.vectorstores.azuresearch import AzureSearch
from azure.ai.documentintelligence.models import DocumentAnalysisFeature

 

 

Now we are going to write some custom function on top of LangChain Document Loader which can help us Load the PDF document. First thing we do is using Azure Document Intelligence which has the this beautiful feature of converting Image to Markdown format. Lets use the same.

 

 

import logging
from typing import Any, Iterator, List, Optional
import os
from langchain_core.documents import Document
from langchain_community.document_loaders.base import BaseLoader
from langchain_community.document_loaders.base import BaseBlobParser
from langchain_community.document_loaders.blob_loaders import Blob

logger = logging.getLogger(__name__)

class AzureAIDocumentIntelligenceLoader(BaseLoader):
    """Loads a PDF with Azure Document Intelligence"""

    def __init__(
        self,
        api_endpoint: str,
        api_key: str,
        file_path: Optional[str] = None,
        url_path: Optional[str] = None,
        api_version: Optional[str] = None,
        api_model: str = "prebuilt-layout",
        mode: str = "markdown",
        *,
        analysis_features: Optional[List[str]] = None,
    ) -> None:
        """
        Initialize the object for file processing with Azure Document Intelligence
        (formerly Form Recognizer).

        This constructor initializes a AzureAIDocumentIntelligenceParser object to be
        used for parsing files using the Azure Document Intelligence API. The load
        method generates Documents whose content representations are determined by the
        mode parameter.

        Parameters:
        -----------
        api_endpoint: str
            The API endpoint to use for DocumentIntelligenceClient construction.
        api_key: str
            The API key to use for DocumentIntelligenceClient construction.
        file_path : Optional[str]
            The path to the file that needs to be loaded.
            Either file_path or url_path must be specified.
        url_path : Optional[str]
            The URL to the file that needs to be loaded.
            Either file_path or url_path must be specified.
        api_version: Optional[str]
            The API version for DocumentIntelligenceClient. Setting None to use
            the default value from `azure-ai-documentintelligence` package.
        api_model: str
            Unique document model name. Default value is "prebuilt-layout".
            Note that overriding this default value may result in unsupported
            behavior.
        mode: Optional[str]
            The type of content representation of the generated Documents.
            Use either "single", "page", or "markdown". Default value is "markdown".
        analysis_features: Optional[List[str]]
            List of optional analysis features, each feature should be passed
            as a str that conforms to the enum `DocumentAnalysisFeature` in
            `azure-ai-documentintelligence` package. Default value is None.

        Examples:
        ---------
        >>> obj = AzureAIDocumentIntelligenceLoader(
        ...     file_path="path/to/file",
        ...     api_endpoint="https://endpoint.azure.com",
        ...     api_key="APIKEY",
        ...     api_version="2023-10-31-preview",
        ...     api_model="prebuilt-layout",
        ...     mode="markdown"
        ... )
        """

        assert (
            file_path is not None or url_path is not None
        ), "file_path or url_path must be provided"
        self.file_path = file_path
        self.url_path = url_path

        self.parser = AzureAIDocumentIntelligenceParser(
            api_endpoint=api_endpoint,
            api_key=api_key,
            api_version=api_version,
            api_model=api_model,
            mode=mode,
            analysis_features=analysis_features,
        )

    def lazy_load(
        self,
    ) -> Iterator[Document]:
        """Lazy load given path as pages."""
        if self.file_path is not None:
            yield from self.parser.parse(self.file_path)
        else:
            yield from self.parser.parse_url(self.url_path)

 

 

Now lets define the Document Parser for the same. This document Parser will internally call the intended to load and parse PDF files using Azure's Document Intelligence service (formerly known as Azure Forms Recognizer). This service uses machine learning models to extract text, key-value pairs, and tables from documents. 

lazy_parse- A method that lazily parses a given file, meaning it starts processing the file and yielding results as they become available rather than waiting for the entire file to be processed.
 
class AzureAIDocumentIntelligenceParser(BaseBlobParser):
    """Loads a PDF with Azure Document Intelligence
    (formerly Forms Recognizer)."""

    def __init__(
        self,
        api_endpoint: str,
        api_key: str,
        api_version: Optional[str] = None,
        api_model: str = "prebuilt-layout",
        mode: str = "markdown",
        analysis_features: Optional[List[str]] = None,
    ):
        from azure.ai.documentintelligence import DocumentIntelligenceClient
        from azure.ai.documentintelligence.models import DocumentAnalysisFeature
        from azure.core.credentials import AzureKeyCredential

        kwargs = {}
        if api_version is not None:
            kwargs["api_version"] = api_version

        if analysis_features is not None:
            _SUPPORTED_FEATURES = [
                DocumentAnalysisFeature.OCR_HIGH_RESOLUTION,
            ]

            analysis_features = [
                DocumentAnalysisFeature(feature) for feature in analysis_features
            ]
            if any(
                [feature not in _SUPPORTED_FEATURES for feature in analysis_features]
            ):
                logger.warning(
                    f"The current supported features are: "
                    f"{[f.value for f in _SUPPORTED_FEATURES]}. "
                    "Using other features may result in unexpected behavior."
                )

        self.client = DocumentIntelligenceClient(
            endpoint=api_endpoint,
            credential=AzureKeyCredential(api_key),
            headers={"x-ms-useragent": "langchain-parser/1.0.0"},
            features=analysis_features,
            **kwargs,
        )
        self.api_model = api_model
        self.mode = mode
        assert self.mode in ["single", "page", "markdown"]

    def _generate_docs_page(self, result: Any) -> Iterator[Document]:
        for p in result.pages:
            content = " ".join([line.content for line in p.lines])

            d = Document(
                page_content=content,
                metadata={
                    "page": p.page_number,
                },
            )
            yield d

    def _generate_docs_single(self, file_path: str, result: Any) -> Iterator[Document]:
        md_content = include_figure_in_md(file_path, result)
        yield Document(page_content=md_content, metadata={})

    def lazy_parse(self, file_path: str) -> Iterator[Document]:
        """Lazily parse the blob."""
        blob = Blob.from_path(file_path)
        with blob.as_bytes_io() as file_obj:
            poller = self.client.begin_analyze_document(
                self.api_model,
                file_obj,
                content_type="application/octet-stream",
                output_content_format="markdown" if self.mode == "markdown" else "text",
            )
            result = poller.result()

            if self.mode in ["single", "markdown"]:
                yield from self._generate_docs_single(file_path, result)
            elif self.mode in ["page"]:
                yield from self._generate_docs_page(result)
            else:
                raise ValueError(f"Invalid mode: {self.mode}")

    def parse_url(self, url: str) -> Iterator[Document]:
        from azure.ai.documentintelligence.models import AnalyzeDocumentRequest

        poller = self.client.begin_analyze_document(
            self.api_model,
            AnalyzeDocumentRequest(url_source=url),
            # content_type="application/octet-stream",
            output_content_format="markdown" if self.mode == "markdown" else "text",
        )
        result = poller.result()

        if self.mode in ["single", "markdown"]:
            yield from self._generate_docs_single(result)
        elif self.mode in ["page"]:
            yield from self._generate_docs_page(result)
        else:
            raise ValueError(f"Invalid mode: {self.mode}")

 

If you look at this LangChain document parser I have included a method called include_figure_in_md. This method goes through the markdown content and look for all figures and replace each figure with the description of the same.

Before the same please lets write some utility method which can help you crop image from the document PDF/Image.

 

 

from PIL import Image
import fitz  # PyMuPDF
import mimetypes

import base64
from mimetypes import guess_type

# Function to encode a local image into data URL 
def local_image_to_data_url(image_path):
    # Guess the MIME type of the image based on the file extension
    mime_type, _ = guess_type(image_path)
    if mime_type is None:
        mime_type = 'application/octet-stream'  # Default MIME type if none is found

    # Read and encode the image file
    with open(image_path, "rb") as image_file:
        base64_encoded_data = base64.b64encode(image_file.read()).decode('utf-8')

    # Construct the data URL
    return f"data:{mime_type};base64,{base64_encoded_data}"

def crop_image_from_image(image_path, page_number, bounding_box):
    """
    Crops an image based on a bounding box.

    :param image_path: Path to the image file.
    :param page_number: The page number of the image to crop (for TIFF format).
    :param bounding_box: A tuple of (left, upper, right, lower) coordinates for the bounding box.
    :return: A cropped image.
    :rtype: PIL.Image.Image
    """
    with Image.open(image_path) as img:
        if img.format == "TIFF":
            # Open the TIFF image
            img.seek(page_number)
            img = img.copy()
            
        # The bounding box is expected to be in the format (left, upper, right, lower).
        cropped_image = img.crop(bounding_box)
        return cropped_image

def crop_image_from_pdf_page(pdf_path, page_number, bounding_box):
    """
    Crops a region from a given page in a PDF and returns it as an image.

    :param pdf_path: Path to the PDF file.
    :param page_number: The page number to crop from (0-indexed).
    :param bounding_box: A tuple of (x0, y0, x1, y1) coordinates for the bounding box.
    :return: A PIL Image of the cropped area.
    """
    doc = fitz.open(pdf_path)
    page = doc.load_page(page_number)
    
    # Cropping the page. The rect requires the coordinates in the format (x0, y0, x1, y1).
    bbx = [x * 72 for x in bounding_box]
    rect = fitz.Rect(bbx)
    pix = page.get_pixmap(matrix=fitz.Matrix(300/72, 300/72), clip=rect)
    
    img = Image.frombytes("RGB", [pix.width, pix.height], pix.samples)
    
    doc.close()

    return img

def crop_image_from_file(file_path, page_number, bounding_box):
    """
    Crop an image from a file.

    Args:
        file_path (str): The path to the file.
        page_number (int): The page number (for PDF and TIFF files, 0-indexed).
        bounding_box (tuple): The bounding box coordinates in the format (x0, y0, x1, y1).

    Returns:
        A PIL Image of the cropped area.
    """
    mime_type = mimetypes.guess_type(file_path)[0]
    
    if mime_type == "application/pdf":
        return crop_image_from_pdf_page(file_path, page_number, bounding_box)
    else:
        return crop_image_from_image(file_path, page_number, bounding_box)

 

 

Next we write a method where image can be passed to GPT-4-Vision model and get the description of this image.

 

 

from openai import AzureOpenAI
aoai_api_base = os.getenv("AZURE_OPENAI_ENDPOINT")
aoai_api_key= os.getenv("AZURE_OPENAI_API_KEY")
aoai_deployment_name = 'gpt-4-vision' # your model deployment name for GPT-4V
aoai_api_version = '2024-02-15-preview' # this might change in the future

MAX_TOKENS = 2000

def understand_image_with_gptv(image_path, caption):
    """
    Generates a description for an image using the GPT-4V model.

    Parameters:
    - api_base (str): The base URL of the API.
    - api_key (str): The API key for authentication.
    - deployment_name (str): The name of the deployment.
    - api_version (str): The version of the API.
    - image_path (str): The path to the image file.
    - caption (str): The caption for the image.

    Returns:
    - img_description (str): The generated description for the image.
    """
    client = AzureOpenAI(
        api_key=aoai_api_key,  
        api_version=aoai_api_version,
        base_url=f"{aoai_api_base}/openai/deployments/{aoai_deployment_name}"
    )

    data_url = local_image_to_data_url(image_path)
    response = client.chat.completions.create(
                model=aoai_deployment_name,
                messages=[
                    { "role": "system", "content": "You are a helpful assistant." },
                    { "role": "user", "content": [  
                        { 
                            "type": "text", 
                            "text": f"Describe this image (note: it has image caption: {caption}):" if caption else "Describe this image:"
                        },
                        { 
                            "type": "image_url",
                            "image_url": {
                                "url": data_url
                            }
                        }
                    ] } 
                ],
                max_tokens=2000
            )
    img_description = response.choices[0].message.content
    return img_description

 

 

Now once we have the utilities method set we can just import the Document Intelligence Loader and load the document.

 

 

from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader
loader = AzureAIDocumentIntelligenceLoader(file_path='sample.pdf', 
                                           api_key = os.getenv("AZURE_DOCUMENT_INTELLIGENCE_KEY"), 
                                           api_endpoint = os.getenv("AZURE_DOCUMENT_INTELLIGENCE_ENDPOINT"),
                                           api_model="prebuilt-layout",
                                           api_version="2024-02-29-preview",
                                           mode='markdown',
                                           analysis_features = [DocumentAnalysisFeature.OCR_HIGH_RESOLUTION])
docs = loader.load()

 

 

Semantic chunking is a powerful technique used in natural language processing that involves breaking down large pieces of text into smaller, thematically consistent segments or “chunks” that are semantically coherent. The primary goal of semantic chunking is to capture and preserve the inherent meaning within the text, allowing each chunk to contain as much semantically independent information as possible. This process is critically important for various language model applications, such as embedding models and retrieval-augmented generation (RAG), because it helps overcome limitations associated with processing long sequences of text. By ensuring that the data fed into language models (LLMs) is thematically and contextually coherent, semantic chunking enhances the model’s ability to interpret and generate relevant and accurate responses.

 

 

 

Additionally, it improves the efficiency of information retrieval from vector databases by enabling the retrieval of highly relevant information that aligns closely with the user’s intent, thereby reducing noise and maintaining semantic integrity. In essence, semantic chunking serves as a bridge between large volumes of text data and the effective processing capabilities of advanced language models, making it a cornerstone of efficient and meaningful natural language understanding and generation.

 

Lets look at the Markdown Header Splitter to split the document based on the header.

 

 

# Split the document into chunks base on markdown headers.
headers_to_split_on = [
    ("#", "Header 1"),
    ("##", "Header 2"),
    ("###", "Header 3"),
    ("####", "Header 4"),
    ("#####", "Header 5"),
    ("######", "Header 6"),  
    ("#######", "Header 7"), 
    ("########", "Header 8")
]
text_splitter = MarkdownHeaderTextSplitter(headers_to_split_on=headers_to_split_on)

docs_string = docs[0].page_content
docs_result = text_splitter.split_text(docs_string)

print("Length of splits: " + str(len(docs_result)))

 

 

Lets initialize the model of both Azure OpenAI GPT and Azure OpenAI Embedding.

 

 

from langchain_openai import AzureOpenAIEmbeddings
from langchain_community.vectorstores import FAISS
from langchain_openai import AzureChatOpenAI
from langchain import hub
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnablePassthrough

llm = AzureChatOpenAI(api_key = os.environ["AZURE_OPENAI_API_KEY"],  
                      api_version = "2023-12-01-preview",
                      azure_endpoint = os.environ["AZURE_OPENAI_ENDPOINT"],
                      model= "gpt-4-1106-preview",
                      streaming=True)

aoai_embeddings = AzureOpenAIEmbeddings(
    api_key=os.getenv("AZURE_OPENAI_API_KEY"),
    azure_deployment="text-embedding-ada-002",
    openai_api_version="2023-12-01-preview",
    azure_endpoint =os.environ["AZURE_OPENAI_ENDPOINT"]
)

 

 

Now lets create an index and store the embeddings into AI Search.

 

 

from langchain_community.vectorstores.azuresearch import AzureSearch
from langchain_openai import AzureOpenAIEmbeddings, OpenAIEmbeddings

index_name: str = "langchain-vector-demo"
vector_store: AzureSearch = AzureSearch(
    azure_search_endpoint=vector_store_address,
    azure_search_key=vector_store_password,
    index_name=index_name,
    embedding_function=aoai_embeddings.embed_query,
)

 

 

Finally lets create our RAG Chain. Here I have used a simple Retrieval but you can make complex Retrieval as well to Retrieve both Image and Text.

 

 

def format_docs(docs):
    return "\n\n".join(doc.page_content for doc in docs)

retriever_base = index.as_retriever(search_type="similarity",search_kwargs = {"k" : 5})

rag_chain_from_docs = (
    {
        "context": lambda input: format_docs(input["documents"]),
        "question": itemgetter("question"),
    }
    | prompt
    | llm
    | StrOutputParser()
)
rag_chain_with_source = RunnableMap(
    {"documents": retriever_base, "question": RunnablePassthrough()}
) | {
    "documents": lambda input: [doc.metadata for doc in input["documents"]],
    "answer": rag_chain_from_docs,
}

 

 

 

Now lets make into action, lets take the below PDF example and ask question from Plot.

 

 

 

 

Here I will ask a question from the Plot from this page. As you see i get the correct response along with citations too.

 

 

Hope you like the blog. Please clap and follow me if you like to read more such blogs coming soon.

 

Updated Apr 19, 2024
Version 2.0

5 Comments

"}},"componentScriptGroups({\"componentId\":\"custom.widget.Social_Sharing\"})":{"__typename":"ComponentScriptGroups","scriptGroups":{"__typename":"ComponentScriptGroupsDefinition","afterInteractive":{"__typename":"PageScriptGroupDefinition","group":"AFTER_INTERACTIVE","scriptIds":[]},"lazyOnLoad":{"__typename":"PageScriptGroupDefinition","group":"LAZY_ON_LOAD","scriptIds":[]}},"componentScripts":[]},"component({\"componentId\":\"custom.widget.MicrosoftFooter\"})":{"__typename":"Component","render({\"context\":{\"component\":{\"entities\":[],\"props\":{}},\"page\":{\"entities\":[\"board:Azure-AI-Services-blog\",\"message:4118184\"],\"name\":\"BlogMessagePage\",\"props\":{},\"url\":\"https://techcommunity.microsoft.com/blog/azure-ai-services-blog/build-intelligent-rag-for-multimodality-and-complex-document-structure/4118184\"}}})":{"__typename":"ComponentRenderResult","html":""}},"componentScriptGroups({\"componentId\":\"custom.widget.MicrosoftFooter\"})":{"__typename":"ComponentScriptGroups","scriptGroups":{"__typename":"ComponentScriptGroupsDefinition","afterInteractive":{"__typename":"PageScriptGroupDefinition","group":"AFTER_INTERACTIVE","scriptIds":[]},"lazyOnLoad":{"__typename":"PageScriptGroupDefinition","group":"LAZY_ON_LOAD","scriptIds":[]}},"componentScripts":[]},"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/community/NavbarDropdownToggle\"]})":[{"__ref":"CachedAsset:text:en_US-components/community/NavbarDropdownToggle-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/common/QueryHandler\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/common/QueryHandler-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageCoverImage\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageCoverImage-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/nodes/NodeTitle\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/nodes/NodeTitle-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageTimeToRead\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageTimeToRead-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageSubject\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageSubject-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/users/UserLink\"]})":[{"__ref":"CachedAsset:text:en_US-components/users/UserLink-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/users/UserRank\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/users/UserRank-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageTime\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageTime-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageBody\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageBody-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageCustomFields\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageCustomFields-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageRevision\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageRevision-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageReplyButton\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageReplyButton-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/messages/MessageAuthorBio\"]})":[{"__ref":"CachedAsset:text:en_US-components/messages/MessageAuthorBio-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/users/UserAvatar\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/users/UserAvatar-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/ranks/UserRankLabel\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/ranks/UserRankLabel-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/users/UserRegistrationDate\"]})":[{"__ref":"CachedAsset:text:en_US-components/users/UserRegistrationDate-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/nodes/NodeAvatar\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/nodes/NodeAvatar-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/nodes/NodeDescription\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/nodes/NodeDescription-1737571274000"}],"message({\"id\":\"message:4273867\"})":{"__ref":"BlogReplyMessage:message:4273867"},"message({\"id\":\"message:4160306\"})":{"__ref":"BlogReplyMessage:message:4160306"},"message({\"id\":\"message:4138216\"})":{"__ref":"BlogReplyMessage:message:4138216"},"message({\"id\":\"message:4137862\"})":{"__ref":"BlogReplyMessage:message:4137862"},"message({\"id\":\"message:4133480\"})":{"__ref":"BlogReplyMessage:message:4133480"},"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"components/tags/TagView/TagViewChip\"]})":[{"__ref":"CachedAsset:text:en_US-components/tags/TagView/TagViewChip-1737571274000"}],"cachedText({\"lastModified\":\"1737571274000\",\"locale\":\"en-US\",\"namespaces\":[\"shared/client/components/nodes/NodeIcon\"]})":[{"__ref":"CachedAsset:text:en_US-shared/client/components/nodes/NodeIcon-1737571274000"}]},"CachedAsset:pages-1742488897953":{"__typename":"CachedAsset","id":"pages-1742488897953","value":[{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"BlogViewAllPostsPage","type":"BLOG","urlPath":"/category/:categoryId/blog/:boardId/all-posts/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"CasePortalPage","type":"CASE_PORTAL","urlPath":"/caseportal","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"CreateGroupHubPage","type":"GROUP_HUB","urlPath":"/groups/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"CaseViewPage","type":"CASE_DETAILS","urlPath":"/case/:caseId/:caseNumber","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"InboxPage","type":"COMMUNITY","urlPath":"/inbox","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"HelpFAQPage","type":"COMMUNITY","urlPath":"/help","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"IdeaMessagePage","type":"IDEA_POST","urlPath":"/idea/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"IdeaViewAllIdeasPage","type":"IDEA","urlPath":"/category/:categoryId/ideas/:boardId/all-ideas/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"LoginPage","type":"USER","urlPath":"/signin","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"BlogPostPage","type":"BLOG","urlPath":"/category/:categoryId/blogs/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"UserBlogPermissions.Page","type":"COMMUNITY","urlPath":"/c/user-blog-permissions/page","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ThemeEditorPage","type":"COMMUNITY","urlPath":"/designer/themes","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"TkbViewAllArticlesPage","type":"TKB","urlPath":"/category/:categoryId/kb/:boardId/all-articles/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1730819800000,"localOverride":null,"page":{"id":"AllEvents","type":"CUSTOM","urlPath":"/Events","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"OccasionEditPage","type":"EVENT","urlPath":"/event/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"OAuthAuthorizationAllowPage","type":"USER","urlPath":"/auth/authorize/allow","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"PageEditorPage","type":"COMMUNITY","urlPath":"/designer/pages","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"PostPage","type":"COMMUNITY","urlPath":"/category/:categoryId/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ForumBoardPage","type":"FORUM","urlPath":"/category/:categoryId/discussions/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"TkbBoardPage","type":"TKB","urlPath":"/category/:categoryId/kb/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"EventPostPage","type":"EVENT","urlPath":"/category/:categoryId/events/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"UserBadgesPage","type":"COMMUNITY","urlPath":"/users/:login/:userId/badges","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"GroupHubMembershipAction","type":"GROUP_HUB","urlPath":"/membership/join/:nodeId/:membershipType","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"MaintenancePage","type":"COMMUNITY","urlPath":"/maintenance","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"IdeaReplyPage","type":"IDEA_REPLY","urlPath":"/idea/:boardId/:messageSubject/:messageId/comments/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"UserSettingsPage","type":"USER","urlPath":"/mysettings/:userSettingsTab","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"GroupHubsPage","type":"GROUP_HUB","urlPath":"/groups","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ForumPostPage","type":"FORUM","urlPath":"/category/:categoryId/discussions/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"OccasionRsvpActionPage","type":"OCCASION","urlPath":"/event/:boardId/:messageSubject/:messageId/rsvp/:responseType","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"VerifyUserEmailPage","type":"USER","urlPath":"/verifyemail/:userId/:verifyEmailToken","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"AllOccasionsPage","type":"OCCASION","urlPath":"/category/:categoryId/events/:boardId/all-events/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"EventBoardPage","type":"EVENT","urlPath":"/category/:categoryId/events/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"TkbReplyPage","type":"TKB_REPLY","urlPath":"/kb/:boardId/:messageSubject/:messageId/comments/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"IdeaBoardPage","type":"IDEA","urlPath":"/category/:categoryId/ideas/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"CommunityGuideLinesPage","type":"COMMUNITY","urlPath":"/communityguidelines","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"CaseCreatePage","type":"SALESFORCE_CASE_CREATION","urlPath":"/caseportal/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"TkbEditPage","type":"TKB","urlPath":"/kb/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ForgotPasswordPage","type":"USER","urlPath":"/forgotpassword","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"IdeaEditPage","type":"IDEA","urlPath":"/idea/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"TagPage","type":"COMMUNITY","urlPath":"/tag/:tagName","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"BlogBoardPage","type":"BLOG","urlPath":"/category/:categoryId/blog/:boardId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"OccasionMessagePage","type":"OCCASION_TOPIC","urlPath":"/event/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ManageContentPage","type":"COMMUNITY","urlPath":"/managecontent","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ClosedMembershipNodeNonMembersPage","type":"GROUP_HUB","urlPath":"/closedgroup/:groupHubId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"CommunityPage","type":"COMMUNITY","urlPath":"/","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ForumMessagePage","type":"FORUM_TOPIC","urlPath":"/discussions/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"IdeaPostPage","type":"IDEA","urlPath":"/category/:categoryId/ideas/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1730819800000,"localOverride":null,"page":{"id":"CommunityHub.Page","type":"CUSTOM","urlPath":"/Directory","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"BlogMessagePage","type":"BLOG_ARTICLE","urlPath":"/blog/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"RegistrationPage","type":"USER","urlPath":"/register","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"EditGroupHubPage","type":"GROUP_HUB","urlPath":"/group/:groupHubId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ForumEditPage","type":"FORUM","urlPath":"/discussions/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ResetPasswordPage","type":"USER","urlPath":"/resetpassword/:userId/:resetPasswordToken","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1730819800000,"localOverride":null,"page":{"id":"AllBlogs.Page","type":"CUSTOM","urlPath":"/blogs","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"TkbMessagePage","type":"TKB_ARTICLE","urlPath":"/kb/:boardId/:messageSubject/:messageId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"BlogEditPage","type":"BLOG","urlPath":"/blog/:boardId/:messageSubject/:messageId/edit","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ManageUsersPage","type":"USER","urlPath":"/users/manage/:tab?/:manageUsersTab?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ForumReplyPage","type":"FORUM_REPLY","urlPath":"/discussions/:boardId/:messageSubject/:messageId/replies/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"PrivacyPolicyPage","type":"COMMUNITY","urlPath":"/privacypolicy","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"NotificationPage","type":"COMMUNITY","urlPath":"/notifications","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"UserPage","type":"USER","urlPath":"/users/:login/:userId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"OccasionReplyPage","type":"OCCASION_REPLY","urlPath":"/event/:boardId/:messageSubject/:messageId/comments/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ManageMembersPage","type":"GROUP_HUB","urlPath":"/group/:groupHubId/manage/:tab?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"SearchResultsPage","type":"COMMUNITY","urlPath":"/search","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"BlogReplyPage","type":"BLOG_REPLY","urlPath":"/blog/:boardId/:messageSubject/:messageId/replies/:replyId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"GroupHubPage","type":"GROUP_HUB","urlPath":"/group/:groupHubId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"TermsOfServicePage","type":"COMMUNITY","urlPath":"/termsofservice","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"CategoryPage","type":"CATEGORY","urlPath":"/category/:categoryId","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"ForumViewAllTopicsPage","type":"FORUM","urlPath":"/category/:categoryId/discussions/:boardId/all-topics/(/:after|/:before)?","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"TkbPostPage","type":"TKB","urlPath":"/category/:categoryId/kbs/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"},{"lastUpdatedTime":1742488897953,"localOverride":null,"page":{"id":"GroupHubPostPage","type":"GROUP_HUB","urlPath":"/group/:groupHubId/:boardId/create","__typename":"PageDescriptor"},"__typename":"PageResource"}],"localOverride":false},"CachedAsset:text:en_US-components/context/AppContext/AppContextProvider-0":{"__typename":"CachedAsset","id":"text:en_US-components/context/AppContext/AppContextProvider-0","value":{"noCommunity":"Cannot find community","noUser":"Cannot find current user","noNode":"Cannot find node with id {nodeId}","noMessage":"Cannot find message with id {messageId}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/common/Loading/LoadingDot-0":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/common/Loading/LoadingDot-0","value":{"title":"Loading..."},"localOverride":false},"User:user:-1":{"__typename":"User","id":"user:-1","uid":-1,"login":"Deleted","email":"","avatar":null,"rank":null,"kudosWeight":1,"registrationData":{"__typename":"RegistrationData","status":"ANONYMOUS","registrationTime":null,"confirmEmailStatus":false,"registrationAccessLevel":"VIEW","ssoRegistrationFields":[]},"ssoId":null,"profileSettings":{"__typename":"ProfileSettings","dateDisplayStyle":{"__typename":"InheritableStringSettingWithPossibleValues","key":"layout.friendly_dates_enabled","value":"false","localValue":"true","possibleValues":["true","false"]},"dateDisplayFormat":{"__typename":"InheritableStringSetting","key":"layout.format_pattern_date","value":"MMM dd yyyy","localValue":"MM-dd-yyyy"},"language":{"__typename":"InheritableStringSettingWithPossibleValues","key":"profile.language","value":"en-US","localValue":"en","possibleValues":["en-US"]}},"deleted":false},"Theme:customTheme1":{"__typename":"Theme","id":"customTheme1"},"Category:category:AI":{"__typename":"Category","id":"category:AI","entityType":"CATEGORY","displayId":"AI","nodeType":"category","depth":3,"title":"Artificial Intelligence and Machine Learning","shortTitle":"Artificial Intelligence and Machine Learning","parent":{"__ref":"Category:category:solutions"},"categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:top":{"__typename":"Category","id":"category:top","displayId":"top","nodeType":"category","depth":0,"title":"Top","entityType":"CATEGORY","shortTitle":"Top"},"Category:category:communities":{"__typename":"Category","id":"category:communities","displayId":"communities","nodeType":"category","depth":1,"parent":{"__ref":"Category:category:top"},"title":"Communities","entityType":"CATEGORY","shortTitle":"Communities"},"Category:category:solutions":{"__typename":"Category","id":"category:solutions","displayId":"solutions","nodeType":"category","depth":2,"parent":{"__ref":"Category:category:communities"},"title":"Topics","entityType":"CATEGORY","shortTitle":"Topics"},"Blog:board:Azure-AI-Services-blog":{"__typename":"Blog","id":"board:Azure-AI-Services-blog","entityType":"BLOG","displayId":"Azure-AI-Services-blog","nodeType":"board","depth":4,"conversationStyle":"BLOG","title":"AI - Azure AI services Blog","description":"","avatar":null,"profileSettings":{"__typename":"ProfileSettings","language":null},"parent":{"__ref":"Category:category:AI"},"ancestors":{"__typename":"CoreNodeConnection","edges":[{"__typename":"CoreNodeEdge","node":{"__ref":"Community:community:gxcuf89792"}},{"__typename":"CoreNodeEdge","node":{"__ref":"Category:category:communities"}},{"__typename":"CoreNodeEdge","node":{"__ref":"Category:category:solutions"}},{"__typename":"CoreNodeEdge","node":{"__ref":"Category:category:AI"}}]},"userContext":{"__typename":"NodeUserContext","canAddAttachments":false,"canUpdateNode":false,"canPostMessages":false,"isSubscribed":false},"boardPolicies":{"__typename":"BoardPolicies","canPublishArticleOnCreate":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.forums.policy_can_publish_on_create_workflow_action.accessDenied","key":"error.lithium.policies.forums.policy_can_publish_on_create_workflow_action.accessDenied","args":[]}}},"shortTitle":"AI - Azure AI services Blog","repliesProperties":{"__typename":"RepliesProperties","sortOrder":"REVERSE_PUBLISH_TIME","repliesFormat":"threaded"},"eventPath":"category:AI/category:solutions/category:communities/community:gxcuf89792board:Azure-AI-Services-blog/","tagProperties":{"__typename":"TagNodeProperties","tagsEnabled":{"__typename":"PolicyResult","failureReason":null}},"requireTags":true,"tagType":"PRESET_ONLY"},"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/cmstNC05WEo0blc\"}":{"__typename":"AssociatedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/cmstNC05WEo0blc","height":512,"width":512,"mimeType":"image/png"},"Rank:rank:4":{"__typename":"Rank","id":"rank:4","position":6,"name":"Microsoft","color":"333333","icon":{"__ref":"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/cmstNC05WEo0blc\"}"},"rankStyle":"OUTLINE"},"User:user:2080373":{"__typename":"User","id":"user:2080373","uid":2080373,"login":"mrajguru","deleted":false,"avatar":{"__typename":"UserAvatar","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/dS0yMDgwMzczLTU2MzI2Nmk2MDUwNkNDRUUxMDhGQjYx"},"rank":{"__ref":"Rank:rank:4"},"email":"","messagesCount":27,"biography":null,"topicsCount":16,"kudosReceivedCount":54,"kudosGivenCount":5,"kudosWeight":1,"registrationData":{"__typename":"RegistrationData","status":null,"registrationTime":"2023-10-12T23:52:15.266-07:00","confirmEmailStatus":null},"followersCount":null,"solutionsCount":0,"entityType":"USER","eventPath":"community:gxcuf89792/user:2080373"},"BlogTopicMessage:message:4118184":{"__typename":"BlogTopicMessage","uid":4118184,"subject":"Build Intelligent RAG For Multimodality and Complex Document Structure","id":"message:4118184","revisionNum":2,"repliesCount":5,"author":{"__ref":"User:user:2080373"},"depth":0,"hasGivenKudo":false,"board":{"__ref":"Blog:board:Azure-AI-Services-blog"},"conversation":{"__ref":"Conversation:conversation:4118184"},"messagePolicies":{"__typename":"MessagePolicies","canPublishArticleOnEdit":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.forums.policy_can_publish_on_edit_workflow_action.accessDenied","key":"error.lithium.policies.forums.policy_can_publish_on_edit_workflow_action.accessDenied","args":[]}},"canModerateSpamMessage":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.feature.moderation_spam.action.moderate_entity.allowed.accessDenied","key":"error.lithium.policies.feature.moderation_spam.action.moderate_entity.allowed.accessDenied","args":[]}}},"contentWorkflow":{"__typename":"ContentWorkflow","state":"PUBLISH","scheduledPublishTime":null,"scheduledTimezone":null,"userContext":{"__typename":"MessageWorkflowContext","canSubmitForReview":null,"canEdit":false,"canRecall":null,"canSubmitForPublication":null,"canReturnToAuthor":null,"canPublish":null,"canReturnToReview":null,"canSchedule":false},"shortScheduledTimezone":null},"readOnly":false,"editFrozen":false,"moderationData":{"__ref":"ModerationData:moderation_data:4118184"},"teaser":"

Struggling with implementing RAG for Document with multiples Tables, Figures , Plots with complex structured scan documents, Looking for a solution?

\n

 

\n

In this blog you will learn implementing end to end solution using Azure AI services including Document Intelligence, AI Search, Azure Open AI and our favorite LangChain 🙂

\n

 

\n

I have taken the advantage of Multimodal capability of Azure OpenAI GPT-4-Vision model for Figures and Plots.

","body":"

The advent of Retrieval-Augmented Generation (RAG) models has been a significant milestone in the field of Natural Language Processing (NLP). These models combine the power of information retrieval with generative language models to produce answers that are not just accurate but also contextually enriched. However, as the digital universe expands beyond textual data, incorporating image understanding and hierarchical document structure analysis into RAG systems is becoming increasingly crucial. This article explores how these two elements can significantly enhance the capabilities of RAG models.

\n

 

\n

Understanding RAG Models

\n

Before diving into the nuances of image understanding and document analysis, let’s briefly touch upon the essence of RAG models. These systems work by first retrieving relevant documents from a vast corpus and then using a generative model to synthesize information into a coherent response. The retrieval component ensures that the model has access to accurate and up-to-date information, while the generative component allows for the creation of human-like text.

\n

 

\n

Image Understanding and Structure Analysis

\n

The Challenge

\n

One of the most significant limitations of traditional RAG models is their inability to understand and interpret visual data. In a world where images accompany textual information ubiquitously, this represents a substantial gap in the model’s comprehension abilities. Documents are not just strings of text; they have structure — sections, subsections, paragraphs, and lists — all of which convey semantic importance. Traditional RAG models often overlook this hierarchical structure, potentially missing out on understanding the document’s full meaning.

\n

The Solution

\n

To bridge this gap, RAG models can be augmented with Computer Vision (CV) capabilities. This involves integrating image recognition and understanding modules that can analyze visual data, extract relevant information, and convert it into a textual format that the RAG model can process. Incorporating hierarchical document structure analysis involves teaching RAG models to recognize and interpret the underlying structure of documents.

\n

 

\n

\n

 

\n

Implementation

\n\n

 

\n

In this blog we will look at how do we implement this using Azure Document Intelligence, LangChain and Azure OpenAI.

\n

Prerequisites

\n

Before we implement this we will require some prerequisites

\n\n

Once we have the above information , lets get started !!

\n

Let’s import the required libraries.

\n

 

\n

 

\n
import os\nfrom dotenv import load_dotenv\nload_dotenv('azure.env')\n\nfrom langchain import hub\nfrom langchain_openai import AzureChatOpenAI\n#from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader\nfrom doc_intelligence import AzureAIDocumentIntelligenceLoader\nfrom langchain_openai import AzureOpenAIEmbeddings\nfrom langchain.schema import StrOutputParser\nfrom langchain.schema.runnable import RunnablePassthrough\nfrom langchain.text_splitter import MarkdownHeaderTextSplitter\nfrom langchain.vectorstores.azuresearch import AzureSearch\nfrom azure.ai.documentintelligence.models import DocumentAnalysisFeature
\n

 

\n

 

\n

Now we are going to write some custom function on top of LangChain Document Loader which can help us Load the PDF document. First thing we do is using Azure Document Intelligence which has the this beautiful feature of converting Image to Markdown format. Lets use the same.

\n

 

\n

 

\n
import logging\nfrom typing import Any, Iterator, List, Optional\nimport os\nfrom langchain_core.documents import Document\nfrom langchain_community.document_loaders.base import BaseLoader\nfrom langchain_community.document_loaders.base import BaseBlobParser\nfrom langchain_community.document_loaders.blob_loaders import Blob\n\nlogger = logging.getLogger(__name__)\n\nclass AzureAIDocumentIntelligenceLoader(BaseLoader):\n    \"\"\"Loads a PDF with Azure Document Intelligence\"\"\"\n\n    def __init__(\n        self,\n        api_endpoint: str,\n        api_key: str,\n        file_path: Optional[str] = None,\n        url_path: Optional[str] = None,\n        api_version: Optional[str] = None,\n        api_model: str = \"prebuilt-layout\",\n        mode: str = \"markdown\",\n        *,\n        analysis_features: Optional[List[str]] = None,\n    ) -> None:\n        \"\"\"\n        Initialize the object for file processing with Azure Document Intelligence\n        (formerly Form Recognizer).\n\n        This constructor initializes a AzureAIDocumentIntelligenceParser object to be\n        used for parsing files using the Azure Document Intelligence API. The load\n        method generates Documents whose content representations are determined by the\n        mode parameter.\n\n        Parameters:\n        -----------\n        api_endpoint: str\n            The API endpoint to use for DocumentIntelligenceClient construction.\n        api_key: str\n            The API key to use for DocumentIntelligenceClient construction.\n        file_path : Optional[str]\n            The path to the file that needs to be loaded.\n            Either file_path or url_path must be specified.\n        url_path : Optional[str]\n            The URL to the file that needs to be loaded.\n            Either file_path or url_path must be specified.\n        api_version: Optional[str]\n            The API version for DocumentIntelligenceClient. Setting None to use\n            the default value from `azure-ai-documentintelligence` package.\n        api_model: str\n            Unique document model name. Default value is \"prebuilt-layout\".\n            Note that overriding this default value may result in unsupported\n            behavior.\n        mode: Optional[str]\n            The type of content representation of the generated Documents.\n            Use either \"single\", \"page\", or \"markdown\". Default value is \"markdown\".\n        analysis_features: Optional[List[str]]\n            List of optional analysis features, each feature should be passed\n            as a str that conforms to the enum `DocumentAnalysisFeature` in\n            `azure-ai-documentintelligence` package. Default value is None.\n\n        Examples:\n        ---------\n        >>> obj = AzureAIDocumentIntelligenceLoader(\n        ...     file_path=\"path/to/file\",\n        ...     api_endpoint=\"https://endpoint.azure.com\",\n        ...     api_key=\"APIKEY\",\n        ...     api_version=\"2023-10-31-preview\",\n        ...     api_model=\"prebuilt-layout\",\n        ...     mode=\"markdown\"\n        ... )\n        \"\"\"\n\n        assert (\n            file_path is not None or url_path is not None\n        ), \"file_path or url_path must be provided\"\n        self.file_path = file_path\n        self.url_path = url_path\n\n        self.parser = AzureAIDocumentIntelligenceParser(\n            api_endpoint=api_endpoint,\n            api_key=api_key,\n            api_version=api_version,\n            api_model=api_model,\n            mode=mode,\n            analysis_features=analysis_features,\n        )\n\n    def lazy_load(\n        self,\n    ) -> Iterator[Document]:\n        \"\"\"Lazy load given path as pages.\"\"\"\n        if self.file_path is not None:\n            yield from self.parser.parse(self.file_path)\n        else:\n            yield from self.parser.parse_url(self.url_path)
\n

 

\n

 

\n

Now lets define the Document Parser for the same. This document Parser will internally call the intended to load and parse PDF files using Azure's Document Intelligence service (formerly known as Azure Forms Recognizer). This service uses machine learning models to extract text, key-value pairs, and tables from documents. 

\n
lazy_parse- A method that lazily parses a given file, meaning it starts processing the file and yielding results as they become available rather than waiting for the entire file to be processed.
\n
 
\n
class AzureAIDocumentIntelligenceParser(BaseBlobParser):\n    \"\"\"Loads a PDF with Azure Document Intelligence\n    (formerly Forms Recognizer).\"\"\"\n\n    def __init__(\n        self,\n        api_endpoint: str,\n        api_key: str,\n        api_version: Optional[str] = None,\n        api_model: str = \"prebuilt-layout\",\n        mode: str = \"markdown\",\n        analysis_features: Optional[List[str]] = None,\n    ):\n        from azure.ai.documentintelligence import DocumentIntelligenceClient\n        from azure.ai.documentintelligence.models import DocumentAnalysisFeature\n        from azure.core.credentials import AzureKeyCredential\n\n        kwargs = {}\n        if api_version is not None:\n            kwargs[\"api_version\"] = api_version\n\n        if analysis_features is not None:\n            _SUPPORTED_FEATURES = [\n                DocumentAnalysisFeature.OCR_HIGH_RESOLUTION,\n            ]\n\n            analysis_features = [\n                DocumentAnalysisFeature(feature) for feature in analysis_features\n            ]\n            if any(\n                [feature not in _SUPPORTED_FEATURES for feature in analysis_features]\n            ):\n                logger.warning(\n                    f\"The current supported features are: \"\n                    f\"{[f.value for f in _SUPPORTED_FEATURES]}. \"\n                    \"Using other features may result in unexpected behavior.\"\n                )\n\n        self.client = DocumentIntelligenceClient(\n            endpoint=api_endpoint,\n            credential=AzureKeyCredential(api_key),\n            headers={\"x-ms-useragent\": \"langchain-parser/1.0.0\"},\n            features=analysis_features,\n            **kwargs,\n        )\n        self.api_model = api_model\n        self.mode = mode\n        assert self.mode in [\"single\", \"page\", \"markdown\"]\n\n    def _generate_docs_page(self, result: Any) -> Iterator[Document]:\n        for p in result.pages:\n            content = \" \".join([line.content for line in p.lines])\n\n            d = Document(\n                page_content=content,\n                metadata={\n                    \"page\": p.page_number,\n                },\n            )\n            yield d\n\n    def _generate_docs_single(self, file_path: str, result: Any) -> Iterator[Document]:\n        md_content = include_figure_in_md(file_path, result)\n        yield Document(page_content=md_content, metadata={})\n\n    def lazy_parse(self, file_path: str) -> Iterator[Document]:\n        \"\"\"Lazily parse the blob.\"\"\"\n        blob = Blob.from_path(file_path)\n        with blob.as_bytes_io() as file_obj:\n            poller = self.client.begin_analyze_document(\n                self.api_model,\n                file_obj,\n                content_type=\"application/octet-stream\",\n                output_content_format=\"markdown\" if self.mode == \"markdown\" else \"text\",\n            )\n            result = poller.result()\n\n            if self.mode in [\"single\", \"markdown\"]:\n                yield from self._generate_docs_single(file_path, result)\n            elif self.mode in [\"page\"]:\n                yield from self._generate_docs_page(result)\n            else:\n                raise ValueError(f\"Invalid mode: {self.mode}\")\n\n    def parse_url(self, url: str) -> Iterator[Document]:\n        from azure.ai.documentintelligence.models import AnalyzeDocumentRequest\n\n        poller = self.client.begin_analyze_document(\n            self.api_model,\n            AnalyzeDocumentRequest(url_source=url),\n            # content_type=\"application/octet-stream\",\n            output_content_format=\"markdown\" if self.mode == \"markdown\" else \"text\",\n        )\n        result = poller.result()\n\n        if self.mode in [\"single\", \"markdown\"]:\n            yield from self._generate_docs_single(result)\n        elif self.mode in [\"page\"]:\n            yield from self._generate_docs_page(result)\n        else:\n            raise ValueError(f\"Invalid mode: {self.mode}\")
\n

 

\n

If you look at this LangChain document parser I have included a method called include_figure_in_md. This method goes through the markdown content and look for all figures and replace each figure with the description of the same.

\n

Before the same please lets write some utility method which can help you crop image from the document PDF/Image.

\n

 

\n

 

\n
from PIL import Image\nimport fitz  # PyMuPDF\nimport mimetypes\n\nimport base64\nfrom mimetypes import guess_type\n\n# Function to encode a local image into data URL \ndef local_image_to_data_url(image_path):\n    # Guess the MIME type of the image based on the file extension\n    mime_type, _ = guess_type(image_path)\n    if mime_type is None:\n        mime_type = 'application/octet-stream'  # Default MIME type if none is found\n\n    # Read and encode the image file\n    with open(image_path, \"rb\") as image_file:\n        base64_encoded_data = base64.b64encode(image_file.read()).decode('utf-8')\n\n    # Construct the data URL\n    return f\"data:{mime_type};base64,{base64_encoded_data}\"\n\ndef crop_image_from_image(image_path, page_number, bounding_box):\n    \"\"\"\n    Crops an image based on a bounding box.\n\n    :param image_path: Path to the image file.\n    :param page_number: The page number of the image to crop (for TIFF format).\n    :param bounding_box: A tuple of (left, upper, right, lower) coordinates for the bounding box.\n    :return: A cropped image.\n    :rtype: PIL.Image.Image\n    \"\"\"\n    with Image.open(image_path) as img:\n        if img.format == \"TIFF\":\n            # Open the TIFF image\n            img.seek(page_number)\n            img = img.copy()\n            \n        # The bounding box is expected to be in the format (left, upper, right, lower).\n        cropped_image = img.crop(bounding_box)\n        return cropped_image\n\ndef crop_image_from_pdf_page(pdf_path, page_number, bounding_box):\n    \"\"\"\n    Crops a region from a given page in a PDF and returns it as an image.\n\n    :param pdf_path: Path to the PDF file.\n    :param page_number: The page number to crop from (0-indexed).\n    :param bounding_box: A tuple of (x0, y0, x1, y1) coordinates for the bounding box.\n    :return: A PIL Image of the cropped area.\n    \"\"\"\n    doc = fitz.open(pdf_path)\n    page = doc.load_page(page_number)\n    \n    # Cropping the page. The rect requires the coordinates in the format (x0, y0, x1, y1).\n    bbx = [x * 72 for x in bounding_box]\n    rect = fitz.Rect(bbx)\n    pix = page.get_pixmap(matrix=fitz.Matrix(300/72, 300/72), clip=rect)\n    \n    img = Image.frombytes(\"RGB\", [pix.width, pix.height], pix.samples)\n    \n    doc.close()\n\n    return img\n\ndef crop_image_from_file(file_path, page_number, bounding_box):\n    \"\"\"\n    Crop an image from a file.\n\n    Args:\n        file_path (str): The path to the file.\n        page_number (int): The page number (for PDF and TIFF files, 0-indexed).\n        bounding_box (tuple): The bounding box coordinates in the format (x0, y0, x1, y1).\n\n    Returns:\n        A PIL Image of the cropped area.\n    \"\"\"\n    mime_type = mimetypes.guess_type(file_path)[0]\n    \n    if mime_type == \"application/pdf\":\n        return crop_image_from_pdf_page(file_path, page_number, bounding_box)\n    else:\n        return crop_image_from_image(file_path, page_number, bounding_box)
\n

 

\n

 

\n

Next we write a method where image can be passed to GPT-4-Vision model and get the description of this image.

\n

 

\n

 

\n
from openai import AzureOpenAI\naoai_api_base = os.getenv(\"AZURE_OPENAI_ENDPOINT\")\naoai_api_key= os.getenv(\"AZURE_OPENAI_API_KEY\")\naoai_deployment_name = 'gpt-4-vision' # your model deployment name for GPT-4V\naoai_api_version = '2024-02-15-preview' # this might change in the future\n\nMAX_TOKENS = 2000\n\ndef understand_image_with_gptv(image_path, caption):\n    \"\"\"\n    Generates a description for an image using the GPT-4V model.\n\n    Parameters:\n    - api_base (str): The base URL of the API.\n    - api_key (str): The API key for authentication.\n    - deployment_name (str): The name of the deployment.\n    - api_version (str): The version of the API.\n    - image_path (str): The path to the image file.\n    - caption (str): The caption for the image.\n\n    Returns:\n    - img_description (str): The generated description for the image.\n    \"\"\"\n    client = AzureOpenAI(\n        api_key=aoai_api_key,  \n        api_version=aoai_api_version,\n        base_url=f\"{aoai_api_base}/openai/deployments/{aoai_deployment_name}\"\n    )\n\n    data_url = local_image_to_data_url(image_path)\n    response = client.chat.completions.create(\n                model=aoai_deployment_name,\n                messages=[\n                    { \"role\": \"system\", \"content\": \"You are a helpful assistant.\" },\n                    { \"role\": \"user\", \"content\": [  \n                        { \n                            \"type\": \"text\", \n                            \"text\": f\"Describe this image (note: it has image caption: {caption}):\" if caption else \"Describe this image:\"\n                        },\n                        { \n                            \"type\": \"image_url\",\n                            \"image_url\": {\n                                \"url\": data_url\n                            }\n                        }\n                    ] } \n                ],\n                max_tokens=2000\n            )\n    img_description = response.choices[0].message.content\n    return img_description
\n

 

\n

 

\n

Now once we have the utilities method set we can just import the Document Intelligence Loader and load the document.

\n

 

\n

 

\n
from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader\nloader = AzureAIDocumentIntelligenceLoader(file_path='sample.pdf', \n                                           api_key = os.getenv(\"AZURE_DOCUMENT_INTELLIGENCE_KEY\"), \n                                           api_endpoint = os.getenv(\"AZURE_DOCUMENT_INTELLIGENCE_ENDPOINT\"),\n                                           api_model=\"prebuilt-layout\",\n                                           api_version=\"2024-02-29-preview\",\n                                           mode='markdown',\n                                           analysis_features = [DocumentAnalysisFeature.OCR_HIGH_RESOLUTION])\ndocs = loader.load()
\n

 

\n

 

\n

Semantic chunking is a powerful technique used in natural language processing that involves breaking down large pieces of text into smaller, thematically consistent segments or “chunks” that are semantically coherent. The primary goal of semantic chunking is to capture and preserve the inherent meaning within the text, allowing each chunk to contain as much semantically independent information as possible. This process is critically important for various language model applications, such as embedding models and retrieval-augmented generation (RAG), because it helps overcome limitations associated with processing long sequences of text. By ensuring that the data fed into language models (LLMs) is thematically and contextually coherent, semantic chunking enhances the model’s ability to interpret and generate relevant and accurate responses.

\n

 

\n

\n

 

\n

 

\n

Additionally, it improves the efficiency of information retrieval from vector databases by enabling the retrieval of highly relevant information that aligns closely with the user’s intent, thereby reducing noise and maintaining semantic integrity. In essence, semantic chunking serves as a bridge between large volumes of text data and the effective processing capabilities of advanced language models, making it a cornerstone of efficient and meaningful natural language understanding and generation.

\n

 

\n

Lets look at the Markdown Header Splitter to split the document based on the header.

\n

 

\n

 

\n
# Split the document into chunks base on markdown headers.\nheaders_to_split_on = [\n    (\"#\", \"Header 1\"),\n    (\"##\", \"Header 2\"),\n    (\"###\", \"Header 3\"),\n    (\"####\", \"Header 4\"),\n    (\"#####\", \"Header 5\"),\n    (\"######\", \"Header 6\"),  \n    (\"#######\", \"Header 7\"), \n    (\"########\", \"Header 8\")\n]\ntext_splitter = MarkdownHeaderTextSplitter(headers_to_split_on=headers_to_split_on)\n\ndocs_string = docs[0].page_content\ndocs_result = text_splitter.split_text(docs_string)\n\nprint(\"Length of splits: \" + str(len(docs_result)))
\n

 

\n

 

\n

Lets initialize the model of both Azure OpenAI GPT and Azure OpenAI Embedding.

\n

 

\n

 

\n
from langchain_openai import AzureOpenAIEmbeddings\nfrom langchain_community.vectorstores import FAISS\nfrom langchain_openai import AzureChatOpenAI\nfrom langchain import hub\nfrom langchain_core.output_parsers import StrOutputParser\nfrom langchain_core.runnables import RunnablePassthrough\n\nllm = AzureChatOpenAI(api_key = os.environ[\"AZURE_OPENAI_API_KEY\"],  \n                      api_version = \"2023-12-01-preview\",\n                      azure_endpoint = os.environ[\"AZURE_OPENAI_ENDPOINT\"],\n                      model= \"gpt-4-1106-preview\",\n                      streaming=True)\n\naoai_embeddings = AzureOpenAIEmbeddings(\n    api_key=os.getenv(\"AZURE_OPENAI_API_KEY\"),\n    azure_deployment=\"text-embedding-ada-002\",\n    openai_api_version=\"2023-12-01-preview\",\n    azure_endpoint =os.environ[\"AZURE_OPENAI_ENDPOINT\"]\n)
\n

 

\n

 

\n

Now lets create an index and store the embeddings into AI Search.

\n

 

\n

 

\n
from langchain_community.vectorstores.azuresearch import AzureSearch\nfrom langchain_openai import AzureOpenAIEmbeddings, OpenAIEmbeddings\n\nindex_name: str = \"langchain-vector-demo\"\nvector_store: AzureSearch = AzureSearch(\n    azure_search_endpoint=vector_store_address,\n    azure_search_key=vector_store_password,\n    index_name=index_name,\n    embedding_function=aoai_embeddings.embed_query,\n)
\n

 

\n

 

\n

Finally lets create our RAG Chain. Here I have used a simple Retrieval but you can make complex Retrieval as well to Retrieve both Image and Text.

\n

 

\n

 

\n
def format_docs(docs):\n    return \"\\n\\n\".join(doc.page_content for doc in docs)\n\nretriever_base = index.as_retriever(search_type=\"similarity\",search_kwargs = {\"k\" : 5})\n\nrag_chain_from_docs = (\n    {\n        \"context\": lambda input: format_docs(input[\"documents\"]),\n        \"question\": itemgetter(\"question\"),\n    }\n    | prompt\n    | llm\n    | StrOutputParser()\n)\nrag_chain_with_source = RunnableMap(\n    {\"documents\": retriever_base, \"question\": RunnablePassthrough()}\n) | {\n    \"documents\": lambda input: [doc.metadata for doc in input[\"documents\"]],\n    \"answer\": rag_chain_from_docs,\n}
\n

 

\n

 

\n

 

\n

Now lets make into action, lets take the below PDF example and ask question from Plot.

\n

 

\n

\n

 

\n

 

\n

 

\n

Here I will ask a question from the Plot from this page. As you see i get the correct response along with citations too.

\n

 

\n

\n

 

\n

Hope you like the blog. Please clap and follow me if you like to read more such blogs coming soon.

\n

 

","body@stringLength":"33467","rawBody":"

The advent of Retrieval-Augmented Generation (RAG) models has been a significant milestone in the field of Natural Language Processing (NLP). These models combine the power of information retrieval with generative language models to produce answers that are not just accurate but also contextually enriched. However, as the digital universe expands beyond textual data, incorporating image understanding and hierarchical document structure analysis into RAG systems is becoming increasingly crucial. This article explores how these two elements can significantly enhance the capabilities of RAG models.

\n

 

\n

Understanding RAG Models

\n

Before diving into the nuances of image understanding and document analysis, let’s briefly touch upon the essence of RAG models. These systems work by first retrieving relevant documents from a vast corpus and then using a generative model to synthesize information into a coherent response. The retrieval component ensures that the model has access to accurate and up-to-date information, while the generative component allows for the creation of human-like text.

\n

 

\n

Image Understanding and Structure Analysis

\n

The Challenge

\n

One of the most significant limitations of traditional RAG models is their inability to understand and interpret visual data. In a world where images accompany textual information ubiquitously, this represents a substantial gap in the model’s comprehension abilities. Documents are not just strings of text; they have structure — sections, subsections, paragraphs, and lists — all of which convey semantic importance. Traditional RAG models often overlook this hierarchical structure, potentially missing out on understanding the document’s full meaning.

\n

The Solution

\n

To bridge this gap, RAG models can be augmented with Computer Vision (CV) capabilities. This involves integrating image recognition and understanding modules that can analyze visual data, extract relevant information, and convert it into a textual format that the RAG model can process. Incorporating hierarchical document structure analysis involves teaching RAG models to recognize and interpret the underlying structure of documents.

\n

 

\n

\n

 

\n

Implementation

\n\n

 

\n

In this blog we will look at how do we implement this using Azure Document Intelligence, LangChain and Azure OpenAI.

\n

Prerequisites

\n

Before we implement this we will require some prerequisites

\n\n

Once we have the above information , lets get started !!

\n

Let’s import the required libraries.

\n

 

\n

 

\nimport os\nfrom dotenv import load_dotenv\nload_dotenv('azure.env')\n\nfrom langchain import hub\nfrom langchain_openai import AzureChatOpenAI\n#from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader\nfrom doc_intelligence import AzureAIDocumentIntelligenceLoader\nfrom langchain_openai import AzureOpenAIEmbeddings\nfrom langchain.schema import StrOutputParser\nfrom langchain.schema.runnable import RunnablePassthrough\nfrom langchain.text_splitter import MarkdownHeaderTextSplitter\nfrom langchain.vectorstores.azuresearch import AzureSearch\nfrom azure.ai.documentintelligence.models import DocumentAnalysisFeature\n

 

\n

 

\n

Now we are going to write some custom function on top of LangChain Document Loader which can help us Load the PDF document. First thing we do is using Azure Document Intelligence which has the this beautiful feature of converting Image to Markdown format. Lets use the same.

\n

 

\n

 

\nimport logging\nfrom typing import Any, Iterator, List, Optional\nimport os\nfrom langchain_core.documents import Document\nfrom langchain_community.document_loaders.base import BaseLoader\nfrom langchain_community.document_loaders.base import BaseBlobParser\nfrom langchain_community.document_loaders.blob_loaders import Blob\n\nlogger = logging.getLogger(__name__)\n\nclass AzureAIDocumentIntelligenceLoader(BaseLoader):\n \"\"\"Loads a PDF with Azure Document Intelligence\"\"\"\n\n def __init__(\n self,\n api_endpoint: str,\n api_key: str,\n file_path: Optional[str] = None,\n url_path: Optional[str] = None,\n api_version: Optional[str] = None,\n api_model: str = \"prebuilt-layout\",\n mode: str = \"markdown\",\n *,\n analysis_features: Optional[List[str]] = None,\n ) -> None:\n \"\"\"\n Initialize the object for file processing with Azure Document Intelligence\n (formerly Form Recognizer).\n\n This constructor initializes a AzureAIDocumentIntelligenceParser object to be\n used for parsing files using the Azure Document Intelligence API. The load\n method generates Documents whose content representations are determined by the\n mode parameter.\n\n Parameters:\n -----------\n api_endpoint: str\n The API endpoint to use for DocumentIntelligenceClient construction.\n api_key: str\n The API key to use for DocumentIntelligenceClient construction.\n file_path : Optional[str]\n The path to the file that needs to be loaded.\n Either file_path or url_path must be specified.\n url_path : Optional[str]\n The URL to the file that needs to be loaded.\n Either file_path or url_path must be specified.\n api_version: Optional[str]\n The API version for DocumentIntelligenceClient. Setting None to use\n the default value from `azure-ai-documentintelligence` package.\n api_model: str\n Unique document model name. Default value is \"prebuilt-layout\".\n Note that overriding this default value may result in unsupported\n behavior.\n mode: Optional[str]\n The type of content representation of the generated Documents.\n Use either \"single\", \"page\", or \"markdown\". Default value is \"markdown\".\n analysis_features: Optional[List[str]]\n List of optional analysis features, each feature should be passed\n as a str that conforms to the enum `DocumentAnalysisFeature` in\n `azure-ai-documentintelligence` package. Default value is None.\n\n Examples:\n ---------\n >>> obj = AzureAIDocumentIntelligenceLoader(\n ... file_path=\"path/to/file\",\n ... api_endpoint=\"https://endpoint.azure.com\",\n ... api_key=\"APIKEY\",\n ... api_version=\"2023-10-31-preview\",\n ... api_model=\"prebuilt-layout\",\n ... mode=\"markdown\"\n ... )\n \"\"\"\n\n assert (\n file_path is not None or url_path is not None\n ), \"file_path or url_path must be provided\"\n self.file_path = file_path\n self.url_path = url_path\n\n self.parser = AzureAIDocumentIntelligenceParser(\n api_endpoint=api_endpoint,\n api_key=api_key,\n api_version=api_version,\n api_model=api_model,\n mode=mode,\n analysis_features=analysis_features,\n )\n\n def lazy_load(\n self,\n ) -> Iterator[Document]:\n \"\"\"Lazy load given path as pages.\"\"\"\n if self.file_path is not None:\n yield from self.parser.parse(self.file_path)\n else:\n yield from self.parser.parse_url(self.url_path)\n

 

\n

 

\n

Now lets define the Document Parser for the same. This document Parser will internally call the intended to load and parse PDF files using Azure's Document Intelligence service (formerly known as Azure Forms Recognizer). This service uses machine learning models to extract text, key-value pairs, and tables from documents. 

\n
lazy_parse- A method that lazily parses a given file, meaning it starts processing the file and yielding results as they become available rather than waiting for the entire file to be processed.
\n
 
\n
class AzureAIDocumentIntelligenceParser(BaseBlobParser):\n \"\"\"Loads a PDF with Azure Document Intelligence\n (formerly Forms Recognizer).\"\"\"\n\n def __init__(\n self,\n api_endpoint: str,\n api_key: str,\n api_version: Optional[str] = None,\n api_model: str = \"prebuilt-layout\",\n mode: str = \"markdown\",\n analysis_features: Optional[List[str]] = None,\n ):\n from azure.ai.documentintelligence import DocumentIntelligenceClient\n from azure.ai.documentintelligence.models import DocumentAnalysisFeature\n from azure.core.credentials import AzureKeyCredential\n\n kwargs = {}\n if api_version is not None:\n kwargs[\"api_version\"] = api_version\n\n if analysis_features is not None:\n _SUPPORTED_FEATURES = [\n DocumentAnalysisFeature.OCR_HIGH_RESOLUTION,\n ]\n\n analysis_features = [\n DocumentAnalysisFeature(feature) for feature in analysis_features\n ]\n if any(\n [feature not in _SUPPORTED_FEATURES for feature in analysis_features]\n ):\n logger.warning(\n f\"The current supported features are: \"\n f\"{[f.value for f in _SUPPORTED_FEATURES]}. \"\n \"Using other features may result in unexpected behavior.\"\n )\n\n self.client = DocumentIntelligenceClient(\n endpoint=api_endpoint,\n credential=AzureKeyCredential(api_key),\n headers={\"x-ms-useragent\": \"langchain-parser/1.0.0\"},\n features=analysis_features,\n **kwargs,\n )\n self.api_model = api_model\n self.mode = mode\n assert self.mode in [\"single\", \"page\", \"markdown\"]\n\n def _generate_docs_page(self, result: Any) -> Iterator[Document]:\n for p in result.pages:\n content = \" \".join([line.content for line in p.lines])\n\n d = Document(\n page_content=content,\n metadata={\n \"page\": p.page_number,\n },\n )\n yield d\n\n def _generate_docs_single(self, file_path: str, result: Any) -> Iterator[Document]:\n md_content = include_figure_in_md(file_path, result)\n yield Document(page_content=md_content, metadata={})\n\n def lazy_parse(self, file_path: str) -> Iterator[Document]:\n \"\"\"Lazily parse the blob.\"\"\"\n blob = Blob.from_path(file_path)\n with blob.as_bytes_io() as file_obj:\n poller = self.client.begin_analyze_document(\n self.api_model,\n file_obj,\n content_type=\"application/octet-stream\",\n output_content_format=\"markdown\" if self.mode == \"markdown\" else \"text\",\n )\n result = poller.result()\n\n if self.mode in [\"single\", \"markdown\"]:\n yield from self._generate_docs_single(file_path, result)\n elif self.mode in [\"page\"]:\n yield from self._generate_docs_page(result)\n else:\n raise ValueError(f\"Invalid mode: {self.mode}\")\n\n def parse_url(self, url: str) -> Iterator[Document]:\n from azure.ai.documentintelligence.models import AnalyzeDocumentRequest\n\n poller = self.client.begin_analyze_document(\n self.api_model,\n AnalyzeDocumentRequest(url_source=url),\n # content_type=\"application/octet-stream\",\n output_content_format=\"markdown\" if self.mode == \"markdown\" else \"text\",\n )\n result = poller.result()\n\n if self.mode in [\"single\", \"markdown\"]:\n yield from self._generate_docs_single(result)\n elif self.mode in [\"page\"]:\n yield from self._generate_docs_page(result)\n else:\n raise ValueError(f\"Invalid mode: {self.mode}\")
\n

 

\n

If you look at this LangChain document parser I have included a method called include_figure_in_md. This method goes through the markdown content and look for all figures and replace each figure with the description of the same.

\n

Before the same please lets write some utility method which can help you crop image from the document PDF/Image.

\n

 

\n

 

\nfrom PIL import Image\nimport fitz # PyMuPDF\nimport mimetypes\n\nimport base64\nfrom mimetypes import guess_type\n\n# Function to encode a local image into data URL \ndef local_image_to_data_url(image_path):\n # Guess the MIME type of the image based on the file extension\n mime_type, _ = guess_type(image_path)\n if mime_type is None:\n mime_type = 'application/octet-stream' # Default MIME type if none is found\n\n # Read and encode the image file\n with open(image_path, \"rb\") as image_file:\n base64_encoded_data = base64.b64encode(image_file.read()).decode('utf-8')\n\n # Construct the data URL\n return f\"data:{mime_type};base64,{base64_encoded_data}\"\n\ndef crop_image_from_image(image_path, page_number, bounding_box):\n \"\"\"\n Crops an image based on a bounding box.\n\n :param image_path: Path to the image file.\n :param page_number: The page number of the image to crop (for TIFF format).\n :param bounding_box: A tuple of (left, upper, right, lower) coordinates for the bounding box.\n :return: A cropped image.\n :rtype: PIL.Image.Image\n \"\"\"\n with Image.open(image_path) as img:\n if img.format == \"TIFF\":\n # Open the TIFF image\n img.seek(page_number)\n img = img.copy()\n \n # The bounding box is expected to be in the format (left, upper, right, lower).\n cropped_image = img.crop(bounding_box)\n return cropped_image\n\ndef crop_image_from_pdf_page(pdf_path, page_number, bounding_box):\n \"\"\"\n Crops a region from a given page in a PDF and returns it as an image.\n\n :param pdf_path: Path to the PDF file.\n :param page_number: The page number to crop from (0-indexed).\n :param bounding_box: A tuple of (x0, y0, x1, y1) coordinates for the bounding box.\n :return: A PIL Image of the cropped area.\n \"\"\"\n doc = fitz.open(pdf_path)\n page = doc.load_page(page_number)\n \n # Cropping the page. The rect requires the coordinates in the format (x0, y0, x1, y1).\n bbx = [x * 72 for x in bounding_box]\n rect = fitz.Rect(bbx)\n pix = page.get_pixmap(matrix=fitz.Matrix(300/72, 300/72), clip=rect)\n \n img = Image.frombytes(\"RGB\", [pix.width, pix.height], pix.samples)\n \n doc.close()\n\n return img\n\ndef crop_image_from_file(file_path, page_number, bounding_box):\n \"\"\"\n Crop an image from a file.\n\n Args:\n file_path (str): The path to the file.\n page_number (int): The page number (for PDF and TIFF files, 0-indexed).\n bounding_box (tuple): The bounding box coordinates in the format (x0, y0, x1, y1).\n\n Returns:\n A PIL Image of the cropped area.\n \"\"\"\n mime_type = mimetypes.guess_type(file_path)[0]\n \n if mime_type == \"application/pdf\":\n return crop_image_from_pdf_page(file_path, page_number, bounding_box)\n else:\n return crop_image_from_image(file_path, page_number, bounding_box)\n

 

\n

 

\n

Next we write a method where image can be passed to GPT-4-Vision model and get the description of this image.

\n

 

\n

 

\nfrom openai import AzureOpenAI\naoai_api_base = os.getenv(\"AZURE_OPENAI_ENDPOINT\")\naoai_api_key= os.getenv(\"AZURE_OPENAI_API_KEY\")\naoai_deployment_name = 'gpt-4-vision' # your model deployment name for GPT-4V\naoai_api_version = '2024-02-15-preview' # this might change in the future\n\nMAX_TOKENS = 2000\n\ndef understand_image_with_gptv(image_path, caption):\n \"\"\"\n Generates a description for an image using the GPT-4V model.\n\n Parameters:\n - api_base (str): The base URL of the API.\n - api_key (str): The API key for authentication.\n - deployment_name (str): The name of the deployment.\n - api_version (str): The version of the API.\n - image_path (str): The path to the image file.\n - caption (str): The caption for the image.\n\n Returns:\n - img_description (str): The generated description for the image.\n \"\"\"\n client = AzureOpenAI(\n api_key=aoai_api_key, \n api_version=aoai_api_version,\n base_url=f\"{aoai_api_base}/openai/deployments/{aoai_deployment_name}\"\n )\n\n data_url = local_image_to_data_url(image_path)\n response = client.chat.completions.create(\n model=aoai_deployment_name,\n messages=[\n { \"role\": \"system\", \"content\": \"You are a helpful assistant.\" },\n { \"role\": \"user\", \"content\": [ \n { \n \"type\": \"text\", \n \"text\": f\"Describe this image (note: it has image caption: {caption}):\" if caption else \"Describe this image:\"\n },\n { \n \"type\": \"image_url\",\n \"image_url\": {\n \"url\": data_url\n }\n }\n ] } \n ],\n max_tokens=2000\n )\n img_description = response.choices[0].message.content\n return img_description\n

 

\n

 

\n

Now once we have the utilities method set we can just import the Document Intelligence Loader and load the document.

\n

 

\n

 

\nfrom langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader\nloader = AzureAIDocumentIntelligenceLoader(file_path='sample.pdf', \n api_key = os.getenv(\"AZURE_DOCUMENT_INTELLIGENCE_KEY\"), \n api_endpoint = os.getenv(\"AZURE_DOCUMENT_INTELLIGENCE_ENDPOINT\"),\n api_model=\"prebuilt-layout\",\n api_version=\"2024-02-29-preview\",\n mode='markdown',\n analysis_features = [DocumentAnalysisFeature.OCR_HIGH_RESOLUTION])\ndocs = loader.load()\n

 

\n

 

\n

Semantic chunking is a powerful technique used in natural language processing that involves breaking down large pieces of text into smaller, thematically consistent segments or “chunks” that are semantically coherent. The primary goal of semantic chunking is to capture and preserve the inherent meaning within the text, allowing each chunk to contain as much semantically independent information as possible. This process is critically important for various language model applications, such as embedding models and retrieval-augmented generation (RAG), because it helps overcome limitations associated with processing long sequences of text. By ensuring that the data fed into language models (LLMs) is thematically and contextually coherent, semantic chunking enhances the model’s ability to interpret and generate relevant and accurate responses.

\n

 

\n

\n

 

\n

 

\n

Additionally, it improves the efficiency of information retrieval from vector databases by enabling the retrieval of highly relevant information that aligns closely with the user’s intent, thereby reducing noise and maintaining semantic integrity. In essence, semantic chunking serves as a bridge between large volumes of text data and the effective processing capabilities of advanced language models, making it a cornerstone of efficient and meaningful natural language understanding and generation.

\n

 

\n

Lets look at the Markdown Header Splitter to split the document based on the header.

\n

 

\n

 

\n# Split the document into chunks base on markdown headers.\nheaders_to_split_on = [\n (\"#\", \"Header 1\"),\n (\"##\", \"Header 2\"),\n (\"###\", \"Header 3\"),\n (\"####\", \"Header 4\"),\n (\"#####\", \"Header 5\"),\n (\"######\", \"Header 6\"), \n (\"#######\", \"Header 7\"), \n (\"########\", \"Header 8\")\n]\ntext_splitter = MarkdownHeaderTextSplitter(headers_to_split_on=headers_to_split_on)\n\ndocs_string = docs[0].page_content\ndocs_result = text_splitter.split_text(docs_string)\n\nprint(\"Length of splits: \" + str(len(docs_result)))\n

 

\n

 

\n

Lets initialize the model of both Azure OpenAI GPT and Azure OpenAI Embedding.

\n

 

\n

 

\nfrom langchain_openai import AzureOpenAIEmbeddings\nfrom langchain_community.vectorstores import FAISS\nfrom langchain_openai import AzureChatOpenAI\nfrom langchain import hub\nfrom langchain_core.output_parsers import StrOutputParser\nfrom langchain_core.runnables import RunnablePassthrough\n\nllm = AzureChatOpenAI(api_key = os.environ[\"AZURE_OPENAI_API_KEY\"], \n api_version = \"2023-12-01-preview\",\n azure_endpoint = os.environ[\"AZURE_OPENAI_ENDPOINT\"],\n model= \"gpt-4-1106-preview\",\n streaming=True)\n\naoai_embeddings = AzureOpenAIEmbeddings(\n api_key=os.getenv(\"AZURE_OPENAI_API_KEY\"),\n azure_deployment=\"text-embedding-ada-002\",\n openai_api_version=\"2023-12-01-preview\",\n azure_endpoint =os.environ[\"AZURE_OPENAI_ENDPOINT\"]\n)\n

 

\n

 

\n

Now lets create an index and store the embeddings into AI Search.

\n

 

\n

 

\nfrom langchain_community.vectorstores.azuresearch import AzureSearch\nfrom langchain_openai import AzureOpenAIEmbeddings, OpenAIEmbeddings\n\nindex_name: str = \"langchain-vector-demo\"\nvector_store: AzureSearch = AzureSearch(\n azure_search_endpoint=vector_store_address,\n azure_search_key=vector_store_password,\n index_name=index_name,\n embedding_function=aoai_embeddings.embed_query,\n)\n

 

\n

 

\n

Finally lets create our RAG Chain. Here I have used a simple Retrieval but you can make complex Retrieval as well to Retrieve both Image and Text.

\n

 

\n

 

\ndef format_docs(docs):\n return \"\\n\\n\".join(doc.page_content for doc in docs)\n\nretriever_base = index.as_retriever(search_type=\"similarity\",search_kwargs = {\"k\" : 5})\n\nrag_chain_from_docs = (\n {\n \"context\": lambda input: format_docs(input[\"documents\"]),\n \"question\": itemgetter(\"question\"),\n }\n | prompt\n | llm\n | StrOutputParser()\n)\nrag_chain_with_source = RunnableMap(\n {\"documents\": retriever_base, \"question\": RunnablePassthrough()}\n) | {\n \"documents\": lambda input: [doc.metadata for doc in input[\"documents\"]],\n \"answer\": rag_chain_from_docs,\n}\n

 

\n

 

\n

 

\n

Now lets make into action, lets take the below PDF example and ask question from Plot.

\n

 

\n

\n

 

\n

 

\n

 

\n

Here I will ask a question from the Plot from this page. As you see i get the correct response along with citations too.

\n

 

\n

\n

 

\n

Hope you like the blog. Please clap and follow me if you like to read more such blogs coming soon.

\n

 

","kudosSumWeight":8,"postTime":"2024-04-18T22:25:20.840-07:00","images":{"__typename":"AssociatedImageConnection","edges":[{"__typename":"AssociatedImageEdge","cursor":"MjUuMXwyLjF8b3wyNXxfTlZffDE","node":{"__ref":"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzMWlEQzVFQTJFRTA5MjE4NkUx?revision=2\"}"}},{"__typename":"AssociatedImageEdge","cursor":"MjUuMXwyLjF8b3wyNXxfTlZffDI","node":{"__ref":"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzMmkxMzFERjRDMjdGQUNGMDFB?revision=2\"}"}},{"__typename":"AssociatedImageEdge","cursor":"MjUuMXwyLjF8b3wyNXxfTlZffDM","node":{"__ref":"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzM2kzNURBMDZBOUMzRTREMjk4?revision=2\"}"}},{"__typename":"AssociatedImageEdge","cursor":"MjUuMXwyLjF8b3wyNXxfTlZffDQ","node":{"__ref":"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzNGlBMzM0REM2NjczMkQ4RjFF?revision=2\"}"}}],"totalCount":4,"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}},"attachments":{"__typename":"AttachmentConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[]},"tags":{"__typename":"TagConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[{"__typename":"TagEdge","cursor":"MjUuMXwyLjF8b3wxMHxfTlZffDE","node":{"__typename":"Tag","id":"tag:azure ai document intelligence","text":"azure ai document intelligence","time":"2024-01-25T07:54:54.758-08:00","lastActivityTime":null,"messagesCount":null,"followersCount":null}},{"__typename":"TagEdge","cursor":"MjUuMXwyLjF8b3wxMHxfTlZffDI","node":{"__typename":"Tag","id":"tag:azure ai search","text":"azure ai search","time":"2019-12-04T13:04:54.809-08:00","lastActivityTime":null,"messagesCount":null,"followersCount":null}},{"__typename":"TagEdge","cursor":"MjUuMXwyLjF8b3wxMHxfTlZffDM","node":{"__typename":"Tag","id":"tag:azure ai studio","text":"azure ai studio","time":"2023-11-11T00:57:52.231-08:00","lastActivityTime":null,"messagesCount":null,"followersCount":null}},{"__typename":"TagEdge","cursor":"MjUuMXwyLjF8b3wxMHxfTlZffDQ","node":{"__typename":"Tag","id":"tag:azure openai service","text":"azure openai service","time":"2022-12-14T08:49:09.396-08:00","lastActivityTime":null,"messagesCount":null,"followersCount":null}}]},"timeToRead":11,"rawTeaser":"

Struggling with implementing RAG for Document with multiples Tables, Figures , Plots with complex structured scan documents, Looking for a solution?

\n

 

\n

In this blog you will learn implementing end to end solution using Azure AI services including Document Intelligence, AI Search, Azure Open AI and our favorite LangChain 🙂

\n

 

\n

I have taken the advantage of Multimodal capability of Azure OpenAI GPT-4-Vision model for Figures and Plots.

","introduction":"","coverImage":null,"coverImageProperties":{"__typename":"CoverImageProperties","style":"STANDARD","titlePosition":"BOTTOM","altText":""},"currentRevision":{"__ref":"Revision:revision:4118184_2"},"latestVersion":{"__typename":"FriendlyVersion","major":"2","minor":"0"},"metrics":{"__typename":"MessageMetrics","views":16104},"visibilityScope":"PUBLIC","canonicalUrl":null,"seoTitle":null,"seoDescription":null,"placeholder":false,"originalMessageForPlaceholder":null,"contributors":{"__typename":"UserConnection","edges":[]},"nonCoAuthorContributors":{"__typename":"UserConnection","edges":[]},"coAuthors":{"__typename":"UserConnection","edges":[]},"blogMessagePolicies":{"__typename":"BlogMessagePolicies","canDoAuthoringActionsOnBlog":{"__typename":"PolicyResult","failureReason":{"__typename":"FailureReason","message":"error.lithium.policies.blog.action_can_do_authoring_action.accessDenied","key":"error.lithium.policies.blog.action_can_do_authoring_action.accessDenied","args":[]}}},"archivalData":null,"replies":{"__typename":"MessageConnection","edges":[{"__typename":"MessageEdge","cursor":"MjUuMXwyLjF8aXwxMHwxMzI6MHxpbnQsNDI3Mzg2Nyw0MjczODY3","node":{"__ref":"BlogReplyMessage:message:4273867"}},{"__typename":"MessageEdge","cursor":"MjUuMXwyLjF8aXwxMHwxMzI6MHxpbnQsNDI3Mzg2Nyw0MTYwMzA2","node":{"__ref":"BlogReplyMessage:message:4160306"}},{"__typename":"MessageEdge","cursor":"MjUuMXwyLjF8aXwxMHwxMzI6MHxpbnQsNDI3Mzg2Nyw0MTM4MjE2","node":{"__ref":"BlogReplyMessage:message:4138216"}},{"__typename":"MessageEdge","cursor":"MjUuMXwyLjF8aXwxMHwxMzI6MHxpbnQsNDI3Mzg2Nyw0MTM3ODYy","node":{"__ref":"BlogReplyMessage:message:4137862"}},{"__typename":"MessageEdge","cursor":"MjUuMXwyLjF8aXwxMHwxMzI6MHxpbnQsNDI3Mzg2Nyw0MTMzNDgw","node":{"__ref":"BlogReplyMessage:message:4133480"}}],"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}},"customFields":[],"revisions({\"constraints\":{\"isPublished\":{\"eq\":true}},\"first\":1})":{"__typename":"RevisionConnection","totalCount":2}},"Conversation:conversation:4118184":{"__typename":"Conversation","id":"conversation:4118184","solved":false,"topic":{"__ref":"BlogTopicMessage:message:4118184"},"lastPostingActivityTime":"2024-10-18T04:29:39.931-07:00","lastPostTime":"2024-10-18T04:29:39.931-07:00","unreadReplyCount":5,"isSubscribed":false},"ModerationData:moderation_data:4118184":{"__typename":"ModerationData","id":"moderation_data:4118184","status":"APPROVED","rejectReason":null,"isReportedAbuse":false,"rejectUser":null,"rejectTime":null,"rejectActorType":null},"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzMWlEQzVFQTJFRTA5MjE4NkUx?revision=2\"}":{"__typename":"AssociatedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzMWlEQzVFQTJFRTA5MjE4NkUx?revision=2","title":"mrajguru_0-1713503507692.png","associationType":"BODY","width":770,"height":549,"altText":null},"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzMmkxMzFERjRDMjdGQUNGMDFB?revision=2\"}":{"__typename":"AssociatedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzMmkxMzFERjRDMjdGQUNGMDFB?revision=2","title":"mrajguru_1-1713503944554.png","associationType":"BODY","width":770,"height":422,"altText":null},"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzM2kzNURBMDZBOUMzRTREMjk4?revision=2\"}":{"__typename":"AssociatedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzM2kzNURBMDZBOUMzRTREMjk4?revision=2","title":"mrajguru_2-1713504222082.png","associationType":"BODY","width":770,"height":615,"altText":null},"AssociatedImage:{\"url\":\"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzNGlBMzM0REM2NjczMkQ4RjFF?revision=2\"}":{"__typename":"AssociatedImage","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/images/bS00MTE4MTg0LTU3MjUzNGlBMzM0REM2NjczMkQ4RjFF?revision=2","title":"mrajguru_3-1713504247556.png","associationType":"BODY","width":770,"height":215,"altText":null},"Revision:revision:4118184_2":{"__typename":"Revision","id":"revision:4118184_2","lastEditTime":"2024-04-18T22:26:54.811-07:00"},"CachedAsset:theme:customTheme1-1742488897428":{"__typename":"CachedAsset","id":"theme:customTheme1-1742488897428","value":{"id":"customTheme1","animation":{"fast":"150ms","normal":"250ms","slow":"500ms","slowest":"750ms","function":"cubic-bezier(0.07, 0.91, 0.51, 1)","__typename":"AnimationThemeSettings"},"avatar":{"borderRadius":"50%","collections":["default"],"__typename":"AvatarThemeSettings"},"basics":{"browserIcon":{"imageAssetName":"favicon-1730836283320.png","imageLastModified":"1730836286415","__typename":"ThemeAsset"},"customerLogo":{"imageAssetName":"favicon-1730836271365.png","imageLastModified":"1730836274203","__typename":"ThemeAsset"},"maximumWidthOfPageContent":"1300px","oneColumnNarrowWidth":"800px","gridGutterWidthMd":"30px","gridGutterWidthXs":"10px","pageWidthStyle":"WIDTH_OF_BROWSER","__typename":"BasicsThemeSettings"},"buttons":{"borderRadiusSm":"3px","borderRadius":"3px","borderRadiusLg":"5px","paddingY":"5px","paddingYLg":"7px","paddingYHero":"var(--lia-bs-btn-padding-y-lg)","paddingX":"12px","paddingXLg":"16px","paddingXHero":"60px","fontStyle":"NORMAL","fontWeight":"700","textTransform":"NONE","disabledOpacity":0.5,"primaryTextColor":"var(--lia-bs-white)","primaryTextHoverColor":"var(--lia-bs-white)","primaryTextActiveColor":"var(--lia-bs-white)","primaryBgColor":"var(--lia-bs-primary)","primaryBgHoverColor":"hsl(var(--lia-bs-primary-h), var(--lia-bs-primary-s), calc(var(--lia-bs-primary-l) * 0.85))","primaryBgActiveColor":"hsl(var(--lia-bs-primary-h), var(--lia-bs-primary-s), calc(var(--lia-bs-primary-l) * 0.7))","primaryBorder":"1px solid transparent","primaryBorderHover":"1px solid transparent","primaryBorderActive":"1px solid transparent","primaryBorderFocus":"1px solid var(--lia-bs-white)","primaryBoxShadowFocus":"0 0 0 1px var(--lia-bs-primary), 0 0 0 4px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","secondaryTextColor":"var(--lia-bs-gray-900)","secondaryTextHoverColor":"hsl(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), calc(var(--lia-bs-gray-900-l) * 0.95))","secondaryTextActiveColor":"hsl(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), calc(var(--lia-bs-gray-900-l) * 0.9))","secondaryBgColor":"var(--lia-bs-gray-200)","secondaryBgHoverColor":"hsl(var(--lia-bs-gray-200-h), var(--lia-bs-gray-200-s), calc(var(--lia-bs-gray-200-l) * 0.96))","secondaryBgActiveColor":"hsl(var(--lia-bs-gray-200-h), var(--lia-bs-gray-200-s), calc(var(--lia-bs-gray-200-l) * 0.92))","secondaryBorder":"1px solid transparent","secondaryBorderHover":"1px solid transparent","secondaryBorderActive":"1px solid transparent","secondaryBorderFocus":"1px solid transparent","secondaryBoxShadowFocus":"0 0 0 1px var(--lia-bs-primary), 0 0 0 4px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","tertiaryTextColor":"var(--lia-bs-gray-900)","tertiaryTextHoverColor":"hsl(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), calc(var(--lia-bs-gray-900-l) * 0.95))","tertiaryTextActiveColor":"hsl(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), calc(var(--lia-bs-gray-900-l) * 0.9))","tertiaryBgColor":"transparent","tertiaryBgHoverColor":"transparent","tertiaryBgActiveColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.04)","tertiaryBorder":"1px solid transparent","tertiaryBorderHover":"1px solid hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.08)","tertiaryBorderActive":"1px solid transparent","tertiaryBorderFocus":"1px solid transparent","tertiaryBoxShadowFocus":"0 0 0 1px var(--lia-bs-primary), 0 0 0 4px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","destructiveTextColor":"var(--lia-bs-danger)","destructiveTextHoverColor":"hsl(var(--lia-bs-danger-h), var(--lia-bs-danger-s), calc(var(--lia-bs-danger-l) * 0.95))","destructiveTextActiveColor":"hsl(var(--lia-bs-danger-h), var(--lia-bs-danger-s), calc(var(--lia-bs-danger-l) * 0.9))","destructiveBgColor":"var(--lia-bs-gray-200)","destructiveBgHoverColor":"hsl(var(--lia-bs-gray-200-h), var(--lia-bs-gray-200-s), calc(var(--lia-bs-gray-200-l) * 0.96))","destructiveBgActiveColor":"hsl(var(--lia-bs-gray-200-h), var(--lia-bs-gray-200-s), calc(var(--lia-bs-gray-200-l) * 0.92))","destructiveBorder":"1px solid transparent","destructiveBorderHover":"1px solid transparent","destructiveBorderActive":"1px solid transparent","destructiveBorderFocus":"1px solid transparent","destructiveBoxShadowFocus":"0 0 0 1px var(--lia-bs-primary), 0 0 0 4px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","__typename":"ButtonsThemeSettings"},"border":{"color":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.08)","mainContent":"NONE","sideContent":"LIGHT","radiusSm":"3px","radius":"5px","radiusLg":"9px","radius50":"100vw","__typename":"BorderThemeSettings"},"boxShadow":{"xs":"0 0 0 1px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.08), 0 3px 0 -1px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.16)","sm":"0 2px 4px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.12)","md":"0 5px 15px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.3)","lg":"0 10px 30px hsla(var(--lia-bs-gray-900-h), var(--lia-bs-gray-900-s), var(--lia-bs-gray-900-l), 0.3)","__typename":"BoxShadowThemeSettings"},"cards":{"bgColor":"var(--lia-panel-bg-color)","borderRadius":"var(--lia-panel-border-radius)","boxShadow":"var(--lia-box-shadow-xs)","__typename":"CardsThemeSettings"},"chip":{"maxWidth":"300px","height":"30px","__typename":"ChipThemeSettings"},"coreTypes":{"defaultMessageLinkColor":"var(--lia-bs-link-color)","defaultMessageLinkDecoration":"none","defaultMessageLinkFontStyle":"NORMAL","defaultMessageLinkFontWeight":"400","defaultMessageFontStyle":"NORMAL","defaultMessageFontWeight":"400","forumColor":"#4099E2","forumFontFamily":"var(--lia-bs-font-family-base)","forumFontWeight":"var(--lia-default-message-font-weight)","forumLineHeight":"var(--lia-bs-line-height-base)","forumFontStyle":"var(--lia-default-message-font-style)","forumMessageLinkColor":"var(--lia-default-message-link-color)","forumMessageLinkDecoration":"var(--lia-default-message-link-decoration)","forumMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","forumMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","forumSolvedColor":"#148563","blogColor":"#1CBAA0","blogFontFamily":"var(--lia-bs-font-family-base)","blogFontWeight":"var(--lia-default-message-font-weight)","blogLineHeight":"1.75","blogFontStyle":"var(--lia-default-message-font-style)","blogMessageLinkColor":"var(--lia-default-message-link-color)","blogMessageLinkDecoration":"var(--lia-default-message-link-decoration)","blogMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","blogMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","tkbColor":"#4C6B90","tkbFontFamily":"var(--lia-bs-font-family-base)","tkbFontWeight":"var(--lia-default-message-font-weight)","tkbLineHeight":"1.75","tkbFontStyle":"var(--lia-default-message-font-style)","tkbMessageLinkColor":"var(--lia-default-message-link-color)","tkbMessageLinkDecoration":"var(--lia-default-message-link-decoration)","tkbMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","tkbMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","qandaColor":"#4099E2","qandaFontFamily":"var(--lia-bs-font-family-base)","qandaFontWeight":"var(--lia-default-message-font-weight)","qandaLineHeight":"var(--lia-bs-line-height-base)","qandaFontStyle":"var(--lia-default-message-link-font-style)","qandaMessageLinkColor":"var(--lia-default-message-link-color)","qandaMessageLinkDecoration":"var(--lia-default-message-link-decoration)","qandaMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","qandaMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","qandaSolvedColor":"#3FA023","ideaColor":"#FF8000","ideaFontFamily":"var(--lia-bs-font-family-base)","ideaFontWeight":"var(--lia-default-message-font-weight)","ideaLineHeight":"var(--lia-bs-line-height-base)","ideaFontStyle":"var(--lia-default-message-font-style)","ideaMessageLinkColor":"var(--lia-default-message-link-color)","ideaMessageLinkDecoration":"var(--lia-default-message-link-decoration)","ideaMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","ideaMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","contestColor":"#FCC845","contestFontFamily":"var(--lia-bs-font-family-base)","contestFontWeight":"var(--lia-default-message-font-weight)","contestLineHeight":"var(--lia-bs-line-height-base)","contestFontStyle":"var(--lia-default-message-link-font-style)","contestMessageLinkColor":"var(--lia-default-message-link-color)","contestMessageLinkDecoration":"var(--lia-default-message-link-decoration)","contestMessageLinkFontStyle":"ITALIC","contestMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","occasionColor":"#D13A1F","occasionFontFamily":"var(--lia-bs-font-family-base)","occasionFontWeight":"var(--lia-default-message-font-weight)","occasionLineHeight":"var(--lia-bs-line-height-base)","occasionFontStyle":"var(--lia-default-message-font-style)","occasionMessageLinkColor":"var(--lia-default-message-link-color)","occasionMessageLinkDecoration":"var(--lia-default-message-link-decoration)","occasionMessageLinkFontStyle":"var(--lia-default-message-link-font-style)","occasionMessageLinkFontWeight":"var(--lia-default-message-link-font-weight)","grouphubColor":"#333333","categoryColor":"#949494","communityColor":"#FFFFFF","productColor":"#949494","__typename":"CoreTypesThemeSettings"},"colors":{"black":"#000000","white":"#FFFFFF","gray100":"#F7F7F7","gray200":"#F7F7F7","gray300":"#E8E8E8","gray400":"#D9D9D9","gray500":"#CCCCCC","gray600":"#717171","gray700":"#707070","gray800":"#545454","gray900":"#333333","dark":"#545454","light":"#F7F7F7","primary":"#0069D4","secondary":"#333333","bodyText":"#333333","bodyBg":"#FFFFFF","info":"#409AE2","success":"#41C5AE","warning":"#FCC844","danger":"#BC341B","alertSystem":"#FF6600","textMuted":"#707070","highlight":"#FFFCAD","outline":"var(--lia-bs-primary)","custom":["#D3F5A4","#243A5E"],"__typename":"ColorsThemeSettings"},"divider":{"size":"3px","marginLeft":"4px","marginRight":"4px","borderRadius":"50%","bgColor":"var(--lia-bs-gray-600)","bgColorActive":"var(--lia-bs-gray-600)","__typename":"DividerThemeSettings"},"dropdown":{"fontSize":"var(--lia-bs-font-size-sm)","borderColor":"var(--lia-bs-border-color)","borderRadius":"var(--lia-bs-border-radius-sm)","dividerBg":"var(--lia-bs-gray-300)","itemPaddingY":"5px","itemPaddingX":"20px","headerColor":"var(--lia-bs-gray-700)","__typename":"DropdownThemeSettings"},"email":{"link":{"color":"#0069D4","hoverColor":"#0061c2","decoration":"none","hoverDecoration":"underline","__typename":"EmailLinkSettings"},"border":{"color":"#e4e4e4","__typename":"EmailBorderSettings"},"buttons":{"borderRadiusLg":"5px","paddingXLg":"16px","paddingYLg":"7px","fontWeight":"700","primaryTextColor":"#ffffff","primaryTextHoverColor":"#ffffff","primaryBgColor":"#0069D4","primaryBgHoverColor":"#005cb8","primaryBorder":"1px solid transparent","primaryBorderHover":"1px solid transparent","__typename":"EmailButtonsSettings"},"panel":{"borderRadius":"5px","borderColor":"#e4e4e4","__typename":"EmailPanelSettings"},"__typename":"EmailThemeSettings"},"emoji":{"skinToneDefault":"#ffcd43","skinToneLight":"#fae3c5","skinToneMediumLight":"#e2cfa5","skinToneMedium":"#daa478","skinToneMediumDark":"#a78058","skinToneDark":"#5e4d43","__typename":"EmojiThemeSettings"},"heading":{"color":"var(--lia-bs-body-color)","fontFamily":"Segoe UI","fontStyle":"NORMAL","fontWeight":"400","h1FontSize":"34px","h2FontSize":"32px","h3FontSize":"28px","h4FontSize":"24px","h5FontSize":"20px","h6FontSize":"16px","lineHeight":"1.3","subHeaderFontSize":"11px","subHeaderFontWeight":"500","h1LetterSpacing":"normal","h2LetterSpacing":"normal","h3LetterSpacing":"normal","h4LetterSpacing":"normal","h5LetterSpacing":"normal","h6LetterSpacing":"normal","subHeaderLetterSpacing":"2px","h1FontWeight":"var(--lia-bs-headings-font-weight)","h2FontWeight":"var(--lia-bs-headings-font-weight)","h3FontWeight":"var(--lia-bs-headings-font-weight)","h4FontWeight":"var(--lia-bs-headings-font-weight)","h5FontWeight":"var(--lia-bs-headings-font-weight)","h6FontWeight":"var(--lia-bs-headings-font-weight)","__typename":"HeadingThemeSettings"},"icons":{"size10":"10px","size12":"12px","size14":"14px","size16":"16px","size20":"20px","size24":"24px","size30":"30px","size40":"40px","size50":"50px","size60":"60px","size80":"80px","size120":"120px","size160":"160px","__typename":"IconsThemeSettings"},"imagePreview":{"bgColor":"var(--lia-bs-gray-900)","titleColor":"var(--lia-bs-white)","controlColor":"var(--lia-bs-white)","controlBgColor":"var(--lia-bs-gray-800)","__typename":"ImagePreviewThemeSettings"},"input":{"borderColor":"var(--lia-bs-gray-600)","disabledColor":"var(--lia-bs-gray-600)","focusBorderColor":"var(--lia-bs-primary)","labelMarginBottom":"10px","btnFontSize":"var(--lia-bs-font-size-sm)","focusBoxShadow":"0 0 0 3px hsla(var(--lia-bs-primary-h), var(--lia-bs-primary-s), var(--lia-bs-primary-l), 0.2)","checkLabelMarginBottom":"2px","checkboxBorderRadius":"3px","borderRadiusSm":"var(--lia-bs-border-radius-sm)","borderRadius":"var(--lia-bs-border-radius)","borderRadiusLg":"var(--lia-bs-border-radius-lg)","formTextMarginTop":"4px","textAreaBorderRadius":"var(--lia-bs-border-radius)","activeFillColor":"var(--lia-bs-primary)","__typename":"InputThemeSettings"},"loading":{"dotDarkColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.2)","dotLightColor":"hsla(var(--lia-bs-white-h), var(--lia-bs-white-s), var(--lia-bs-white-l), 0.5)","barDarkColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.06)","barLightColor":"hsla(var(--lia-bs-white-h), var(--lia-bs-white-s), var(--lia-bs-white-l), 0.4)","__typename":"LoadingThemeSettings"},"link":{"color":"var(--lia-bs-primary)","hoverColor":"hsl(var(--lia-bs-primary-h), var(--lia-bs-primary-s), calc(var(--lia-bs-primary-l) - 10%))","decoration":"none","hoverDecoration":"underline","__typename":"LinkThemeSettings"},"listGroup":{"itemPaddingY":"15px","itemPaddingX":"15px","borderColor":"var(--lia-bs-gray-300)","__typename":"ListGroupThemeSettings"},"modal":{"contentTextColor":"var(--lia-bs-body-color)","contentBg":"var(--lia-bs-white)","backgroundBg":"var(--lia-bs-black)","smSize":"440px","mdSize":"760px","lgSize":"1080px","backdropOpacity":0.3,"contentBoxShadowXs":"var(--lia-bs-box-shadow-sm)","contentBoxShadow":"var(--lia-bs-box-shadow)","headerFontWeight":"700","__typename":"ModalThemeSettings"},"navbar":{"position":"FIXED","background":{"attachment":null,"clip":null,"color":"var(--lia-bs-white)","imageAssetName":"","imageLastModified":"0","origin":null,"position":"CENTER_CENTER","repeat":"NO_REPEAT","size":"COVER","__typename":"BackgroundProps"},"backgroundOpacity":0.8,"paddingTop":"15px","paddingBottom":"15px","borderBottom":"1px solid var(--lia-bs-border-color)","boxShadow":"var(--lia-bs-box-shadow-sm)","brandMarginRight":"30px","brandMarginRightSm":"10px","brandLogoHeight":"30px","linkGap":"10px","linkJustifyContent":"flex-start","linkPaddingY":"5px","linkPaddingX":"10px","linkDropdownPaddingY":"9px","linkDropdownPaddingX":"var(--lia-nav-link-px)","linkColor":"var(--lia-bs-body-color)","linkHoverColor":"var(--lia-bs-primary)","linkFontSize":"var(--lia-bs-font-size-sm)","linkFontStyle":"NORMAL","linkFontWeight":"400","linkTextTransform":"NONE","linkLetterSpacing":"normal","linkBorderRadius":"var(--lia-bs-border-radius-sm)","linkBgColor":"transparent","linkBgHoverColor":"transparent","linkBorder":"none","linkBorderHover":"none","linkBoxShadow":"none","linkBoxShadowHover":"none","linkTextBorderBottom":"none","linkTextBorderBottomHover":"none","dropdownPaddingTop":"10px","dropdownPaddingBottom":"15px","dropdownPaddingX":"10px","dropdownMenuOffset":"2px","dropdownDividerMarginTop":"10px","dropdownDividerMarginBottom":"10px","dropdownBorderColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.08)","controllerBgHoverColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.1)","controllerIconColor":"var(--lia-bs-body-color)","controllerIconHoverColor":"var(--lia-bs-body-color)","controllerTextColor":"var(--lia-nav-controller-icon-color)","controllerTextHoverColor":"var(--lia-nav-controller-icon-hover-color)","controllerHighlightColor":"hsla(30, 100%, 50%)","controllerHighlightTextColor":"var(--lia-yiq-light)","controllerBorderRadius":"var(--lia-border-radius-50)","hamburgerColor":"var(--lia-nav-controller-icon-color)","hamburgerHoverColor":"var(--lia-nav-controller-icon-color)","hamburgerBgColor":"transparent","hamburgerBgHoverColor":"transparent","hamburgerBorder":"none","hamburgerBorderHover":"none","collapseMenuMarginLeft":"20px","collapseMenuDividerBg":"var(--lia-nav-link-color)","collapseMenuDividerOpacity":0.16,"__typename":"NavbarThemeSettings"},"pager":{"textColor":"var(--lia-bs-link-color)","textFontWeight":"var(--lia-font-weight-md)","textFontSize":"var(--lia-bs-font-size-sm)","__typename":"PagerThemeSettings"},"panel":{"bgColor":"var(--lia-bs-white)","borderRadius":"var(--lia-bs-border-radius)","borderColor":"var(--lia-bs-border-color)","boxShadow":"none","__typename":"PanelThemeSettings"},"popover":{"arrowHeight":"8px","arrowWidth":"16px","maxWidth":"300px","minWidth":"100px","headerBg":"var(--lia-bs-white)","borderColor":"var(--lia-bs-border-color)","borderRadius":"var(--lia-bs-border-radius)","boxShadow":"0 0.5rem 1rem hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.15)","__typename":"PopoverThemeSettings"},"prism":{"color":"#000000","bgColor":"#f5f2f0","fontFamily":"var(--font-family-monospace)","fontSize":"var(--lia-bs-font-size-base)","fontWeightBold":"var(--lia-bs-font-weight-bold)","fontStyleItalic":"italic","tabSize":2,"highlightColor":"#b3d4fc","commentColor":"#62707e","punctuationColor":"#6f6f6f","namespaceOpacity":"0.7","propColor":"#990055","selectorColor":"#517a00","operatorColor":"#906736","operatorBgColor":"hsla(0, 0%, 100%, 0.5)","keywordColor":"#0076a9","functionColor":"#d3284b","variableColor":"#c14700","__typename":"PrismThemeSettings"},"rte":{"bgColor":"var(--lia-bs-white)","borderRadius":"var(--lia-panel-border-radius)","boxShadow":" var(--lia-panel-box-shadow)","customColor1":"#bfedd2","customColor2":"#fbeeb8","customColor3":"#f8cac6","customColor4":"#eccafa","customColor5":"#c2e0f4","customColor6":"#2dc26b","customColor7":"#f1c40f","customColor8":"#e03e2d","customColor9":"#b96ad9","customColor10":"#3598db","customColor11":"#169179","customColor12":"#e67e23","customColor13":"#ba372a","customColor14":"#843fa1","customColor15":"#236fa1","customColor16":"#ecf0f1","customColor17":"#ced4d9","customColor18":"#95a5a6","customColor19":"#7e8c8d","customColor20":"#34495e","customColor21":"#000000","customColor22":"#ffffff","defaultMessageHeaderMarginTop":"40px","defaultMessageHeaderMarginBottom":"20px","defaultMessageItemMarginTop":"0","defaultMessageItemMarginBottom":"10px","diffAddedColor":"hsla(170, 53%, 51%, 0.4)","diffChangedColor":"hsla(43, 97%, 63%, 0.4)","diffNoneColor":"hsla(0, 0%, 80%, 0.4)","diffRemovedColor":"hsla(9, 74%, 47%, 0.4)","specialMessageHeaderMarginTop":"40px","specialMessageHeaderMarginBottom":"20px","specialMessageItemMarginTop":"0","specialMessageItemMarginBottom":"10px","__typename":"RteThemeSettings"},"tags":{"bgColor":"var(--lia-bs-gray-200)","bgHoverColor":"var(--lia-bs-gray-400)","borderRadius":"var(--lia-bs-border-radius-sm)","color":"var(--lia-bs-body-color)","hoverColor":"var(--lia-bs-body-color)","fontWeight":"var(--lia-font-weight-md)","fontSize":"var(--lia-font-size-xxs)","textTransform":"UPPERCASE","letterSpacing":"0.5px","__typename":"TagsThemeSettings"},"toasts":{"borderRadius":"var(--lia-bs-border-radius)","paddingX":"12px","__typename":"ToastsThemeSettings"},"typography":{"fontFamilyBase":"Segoe UI","fontStyleBase":"NORMAL","fontWeightBase":"400","fontWeightLight":"300","fontWeightNormal":"400","fontWeightMd":"500","fontWeightBold":"700","letterSpacingSm":"normal","letterSpacingXs":"normal","lineHeightBase":"1.5","fontSizeBase":"16px","fontSizeXxs":"11px","fontSizeXs":"12px","fontSizeSm":"14px","fontSizeLg":"20px","fontSizeXl":"24px","smallFontSize":"14px","customFonts":[{"source":"SERVER","name":"Segoe UI","styles":[{"style":"NORMAL","weight":"400","__typename":"FontStyleData"},{"style":"NORMAL","weight":"300","__typename":"FontStyleData"},{"style":"NORMAL","weight":"600","__typename":"FontStyleData"},{"style":"NORMAL","weight":"700","__typename":"FontStyleData"},{"style":"ITALIC","weight":"400","__typename":"FontStyleData"}],"assetNames":["SegoeUI-normal-400.woff2","SegoeUI-normal-300.woff2","SegoeUI-normal-600.woff2","SegoeUI-normal-700.woff2","SegoeUI-italic-400.woff2"],"__typename":"CustomFont"},{"source":"SERVER","name":"MWF Fluent Icons","styles":[{"style":"NORMAL","weight":"400","__typename":"FontStyleData"}],"assetNames":["MWFFluentIcons-normal-400.woff2"],"__typename":"CustomFont"}],"__typename":"TypographyThemeSettings"},"unstyledListItem":{"marginBottomSm":"5px","marginBottomMd":"10px","marginBottomLg":"15px","marginBottomXl":"20px","marginBottomXxl":"25px","__typename":"UnstyledListItemThemeSettings"},"yiq":{"light":"#ffffff","dark":"#000000","__typename":"YiqThemeSettings"},"colorLightness":{"primaryDark":0.36,"primaryLight":0.74,"primaryLighter":0.89,"primaryLightest":0.95,"infoDark":0.39,"infoLight":0.72,"infoLighter":0.85,"infoLightest":0.93,"successDark":0.24,"successLight":0.62,"successLighter":0.8,"successLightest":0.91,"warningDark":0.39,"warningLight":0.68,"warningLighter":0.84,"warningLightest":0.93,"dangerDark":0.41,"dangerLight":0.72,"dangerLighter":0.89,"dangerLightest":0.95,"__typename":"ColorLightnessThemeSettings"},"localOverride":false,"__typename":"Theme"},"localOverride":false},"CachedAsset:text:en_US-components/common/EmailVerification-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/common/EmailVerification-1737571274000","value":{"email.verification.title":"Email Verification Required","email.verification.message.update.email":"To participate in the community, you must first verify your email address. The verification email was sent to {email}. To change your email, visit My Settings.","email.verification.message.resend.email":"To participate in the community, you must first verify your email address. The verification email was sent to {email}. Resend email."},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/common/Loading/LoadingDot-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/common/Loading/LoadingDot-1737571274000","value":{"title":"Loading..."},"localOverride":false},"CachedAsset:quilt:o365.prod:pages/blogs/BlogMessagePage:board:Azure-AI-Services-blog-1742488895474":{"__typename":"CachedAsset","id":"quilt:o365.prod:pages/blogs/BlogMessagePage:board:Azure-AI-Services-blog-1742488895474","value":{"id":"BlogMessagePage","container":{"id":"Common","headerProps":{"backgroundImageProps":null,"backgroundColor":null,"addComponents":null,"removeComponents":["community.widget.bannerWidget"],"componentOrder":null,"__typename":"QuiltContainerSectionProps"},"headerComponentProps":{"community.widget.breadcrumbWidget":{"disableLastCrumbForDesktop":false}},"footerProps":null,"footerComponentProps":null,"items":[{"id":"blog-article","layout":"ONE_COLUMN","bgColor":null,"showTitle":null,"showDescription":null,"textPosition":null,"textColor":null,"sectionEditLevel":"LOCKED","bgImage":null,"disableSpacing":null,"edgeToEdgeDisplay":null,"fullHeight":null,"showBorder":null,"__typename":"OneColumnQuiltSection","columnMap":{"main":[{"id":"blogs.widget.blogArticleWidget","className":"lia-blog-container","props":null,"__typename":"QuiltComponent"}],"__typename":"OneSectionColumns"}},{"id":"section-1729184836777","layout":"MAIN_SIDE","bgColor":"transparent","showTitle":false,"showDescription":false,"textPosition":"CENTER","textColor":"var(--lia-bs-body-color)","sectionEditLevel":null,"bgImage":null,"disableSpacing":null,"edgeToEdgeDisplay":null,"fullHeight":null,"showBorder":null,"__typename":"MainSideQuiltSection","columnMap":{"main":[],"side":[{"id":"custom.widget.Social_Sharing","className":null,"props":{"widgetVisibility":"signedInOrAnonymous","useTitle":true,"useBackground":true,"title":"Share","lazyLoad":false},"__typename":"QuiltComponent"}],"__typename":"MainSideSectionColumns"}}],"__typename":"QuiltContainer"},"__typename":"Quilt","localOverride":false},"localOverride":false},"CachedAsset:text:en_US-pages/blogs/BlogMessagePage-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-pages/blogs/BlogMessagePage-1737571274000","value":{"title":"{contextMessageSubject} | {communityTitle}","errorMissing":"This blog post cannot be found","name":"Blog Message Page","section.blog-article.title":"Blog Post","archivedMessageTitle":"This Content Has Been Archived","section.section-1729184836777.title":"","section.section-1729184836777.description":"","section.CncIde.title":"Blog Post","section.tifEmD.description":"","section.tifEmD.title":""},"localOverride":false},"CachedAsset:quiltWrapper:o365.prod:Common:1742488699019":{"__typename":"CachedAsset","id":"quiltWrapper:o365.prod:Common:1742488699019","value":{"id":"Common","header":{"backgroundImageProps":{"assetName":null,"backgroundSize":"COVER","backgroundRepeat":"NO_REPEAT","backgroundPosition":"CENTER_CENTER","lastModified":null,"__typename":"BackgroundImageProps"},"backgroundColor":"transparent","items":[{"id":"community.widget.navbarWidget","props":{"showUserName":true,"showRegisterLink":true,"useIconLanguagePicker":true,"useLabelLanguagePicker":true,"className":"QuiltComponent_lia-component-edit-mode__0nCcm","links":{"sideLinks":[],"mainLinks":[{"children":[],"linkType":"INTERNAL","id":"gxcuf89792","params":{},"routeName":"CommunityPage"},{"children":[],"linkType":"EXTERNAL","id":"external-link","url":"/Directory","target":"SELF"},{"children":[{"linkType":"INTERNAL","id":"microsoft365","params":{"categoryId":"microsoft365"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-teams","params":{"categoryId":"MicrosoftTeams"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"windows","params":{"categoryId":"Windows"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-securityand-compliance","params":{"categoryId":"microsoft-security"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"outlook","params":{"categoryId":"Outlook"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"planner","params":{"categoryId":"Planner"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"windows-server","params":{"categoryId":"Windows-Server"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"azure","params":{"categoryId":"Azure"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"exchange","params":{"categoryId":"Exchange"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-endpoint-manager","params":{"categoryId":"microsoft-endpoint-manager"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"s-q-l-server","params":{"categoryId":"SQL-Server"},"routeName":"CategoryPage"},{"linkType":"EXTERNAL","id":"external-link-2","url":"/Directory","target":"SELF"}],"linkType":"EXTERNAL","id":"communities","url":"/","target":"BLANK"},{"children":[{"linkType":"INTERNAL","id":"education-sector","params":{"categoryId":"EducationSector"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"a-i","params":{"categoryId":"AI"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"i-t-ops-talk","params":{"categoryId":"ITOpsTalk"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"partner-community","params":{"categoryId":"PartnerCommunity"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-mechanics","params":{"categoryId":"MicrosoftMechanics"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"healthcare-and-life-sciences","params":{"categoryId":"HealthcareAndLifeSciences"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"public-sector","params":{"categoryId":"PublicSector"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"io-t","params":{"categoryId":"IoT"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"driving-adoption","params":{"categoryId":"DrivingAdoption"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"s-m-b","params":{"categoryId":"SMB"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"startupsat-microsoft","params":{"categoryId":"StartupsatMicrosoft"},"routeName":"CategoryPage"},{"linkType":"EXTERNAL","id":"external-link-1","url":"/Directory","target":"SELF"}],"linkType":"EXTERNAL","id":"communities-1","url":"/","target":"SELF"},{"children":[],"linkType":"EXTERNAL","id":"external","url":"/Blogs","target":"SELF"},{"children":[],"linkType":"EXTERNAL","id":"external-1","url":"/Events","target":"SELF"},{"children":[{"linkType":"INTERNAL","id":"microsoft-learn-1","params":{"categoryId":"MicrosoftLearn"},"routeName":"CategoryPage"},{"linkType":"INTERNAL","id":"microsoft-learn-blog","params":{"boardId":"MicrosoftLearnBlog","categoryId":"MicrosoftLearn"},"routeName":"BlogBoardPage"},{"linkType":"EXTERNAL","id":"external-10","url":"https://learningroomdirectory.microsoft.com/","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-3","url":"https://docs.microsoft.com/learn/dynamics365/?WT.mc_id=techcom_header-webpage-m365","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-4","url":"https://docs.microsoft.com/learn/m365/?wt.mc_id=techcom_header-webpage-m365","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-5","url":"https://docs.microsoft.com/learn/topics/sci/?wt.mc_id=techcom_header-webpage-m365","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-6","url":"https://docs.microsoft.com/learn/powerplatform/?wt.mc_id=techcom_header-webpage-powerplatform","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-7","url":"https://docs.microsoft.com/learn/github/?wt.mc_id=techcom_header-webpage-github","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-8","url":"https://docs.microsoft.com/learn/teams/?wt.mc_id=techcom_header-webpage-teams","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-9","url":"https://docs.microsoft.com/learn/dotnet/?wt.mc_id=techcom_header-webpage-dotnet","target":"BLANK"},{"linkType":"EXTERNAL","id":"external-2","url":"https://docs.microsoft.com/learn/azure/?WT.mc_id=techcom_header-webpage-m365","target":"BLANK"}],"linkType":"INTERNAL","id":"microsoft-learn","params":{"categoryId":"MicrosoftLearn"},"routeName":"CategoryPage"},{"children":[],"linkType":"INTERNAL","id":"community-info-center","params":{"categoryId":"Community-Info-Center"},"routeName":"CategoryPage"}]},"style":{"boxShadow":"var(--lia-bs-box-shadow-sm)","controllerHighlightColor":"hsla(30, 100%, 50%)","linkFontWeight":"400","dropdownDividerMarginBottom":"10px","hamburgerBorderHover":"none","linkBoxShadowHover":"none","linkFontSize":"14px","backgroundOpacity":0.8,"controllerBorderRadius":"var(--lia-border-radius-50)","hamburgerBgColor":"transparent","hamburgerColor":"var(--lia-nav-controller-icon-color)","linkTextBorderBottom":"none","brandLogoHeight":"30px","linkBgHoverColor":"transparent","linkLetterSpacing":"normal","collapseMenuDividerOpacity":0.16,"dropdownPaddingBottom":"15px","paddingBottom":"15px","dropdownMenuOffset":"2px","hamburgerBgHoverColor":"transparent","borderBottom":"1px solid var(--lia-bs-border-color)","hamburgerBorder":"none","dropdownPaddingX":"10px","brandMarginRightSm":"10px","linkBoxShadow":"none","collapseMenuDividerBg":"var(--lia-nav-link-color)","linkColor":"var(--lia-bs-body-color)","linkJustifyContent":"flex-start","dropdownPaddingTop":"10px","controllerHighlightTextColor":"var(--lia-yiq-dark)","controllerTextColor":"var(--lia-nav-controller-icon-color)","background":{"imageAssetName":"","color":"var(--lia-bs-white)","size":"COVER","repeat":"NO_REPEAT","position":"CENTER_CENTER","imageLastModified":""},"linkBorderRadius":"var(--lia-bs-border-radius-sm)","linkHoverColor":"var(--lia-bs-body-color)","position":"FIXED","linkBorder":"none","linkTextBorderBottomHover":"2px solid var(--lia-bs-body-color)","brandMarginRight":"30px","hamburgerHoverColor":"var(--lia-nav-controller-icon-color)","linkBorderHover":"none","collapseMenuMarginLeft":"20px","linkFontStyle":"NORMAL","controllerTextHoverColor":"var(--lia-nav-controller-icon-hover-color)","linkPaddingX":"10px","linkPaddingY":"5px","paddingTop":"15px","linkTextTransform":"NONE","dropdownBorderColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.08)","controllerBgHoverColor":"hsla(var(--lia-bs-black-h), var(--lia-bs-black-s), var(--lia-bs-black-l), 0.1)","linkBgColor":"transparent","linkDropdownPaddingX":"var(--lia-nav-link-px)","linkDropdownPaddingY":"9px","controllerIconColor":"var(--lia-bs-body-color)","dropdownDividerMarginTop":"10px","linkGap":"10px","controllerIconHoverColor":"var(--lia-bs-body-color)"},"showSearchIcon":false,"languagePickerStyle":"iconAndLabel"},"__typename":"QuiltComponent"},{"id":"community.widget.breadcrumbWidget","props":{"backgroundColor":"transparent","linkHighlightColor":"var(--lia-bs-primary)","visualEffects":{"showBottomBorder":true},"linkTextColor":"var(--lia-bs-gray-700)"},"__typename":"QuiltComponent"},{"id":"custom.widget.community_banner","props":{"widgetVisibility":"signedInOrAnonymous","useTitle":true,"usePageWidth":false,"useBackground":false,"title":"","lazyLoad":false},"__typename":"QuiltComponent"},{"id":"custom.widget.HeroBanner","props":{"widgetVisibility":"signedInOrAnonymous","usePageWidth":false,"useTitle":true,"cMax_items":3,"useBackground":false,"title":"","lazyLoad":false,"widgetChooser":"custom.widget.HeroBanner"},"__typename":"QuiltComponent"}],"__typename":"QuiltWrapperSection"},"footer":{"backgroundImageProps":{"assetName":null,"backgroundSize":"COVER","backgroundRepeat":"NO_REPEAT","backgroundPosition":"CENTER_CENTER","lastModified":null,"__typename":"BackgroundImageProps"},"backgroundColor":"transparent","items":[{"id":"custom.widget.MicrosoftFooter","props":{"widgetVisibility":"signedInOrAnonymous","useTitle":true,"useBackground":false,"title":"","lazyLoad":false},"__typename":"QuiltComponent"}],"__typename":"QuiltWrapperSection"},"__typename":"QuiltWrapper","localOverride":false},"localOverride":false},"CachedAsset:text:en_US-components/common/ActionFeedback-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/common/ActionFeedback-1737571274000","value":{"joinedGroupHub.title":"Welcome","joinedGroupHub.message":"You are now a member of this group and are subscribed to updates.","groupHubInviteNotFound.title":"Invitation Not Found","groupHubInviteNotFound.message":"Sorry, we could not find your invitation to the group. The owner may have canceled the invite.","groupHubNotFound.title":"Group Not Found","groupHubNotFound.message":"The grouphub you tried to join does not exist. It may have been deleted.","existingGroupHubMember.title":"Already Joined","existingGroupHubMember.message":"You are already a member of this group.","accountLocked.title":"Account Locked","accountLocked.message":"Your account has been locked due to multiple failed attempts. Try again in {lockoutTime} minutes.","editedGroupHub.title":"Changes Saved","editedGroupHub.message":"Your group has been updated.","leftGroupHub.title":"Goodbye","leftGroupHub.message":"You are no longer a member of this group and will not receive future updates.","deletedGroupHub.title":"Deleted","deletedGroupHub.message":"The group has been deleted.","groupHubCreated.title":"Group Created","groupHubCreated.message":"{groupHubName} is ready to use","accountClosed.title":"Account Closed","accountClosed.message":"The account has been closed and you will now be redirected to the homepage","resetTokenExpired.title":"Reset Password Link has Expired","resetTokenExpired.message":"Try resetting your password again","invalidUrl.title":"Invalid URL","invalidUrl.message":"The URL you're using is not recognized. Verify your URL and try again.","accountClosedForUser.title":"Account Closed","accountClosedForUser.message":"{userName}'s account is closed","inviteTokenInvalid.title":"Invitation Invalid","inviteTokenInvalid.message":"Your invitation to the community has been canceled or expired.","inviteTokenError.title":"Invitation Verification Failed","inviteTokenError.message":"The url you are utilizing is not recognized. Verify your URL and try again","pageNotFound.title":"Access Denied","pageNotFound.message":"You do not have access to this area of the community or it doesn't exist","eventAttending.title":"Responded as Attending","eventAttending.message":"You'll be notified when there's new activity and reminded as the event approaches","eventInterested.title":"Responded as Interested","eventInterested.message":"You'll be notified when there's new activity and reminded as the event approaches","eventNotFound.title":"Event Not Found","eventNotFound.message":"The event you tried to respond to does not exist.","redirectToRelatedPage.title":"Showing Related Content","redirectToRelatedPageForBaseUsers.title":"Showing Related Content","redirectToRelatedPageForBaseUsers.message":"The content you are trying to access is archived","redirectToRelatedPage.message":"The content you are trying to access is archived","relatedUrl.archivalLink.flyoutMessage":"The content you are trying to access is archived View Archived Content"},"localOverride":false},"CachedAsset:component:custom.widget.community_banner-en-1742488931609":{"__typename":"CachedAsset","id":"component:custom.widget.community_banner-en-1742488931609","value":{"component":{"id":"custom.widget.community_banner","template":{"id":"community_banner","markupLanguage":"HANDLEBARS","style":".community-banner {\n a.top-bar.btn {\n top: 0px;\n width: 100%;\n z-index: 999;\n text-align: center;\n left: 0px;\n background: #0068b8;\n color: white;\n padding: 10px 0px;\n display:block;\n box-shadow:none !important;\n border: none !important;\n border-radius: none !important;\n margin: 0px !important;\n font-size:14px;\n }\n}","texts":null,"defaults":{"config":{"applicablePages":[],"description":"community announcement text","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"components":[{"id":"custom.widget.community_banner","form":null,"config":null,"props":[],"__typename":"Component"}],"grouping":"CUSTOM","__typename":"ComponentTemplate"},"properties":{"config":{"applicablePages":[],"description":"community announcement text","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"form":null,"__typename":"Component","localOverride":false},"globalCss":{"css":".custom_widget_community_banner_community-banner_1a5zb_1 {\n a.custom_widget_community_banner_top-bar_1a5zb_2.custom_widget_community_banner_btn_1a5zb_2 {\n top: 0;\n width: 100%;\n z-index: 999;\n text-align: center;\n left: 0;\n background: #0068b8;\n color: white;\n padding: 0.625rem 0;\n display:block;\n box-shadow:none !important;\n border: none !important;\n border-radius: none !important;\n margin: 0 !important;\n font-size:0.875rem;\n }\n}","tokens":{"community-banner":"custom_widget_community_banner_community-banner_1a5zb_1","top-bar":"custom_widget_community_banner_top-bar_1a5zb_2","btn":"custom_widget_community_banner_btn_1a5zb_2"}},"form":null},"localOverride":false},"CachedAsset:component:custom.widget.HeroBanner-en-1742488931609":{"__typename":"CachedAsset","id":"component:custom.widget.HeroBanner-en-1742488931609","value":{"component":{"id":"custom.widget.HeroBanner","template":{"id":"HeroBanner","markupLanguage":"REACT","style":null,"texts":{"searchPlaceholderText":"Search this community","followActionText":"Follow","unfollowActionText":"Following","searchOnHoverText":"Please enter your search term(s) and then press return key to complete a search."},"defaults":{"config":{"applicablePages":[],"description":null,"fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[{"id":"max_items","dataType":"NUMBER","list":false,"defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"control":"INPUT","__typename":"PropDefinition"}],"__typename":"ComponentProperties"},"components":[{"id":"custom.widget.HeroBanner","form":{"fields":[{"id":"widgetChooser","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"title","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useTitle","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useBackground","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"widgetVisibility","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"moreOptions","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"cMax_items","validation":null,"noValidation":null,"dataType":"NUMBER","list":false,"control":"INPUT","defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"__typename":"FormField"}],"layout":{"rows":[{"id":"widgetChooserGroup","type":"fieldset","as":null,"items":[{"id":"widgetChooser","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"titleGroup","type":"fieldset","as":null,"items":[{"id":"title","className":null,"__typename":"FormFieldRef"},{"id":"useTitle","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"useBackground","type":"fieldset","as":null,"items":[{"id":"useBackground","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"widgetVisibility","type":"fieldset","as":null,"items":[{"id":"widgetVisibility","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"moreOptionsGroup","type":"fieldset","as":null,"items":[{"id":"moreOptions","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"componentPropsGroup","type":"fieldset","as":null,"items":[{"id":"cMax_items","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"}],"actionButtons":null,"className":"custom_widget_HeroBanner_form","formGroupFieldSeparator":"divider","__typename":"FormLayout"},"__typename":"Form"},"config":null,"props":[],"__typename":"Component"}],"grouping":"CUSTOM","__typename":"ComponentTemplate"},"properties":{"config":{"applicablePages":[],"description":null,"fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[{"id":"max_items","dataType":"NUMBER","list":false,"defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"control":"INPUT","__typename":"PropDefinition"}],"__typename":"ComponentProperties"},"form":{"fields":[{"id":"widgetChooser","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"title","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useTitle","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useBackground","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"widgetVisibility","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"moreOptions","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"cMax_items","validation":null,"noValidation":null,"dataType":"NUMBER","list":false,"control":"INPUT","defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"__typename":"FormField"}],"layout":{"rows":[{"id":"widgetChooserGroup","type":"fieldset","as":null,"items":[{"id":"widgetChooser","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"titleGroup","type":"fieldset","as":null,"items":[{"id":"title","className":null,"__typename":"FormFieldRef"},{"id":"useTitle","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"useBackground","type":"fieldset","as":null,"items":[{"id":"useBackground","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"widgetVisibility","type":"fieldset","as":null,"items":[{"id":"widgetVisibility","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"moreOptionsGroup","type":"fieldset","as":null,"items":[{"id":"moreOptions","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"componentPropsGroup","type":"fieldset","as":null,"items":[{"id":"cMax_items","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"}],"actionButtons":null,"className":"custom_widget_HeroBanner_form","formGroupFieldSeparator":"divider","__typename":"FormLayout"},"__typename":"Form"},"__typename":"Component","localOverride":false},"globalCss":null,"form":{"fields":[{"id":"widgetChooser","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"title","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useTitle","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"useBackground","validation":null,"noValidation":null,"dataType":"BOOLEAN","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"widgetVisibility","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"moreOptions","validation":null,"noValidation":null,"dataType":"STRING","list":null,"control":null,"defaultValue":null,"label":null,"description":null,"possibleValues":null,"__typename":"FormField"},{"id":"cMax_items","validation":null,"noValidation":null,"dataType":"NUMBER","list":false,"control":"INPUT","defaultValue":"3","label":"Max Items","description":"The maximum number of items to display in the carousel","possibleValues":null,"__typename":"FormField"}],"layout":{"rows":[{"id":"widgetChooserGroup","type":"fieldset","as":null,"items":[{"id":"widgetChooser","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"titleGroup","type":"fieldset","as":null,"items":[{"id":"title","className":null,"__typename":"FormFieldRef"},{"id":"useTitle","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"useBackground","type":"fieldset","as":null,"items":[{"id":"useBackground","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"widgetVisibility","type":"fieldset","as":null,"items":[{"id":"widgetVisibility","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"moreOptionsGroup","type":"fieldset","as":null,"items":[{"id":"moreOptions","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"},{"id":"componentPropsGroup","type":"fieldset","as":null,"items":[{"id":"cMax_items","className":null,"__typename":"FormFieldRef"}],"props":null,"legend":null,"description":null,"className":null,"viewVariant":null,"toggleState":null,"__typename":"FormFieldset"}],"actionButtons":null,"className":"custom_widget_HeroBanner_form","formGroupFieldSeparator":"divider","__typename":"FormLayout"},"__typename":"Form"}},"localOverride":false},"CachedAsset:component:custom.widget.Social_Sharing-en-1742488931609":{"__typename":"CachedAsset","id":"component:custom.widget.Social_Sharing-en-1742488931609","value":{"component":{"id":"custom.widget.Social_Sharing","template":{"id":"Social_Sharing","markupLanguage":"HANDLEBARS","style":".social-share {\n .sharing-options {\n position: relative;\n margin: 0;\n padding: 0;\n line-height: 10px;\n display: flex;\n justify-content: left;\n gap: 5px;\n list-style-type: none;\n li {\n text-align: left;\n a {\n min-width: 30px;\n min-height: 30px;\n display: block;\n padding: 1px;\n .social-share-linkedin {\n img {\n background-color: rgb(0, 119, 181);\n }\n }\n .social-share-facebook {\n img {\n background-color: rgb(59, 89, 152);\n }\n }\n .social-share-x {\n img {\n background-color: rgb(0, 0, 0);\n }\n }\n .social-share-rss {\n img {\n background-color: rgb(0, 0, 0);\n }\n }\n .social-share-reddit {\n img {\n background-color: rgb(255, 69, 0);\n }\n }\n .social-share-email {\n img {\n background-color: rgb(132, 132, 132);\n }\n }\n }\n a {\n img {\n height: 2rem;\n }\n }\n }\n }\n}\n","texts":null,"defaults":{"config":{"applicablePages":[],"description":"Adds buttons to share to various social media websites","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"components":[{"id":"custom.widget.Social_Sharing","form":null,"config":null,"props":[],"__typename":"Component"}],"grouping":"CUSTOM","__typename":"ComponentTemplate"},"properties":{"config":{"applicablePages":[],"description":"Adds buttons to share to various social media websites","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"form":null,"__typename":"Component","localOverride":false},"globalCss":{"css":".custom_widget_Social_Sharing_social-share_c7xxz_1 {\n .custom_widget_Social_Sharing_sharing-options_c7xxz_2 {\n position: relative;\n margin: 0;\n padding: 0;\n line-height: 0.625rem;\n display: flex;\n justify-content: left;\n gap: 0.3125rem;\n list-style-type: none;\n li {\n text-align: left;\n a {\n min-width: 1.875rem;\n min-height: 1.875rem;\n display: block;\n padding: 0.0625rem;\n .custom_widget_Social_Sharing_social-share-linkedin_c7xxz_18 {\n img {\n background-color: rgb(0, 119, 181);\n }\n }\n .custom_widget_Social_Sharing_social-share-facebook_c7xxz_23 {\n img {\n background-color: rgb(59, 89, 152);\n }\n }\n .custom_widget_Social_Sharing_social-share-x_c7xxz_28 {\n img {\n background-color: rgb(0, 0, 0);\n }\n }\n .custom_widget_Social_Sharing_social-share-rss_c7xxz_33 {\n img {\n background-color: rgb(0, 0, 0);\n }\n }\n .custom_widget_Social_Sharing_social-share-reddit_c7xxz_38 {\n img {\n background-color: rgb(255, 69, 0);\n }\n }\n .custom_widget_Social_Sharing_social-share-email_c7xxz_43 {\n img {\n background-color: rgb(132, 132, 132);\n }\n }\n }\n a {\n img {\n height: 2rem;\n }\n }\n }\n }\n}\n","tokens":{"social-share":"custom_widget_Social_Sharing_social-share_c7xxz_1","sharing-options":"custom_widget_Social_Sharing_sharing-options_c7xxz_2","social-share-linkedin":"custom_widget_Social_Sharing_social-share-linkedin_c7xxz_18","social-share-facebook":"custom_widget_Social_Sharing_social-share-facebook_c7xxz_23","social-share-x":"custom_widget_Social_Sharing_social-share-x_c7xxz_28","social-share-rss":"custom_widget_Social_Sharing_social-share-rss_c7xxz_33","social-share-reddit":"custom_widget_Social_Sharing_social-share-reddit_c7xxz_38","social-share-email":"custom_widget_Social_Sharing_social-share-email_c7xxz_43"}},"form":null},"localOverride":false},"CachedAsset:component:custom.widget.MicrosoftFooter-en-1742488931609":{"__typename":"CachedAsset","id":"component:custom.widget.MicrosoftFooter-en-1742488931609","value":{"component":{"id":"custom.widget.MicrosoftFooter","template":{"id":"MicrosoftFooter","markupLanguage":"HANDLEBARS","style":".context-uhf {\n min-width: 280px;\n font-size: 15px;\n box-sizing: border-box;\n -ms-text-size-adjust: 100%;\n -webkit-text-size-adjust: 100%;\n & *,\n & *:before,\n & *:after {\n box-sizing: inherit;\n }\n a.c-uhff-link {\n color: #616161;\n word-break: break-word;\n text-decoration: none;\n }\n &a:link,\n &a:focus,\n &a:hover,\n &a:active,\n &a:visited {\n text-decoration: none;\n color: inherit;\n }\n & div {\n font-family: 'Segoe UI', SegoeUI, 'Helvetica Neue', Helvetica, Arial, sans-serif;\n }\n}\n.c-uhff {\n background: #f2f2f2;\n margin: -1.5625;\n width: auto;\n height: auto;\n}\n.c-uhff-nav {\n margin: 0 auto;\n max-width: calc(1600px + 10%);\n padding: 0 5%;\n box-sizing: inherit;\n &:before,\n &:after {\n content: ' ';\n display: table;\n clear: left;\n }\n @media only screen and (max-width: 1083px) {\n padding-left: 12px;\n }\n .c-heading-4 {\n color: #616161;\n word-break: break-word;\n font-size: 15px;\n line-height: 20px;\n padding: 36px 0 4px;\n font-weight: 600;\n }\n .c-uhff-nav-row {\n .c-uhff-nav-group {\n display: block;\n float: left;\n min-height: 1px;\n vertical-align: text-top;\n padding: 0 12px;\n width: 100%;\n zoom: 1;\n &:first-child {\n padding-left: 0;\n @media only screen and (max-width: 1083px) {\n padding-left: 12px;\n }\n }\n @media only screen and (min-width: 540px) and (max-width: 1082px) {\n width: 33.33333%;\n }\n @media only screen and (min-width: 1083px) {\n width: 16.6666666667%;\n }\n ul.c-list.f-bare {\n font-size: 11px;\n line-height: 16px;\n margin-top: 0;\n margin-bottom: 0;\n padding-left: 0;\n list-style-type: none;\n li {\n word-break: break-word;\n padding: 8px 0;\n margin: 0;\n }\n }\n }\n }\n}\n.c-uhff-base {\n background: #f2f2f2;\n margin: 0 auto;\n max-width: calc(1600px + 10%);\n padding: 30px 5% 16px;\n &:before,\n &:after {\n content: ' ';\n display: table;\n }\n &:after {\n clear: both;\n }\n a.c-uhff-ccpa {\n font-size: 11px;\n line-height: 16px;\n float: left;\n margin: 3px 0;\n }\n a.c-uhff-ccpa:hover {\n text-decoration: underline;\n }\n ul.c-list {\n font-size: 11px;\n line-height: 16px;\n float: right;\n margin: 3px 0;\n color: #616161;\n li {\n padding: 0 24px 4px 0;\n display: inline-block;\n }\n }\n .c-list.f-bare {\n padding-left: 0;\n list-style-type: none;\n }\n @media only screen and (max-width: 1083px) {\n display: flex;\n flex-wrap: wrap;\n padding: 30px 24px 16px;\n }\n}\n","texts":{"New tab":"What's New","New 1":"Surface Laptop Studio 2","New 2":"Surface Laptop Go 3","New 3":"Surface Pro 9","New 4":"Surface Laptop 5","New 5":"Surface Studio 2+","New 6":"Copilot in Windows","New 7":"Microsoft 365","New 8":"Windows 11 apps","Store tab":"Microsoft Store","Store 1":"Account Profile","Store 2":"Download Center","Store 3":"Microsoft Store Support","Store 4":"Returns","Store 5":"Order tracking","Store 6":"Certified Refurbished","Store 7":"Microsoft Store Promise","Store 8":"Flexible Payments","Education tab":"Education","Edu 1":"Microsoft in education","Edu 2":"Devices for education","Edu 3":"Microsoft Teams for Education","Edu 4":"Microsoft 365 Education","Edu 5":"How to buy for your school","Edu 6":"Educator Training and development","Edu 7":"Deals for students and parents","Edu 8":"Azure for students","Business tab":"Business","Bus 1":"Microsoft Cloud","Bus 2":"Microsoft Security","Bus 3":"Dynamics 365","Bus 4":"Microsoft 365","Bus 5":"Microsoft Power Platform","Bus 6":"Microsoft Teams","Bus 7":"Microsoft Industry","Bus 8":"Small Business","Developer tab":"Developer & IT","Dev 1":"Azure","Dev 2":"Developer Center","Dev 3":"Documentation","Dev 4":"Microsoft Learn","Dev 5":"Microsoft Tech Community","Dev 6":"Azure Marketplace","Dev 7":"AppSource","Dev 8":"Visual Studio","Company tab":"Company","Com 1":"Careers","Com 2":"About Microsoft","Com 3":"Company News","Com 4":"Privacy at Microsoft","Com 5":"Investors","Com 6":"Diversity and inclusion","Com 7":"Accessiblity","Com 8":"Sustainibility"},"defaults":{"config":{"applicablePages":[],"description":"The Microsoft Footer","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"components":[{"id":"custom.widget.MicrosoftFooter","form":null,"config":null,"props":[],"__typename":"Component"}],"grouping":"CUSTOM","__typename":"ComponentTemplate"},"properties":{"config":{"applicablePages":[],"description":"The Microsoft Footer","fetchedContent":null,"__typename":"ComponentConfiguration"},"props":[],"__typename":"ComponentProperties"},"form":null,"__typename":"Component","localOverride":false},"globalCss":{"css":".custom_widget_MicrosoftFooter_context-uhf_f95yq_1 {\n min-width: 17.5rem;\n font-size: 0.9375rem;\n box-sizing: border-box;\n -ms-text-size-adjust: 100%;\n -webkit-text-size-adjust: 100%;\n & *,\n & *:before,\n & *:after {\n box-sizing: inherit;\n }\n a.custom_widget_MicrosoftFooter_c-uhff-link_f95yq_12 {\n color: #616161;\n word-break: break-word;\n text-decoration: none;\n }\n &a:link,\n &a:focus,\n &a:hover,\n &a:active,\n &a:visited {\n text-decoration: none;\n color: inherit;\n }\n & div {\n font-family: 'Segoe UI', SegoeUI, 'Helvetica Neue', Helvetica, Arial, sans-serif;\n }\n}\n.custom_widget_MicrosoftFooter_c-uhff_f95yq_12 {\n background: #f2f2f2;\n margin: -1.5625;\n width: auto;\n height: auto;\n}\n.custom_widget_MicrosoftFooter_c-uhff-nav_f95yq_35 {\n margin: 0 auto;\n max-width: calc(100rem + 10%);\n padding: 0 5%;\n box-sizing: inherit;\n &:before,\n &:after {\n content: ' ';\n display: table;\n clear: left;\n }\n @media only screen and (max-width: 1083px) {\n padding-left: 0.75rem;\n }\n .custom_widget_MicrosoftFooter_c-heading-4_f95yq_49 {\n color: #616161;\n word-break: break-word;\n font-size: 0.9375rem;\n line-height: 1.25rem;\n padding: 2.25rem 0 0.25rem;\n font-weight: 600;\n }\n .custom_widget_MicrosoftFooter_c-uhff-nav-row_f95yq_57 {\n .custom_widget_MicrosoftFooter_c-uhff-nav-group_f95yq_58 {\n display: block;\n float: left;\n min-height: 0.0625rem;\n vertical-align: text-top;\n padding: 0 0.75rem;\n width: 100%;\n zoom: 1;\n &:first-child {\n padding-left: 0;\n @media only screen and (max-width: 1083px) {\n padding-left: 0.75rem;\n }\n }\n @media only screen and (min-width: 540px) and (max-width: 1082px) {\n width: 33.33333%;\n }\n @media only screen and (min-width: 1083px) {\n width: 16.6666666667%;\n }\n ul.custom_widget_MicrosoftFooter_c-list_f95yq_78.custom_widget_MicrosoftFooter_f-bare_f95yq_78 {\n font-size: 0.6875rem;\n line-height: 1rem;\n margin-top: 0;\n margin-bottom: 0;\n padding-left: 0;\n list-style-type: none;\n li {\n word-break: break-word;\n padding: 0.5rem 0;\n margin: 0;\n }\n }\n }\n }\n}\n.custom_widget_MicrosoftFooter_c-uhff-base_f95yq_94 {\n background: #f2f2f2;\n margin: 0 auto;\n max-width: calc(100rem + 10%);\n padding: 1.875rem 5% 1rem;\n &:before,\n &:after {\n content: ' ';\n display: table;\n }\n &:after {\n clear: both;\n }\n a.custom_widget_MicrosoftFooter_c-uhff-ccpa_f95yq_107 {\n font-size: 0.6875rem;\n line-height: 1rem;\n float: left;\n margin: 0.1875rem 0;\n }\n a.custom_widget_MicrosoftFooter_c-uhff-ccpa_f95yq_107:hover {\n text-decoration: underline;\n }\n ul.custom_widget_MicrosoftFooter_c-list_f95yq_78 {\n font-size: 0.6875rem;\n line-height: 1rem;\n float: right;\n margin: 0.1875rem 0;\n color: #616161;\n li {\n padding: 0 1.5rem 0.25rem 0;\n display: inline-block;\n }\n }\n .custom_widget_MicrosoftFooter_c-list_f95yq_78.custom_widget_MicrosoftFooter_f-bare_f95yq_78 {\n padding-left: 0;\n list-style-type: none;\n }\n @media only screen and (max-width: 1083px) {\n display: flex;\n flex-wrap: wrap;\n padding: 1.875rem 1.5rem 1rem;\n }\n}\n","tokens":{"context-uhf":"custom_widget_MicrosoftFooter_context-uhf_f95yq_1","c-uhff-link":"custom_widget_MicrosoftFooter_c-uhff-link_f95yq_12","c-uhff":"custom_widget_MicrosoftFooter_c-uhff_f95yq_12","c-uhff-nav":"custom_widget_MicrosoftFooter_c-uhff-nav_f95yq_35","c-heading-4":"custom_widget_MicrosoftFooter_c-heading-4_f95yq_49","c-uhff-nav-row":"custom_widget_MicrosoftFooter_c-uhff-nav-row_f95yq_57","c-uhff-nav-group":"custom_widget_MicrosoftFooter_c-uhff-nav-group_f95yq_58","c-list":"custom_widget_MicrosoftFooter_c-list_f95yq_78","f-bare":"custom_widget_MicrosoftFooter_f-bare_f95yq_78","c-uhff-base":"custom_widget_MicrosoftFooter_c-uhff-base_f95yq_94","c-uhff-ccpa":"custom_widget_MicrosoftFooter_c-uhff-ccpa_f95yq_107"}},"form":null},"localOverride":false},"CachedAsset:text:en_US-components/community/Breadcrumb-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/community/Breadcrumb-1737571274000","value":{"navLabel":"Breadcrumbs","dropdown":"Additional parent page navigation"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageBanner-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageBanner-1737571274000","value":{"messageMarkedAsSpam":"This post has been marked as spam","messageMarkedAsSpam@board:TKB":"This article has been marked as spam","messageMarkedAsSpam@board:BLOG":"This post has been marked as spam","messageMarkedAsSpam@board:FORUM":"This discussion has been marked as spam","messageMarkedAsSpam@board:OCCASION":"This event has been marked as spam","messageMarkedAsSpam@board:IDEA":"This idea has been marked as spam","manageSpam":"Manage Spam","messageMarkedAsAbuse":"This post has been marked as abuse","messageMarkedAsAbuse@board:TKB":"This article has been marked as abuse","messageMarkedAsAbuse@board:BLOG":"This post has been marked as abuse","messageMarkedAsAbuse@board:FORUM":"This discussion has been marked as abuse","messageMarkedAsAbuse@board:OCCASION":"This event has been marked as abuse","messageMarkedAsAbuse@board:IDEA":"This idea has been marked as abuse","preModCommentAuthorText":"This comment will be published as soon as it is approved","preModCommentModeratorText":"This comment is awaiting moderation","messageMarkedAsOther":"This post has been rejected due to other reasons","messageMarkedAsOther@board:TKB":"This article has been rejected due to other reasons","messageMarkedAsOther@board:BLOG":"This post has been rejected due to other reasons","messageMarkedAsOther@board:FORUM":"This discussion has been rejected due to other reasons","messageMarkedAsOther@board:OCCASION":"This event has been rejected due to other reasons","messageMarkedAsOther@board:IDEA":"This idea has been rejected due to other reasons","messageArchived":"This post was archived on {date}","relatedUrl":"View Related Content","relatedContentText":"Showing related content","archivedContentLink":"View Archived Content"},"localOverride":false},"Category:category:Exchange":{"__typename":"Category","id":"category:Exchange","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Planner":{"__typename":"Category","id":"category:Planner","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Outlook":{"__typename":"Category","id":"category:Outlook","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Community-Info-Center":{"__typename":"Category","id":"category:Community-Info-Center","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:EducationSector":{"__typename":"Category","id":"category:EducationSector","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:DrivingAdoption":{"__typename":"Category","id":"category:DrivingAdoption","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Azure":{"__typename":"Category","id":"category:Azure","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Windows-Server":{"__typename":"Category","id":"category:Windows-Server","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:SQL-Server":{"__typename":"Category","id":"category:SQL-Server","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:MicrosoftTeams":{"__typename":"Category","id":"category:MicrosoftTeams","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:PublicSector":{"__typename":"Category","id":"category:PublicSector","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:microsoft365":{"__typename":"Category","id":"category:microsoft365","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:IoT":{"__typename":"Category","id":"category:IoT","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:HealthcareAndLifeSciences":{"__typename":"Category","id":"category:HealthcareAndLifeSciences","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:SMB":{"__typename":"Category","id":"category:SMB","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:ITOpsTalk":{"__typename":"Category","id":"category:ITOpsTalk","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:microsoft-endpoint-manager":{"__typename":"Category","id":"category:microsoft-endpoint-manager","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:MicrosoftLearn":{"__typename":"Category","id":"category:MicrosoftLearn","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Blog:board:MicrosoftLearnBlog":{"__typename":"Blog","id":"board:MicrosoftLearnBlog","blogPolicies":{"__typename":"BlogPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}},"boardPolicies":{"__typename":"BoardPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:MicrosoftMechanics":{"__typename":"Category","id":"category:MicrosoftMechanics","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:StartupsatMicrosoft":{"__typename":"Category","id":"category:StartupsatMicrosoft","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:PartnerCommunity":{"__typename":"Category","id":"category:PartnerCommunity","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:Windows":{"__typename":"Category","id":"category:Windows","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"Category:category:microsoft-security":{"__typename":"Category","id":"category:microsoft-security","categoryPolicies":{"__typename":"CategoryPolicies","canReadNode":{"__typename":"PolicyResult","failureReason":null}}},"QueryVariables:TopicReplyList:message:4118184:2":{"__typename":"QueryVariables","id":"TopicReplyList:message:4118184:2","value":{"id":"message:4118184","first":10,"sorts":{"postTime":{"direction":"DESC"}},"repliesFirst":3,"repliesFirstDepthThree":1,"repliesSorts":{"postTime":{"direction":"DESC"}},"useAvatar":true,"useAuthorLogin":true,"useAuthorRank":true,"useBody":true,"useKudosCount":true,"useTimeToRead":false,"useMedia":false,"useReadOnlyIcon":false,"useRepliesCount":true,"useSearchSnippet":false,"useAcceptedSolutionButton":false,"useSolvedBadge":false,"useAttachments":false,"attachmentsFirst":5,"useTags":true,"useNodeAncestors":false,"useUserHoverCard":false,"useNodeHoverCard":false,"useModerationStatus":true,"usePreviewSubjectModal":false,"useMessageStatus":true}},"ROOT_MUTATION":{"__typename":"Mutation"},"CachedAsset:text:en_US-components/community/Navbar-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/community/Navbar-1737571274000","value":{"community":"Community Home","inbox":"Inbox","manageContent":"Manage Content","tos":"Terms of Service","forgotPassword":"Forgot Password","themeEditor":"Theme Editor","edit":"Edit Navigation Bar","skipContent":"Skip to content","gxcuf89792":"Tech Community","external-1":"Events","s-m-b":"Small and Medium Businesses","windows-server":"Windows Server","education-sector":"Education Sector","driving-adoption":"Driving Adoption","microsoft-learn":"Microsoft Learn","s-q-l-server":"SQL Server","partner-community":"Microsoft Partner Community","microsoft365":"Microsoft 365","external-9":".NET","external-8":"Teams","external-7":"Github","products-services":"Products","external-6":"Power Platform","communities-1":"Topics","external-5":"Microsoft Security","planner":"Planner","external-4":"Microsoft 365","external-3":"Dynamics 365","azure":"Azure","healthcare-and-life-sciences":"Healthcare and Life Sciences","external-2":"Azure","microsoft-mechanics":"Microsoft Mechanics","microsoft-learn-1":"Community","external-10":"Learning Room Directory","microsoft-learn-blog":"Blog","windows":"Windows","i-t-ops-talk":"ITOps Talk","external-link-1":"View All","microsoft-securityand-compliance":"Microsoft Security","public-sector":"Public Sector","community-info-center":"Lounge","external-link-2":"View All","microsoft-teams":"Microsoft Teams","external":"Blogs","microsoft-endpoint-manager":"Microsoft Intune and Configuration Manager","startupsat-microsoft":"Startups at Microsoft","exchange":"Exchange","a-i":"AI and Machine Learning","io-t":"Internet of Things (IoT)","outlook":"Outlook","external-link":"Community Hubs","communities":"Products"},"localOverride":false},"CachedAsset:text:en_US-components/community/NavbarHamburgerDropdown-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/community/NavbarHamburgerDropdown-1737571274000","value":{"hamburgerLabel":"Side Menu"},"localOverride":false},"CachedAsset:text:en_US-components/community/BrandLogo-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/community/BrandLogo-1737571274000","value":{"logoAlt":"Khoros","themeLogoAlt":"Brand Logo"},"localOverride":false},"CachedAsset:text:en_US-components/community/NavbarTextLinks-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/community/NavbarTextLinks-1737571274000","value":{"more":"More"},"localOverride":false},"CachedAsset:text:en_US-components/authentication/AuthenticationLink-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/authentication/AuthenticationLink-1737571274000","value":{"title.login":"Sign In","title.registration":"Register","title.forgotPassword":"Forgot Password","title.multiAuthLogin":"Sign In"},"localOverride":false},"CachedAsset:text:en_US-components/nodes/NodeLink-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/nodes/NodeLink-1737571274000","value":{"place":"Place {name}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageView/MessageViewStandard-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageView/MessageViewStandard-1737571274000","value":{"anonymous":"Anonymous","author":"{messageAuthorLogin}","authorBy":"{messageAuthorLogin}","board":"{messageBoardTitle}","replyToUser":" to {parentAuthor}","showMoreReplies":"Show More","replyText":"Reply","repliesText":"Replies","markedAsSolved":"Marked as Solved","movedMessagePlaceholder.BLOG":"{count, plural, =0 {This comment has been} other {These comments have been} }","movedMessagePlaceholder.TKB":"{count, plural, =0 {This comment has been} other {These comments have been} }","movedMessagePlaceholder.FORUM":"{count, plural, =0 {This reply has been} other {These replies have been} }","movedMessagePlaceholder.IDEA":"{count, plural, =0 {This comment has been} other {These comments have been} }","movedMessagePlaceholder.OCCASION":"{count, plural, =0 {This comment has been} other {These comments have been} }","movedMessagePlaceholderUrlText":"moved.","messageStatus":"Status: ","statusChanged":"Status changed: {previousStatus} to {currentStatus}","statusAdded":"Status added: {status}","statusRemoved":"Status removed: {status}","labelExpand":"expand replies","labelCollapse":"collapse replies","unhelpfulReason.reason1":"Content is outdated","unhelpfulReason.reason2":"Article is missing information","unhelpfulReason.reason3":"Content is for a different Product","unhelpfulReason.reason4":"Doesn't match what I was searching for"},"localOverride":false},"CachedAsset:text:en_US-components/messages/ThreadedReplyList-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/ThreadedReplyList-1737571274000","value":{"title":"{count, plural, one{# Reply} other{# Replies}}","title@board:BLOG":"{count, plural, one{# Comment} other{# Comments}}","title@board:TKB":"{count, plural, one{# Comment} other{# Comments}}","title@board:IDEA":"{count, plural, one{# Comment} other{# Comments}}","title@board:OCCASION":"{count, plural, one{# Comment} other{# Comments}}","noRepliesTitle":"No Replies","noRepliesTitle@board:BLOG":"No Comments","noRepliesTitle@board:TKB":"No Comments","noRepliesTitle@board:IDEA":"No Comments","noRepliesTitle@board:OCCASION":"No Comments","noRepliesDescription":"Be the first to reply","noRepliesDescription@board:BLOG":"Be the first to comment","noRepliesDescription@board:TKB":"Be the first to comment","noRepliesDescription@board:IDEA":"Be the first to comment","noRepliesDescription@board:OCCASION":"Be the first to comment","messageReadOnlyAlert:BLOG":"Comments have been turned off for this post","messageReadOnlyAlert:TKB":"Comments have been turned off for this article","messageReadOnlyAlert:IDEA":"Comments have been turned off for this idea","messageReadOnlyAlert:FORUM":"Replies have been turned off for this discussion","messageReadOnlyAlert:OCCASION":"Comments have been turned off for this event"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageReplyCallToAction-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageReplyCallToAction-1737571274000","value":{"leaveReply":"Leave a reply...","leaveReply@board:BLOG@message:root":"Leave a comment...","leaveReply@board:TKB@message:root":"Leave a comment...","leaveReply@board:IDEA@message:root":"Leave a comment...","leaveReply@board:OCCASION@message:root":"Leave a comment...","repliesTurnedOff.FORUM":"Replies are turned off for this topic","repliesTurnedOff.BLOG":"Comments are turned off for this topic","repliesTurnedOff.TKB":"Comments are turned off for this topic","repliesTurnedOff.IDEA":"Comments are turned off for this topic","repliesTurnedOff.OCCASION":"Comments are turned off for this topic","infoText":"Stop poking me!"},"localOverride":false},"Rank:rank:37":{"__typename":"Rank","id":"rank:37","position":18,"name":"Copper Contributor","color":"333333","icon":null,"rankStyle":"TEXT"},"User:user:2722531":{"__typename":"User","id":"user:2722531","uid":2722531,"login":"jmohren","biography":null,"registrationData":{"__typename":"RegistrationData","status":null,"registrationTime":"2024-10-18T04:25:05.584-07:00"},"deleted":false,"email":"","avatar":{"__typename":"UserAvatar","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/m_assets/avatars/default/avatar-7.svg?time=0"},"rank":{"__ref":"Rank:rank:37"},"entityType":"USER","eventPath":"community:gxcuf89792/user:2722531"},"ModerationData:moderation_data:4273867":{"__typename":"ModerationData","id":"moderation_data:4273867","status":"APPROVED","rejectReason":null,"isReportedAbuse":false,"rejectUser":null,"rejectTime":null,"rejectActorType":null},"BlogReplyMessage:message:4273867":{"__typename":"BlogReplyMessage","author":{"__ref":"User:user:2722531"},"id":"message:4273867","revisionNum":1,"uid":4273867,"depth":1,"hasGivenKudo":false,"subscribed":false,"board":{"__ref":"Blog:board:Azure-AI-Services-blog"},"parent":{"__ref":"BlogTopicMessage:message:4118184"},"conversation":{"__ref":"Conversation:conversation:4118184"},"subject":"Re: Build Intelligent RAG For Multimodality and Complex Document Structure","moderationData":{"__ref":"ModerationData:moderation_data:4273867"},"body":"

Awesome work mrajguru, are you aware of any plans to include this functionality of extracting images for multimodal rag in the azure-ai-documentintelligence python package soon? Or is creating an own implementation with changes to the code the way to go?

","body@stripHtml({\"removeProcessingText\":false,\"removeSpoilerMarkup\":false,\"removeTocMarkup\":false,\"truncateLength\":200})@stringLength":"203","kudosSumWeight":0,"repliesCount":0,"postTime":"2024-10-18T04:29:39.931-07:00","lastPublishTime":"2024-10-18T04:29:39.931-07:00","metrics":{"__typename":"MessageMetrics","views":587},"visibilityScope":"PUBLIC","placeholder":false,"originalMessageForPlaceholder":null,"entityType":"BLOG_REPLY","eventPath":"category:AI/category:solutions/category:communities/community:gxcuf89792board:Azure-AI-Services-blog/message:4118184/message:4273867","replies":{"__typename":"MessageConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[]},"customFields":[],"attachments":{"__typename":"AttachmentConnection","edges":[],"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}}},"User:user:2508126":{"__typename":"User","id":"user:2508126","uid":2508126,"login":"Sri_Kamali_R","biography":null,"registrationData":{"__typename":"RegistrationData","status":null,"registrationTime":"2024-06-05T01:37:00.977-07:00"},"deleted":false,"email":"","avatar":{"__typename":"UserAvatar","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/m_assets/avatars/default/avatar-9.svg?time=0"},"rank":{"__ref":"Rank:rank:37"},"entityType":"USER","eventPath":"community:gxcuf89792/user:2508126"},"ModerationData:moderation_data:4160306":{"__typename":"ModerationData","id":"moderation_data:4160306","status":"APPROVED","rejectReason":null,"isReportedAbuse":false,"rejectUser":null,"rejectTime":null,"rejectActorType":null},"BlogReplyMessage:message:4160306":{"__typename":"BlogReplyMessage","author":{"__ref":"User:user:2508126"},"id":"message:4160306","revisionNum":1,"uid":4160306,"depth":1,"hasGivenKudo":false,"subscribed":false,"board":{"__ref":"Blog:board:Azure-AI-Services-blog"},"parent":{"__ref":"BlogTopicMessage:message:4118184"},"conversation":{"__ref":"Conversation:conversation:4118184"},"subject":"Re: Build Intelligent RAG For Multimodality and Complex Document Structure","moderationData":{"__ref":"ModerationData:moderation_data:4160306"},"body":"

Thank you for your post. It is very helpful. It would be really helpful if it is available for Australia East region \":smile:\".

","body@stripHtml({\"removeProcessingText\":false,\"removeSpoilerMarkup\":false,\"removeTocMarkup\":false,\"truncateLength\":200})@stringLength":"132","kudosSumWeight":0,"repliesCount":0,"postTime":"2024-06-05T01:39:09.250-07:00","lastPublishTime":"2024-06-05T01:39:09.250-07:00","metrics":{"__typename":"MessageMetrics","views":4206},"visibilityScope":"PUBLIC","placeholder":false,"originalMessageForPlaceholder":null,"entityType":"BLOG_REPLY","eventPath":"category:AI/category:solutions/category:communities/community:gxcuf89792board:Azure-AI-Services-blog/message:4118184/message:4160306","replies":{"__typename":"MessageConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[]},"customFields":[],"attachments":{"__typename":"AttachmentConnection","edges":[],"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}}},"ModerationData:moderation_data:4138216":{"__typename":"ModerationData","id":"moderation_data:4138216","status":"APPROVED","rejectReason":null,"isReportedAbuse":false,"rejectUser":null,"rejectTime":null,"rejectActorType":null},"BlogReplyMessage:message:4138216":{"__typename":"BlogReplyMessage","author":{"__ref":"User:user:2080373"},"id":"message:4138216","revisionNum":1,"uid":4138216,"depth":1,"hasGivenKudo":false,"subscribed":false,"board":{"__ref":"Blog:board:Azure-AI-Services-blog"},"parent":{"__ref":"BlogTopicMessage:message:4118184"},"conversation":{"__ref":"Conversation:conversation:4118184"},"subject":"Re: Build Intelligent RAG For Multimodality and Complex Document Structure","moderationData":{"__ref":"ModerationData:moderation_data:4138216"},"body":"

Thanks I have added the same.

","body@stripHtml({\"removeProcessingText\":false,\"removeSpoilerMarkup\":false,\"removeTocMarkup\":false,\"truncateLength\":200})@stringLength":"31","kudosSumWeight":0,"repliesCount":0,"postTime":"2024-05-12T23:44:14.996-07:00","lastPublishTime":"2024-05-12T23:44:14.996-07:00","metrics":{"__typename":"MessageMetrics","views":5481},"visibilityScope":"PUBLIC","placeholder":false,"originalMessageForPlaceholder":null,"entityType":"BLOG_REPLY","eventPath":"category:AI/category:solutions/category:communities/community:gxcuf89792board:Azure-AI-Services-blog/message:4118184/message:4138216","replies":{"__typename":"MessageConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[]},"customFields":[],"attachments":{"__typename":"AttachmentConnection","edges":[],"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}}},"User:user:2467960":{"__typename":"User","id":"user:2467960","uid":2467960,"login":"davidfungf","biography":null,"registrationData":{"__typename":"RegistrationData","status":null,"registrationTime":"2024-05-12T02:14:47.436-07:00"},"deleted":false,"email":"","avatar":{"__typename":"UserAvatar","url":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/m_assets/avatars/default/avatar-5.svg?time=0"},"rank":{"__ref":"Rank:rank:37"},"entityType":"USER","eventPath":"community:gxcuf89792/user:2467960"},"ModerationData:moderation_data:4137862":{"__typename":"ModerationData","id":"moderation_data:4137862","status":"APPROVED","rejectReason":null,"isReportedAbuse":false,"rejectUser":null,"rejectTime":null,"rejectActorType":null},"BlogReplyMessage:message:4137862":{"__typename":"BlogReplyMessage","author":{"__ref":"User:user:2467960"},"id":"message:4137862","revisionNum":1,"uid":4137862,"depth":1,"hasGivenKudo":false,"subscribed":false,"board":{"__ref":"Blog:board:Azure-AI-Services-blog"},"parent":{"__ref":"BlogTopicMessage:message:4118184"},"conversation":{"__ref":"Conversation:conversation:4118184"},"subject":"Re: Build Intelligent RAG For Multimodality and Complex Document Structure","moderationData":{"__ref":"ModerationData:moderation_data:4137862"},"body":"

Thank you for your post. It is very helpful. I am going to try the code but both src folder and build-docker-image.sh files are not found in the github. 

 

https://github.com/monuminu/AOAI_Samples/tree/main/multimodal_rag

","body@stripHtml({\"removeProcessingText\":false,\"removeSpoilerMarkup\":false,\"removeTocMarkup\":false,\"truncateLength\":200})@stringLength":"213","kudosSumWeight":1,"repliesCount":0,"postTime":"2024-05-12T03:11:14.423-07:00","lastPublishTime":"2024-05-12T03:11:14.423-07:00","metrics":{"__typename":"MessageMetrics","views":5515},"visibilityScope":"PUBLIC","placeholder":false,"originalMessageForPlaceholder":null,"entityType":"BLOG_REPLY","eventPath":"category:AI/category:solutions/category:communities/community:gxcuf89792board:Azure-AI-Services-blog/message:4118184/message:4137862","replies":{"__typename":"MessageConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[]},"customFields":[],"attachments":{"__typename":"AttachmentConnection","edges":[],"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}}},"ModerationData:moderation_data:4133480":{"__typename":"ModerationData","id":"moderation_data:4133480","status":"APPROVED","rejectReason":null,"isReportedAbuse":false,"rejectUser":null,"rejectTime":null,"rejectActorType":null},"BlogReplyMessage:message:4133480":{"__typename":"BlogReplyMessage","author":{"__ref":"User:user:2080373"},"id":"message:4133480","revisionNum":1,"uid":4133480,"depth":1,"hasGivenKudo":false,"subscribed":false,"board":{"__ref":"Blog:board:Azure-AI-Services-blog"},"parent":{"__ref":"BlogTopicMessage:message:4118184"},"conversation":{"__ref":"Conversation:conversation:4118184"},"subject":"Re: Build Intelligent RAG For Multimodality and Complex Document Structure","moderationData":{"__ref":"ModerationData:moderation_data:4133480"},"body":"

Pleaase https://github.com/monuminu/AOAI_Samples/tree/main/multimodal_rag for the github link

","body@stripHtml({\"removeProcessingText\":false,\"removeSpoilerMarkup\":false,\"removeTocMarkup\":false,\"truncateLength\":200})@stringLength":"100","kudosSumWeight":2,"repliesCount":0,"postTime":"2024-05-07T06:59:00.371-07:00","lastPublishTime":"2024-05-07T06:59:00.371-07:00","metrics":{"__typename":"MessageMetrics","views":5808},"visibilityScope":"PUBLIC","placeholder":false,"originalMessageForPlaceholder":null,"entityType":"BLOG_REPLY","eventPath":"category:AI/category:solutions/category:communities/community:gxcuf89792board:Azure-AI-Services-blog/message:4118184/message:4133480","replies":{"__typename":"MessageConnection","pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null},"edges":[]},"customFields":[],"attachments":{"__typename":"AttachmentConnection","edges":[],"pageInfo":{"__typename":"PageInfo","hasNextPage":false,"endCursor":null,"hasPreviousPage":false,"startCursor":null}}},"CachedAsset:text:en_US-components/community/NavbarDropdownToggle-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/community/NavbarDropdownToggle-1737571274000","value":{"ariaLabelClosed":"Press the down arrow to open the menu"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/common/QueryHandler-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/common/QueryHandler-1737571274000","value":{"title":"Query Handler"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageCoverImage-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageCoverImage-1737571274000","value":{"coverImageTitle":"Cover Image"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/nodes/NodeTitle-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/nodes/NodeTitle-1737571274000","value":{"nodeTitle":"{nodeTitle, select, community {Community} other {{nodeTitle}}} "},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageTimeToRead-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageTimeToRead-1737571274000","value":{"minReadText":"{min} MIN READ"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageSubject-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageSubject-1737571274000","value":{"noSubject":"(no subject)"},"localOverride":false},"CachedAsset:text:en_US-components/users/UserLink-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/users/UserLink-1737571274000","value":{"authorName":"View Profile: {author}","anonymous":"Anonymous"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/users/UserRank-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/users/UserRank-1737571274000","value":{"rankName":"{rankName}","userRank":"Author rank {rankName}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageTime-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageTime-1737571274000","value":{"postTime":"Published: {time}","lastPublishTime":"Last Update: {time}","conversation.lastPostingActivityTime":"Last posting activity time: {time}","conversation.lastPostTime":"Last post time: {time}","moderationData.rejectTime":"Rejected time: {time}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageBody-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageBody-1737571274000","value":{"showMessageBody":"Show More","mentionsErrorTitle":"{mentionsType, select, board {Board} user {User} message {Message} other {}} No Longer Available","mentionsErrorMessage":"The {mentionsType} you are trying to view has been removed from the community.","videoProcessing":"Video is being processed. Please try again in a few minutes.","bannerTitle":"Video provider requires cookies to play the video. Accept to continue or {url} it directly on the provider's site.","buttonTitle":"Accept","urlText":"watch"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageCustomFields-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageCustomFields-1737571274000","value":{"CustomField.default.label":"Value of {name}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageRevision-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageRevision-1737571274000","value":{"lastUpdatedDatePublished":"{publishCount, plural, one{Published} other{Updated}} {date}","lastUpdatedDateDraft":"Created {date}","version":"Version {major}.{minor}"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageReplyButton-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageReplyButton-1737571274000","value":{"repliesCount":"{count}","title":"Reply","title@board:BLOG@message:root":"Comment","title@board:TKB@message:root":"Comment","title@board:IDEA@message:root":"Comment","title@board:OCCASION@message:root":"Comment"},"localOverride":false},"CachedAsset:text:en_US-components/messages/MessageAuthorBio-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/messages/MessageAuthorBio-1737571274000","value":{"sendMessage":"Send Message","actionMessage":"Follow this blog board to get notified when there's new activity","coAuthor":"CO-PUBLISHER","contributor":"CONTRIBUTOR","userProfile":"View Profile","iconlink":"Go to {name} {type}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/users/UserAvatar-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/users/UserAvatar-1737571274000","value":{"altText":"{login}'s avatar","altTextGeneric":"User's avatar"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/ranks/UserRankLabel-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/ranks/UserRankLabel-1737571274000","value":{"altTitle":"Icon for {rankName} rank"},"localOverride":false},"CachedAsset:text:en_US-components/users/UserRegistrationDate-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/users/UserRegistrationDate-1737571274000","value":{"noPrefix":"{date}","withPrefix":"Joined {date}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/nodes/NodeAvatar-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/nodes/NodeAvatar-1737571274000","value":{"altTitle":"Node avatar for {nodeTitle}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/nodes/NodeDescription-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/nodes/NodeDescription-1737571274000","value":{"description":"{description}"},"localOverride":false},"CachedAsset:text:en_US-components/tags/TagView/TagViewChip-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-components/tags/TagView/TagViewChip-1737571274000","value":{"tagLabelName":"Tag name {tagName}"},"localOverride":false},"CachedAsset:text:en_US-shared/client/components/nodes/NodeIcon-1737571274000":{"__typename":"CachedAsset","id":"text:en_US-shared/client/components/nodes/NodeIcon-1737571274000","value":{"contentType":"Content Type {style, select, FORUM {Forum} BLOG {Blog} TKB {Knowledge Base} IDEA {Ideas} OCCASION {Events} other {}} icon"},"localOverride":false}}}},"page":"/blogs/BlogMessagePage/BlogMessagePage","query":{"boardId":"azure-ai-services-blog","messageSubject":"build-intelligent-rag-for-multimodality-and-complex-document-structure","messageId":"4118184"},"buildId":"HEhyUrv5OXNBIbfCLaOrw","runtimeConfig":{"buildInformationVisible":false,"logLevelApp":"info","logLevelMetrics":"info","openTelemetryClientEnabled":false,"openTelemetryConfigName":"o365","openTelemetryServiceVersion":"25.1.0","openTelemetryUniverse":"prod","openTelemetryCollector":"http://localhost:4318","openTelemetryRouteChangeAllowedTime":"5000","apolloDevToolsEnabled":false,"inboxMuteWipFeatureEnabled":false},"isFallback":false,"isExperimentalCompile":false,"dynamicIds":["./components/community/Navbar/NavbarWidget.tsx","./components/community/Breadcrumb/BreadcrumbWidget.tsx","./components/customComponent/CustomComponent/CustomComponent.tsx","./components/blogs/BlogArticleWidget/BlogArticleWidget.tsx","./components/external/components/ExternalComponent.tsx","./components/messages/MessageView/MessageViewStandard/MessageViewStandard.tsx","./components/messages/ThreadedReplyList/ThreadedReplyList.tsx","../shared/client/components/common/List/UnstyledList/UnstyledList.tsx","./components/messages/MessageView/MessageView.tsx","../shared/client/components/common/List/UnwrappedList/UnwrappedList.tsx","./components/tags/TagView/TagView.tsx","./components/tags/TagView/TagViewChip/TagViewChip.tsx"],"appGip":true,"scriptLoader":[{"id":"analytics","src":"https://techcommunity.microsoft.com/t5/s/gxcuf89792/pagescripts/1730819800000/analytics.js?page.id=BlogMessagePage&entity.id=board%3Aazure-ai-services-blog&entity.id=message%3A4118184","strategy":"afterInteractive"}]}