Extracting information from unstructured document (e.g., contracts) with Azure Form Recognizer

Published Mar 28 2022 01:43 PM 1,638 Views
Microsoft

Extracting information from unstructured document (e.g., contracts) with Azure Form Recognizer

 

Extracting information from unstructured documents such as contracts is usually manual and involves tediously reading and understanding substantial amounts of documents to find specific information and manually extracting the information to digitize it. The process consumes a significant amount of a lawyer’s billable hours and is prone to human errors.  With Azure Form Recognizer you can automate this process. Azure Form Recognizer uses deep learning models and enables you to train a custom contract model to extract the information you need with just a few sample documents.

 

Introduction to the new Azure Form Recognizer Custom Neural (document) model

Organizations today deal with vast quantities of unstructured documents including contracts, financial or medical reports and publications. Processing these unstructured documents with AI to extract the right fields by relying on semantics improves decision making and time to value.

Neural (Custom document) model is a new deep learned model to extract fields from structured and unstructured documents. The new model shares the same labeling approach as the existing custom form or template models. You can start with just 5 labeled documents to train a model. With a common labeling format, it is easy to take your existing template or custom form project and train a neural or custom document model or start from scratch and label. When dealing with variations, simply add a few samples of each variation to the training dataset as custom document models generalize well across variations.

 

When to use this new capability

Custom neural models or neural models are a deep learned model that combines layout and language features to accurately extract labeled fields from documents. The base custom neural model is trained on various document types that makes it suitable to be trained for extracting fields from structured, semi-structured and unstructured documents. Use the new Custom neural model for training a model on unstructured documents such as contracts, scope of work, letters etc. or to train a model for a variety of documents from the same type with different formats such as paystubs, bank statements etc. to create a single model for all document variations.

 

Getting started is simple

Let's take contracts as an example and dive into the following steps:

 

Step 1 - Azure Blob Storage container

Standard performance Azure Blob Storage account. You will create containers to store and organize your training documents within your storage account. If you do not know how to create an Azure storage account with a container, follow these quick starts:

  1. Create a storage account. When creating your storage account, make sure to select Standard performance in the Instance details → Performance field.
  2. Create a container. When creating your container, set the Public access level field to Container (anonymous read access for containers and blobs) in the New Container window.

Configure CORS

CORS (Cross Origin Resource Sharing) needs to be configured on your Azure storage account for it to be accessible from the Form Recognizer Studio. To configure CORS in the Azure portal, you will need access to the CORS blade of your storage account.

NetaH_0-1648490673524.png

  1. Select the CORS blade for the storage account.
  2. Start by creating a new CORS entry in the Blob service.
  3. Set the Allowed origins to https://formrecognizer.appliedai.azure.com.
  4. Select all the available 8 options for Allowed methods.
  5. Approve all Allowed headers and Exposed headers by entering an * in each field.
  6. Set the Max Age to 120 seconds (about 2 minutes) or any acceptable value.
  7. Click the save button at the top of the page to save the changes.

CORS should now be configured to use the storage account from Form Recognizer Studio.

Step 2 - Sample contracts documents set

  1. Go to the Azure portal and navigate as follows: Your storage account → Data storage → Containers

NetaH_1-1648490814684.png

  1. Select a container from the list.
  2. Select Upload from the menu at the top of the page and upload your training documents to the blob. You will need 5 documents to get started.

NetaH_2-1648490814687.png

 

  1. The Upload blob window will appear.
  2. Select your file(s) to upload.

NetaH_3-1648490814690.png

To train the model you will need 5 contract documents to get started.

 

Step  3- Create a Custom contracts model

To create custom contracts models, you start with configuring your project:

  1. Login to the Azure Form Recognizer Studio https://formrecognizer.appliedai.azure.com
  2. From the Studio home, select the Custom model card to open the Custom model's page.
  3. Use the "Create a project" command to start the new project configuration wizard.
  4. Enter project details, select the Azure subscription and resource, and the Azure Blob storage container that contains your contract data (created in the previous step).
  5. Review and submit your settings to create the project.
  6. Label and tag the data points you would like to extract from the documents.
  7. Select the Train command and
    • enter model name
    • select custom neural (document) model
    • Train the model
  8. Once the model is ready, use the Test command to validate it with the test document which you did not use in your training dataset. 

Contracts Blog images.gif

 

Step 4 - Start coding to analyze documents using the model

After testing your model with sample documents via the Form Recognizer Studio you can now also analyze documents directly from your application using the Form Recognizer REST API or SDK

 

 

!pip install azure-ai-formrecognizer==3.2.0b3
from azure.core.credentials import AzureKeyCredential
from azure.ai.formrecognizer import DocumentAnalysisClient, DocumentModelAdministrationClient
import os
import pandas as pd
endpoint = os.environ["AZURE_FORM_RECOGNIZER_ENDPOINT"]
key = os.environ["AZURE_FORM_RECOGNIZER_KEY"]
 
document_admin_client = DocumentModelAdministrationClient(endpoint=endpoint, credential=AzureKeyCredential(key))
models = document_admin_client.list_models()
for model in models:
    print("{} | {}".format(model.model_id, model.description))
document_analysis_client = DocumentAnalysisClient(
        endpoint=endpoint, credential=AzureKeyCredential(key))
 
path_to_sample_documents = os.path.abspath(
        os.path.join(
            os.getcwd(),
            "sample_contract.pdf",
        )
    )
 
with open(path_to_sample_documents, "rb") as f:
        poller = document_analysis_client.begin_analyze_document(
            "Contracts", document=f)
result = poller.result()
for idx, document in enumerate(result.documents):
        print("--------Analyzing document #{}--------".format(idx + 1))
        print("Document has type {}".format(document.doc_type))
        print("Document has confidence {}".format(document.confidence))
        print("Document was analyzed by model with ID {}".format(result.model_id))
        for name, field in document.fields.items():
            field_value = field.value if field.value else field.content
            if field.value_type == "list":
                df_list  = []
                for row in field.value:
                    a_row = {}
                    for key, value in row.value.items():
                        a_row[key] = value.content
                    df_list.append(a_row)
                df = pd.DataFrame(df_list)
                display(df)
            else:
                print("'{}' with value '{}' and with confidence {}".format(name, field_value, field.confidence))

 

Now you are ready to analyze all your documents with Form Recognizer. 

 

Additional resources

  • Learn more about Form Recognizer here
  • See what is new in the latest Form Recognizer release here
%3CLINGO-SUB%20id%3D%22lingo-sub-3267207%22%20slang%3D%22en-US%22%3EExtracting%20information%20from%20unstructured%20document%20(e.g.%2C%20contracts)%20with%20Azure%20Form%20Recognizer%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-3267207%22%20slang%3D%22en-US%22%3E%3CP%3E%3CFONT%20size%3D%226%22%20color%3D%22%23003366%22%3E%3CSTRONG%3EExtracting%20information%20from%20unstructured%20document%20(e.g.%2C%20contracts)%20with%20Azure%20Form%20Recognizer%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3EExtracting%20information%20from%20unstructured%20documents%20such%20as%20contracts%20is%20usually%20manual%20and%20involves%20tediously%20reading%20and%20understanding%20substantial%20amounts%20of%20documents%20to%20find%20specific%20information%20and%20manually%20extracting%20the%20information%20to%20digitize%20it.%20The%20process%20consumes%20a%20significant%20amount%20of%20a%20lawyer%E2%80%99s%20billable%20hours%20and%20is%20prone%20to%20human%20errors.%26nbsp%3B%20With%20Azure%20Form%20Recognizer%20you%20can%20automate%20this%20process.%20Azure%20Form%20Recognizer%20uses%20deep%20learning%20models%20and%20enables%20you%20to%20train%20a%20custom%20contract%20model%20to%20extract%20the%20information%20you%20need%20with%20just%20a%20few%20sample%20documents.%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%225%22%20color%3D%22%23003366%22%3E%3CSTRONG%3EIntroduction%20to%20the%20new%20Azure%20Form%20Recognizer%20Custom%20Neural%20(document)%20model%20%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3EOrganizations%20today%20deal%20with%20vast%20quantities%20of%20unstructured%20documents%20including%20contracts%2C%20financial%20or%20medical%20reports%20and%20publications.%20Processing%20these%20unstructured%20documents%20with%20AI%20to%20extract%20the%20right%20fields%20by%20relying%20on%20semantics%20improves%20decision%20making%20and%20time%20to%20value.%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fapplied-ai-services%2Fform-recognizer%2Fconcept-custom%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3ENeural%20(Custom%20document)%20model%3C%2FA%3E%26nbsp%3Bis%20a%20new%20deep%20learned%20model%20to%20extract%20fields%20from%20structured%20and%20unstructured%20documents.%20The%20new%20model%20shares%20the%20same%20labeling%20approach%20as%20the%20existing%20custom%20form%20or%20template%20models.%20You%20can%20start%20with%20just%205%20labeled%20documents%20to%20train%20a%20model.%20With%20a%20common%20labeling%20format%2C%20it%20is%20easy%20to%20take%20your%20existing%20template%20or%20custom%20form%20project%20and%20train%20a%20neural%20or%20custom%20document%20model%20or%20start%20from%20scratch%20and%20label.%20When%20dealing%20with%20variations%2C%20simply%20add%20a%20few%20samples%20of%20each%20variation%20to%20the%20training%20dataset%20as%20custom%20document%20models%20generalize%20well%20across%20variations.%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%225%22%20color%3D%22%23003366%22%3E%3CSTRONG%3EWhen%20to%20use%20this%20new%20capability%20%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3ECustom%20neural%20models%20or%20neural%20models%20are%20a%20deep%20learned%20model%20that%20combines%20layout%20and%20language%20features%20to%20accurately%20extract%20labeled%20fields%20from%20documents.%20The%20base%20custom%20neural%20model%20is%20trained%20on%20various%20document%20types%20that%20makes%20it%20suitable%20to%20be%20trained%20for%20extracting%20fields%20from%20structured%2C%20semi-structured%20and%20unstructured%20documents.%20Use%20the%20new%20Custom%20neural%20model%20for%20training%20a%20model%20on%20unstructured%20documents%20such%20as%20contracts%2C%20scope%20of%20work%2C%20letters%20etc.%20or%20to%20train%20a%20model%20for%20a%20variety%20of%20documents%20from%20the%20same%20type%20with%20different%20formats%20such%20as%20paystubs%2C%20bank%20statements%20etc.%20to%20create%20a%20single%20model%20for%20all%20document%20variations.%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%225%22%20color%3D%22%23003366%22%3E%3CSTRONG%3EGetting%20started%20is%20simple%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3ELet's%20take%20contracts%20as%20an%20example%20and%20dive%20into%20the%20following%20steps%3A%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3E%3CSTRONG%3EStep%201%20-%20Azure%20Blob%20Storage%20container%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3EStandard%3CSTRONG%3E%20performance%3C%2FSTRONG%3E%26nbsp%3B%3CA%20href%3D%22https%3A%2F%2Fportal.azure.com%2F%23create%2FMicrosoft.StorageAccount-ARM%22%20target%3D%22_blank%22%20rel%3D%22noopener%20nofollow%20noreferrer%22%3E%3CSTRONG%3EAzure%20Blob%20Storage%20account%3C%2FSTRONG%3E%3C%2FA%3E.%20You%20will%20create%20containers%20to%20store%20and%20organize%20your%20training%20documents%20within%20your%20storage%20account.%20If%20you%20do%20not%20know%20how%20to%20create%20an%20Azure%20storage%20account%20with%20a%20container%2C%20follow%20these%20quick%20starts%3A%3C%2FFONT%3E%3C%2FP%3E%0A%3COL%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fstorage%2Fcommon%2Fstorage-account-create%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3E%3CSTRONG%3ECreate%20a%20storage%20account%3C%2FSTRONG%3E%3C%2FA%3E.%20When%20creating%20your%20storage%20account%2C%20make%20sure%20to%20select%26nbsp%3B%3CSTRONG%3EStandard%3C%2FSTRONG%3E%26nbsp%3Bperformance%20in%20the%26nbsp%3B%3CSTRONG%3EInstance%20details%20%E2%86%92%20Performance%3C%2FSTRONG%3E%26nbsp%3Bfield.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fstorage%2Fblobs%2Fstorage-quickstart-blobs-portal%23create-a-container%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3E%3CSTRONG%3ECreate%20a%20container%3C%2FSTRONG%3E%3C%2FA%3E.%20When%20creating%20your%20container%2C%20set%20the%26nbsp%3B%3CSTRONG%3EPublic%20access%20level%3C%2FSTRONG%3E%26nbsp%3Bfield%20to%26nbsp%3B%3CSTRONG%3EContainer%3C%2FSTRONG%3E%26nbsp%3B(anonymous%20read%20access%20for%20containers%20and%20blobs)%20in%20the%26nbsp%3B%3CSTRONG%3ENew%20Container%3C%2FSTRONG%3E%26nbsp%3Bwindow.%3C%2FFONT%3E%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3E%3CSTRONG%3EConfigure%20CORS%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Frest%2Fapi%2Fstorageservices%2Fcross-origin-resource-sharing--cors--support-for-the-azure-storage-services%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3ECORS%20(Cross%20Origin%20Resource%20Sharing)%3C%2FA%3E%26nbsp%3Bneeds%20to%20be%20configured%20on%20your%20Azure%20storage%20account%20for%20it%20to%20be%20accessible%20from%20the%20Form%20Recognizer%20Studio.%20To%20configure%20CORS%20in%20the%20Azure%20portal%2C%20you%20will%20need%20access%20to%20the%20CORS%20blade%20of%20your%20storage%20account.%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%22NetaH_0-1648490673524.png%22%20style%3D%22width%3A%20764px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F359234iBB6F35CC8AF6383D%2Fimage-dimensions%2F764x430%3Fv%3Dv2%22%20width%3D%22764%22%20height%3D%22430%22%20role%3D%22button%22%20title%3D%22NetaH_0-1648490673524.png%22%20alt%3D%22NetaH_0-1648490673524.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3COL%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ESelect%20the%20CORS%20blade%20for%20the%20storage%20account.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EStart%20by%20creating%20a%20new%20CORS%20entry%20in%20the%20Blob%20service.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ESet%20the%26nbsp%3B%3CSTRONG%3EAllowed%20origins%3C%2FSTRONG%3E%26nbsp%3Bto%26nbsp%3B%3CA%20href%3D%22https%3A%2F%2Fformrecognizer.appliedai.azure.com%2F%22%20target%3D%22_blank%22%20rel%3D%22noopener%20nofollow%20noreferrer%22%3E%3CSTRONG%3Ehttps%3A%2F%2Fformrecognizer.appliedai.azure.com%3C%2FSTRONG%3E%3C%2FA%3E.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ESelect%20all%20the%20available%208%20options%20for%26nbsp%3B%3CSTRONG%3EAllowed%20methods%3C%2FSTRONG%3E.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EApprove%20all%26nbsp%3B%3CSTRONG%3EAllowed%20headers%3C%2FSTRONG%3E%26nbsp%3Band%26nbsp%3B%3CSTRONG%3EExposed%20headers%3C%2FSTRONG%3E%26nbsp%3Bby%20entering%20an%20*%20in%20each%20field.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ESet%20the%26nbsp%3B%3CSTRONG%3EMax%20Age%3C%2FSTRONG%3E%26nbsp%3Bto%20120%20seconds%20(about%202%20minutes)%20or%20any%20acceptable%20value.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EClick%20the%20save%20button%20at%20the%20top%20of%20the%20page%20to%20save%20the%20changes.%3C%2FFONT%3E%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3ECORS%20should%20now%20be%20configured%20to%20use%20the%20storage%20account%20from%20Form%20Recognizer%20Studio.%3C%2FFONT%3E%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3E%3CSTRONG%3EStep%202%20-%20Sample%20contracts%20documents%20set%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3COL%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EGo%20to%20the%26nbsp%3B%3CA%20href%3D%22https%3A%2F%2Fportal.azure.com%2F%23home%22%20target%3D%22_blank%22%20rel%3D%22noopener%20nofollow%20noreferrer%22%3EAzure%20portal%3C%2FA%3E%26nbsp%3Band%20navigate%20as%20follows%3A%26nbsp%3B%3CSTRONG%3EYour%20storage%20account%3C%2FSTRONG%3E%26nbsp%3B%E2%86%92%26nbsp%3B%3CSTRONG%3EData%20storage%3C%2FSTRONG%3E%26nbsp%3B%E2%86%92%26nbsp%3B%3CSTRONG%3EContainers%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%20style%3D%22%20padding-left%20%3A%2030px%3B%20%22%3E%3CFONT%20size%3D%224%22%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%22NetaH_1-1648490814684.png%22%20style%3D%22width%3A%20400px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F359235iD37755DF9959B692%2Fimage-size%2Fmedium%3Fv%3Dv2%26amp%3Bpx%3D400%22%20role%3D%22button%22%20title%3D%22NetaH_1-1648490814684.png%22%20alt%3D%22NetaH_1-1648490814684.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3COL%20start%3D%222%22%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ESelect%20a%26nbsp%3B%3CSTRONG%3Econtainer%3C%2FSTRONG%3E%26nbsp%3Bfrom%20the%20list.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ESelect%26nbsp%3B%3CSTRONG%3EUpload%3C%2FSTRONG%3E%26nbsp%3Bfrom%20the%20menu%20at%20the%20top%20of%20the%20page%20and%20upload%20your%20training%20documents%20to%20the%20blob.%20You%20will%20need%205%20documents%20to%20get%20started.%3C%2FFONT%3E%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%20style%3D%22%20padding-left%20%3A%2030px%3B%20%22%3E%3CFONT%20size%3D%224%22%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%22NetaH_2-1648490814687.png%22%20style%3D%22width%3A%20884px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F359237i86780D92E236E890%2Fimage-dimensions%2F884x42%3Fv%3Dv2%22%20width%3D%22884%22%20height%3D%2242%22%20role%3D%22button%22%20title%3D%22NetaH_2-1648490814687.png%22%20alt%3D%22NetaH_2-1648490814687.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3COL%20start%3D%224%22%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EThe%26nbsp%3B%3CSTRONG%3EUpload%20blob%3C%2FSTRONG%3E%26nbsp%3Bwindow%20will%20appear.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ESelect%20your%20file(s)%20to%20upload.%3C%2FFONT%3E%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%20style%3D%22%20padding-left%20%3A%2030px%3B%20%22%3E%3CFONT%20size%3D%224%22%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%22NetaH_3-1648490814690.png%22%20style%3D%22width%3A%20400px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F359236i9E6512F5BECED510%2Fimage-size%2Fmedium%3Fv%3Dv2%26amp%3Bpx%3D400%22%20role%3D%22button%22%20title%3D%22NetaH_3-1648490814690.png%22%20alt%3D%22NetaH_3-1648490814690.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3ETo%20train%20the%20model%20you%20will%20need%205%20contract%20documents%20to%20get%20started.%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3E%3CSTRONG%3EStep%26nbsp%3B%203-%20Create%20a%20Custom%20contracts%20model%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3ETo%20create%20custom%20contracts%20models%2C%20you%20start%20with%20configuring%20your%20project%3A%3C%2FFONT%3E%3C%2FP%3E%0A%3COL%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ELogin%20to%20the%20Azure%20Form%20Recognizer%20Studio%20%3CA%20href%3D%22https%3A%2F%2Fformrecognizer.appliedai.azure.com%22%20target%3D%22_blank%22%20rel%3D%22noopener%20nofollow%20noreferrer%22%3Ehttps%3A%2F%2Fformrecognizer.appliedai.azure.com%3C%2FA%3E%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EFrom%20the%20Studio%20home%2C%20select%20the%20Custom%20model%20card%20to%20open%20the%20Custom%20model's%20page.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EUse%20the%20%22Create%20a%20project%22%20command%20to%20start%20the%20new%20project%20configuration%20wizard.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EEnter%20project%20details%2C%20select%20the%20Azure%20subscription%20and%20resource%2C%20and%20the%20Azure%20Blob%20storage%20container%20that%20contains%20your%20contract%20data%20(created%20in%20the%20previous%20step).%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EReview%20and%20submit%20your%20settings%20to%20create%20the%20project.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ELabel%20and%20tag%20the%20data%20points%20you%20would%20like%20to%20extract%20from%20the%20documents.%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ESelect%20the%20Train%20command%20and%3C%2FFONT%3E%3CUL%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3Eenter%20model%20name%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3Eselect%20custom%20neural%20(document)%20model%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ETrain%20the%20model%3C%2FFONT%3E%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3EOnce%20the%20model%20is%20ready%2C%20use%20the%20Test%20command%20to%20validate%20it%20with%20the%20test%20document%20which%20you%20did%20not%20use%20in%20your%20training%20dataset.%26nbsp%3B%3C%2FFONT%3E%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%20style%3D%22%20padding-left%20%3A%2030px%3B%20%22%3E%3CFONT%20size%3D%224%22%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%22Contracts%20Blog%20images.gif%22%20style%3D%22width%3A%20944px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F359240i743938424F9F41D4%2Fimage-dimensions%2F944x531%3Fv%3Dv2%22%20width%3D%22944%22%20height%3D%22531%22%20role%3D%22button%22%20title%3D%22Contracts%20Blog%20images.gif%22%20alt%3D%22Contracts%20Blog%20images.gif%22%20%2F%3E%3C%2FSPAN%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3E%3CSTRONG%3EStep%204%20-%20Start%20coding%20to%20analyze%20documents%20using%20the%20model%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3EAfter%20testing%20your%20model%20with%20sample%20documents%20via%20the%20Form%20Recognizer%20Studio%20you%20can%20now%20also%20analyze%20documents%20directly%20from%20your%20application%20using%20the%20Form%20Recognizer%20%3CA%20href%3D%22https%3A%2F%2Fwestus.dev.cognitive.microsoft.com%2Fdocs%2Fservices%2Fform-recognizer-api-v3-0-preview-2%2Foperations%2FAnalyzeDocument%22%20target%3D%22_self%22%20rel%3D%22noopener%20noreferrer%22%3EREST%20API%3C%2FA%3E%20or%20%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fapplied-ai-services%2Fform-recognizer%2Fquickstarts%2Ftry-v3-python-sdk%22%20target%3D%22_self%22%20rel%3D%22noopener%20noreferrer%22%3ESDK%3C%2FA%3E.%26nbsp%3B%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CPRE%20class%3D%22lia-code-sample%20language-python%22%3E%3CCODE%3E!pip%20install%20azure-ai-formrecognizer%3D%3D3.2.0b3%0Afrom%20azure.core.credentials%20import%20AzureKeyCredential%0Afrom%20azure.ai.formrecognizer%20import%20DocumentAnalysisClient%2C%20DocumentModelAdministrationClient%0Aimport%20os%0Aimport%20pandas%20as%20pd%0Aendpoint%20%3D%20os.environ%5B%22AZURE_FORM_RECOGNIZER_ENDPOINT%22%5D%0Akey%20%3D%20os.environ%5B%22AZURE_FORM_RECOGNIZER_KEY%22%5D%0A%20%0Adocument_admin_client%20%3D%20DocumentModelAdministrationClient(endpoint%3Dendpoint%2C%20credential%3DAzureKeyCredential(key))%0Amodels%20%3D%20document_admin_client.list_models()%0Afor%20model%20in%20models%3A%0A%20%20%20%20print(%22%7B%7D%20%7C%20%7B%7D%22.format(model.model_id%2C%20model.description))%0Adocument_analysis_client%20%3D%20DocumentAnalysisClient(%0A%20%20%20%20%20%20%20%20endpoint%3Dendpoint%2C%20credential%3DAzureKeyCredential(key))%0A%20%0Apath_to_sample_documents%20%3D%20os.path.abspath(%0A%20%20%20%20%20%20%20%20os.path.join(%0A%20%20%20%20%20%20%20%20%20%20%20%20os.getcwd()%2C%0A%20%20%20%20%20%20%20%20%20%20%20%20%22sample_contract.pdf%22%2C%0A%20%20%20%20%20%20%20%20)%0A%20%20%20%20)%0A%20%0Awith%20open(path_to_sample_documents%2C%20%22rb%22)%20as%20f%3A%0A%20%20%20%20%20%20%20%20poller%20%3D%20document_analysis_client.begin_analyze_document(%0A%20%20%20%20%20%20%20%20%20%20%20%20%22Contracts%22%2C%20document%3Df)%0Aresult%20%3D%20poller.result()%0Afor%20idx%2C%20document%20in%20enumerate(result.documents)%3A%0A%20%20%20%20%20%20%20%20print(%22--------Analyzing%20document%20%23%7B%7D--------%22.format(idx%20%2B%201))%0A%20%20%20%20%20%20%20%20print(%22Document%20has%20type%20%7B%7D%22.format(document.doc_type))%0A%20%20%20%20%20%20%20%20print(%22Document%20has%20confidence%20%7B%7D%22.format(document.confidence))%0A%20%20%20%20%20%20%20%20print(%22Document%20was%20analyzed%20by%20model%20with%20ID%20%7B%7D%22.format(result.model_id))%0A%20%20%20%20%20%20%20%20for%20name%2C%20field%20in%20document.fields.items()%3A%0A%20%20%20%20%20%20%20%20%20%20%20%20field_value%20%3D%20field.value%20if%20field.value%20else%20field.content%0A%20%20%20%20%20%20%20%20%20%20%20%20if%20field.value_type%20%3D%3D%20%22list%22%3A%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20df_list%20%20%3D%20%5B%5D%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20for%20row%20in%20field.value%3A%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20a_row%20%3D%20%7B%7D%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20for%20key%2C%20value%20in%20row.value.items()%3A%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20a_row%5Bkey%5D%20%3D%20value.content%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20df_list.append(a_row)%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20df%20%3D%20pd.DataFrame(df_list)%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20display(df)%0A%20%20%20%20%20%20%20%20%20%20%20%20else%3A%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20print(%22'%7B%7D'%20with%20value%20'%7B%7D'%20and%20with%20confidence%20%7B%7D%22.format(name%2C%20field_value%2C%20field.confidence))%0A%3C%2FCODE%3E%3C%2FPRE%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%224%22%3ENow%20you%20are%20ready%20to%20analyze%20all%20your%20documents%20with%20Form%20Recognizer.%26nbsp%3B%3C%2FFONT%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CFONT%20size%3D%225%22%3E%3CSTRONG%3EAdditional%20resources%20%3C%2FSTRONG%3E%3C%2FFONT%3E%3C%2FP%3E%0A%3CUL%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ELearn%20more%20about%20Form%20Recognizer%20%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fapplied-ai-services%2Fform-recognizer%2Foverview%3Ftabs%3Dv3-0%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3Ehere%3C%2FA%3E%3C%2FFONT%3E%3C%2FLI%3E%0A%3CLI%3E%3CFONT%20size%3D%224%22%3ESee%20what%20is%20new%20in%20the%20latest%20Form%20Recognizer%20release%20%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fapplied-ai-services%2Fform-recognizer%2Fwhats-new%3Ftabs%3Dcsharp%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3Ehere%3C%2FA%3E%3C%2FFONT%3E%3C%2FLI%3E%0A%3C%2FUL%3E%3C%2FLINGO-BODY%3E%3CLINGO-TEASER%20id%3D%22lingo-teaser-3267207%22%20slang%3D%22en-US%22%3E%3CP%3EExtracting%20information%20from%20unstructured%20documents%20such%20as%20contracts%20is%20usually%20manual%20and%20involves%20tediously%20reading%20and%20understanding%20substantial%20amounts%20of%20documents%20to%20find%20specific%20information%20and%20manually%20extracting%20the%20information%20to%20digitize%20it.%20The%20process%20consumes%20a%20significant%20amount%20of%20a%20lawyer%E2%80%99s%20billable%20hours%20and%20is%20prone%20to%20human%20errors.%26nbsp%3B%20With%20Azure%20Form%20Recognizer%20you%20can%20automate%20this%20process.%20See%20how.%26nbsp%3B%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%22Contracts%20Blog%20images.gif%22%20style%3D%22width%3A%20853px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F359243iD0BF2EF8F27BA4C4%2Fimage-size%2Flarge%3Fv%3Dv2%26amp%3Bpx%3D999%22%20role%3D%22button%22%20title%3D%22Contracts%20Blog%20images.gif%22%20alt%3D%22Contracts%20Blog%20images.gif%22%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%3C%2FLINGO-TEASER%3E
Co-Authors
Version history
Last update:
‎Mar 28 2022 01:45 PM
Updated by: