xml
16 Topics🚀 New & Improved Data Mapper UX in Azure Logic Apps – Now in Public Preview!
We’re excited to announce that a UX update for Data Mapper in Azure Logic Apps is now in Public Preview! We have continuously improved Data Mapper, which is already generally available (GA), based on customer feedback. Last year, we conducted a private preview to assess the improvements in the new user experience and confirm that we are on the right track in simplifying complex data transformations, including EDI schemas. With the insights gained, we made significant UI enhancements and added features to streamline the mapping process. Feedback We value your feedback to make the Data Mapper even better. Please share your thoughts, suggestions, and overall experience with us through our feedback form. How feedback shaped the Public Preview Throughout the evolution of Data Mapper, we gathered valuable feedback from customers and partners. Key themes that emerged include: Reliability: Ensuring the Data Mapper can handle large schemas and complex transformation logic, including functions. Error handling: Providing real-time validation by allowing users to test payloads and catch errors while authoring maps. Looping: Clearly indicating when repeating nodes are mapped and ensuring complex objects are properly represented Drag & drop enhancements: Improving how connections between nodes are created for better usability. Deserialization & namespace honoring: Ensuring XML deserialization correctly loads mappings without data loss, preserving namespace integrity for seamless schema validation. We’ve incorporated these suggestions into the public preview, ensuring a more refined and user-friendly experience. What’s new in the Data Mapper UX? 1. Easier navigation Docked schema panels keep you oriented within the data map. Easily search for specific nodes to streamline mapping. 2. Side-by-side function panel Search and use 100+ built-in functions, including mainly: Collection functions (for repeating elements) String manipulations Mathematical operations Conditional logic 3. Automatic looping for repeating nodes When mapping repeating nodes, a new loop connection is automatically added on the immediate parent nodes at source and destination. Repeating parent nodes are denoted by "A1, A2" notation on hover. Note: If the child node in the source has a deeper nesting level than in the destination, you must manually map the connection from the repeating source node to the destination to ensure proper data transformation. 4. Real-time error detection On saving the map, instantly view warnings and errors for missing mappings or incorrect configurations 5. Test Your Map Instantly Preview the output before running your workflow. How to set up and test out the new Data Mapper experience Enable the Preview: Go to your Azure Logic App (Standard) extension -> Settings -> Data Mapper. Select “Version ~2” to try out the new user experience. Light theme: Enable "Light Theme" in VS Code before creating a new data map. Dark Theme is not supported, but is on the roadmap and will be prioritized soon. Create a New Data Map: Navigate to the Azure tab on the left-hand panel of your VS Code. Select “Create New Data Map” and name it. Once loaded, select the schemas for source and destination. Upload schemas: Upload your source and destination schemas before creating the map (eg .xsd or .json files). Limitations While the new Data Mapper UX brings significant improvements, a few limitations remain: Filter function: The filter function correctly processes numeric conditions when enclosed in quotes (e.g., ">= 10"), but does not behave consistently for string comparisons (e.g., checking if item name = "Pen"). We are actively working on refining this behavior. Custom Functions: Support for custom functions is coming in the next refresh to enhance flexibility in data mapping. Usability enhancements: Improved tooltips, function labels, error messages and other UX refinements are on the way to provide clearer guidance and a smoother mapping experience, especially for complex transformations. Future investments The product is going to continue getting better and we should be adding more features very soon! Some immediate investments include: Enhanced test map experience: Making it easier to validate mappings during development. Panel resizing: Allowing users to have flexibility in viewing larger schemas and functions when multiple panels are expanded.3.3KViews12likes6Comments🚀 General Availability: Enhanced Data Mapper Experience in Logic Apps (Standard)
We’re excited to announce the General Availability (GA) of the redesigned Data Mapper UX in the Azure Logic Apps (Standard) extension for Visual Studio Code. This release marks a major milestone in our journey to modernize and streamline data transformation workflows for integration developer. What's new The new UX, previously available in public preview, is now the default experience in the Logic Apps Standard extension. This GA release reflects direct feedback from our integration developer community. We’ve resolved blockers that we heard from customers and usability issues that impacted performance and stability, including: Opening V1 maps in V2: Seamlessly open and edit existing maps you have already created with latest visual capabilities. Load schemas on Mac: Addressed schema-related crashes on macOS for a smoother experience. Function documentation updates: Improved guidance and examples for built-in collection functions that apply on repeating nodes. Stay connected We would love to hear your feedback. Please use this form link to let us know if there are any missing gaps or scenarios that are not yet covered1.2KViews1like0Comments🔁 Public Preview Refresh: More Power to Data Mapper in Azure Logic Apps
We’re back with a Public Preview refresh for the Data Mapper in Azure Logic Apps (Standard) — bringing forward some long-standing capabilities that are now fully supported in the new UX. In our initial announcement, we introduced a redesigned experience focused on usability, error handling, and improved mapping for complex schemas. As we continue evolving the tool, we’re working to bring feature parity with the classic experience, while layering in modern enhancements along the way. With this update, several existing capabilities from the legacy Data Mapper are now available in the new preview version — so you can bring your advanced scenarios forward with confidence. 🛠️ Run XSLT Inside Your Data Map The ability to apply XSLT has long been a powerful feature in Logic Apps, and we’re excited to bring Run XSLT support into the new UX. You can now invoke reusable transformation logic from your map, including: Enterprise-grade XSLT Predefined templates or logic from your BizTalk workflows How to try it out: Create a new data map. Right-click on the MapDefintions or Maps folder and click Create new data map Store the XSLT file under Artifacts -> DataMapper/Extension -> InlineXslt. Open the data map and search for Run XSLT in the functions panel. Select the function and simply select the function you want to run from the dropdown Connect to desired destination node. In my case, the function simply adds a "Placeholder" value for the Name node at destination, alongside an "EmployeeType" node. Note that you do not need to connect any source node to the XSLT function given this is custom XSLT logic that will be applied directly at destination node. Upon testing the map, right value is generated in the destination schema 🔍 Execute XPath to Extract Targeted Values Execute XPath is now supported in the new experience, giving you control to extract specific values from nested XML structures. This function is particularly useful for: Accessing attributes and nested elements Applying logic based on the structure or content of incoming data How to try it out: Search for Execute XPath in the functions panel. Select the function and add the expression you want to extract Map it to destination node. Here is what the map will look like: The test payload correctly creates multiple Address nodes at destination based on the Address node at source. 🧩 Use Custom XML Functions Custom XML functions allow you to define and reuse logic across your map. This helps reduce duplication and supports schema-specific transformations. Now that support is available in the new UX, you can: Wrap complex logic into manageable components Handle schema-specific edge cases with ease How to try it out: Add the .xml function file under Artifacts -> DataMapper/Extension -> Functions Open the data map and under Utility category of functions, select the new function. In our case, the xml function is called Age Connect function input to Date_of_Birth node at source and output to Age node at destination. The map will look something like this Test the map and notice that the age is calculated correctly at the destination node 🌒 Dark Mode Support in VS Code The new UX now respects Dark Mode in VS Code, giving you a visually cohesive and low-contrast authoring experience — perfect for long mapping sessions. No extra steps needed — Dark Mode works automatically based on your VS Code theme settings. ⚙️ How to Enable the New Experience If you haven’t yet tried the new UX: Open your Logic Apps (Standard) project in VS Code Go to Logic Apps (Standard) extension → Settings → Data Mapper Select Version ~2 You’ll find detailed walkthroughs in the initial preview announcement blog. 💬 We’d Love Your Feedback We’re continuously evolving the Data Mapper, and your feedback is key to getting it right — especially as we bring more advanced transformation scenarios into the new experience. 👉 Submit your feedback here 🐛 Found an issue or have a specific feature request? Let us know on GitHub Issues Thanks again for being part of the journey — more updates coming soon! 🚀Announcing the General Availability of the XML Parse and Compose Actions in Azure Logic Apps
The XML Operations connector We have recently added two actions for the XML Operations connector: Parse XML with schema and Compose XML with schema. With this addition, Logic Apps customers can now interact with the token picker during design time. The tokens are generated from the XML schema provided by the customer. As a result, the XML document and its contained properties will be easily accessible, created and manipulated in the workflow. XML parse with schema The XML parse with schema allow customers to parse XML data using an XSD file (an XML schema file). XSD files need to be uploaded to the Logic App schemas artifacts or an Integration account. Once they have been uploaded, you need to enter the enter your XML content, the source of the schema and the name of the schema file. The XML content may either be provided in-line or selected from previous operations in the workflow using the token picker. For instance, the following is a parsed XML: XML compose with schema The XML compose with schema allows customers to generate XML data, using an XSD file. XSD files need to be uploaded to the Logic App schemas artifacts or an Integration account. Once they have been uploaded, you should select the XSD file along with entering the JSON root element or elements of your input XML schema. The JSON input elements will be dynamically generated based on the selected XML schema. For instance, the following is a Composed file: Learnings from Transition from Public Preview to General Availability: Upon feedback received from multiple customers, we would love to share the following recommendations and considerations, that will you maximize the reliability, flexibility, and internationalization capabilities of XML Parse and Compose actions in Azure Logic Apps. Handling Array Inputs in XML Switch to array input mode when mapping arrays. By default, the Logic App Designer expects individual array items for XML elements with maxOccurs > 1. If you want to assign an entire array token, use the array input mode icon in the Designer. This avoids unnecessary For Each loops and streamlines your workflow. For instance, the following: Click the Switch to input entire array Enter your array token. Managing Non-UTF-8 Encoded XML Leverage the encoding parameter in XML Compose. Customers can specify the desired character encoding (e.g., iso-2022-jp for Japanese). This controls both the .NET XML writer settings and the output encoding, allowing for broader internationalization support. Example configuration: Use the XML writer settings property to set encoding as needed. Safe Transport of Binary and Non-UTF-8 Content Utilize the Logic App content envelope. The XML Compose action outputs content in a safe envelope, enabling transport of binary and non-UTF-8 content within the UTF-8 JSON payload. Downstream actions (e.g., HTTP Request) can consume this envelope directly. Content-Type Header Management XML Compose now specifies the exact character set in the Content-Type header. This ensures downstream systems receive the correct encoding information. For example, application/xml; charset=iso-2022-jp will be set for Japanese-encoded XML. Consuming XML Output in HTTP Actions Reference the XML output property directly in HTTP actions. The envelope’s content-type is promoted to the HTTP header, and the base64-encoded content is decoded and sent as the raw HTTP body. This preserves encoding and binary fidelity. Documentation and External References Consult official documentation for advanced scenarios: Support non-Unicode character encoding in Azure Logic Apps. Content-Type and Content-Encoding for clarifying header usage. Do not confuse Content-Type with Content-Encoding. Content-Type specifies character set encoding (e.g., UTF-8, ISO-2022-JP), while Content-Encoding refers to compression (e.g., gzip). Normalization and prefix and trailer trimming: Here is a sample that shows how XML normalization works for values, and how to achieve prefix and trailing trimming: XSD: <?xml version="1.0" encoding="utf-8"?> <xs:schema id="XmlNormalizationAndWhitespaceCollapsed" targetNamespace="http://schemas.contoso.com/XmlNormalizationAndWhitespace" elementFormDefault="qualified" xmlns="http://schemas.contoso.com/XmlNormalizationAndWhitespace" xmlns:mstns="http://schemas.contoso.com/XmlNormalizationAndWhitespace" xmlns:xs="http://www.w3.org/2001/XMLSchema"> <!-- simple type that preserves whitespace (like xs:string) --> <xs:simpleType name="PreserveString"> <xs:restriction base="xs:string" /> </xs:simpleType> <!-- normalizedString collapses CR/LF/TAB to spaces, but preserves leading/trailing and repeated spaces --> <xs:simpleType name="NormalizedName"> <xs:restriction base="xs:normalizedString" /> </xs:simpleType> <!-- token collapses runs of spaces and trims leading/trailing spaces --> <xs:simpleType name="TokenName"> <xs:restriction base="xs:token" /> </xs:simpleType> <!-- explicit whitespace collapse facet (equivalent to xs:token for this purpose) --> <xs:simpleType name="CollapsedName"> <xs:restriction base="xs:string"> <xs:whiteSpace value="collapse" /> </xs:restriction> </xs:simpleType> <xs:element name="root"> <xs:complexType> <xs:sequence> <xs:element name="header"> <xs:complexType> <xs:sequence> <xs:element name="id" type="xs:int" /> </xs:sequence> </xs:complexType> </xs:element> <xs:element name="row" maxOccurs="unbounded"> <xs:complexType> <xs:sequence> <xs:element name="id" type="xs:int" /> <!-- the three variants on whitespace handling for name and aliases --> <xs:element name="name" type="mstns:TokenName" /> <xs:element name="nameNormalized" type="mstns:NormalizedName" minOccurs="0" /> <xs:element name="nameCollapsed" type="mstns:CollapsedName" minOccurs="0" /> <xs:element name="namePreserved" type="mstns:PreserveString" minOccurs="0" /> </xs:sequence> </xs:complexType> </xs:element> </xs:sequence> </xs:complexType> </xs:element> </xs:schema> JSON: { ":Attribute:xmlns": "http://schemas.contoso.com/XmlNormalizationAndWhitespace", "header": { "id": 1002 }, "row": [ { "id": 1, "name": "cyckel1", "nameNormalized": " cyckel1 end ", "nameCollapsed": "cyckel1", "namePreserved": "cyckel1\r\n\tend" }, { "id": 2, "name": "cyckel2", "nameNormalized": "line1 line2 ", "nameCollapsed": "cyckel2", "namePreserved": "line1\nline2\t" }, { "id": 3, "name": "cyckel12", "nameNormalized": " tabs and lines", "nameCollapsed": "cyckel12", "namePreserved": "\ttabs\tand\r\nlines" } ] } XML: <?xml version="1.0" encoding="utf-8"?> <root xmlns="http://schemas.contoso.com/XmlNormalizationAndWhitespace"> <header> <id>1002</id> </header> <row> <id>1</id> <name>cyckel1 </name> <nameNormalized> cyckel1
	end </nameNormalized> <nameCollapsed>cyckel1 </nameCollapsed> <namePreserved>cyckel1
	end</namePreserved> </row> <row> <id>2</id> <name>cyckel2 </name> <nameNormalized>line1
line2	</nameNormalized> <nameCollapsed>cyckel2 </nameCollapsed> <namePreserved>line1
line2	</namePreserved> </row> <row> <id>3</id> <name> cyckel12</name> <nameNormalized>	tabs	and
lines</nameNormalized> <nameCollapsed> cyckel12</nameCollapsed> <namePreserved>	tabs	and
lines</namePreserved> </row> </root> Check this short video to learn more:819Views1like0CommentsLogic Apps Agentic Workflows with SAP - Part 1: Infrastructure
When you integrate Azure Logic Apps with SAP, the “hello world” part is usually easy. The part that bites you later is data quality. In SAP-heavy flows, validation isn’t a nice-to-have — it’s what makes the downstream results meaningful. If invalid data slips through, it can get expensive fast: you may create incorrect business documents, trigger follow-up processes, and end up in a cleanup path that’s harder (and more manual) than building validation upfront. And in “all-or-nothing” transactional patterns, things get even more interesting: one bad record can force a rollback strategy, compensating actions, or a whole replay/reconciliation story you didn’t want to own. See for instance Handling Errors in SAP BAPI Transactions | Microsoft Community Hub to get an idea of the complexity in a BizTalk context. That’s the motivation for this post: a practical starter pattern that you can adapt to many data shapes and domains for validating data in a Logic Apps + SAP integration. Note: For the full set of assets used here, see the companion GitHub repository (workflows, schemas, SAP ABAP code, and sample files). 1. Introduction Scenario overview The scenario is intentionally simple, but it mirrors what shows up in real systems: A Logic App workflow sends CSV documents to an SAP endpoint. SAP forwards the payload to a second Logic App workflow that performs: rule-based validation (based on pre-defined rules) analysis/enrichment (market trends, predictions, recommendations) The workflow either: returns validated results (or validation errors) to the initiating workflow, or persists outputs for later use For illustration, I’m using fictitious retail data. The content is made up, but the mechanics are generic: the same approach works for orders, inventory, pricing, master data feeds, or any “file in → decision out” integration. You’ll see sample inputs and outputs below to keep the transformations concrete. What this post covers This walkthrough focuses on the integration building blocks that tend to matter in production: Calling SAP RFCs from Logic App workflows, and invoking Logic App workflows from SAP function modules Using the Logic Apps SAP built-in trigger Receiving and processing IDocs Returning responses and exceptions back to SAP in a structured, actionable way Data manipulation patterns in Logic Apps, including: parsing and formatting inline scripts XPath (where it fits, and where it becomes painful). Overall Implementation A high-level view of the implementation is shown below. The source workflow handles end-to-end ingestion—file intake, transformation, SAP integration, error handling, and notifications—using Azure Logic Apps. The destination workflows focus on validation and downstream processing, including AI-assisted analysis and reporting, with robust exception handling across multiple technologies. I’ll cover the AI portion in a follow-up post. Note on AI-assisted development Most of the workflow “glue” in this post—XPath, JavaScript snippets, and Logic Apps expressions—was built with help from Copilot and the AI assistant in the designer (see Get AI-assisted help for Standard workflows - Azure Logic Apps | Microsoft Learn). In my experience, this is exactly where AI assistance pays off: generating correct scaffolding quickly, then iterating based on runtime behavior. I’ve also included SAP ABAP snippets for the SAP-side counterpart. You don’t need advanced ABAP knowledge to follow along; the snippets are deliberately narrow and integration-focused. I include them because it’s hard to design robust integrations if you only understand one side of the contract. When you understand how SAP expects to receive data, how it signals errors, and where transactional boundaries actually are, you end up with cleaner workflows and fewer surprises. 2. Source Workflow This workflow is a small, end‑to‑end “sender” pipeline: it reads a CSV file from Azure Blob Storage, converts the rows into the SAP table‑of‑lines XML shape expected by an RFC, calls Z_GET_ORDERS_ANALYSIS via the SAP connector, then extracts analysis or error details from the RFC response and emails a single consolidated result. The name of the data file is a parameter of the logic app. At a high level: Input: an HTTP request (used to kick off the run) + a blob name. Processing: CSV → array of rows → XML (…) → RFC call Output: one email containing either: the analysis (success path), or a composed error summary (failure path). The diagram below summarizes the sender pipeline: HTTP trigger → Blob CSV (header included) → rows → SAP RFC → parse response → email. Two design choices are doing most of the work here. First, the workflow keeps the CSV transport contract stable by sending the file as a verbatim list of lines—including the header—wrapped into … elements under IT_CSV . Second, it treats the RFC response as the source of truth: EXCEPTIONMSG and RETURN/MESSAGE drive a single Has errors gate, which determines whether the email contains the analysis or a consolidated failure summary. Step-by-step description Phase 0 — Trigger Trigger — When_an_HTTP_request_is_received The workflow is invoked via an HTTP request trigger (stateful workflow). Phase 1 — Load and split the CSV Read file — Read_CSV_orders_from_blob Reads the CSV from container onlinestoreorders using the blob name from @parameters('DataFileName'). The name of the data file is a parameter of the logic app. Split into rows — Extract_rows Splits the blob content on \r\n, producing an array of CSV lines. Design note: Keeping the header row is useful when downstream validation or analysis wants column names, and it avoids implicit assumptions in the sender workflow. Phase 2 — Shape the RFC payload Convert CSV rows to SAP XML — Transform_CSV_to_XML Uses JavaScript to wrap each CSV row (including the header line) into the SAP line structure and XML‑escape special characters. The output is an XML fragment representing a table of ZTY_CSV_LINE rows. Phase 3 — Call SAP and extract response fields Call the RFC — [RFC]_ Z_GET_ORDERS_ANALYSIS Invokes Z_GET_ORDERS_ANALYSIS with an XML body containing … built from the transformed rows. Note that a <DEST>...</DEST> element can also be provided to override the default value in the function module definition. Extract error/status — Save_EXCEPTION_message and Save_RETURN_message Uses XPath to pull: EXCEPTIONMSG from the RFC response, and the structured RETURN / MESSAGE field. Phase 4 — Decide success vs failure and notify Initialize output buffer — Initialize_email_body Creates the EmailBody variable used by both success and failure cases. Gate — Has_errors Determines whether to treat the run as failed based on: EXCEPTIONMSG being different from "ok", or RETURN / MESSAGE being non‑empty. Send result — Send_an_email_(V2) Emails either: the extracted ANALYSIS (success), or a concatenated error summary including RETURN / MESSAGE plus message details ( MESSAGE_V1 … MESSAGE_V4 ) and EXCEPTIONMSG . Note: Because the header row is included in IT_CSV , the SAP-side parsing/validation treats the first line as column titles (or simply ignores it). The sender workflow stays “schema-agnostic” by design. Useful snippets Snippet 1 — Split the CSV into rows split(string(body('Read_CSV_orders_from_blob')?['content']), '\r\n') Tip: If your CSV has a header row you don’t want to send to SAP, switch back to: @skip(split(string(body('Read_CSV_orders_from_blob')?['content']), '\r\n'), 1) Snippet 2 — JavaScript transform: “rows → SAP table‑of‑lines XML” const lines = workflowContext.actions.Extract_rows.outputs; function xmlEscape(value) { return String(value) .replace(/&/g, "&") .replace(//g, ">") .replace(/"/g, """) .replace(/'/g, "'"); } // NOTE: we don't want to keep empty lines (which can be produced by reading the blobs) // the reason being that if the recipient uses a schema to validate the xml, // it may reject it if it does not allow empty nodes. const xml = lines .filter(line => line && line.trim() !== '') // keep only non-empty lines .map(line => `<zty_csv_line><line>${xmlEscape(line)}</line></zty_csv_line>`) .join(''); return { xml }; Snippet 3 — XPath extraction of response fields (namespace-robust) EXCEPTIONMSG: @xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string( /*[local-name()="Z_GET_ORDERS_ANALYSISResponse"] /*[local-name()="EXCEPTIONMSG"])') RETURN/MESSAGE: @xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string( /*[local-name()="Z_GET_ORDERS_ANALYSISResponse"] /*[local-name()="RETURN"] /*[local-name()="MESSAGE"])') Snippet 4 — Failure email body composition concat( 'Error message: ', outputs('Save_RETURN_message'), ', details: ', xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(//*[local-name()=\"MESSAGE_V1\"])'), xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(//*[local-name()=\"MESSAGE_V2\"])'), xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(//*[local-name()=\"MESSAGE_V3\"])'), xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(//*[local-name()=\"MESSAGE_V4\"])'), '; ', 'Exception message: ', outputs('Save_EXCEPTION_message'), '.') 3. SAP Support To make the SAP/Logic Apps boundary simple, I model the incoming CSV as a table of “raw lines” on the SAP side. The function module Z_GET_ORDERS_ANALYSIS exposes a single table parameter, IT_CSV , typed using a custom line structure. Figure: IT_CSV is a table of CSV lines ( ZTY_CSV_LINE ), with a single LINE field ( CHAR2048 ). IT_CSV uses the custom structure ZTY_CSV_LINE , which contains a single component LINE ( CHAR2048 ). This keeps the SAP interface stable: the workflow can send CSV lines without SAP having to know the schema up front, and the parsing/validation logic can evolve independently. The diagram below shows the plumbing that connects SAP to Azure Logic Apps in two common patterns: SAP sending IDocs to a workflow and SAP calling a remote-enabled endpoint via an RFC destination. I’m showing all three pieces together—the ABAP call site, the SM59 RFC destination, and the Logic Apps SAP built-in trigger—because most “it doesn’t work” problems come down to a small set of mismatched configuration values rather than workflow logic. The key takeaway is that both patterns hinge on the same contract: Program ID plus the SAP Gateway host/service. In SAP, those live in SM59 (TCP/IP destination, registered server program). In Logic Apps, the SAP built-in trigger listens using the same Program ID and gateway settings, while the trigger configuration (for example, IDoc format and degree of parallelism) controls how messages are interpreted and processed. Once these values line up, the rest of the implementation becomes “normal workflow engineering”: validation, predictable error propagation, and response shaping. Before diving into workflow internals, I make the SAP-side contract explicit. The function module interface below shows the integration boundary: CSV lines come in as IT_CSV , results come back as ANALYSIS , and status/error information is surfaced both as a human-readable EXCEPTIONMSG and as a structured RETURN ( BAPIRET2 ). I also use a dedicated exception ( SENDEXCEPTIONTOSAPSERVER ) to signal workflow-raised failures cleanly. Contract (what goes over RFC): Input: IT_CSV (CSV lines) Outputs: ANALYSIS (analysis payload), EXCEPTIONMSG (human-readable status) Return structure: RETURN ( BAPIRET2 ) for structured SAP-style success/error Custom exception: SENDEXCEPTIONTOSAPSERVER for workflow-raised failures Here is the ABAP wrapper that calls the remote implementation and normalizes the result. FUNCTION z_get_orders_analysis. *"---------------------------------------------------------------------- *" This module acts as a caller wrapper. *" Important: the remote execution is determined by DESTINATION. *" Even though the function name is the same, this is not recursion: *" the call runs in the remote RFC server registered under DESTINATION "DEST". *"---------------------------------------------------------------------- *" Contract: *" TABLES it_csv "CSV lines *" IMPORTING analysis "Result payload *" EXPORTING exceptionmsg "Human-readable status / error *" CHANGING return "BAPIRET2 return structure *" EXCEPTIONS sendexceptiontosapserver *"---------------------------------------------------------------------- CALL FUNCTION 'Z_GET_ORDERS_ANALYSIS' DESTINATION dest IMPORTING analysis = analysis TABLES it_csv = it_csv CHANGING return = return EXCEPTIONS sendexceptiontosapserver = 1 system_failure = 2 MESSAGE exceptionmsg communication_failure = 3 MESSAGE exceptionmsg OTHERS = 4. CASE sy-subrc. WHEN 0. exceptionmsg = 'ok'. "Optional: normalize success into RETURN for callers that ignore EXCEPTIONMSG IF return-type IS INITIAL. return-type = 'S'. return-message = 'OK'. ENDIF. WHEN 1. exceptionmsg = |Exception from workflow: SENDEXCEPTIONTOSAPSERVER { sy-msgv1 }{ sy-msgv2 }{ sy-msgv3 }{ sy-msgv4 }|. return-type = 'E'. return-message = exceptionmsg. WHEN 2 OR 3. "system_failure / communication_failure usually already populate exceptionmsg IF exceptionmsg IS INITIAL. exceptionmsg = |RFC system/communication failure.|. ENDIF. return-type = 'E'. return-message = exceptionmsg. WHEN OTHERS. exceptionmsg = |Error in workflow: { sy-msgv1 }{ sy-msgv2 }{ sy-msgv3 }{ sy-msgv4 }|. return-type = 'E'. return-message = exceptionmsg. ENDCASE. ENDFUNCTION. The wrapper is intentionally small: it forwards the payload to the remote implementation via the RFC destination and then normalizes the outcome into a predictable shape. The point isn’t fancy ABAP — it’s reliability. With a stable contract ( IT_CSV, ANALYSIS, RETURN, EXCEPTIONMSG ) the Logic Apps side can evolve independently while SAP callers still get consistent success/error semantics. Important: in 'CALL FUNCTION 'Z_GET_ORDERS_ANALYSIS' DESTINATION dest' the name of the called function should be the same as the name of the ABAP wrapper function module, the reason being that the SAP built-in trigger in the logic app uses the function module signature as the contract (i.e. metadata). Also note that in the provided companion artifacts, a default value was assigned to 'dest' in the function module definition. But it can also be specified in the request (DEST element). Z_GET_ORDERS_ANALYSIS . To sum up, the integration is intentionally shaped around three outputs: the raw input table ( IT_CSV ), a standardized SAP return structure ( RETURN / BAPIRET2 ), and a readable status string ( EXCEPTIONMSG ). The custom exception ( SENDEXCEPTIONTOSAPSERVER ) gives me a clean way to surface workflow failures back into SAP without burying them inside connector-specific error payloads. This is depicted in the figure below. 4. Destination Workflow The diagram below shows the destination workflow at a high level. I designed it as a staged pipeline: guard early, normalize input, validate, and then split the workload into two paths—operational handling of invalid records (notifications and optional IDoc remediation) and analysis of the validated dataset. Importantly, the SAP response is intentionally narrow: SAP receives only the final analysis (or a structured error), while validation details are delivered out-of-band via email. How to read this diagram Guardrail: Validate requested action ensures the workflow only handles the expected request. Normalize: Create CSV payload converts the inbound content into a consistent CSV representation. Validate: Data Validation Agent identifies invalid records (and produces a summary). Operational handling (invalid data): invalid rows are reported by email and may optionally be turned into IDocs (right-hand block). Analyze (valid data): Analyze data runs only on the validated dataset (invalid IDs excluded). Outputs: users receive Email analysis, while SAP receives only the analysis (or a structured error) via Respond to SAP server. Figure: Destination workflow with staged validation, optional IDoc remediation, and an SAP response . Reading the workflow top-to-bottom, the main design choice is separation of concerns. Validation is used to filter and operationalize bad records (notify humans, optionally create IDocs), while the SAP-facing response stays clean and predictable: SAP receives the final analysis for the validated dataset, or an error if the run can’t complete. This keeps the SAP contract stable even as validation rules and reporting details evolve. Step‑by‑step walkthrough Phase 0 — Entry and routing Trigger — When a message is received The workflow starts when an inbound SAP message is delivered to the Logic Apps SAP built‑in trigger. Guardrail — Validate requested action (2 cases) The workflow immediately checks whether the inbound request is the operation it expects (for example, the function/action name equals Z_GET_ORDERS_ANALYSIS ). If the action does not match: the workflow sends an exception back to SAP describing the unexpected action and terminates early (fail fast). If the action matches: processing continues. Phase 1 — Normalize input into a workflow‑friendly payload Prepare input — Create CSV payload The workflow extracts CSV lines from the inbound (XML) SAP payload and normalizes them into a consistent CSV text payload that downstream steps can process reliably. Initialize validation state — Initialize array of invalid order ids The workflow creates an empty array variable to capture order IDs that fail validation. This becomes the “validation output channel” used later for reporting, filtering, and optional remediation. Phase 2 — Validate the dataset (AI agent loop) Validate — Data Validation Agent (3 cases) This stage performs rule‑based validation using an agent pattern (backed by Azure OpenAI). Conceptually, it does three things (as shown in the diagram’s expanded block): Get validation rules: retrieves business rules from a SharePoint‑hosted validation document. The location of the validation rules file is a parameter of the logic app. Get CSV payload: loads the normalized CSV created earlier. Summarize CSV payload review: evaluates the CSV against the rules and produces structured validation outputs. Outputs produced by validation: A list of invalid order IDs The corresponding invalid CSV rows A human‑readable validation summary Note: The detailed AI prompt/agent mechanics are covered in Part 2. In Part 1, the focus is on the integration flow and how data moves. Phase 3 — Operational handling of invalid records (email + optional SAP remediation) After validation, the workflow treats invalid records as an operational concern: they are reported to humans and can optionally be routed into an SAP remediation path. This is shown in the right‑hand “Create IDocs” block. Notify — Send verification summary The workflow sends an email report (Office 365) to configured recipients containing: the validation summary the invalid order IDs the invalid CSV payload (or the subset of invalid rows) Transform — Transform CSV to XML The workflow converts the invalid CSV lines into an XML format that is suitable for SAP processing. Optional remediation — [RFC] Create all IDocs (conditional) If the workflow parameter (for example, CreateIDocs) is enabled, the workflow calls an SAP RFC (e.g., Z_CREATE_ONLINEORDER_IDOC ) to create IDocs from the transformed invalid data. Why this matters: Validation results are made visible (email) and optionally actionable (IDocs), without polluting the primary analysis response that SAP receives. Phase 4 — Analyze only the validated dataset (AI analysis) The workflow runs AI analysis on the validated dataset, explicitly excluding invalid order IDs discovered during the validation phase. The analysis prompt instructs the model to produce outputs such as trends, predictions, and recommendations. Note: The AI analysis prompt design and output shaping are covered in Part 2. Phase 5 — Post‑process the AI response and publish outputs Package results — Process analysis results (Scope) The workflow converts the AI response into a format suitable for email and for SAP consumption: Parse the OpenAI JSON response Extract the analysis content Convert markdown → HTML using custom JavaScript formatting Outputs Email analysis: sends the formatted analysis to recipients. Respond to SAP server: returns only the analysis (and errors) to SAP. Key design choice: SAP receives a clean, stable contract—analysis on success, structured error on failure. Validation details are handled out‑of‑band via email (and optionally via IDoc creation). Note: the analysis email sent by the destination workflow is there for testing purposes, to verify that the html content remains the same as it is sent back to the source workflow. Useful snippets Snippet 1 - Join each CSV line in the XML to make a CSV table: join( xpath( xml(triggerBody()?['content']), '/*[local-name()=\"Z_GET_ORDERS_ANALYSIS\"] /*[local-name()=\"IT_CSV\"] /*[local-name()=\"ZTY_CSV_LINE\"] /*[local-name()=\"LINE\"]/text()' ), '\r\n') Note: For the sake of simplicity, XPath is used here and throughout all places where XML is parsed. In the general case however, the Parse XML with schema action is the better and recommended way to strictly enforce the data contract between senders and receivers. More information about Parse XML with schema is provided in Appendix 1. Snippet 2 - Format markdown to html (simplified): const raw = workflowContext.actions.Extract_analysis.outputs; // Basic HTML escaping for safety (keeps <code> blocks clean) const escapeHtml = s => s.replace(/[&<>"]/g, c => ({'&':'&','<':'<','>':'>','"':'"'}[c])); // Normalize line endings let md = raw; // raw.replace(/\r\n/g, '\n').trim(); // Convert code blocks (``` ... ```) md = md.replace(/```([\s\S]*?)```/g, (m, p1) => `<pre><code>${escapeHtml(p1)}</code></pre>`); // Horizontal rules --- or *** md = md.replace(/(?:^|\n)---+(?:\n|$)/g, '<hr/>'); // Headings ###### to # for (let i = 6; i >= 1; i--) { const re = new RegExp(`(?:^|\\n)${'#'.repeat(i)}\\s+(.+?)\\s*(?=\\n|$)`, 'g'); md = md.replace(re, (m, p1) => `<h${i}>${p1.trim()}</h${i}>`); } // Bold and italic md = md.replace(/\*\*([^*]+)\*\*/g, '<strong>$1</strong>'); md = md.replace(/\*([^*]+)\*/g, '<em>$1</em>'); // Unordered lists (lines starting with -, *, +) md = md.replace(/(?:^|\n)([-*+]\s.+(?:\n[-*+]\s.+)*)/g, (m) => { const items = m.trim().split(/\n/).map(l => l.replace(/^[-*+]\s+/, '').trim()); return '\n<ul>' + items.map(i => `<li>${i}</li>`).join('\n') + '</ul>'; }); // Paragraphs: wrap remaining text blocks in <p>...</p> const blocks = md.split(/\n{2,}/).map(b => { if (/^<h\d>|^<ul>|^<pre>|^<hr\/>/.test(b.trim())) return b; return `<p>${b.replace(/\n/g, '<br/>')}</p>`; }); const html = blocks.join(''); return { html }; 5. Exception Handling To illustrate exception handling, we supposed that multiple workflows may listen to the same program id (by design or unexpectedly) and could therefore receive messages that were meant for others. So the first thing that happens is validate that the function name is as expected. It is shown below. In this section I show three practical ways to surface workflow failures back to SAP using the Logic Apps action “Send exception to SAP server”, and the corresponding ABAP patterns used to handle them. The core idea is the same in all three: Logic Apps raises an exception on the SAP side, SAP receives it as an RFC exception, and your ABAP wrapper converts that into something predictable (for example, a readable EXCEPTIONMSG , a populated RETURN , or both). The differences are in how much control you want over the exception identity and whether you want to leverage SAP message classes for consistent, localized messages. 5.1 Default exception This first example shows the default behavior of Send exception to SAP server. When the action runs without a custom exception name configuration, the connector raises a pre-defined exception that can be handled explicitly in ABAP. On the Logic Apps side, the action card “Send exception to SAP server” sends an Exception Error Message (for example, “Unexpected action in request: …”). On the ABAP side, the RFC call lists SENDEXCEPTIONTOSAPSERVER = 1 under EXCEPTIONS , and the code uses CASE sy-subrc to map that exception to a readable message. The key takeaway is that you get a reliable “out-of-the-box” exception path: ABAP can treat sy-subrc = 1 as the workflow‑raised failure and generate a consistent EXCEPTIONMSG . This is the simplest option and works well when you don’t need multiple exception names—just one clear “workflow failed” signal. 5.2 Message exception If you want more control than the default, you can configure the action to raise a named exception declared in your ABAP function module interface. This makes it easier to route different failure types without parsing free-form text. The picture shows Advanced parameters under the Logic Apps action, including “Exception Name” with helper text indicating it must match an exception declared in the ABAP function module definition. This option is useful when you want to distinguish workflow error categories (e.g., validation vs. routing vs. downstream failures) using exception identity, not just message text. The contract stays explicit: Logic Apps raises a named exception, and ABAP can branch on that name (or on sy-subrc mapping) with minimal ambiguity. 5.3 Message class exception The third approach uses SAP’s built-in message class mechanism so that the exception raised by the workflow can map cleanly into SAP’s message catalog ( T100 ). This is helpful when you want consistent formatting and localization aligned with standard SAP patterns. On the Logic Apps side, the action shows advanced fields including Message Class, Message Number, and an Is ABAP Message toggle, with helper text stating the message class can come from message maintenance ( SE91 ) or be custom. On the ABAP side, the code highlights an error-handling block that calls using sy-msgid , sy-msgno , and variables sy-msgv1 … sy-msgv4 , then stores the resulting text in EXCEPTIONMSG . This pattern is ideal when you want workflow exceptions to look and behave like “native” SAP messages. Instead of hard-coding strings, you rely on the message catalog and let ABAP produce a consistent final message via FORMAT_MESSAGE . The result is easier to standardize across teams and environments—especially if you already manage message classes as part of your SAP development process. Refer to Appendix 2 for further information on FORMAT_MESSAGE . 5.4 Choosing an exception strategy that SAP can act on Across these examples, the goal is consistent: treat workflow failures as first‑class outcomes in SAP, not as connector noise buried in run history. The Logic Apps action Send exception to SAP server gives you three increasingly structured ways to do that, and the “right” choice depends on how much semantics you want SAP to understand. Default exception (lowest ceremony): Use this when you just need a reliable “workflow failed” signal. The connector raises a pre-defined exception name (for example, SENDEXCEPTIONTOSAPSERVER ), and ABAP can handle it with a simple EXCEPTIONS … = 1 mapping and a sy-subrc check. This is the fastest way to make failures visible and deterministic. Named exception(s) (more routing control): Use this when you want SAP to distinguish failure types without parsing message text. By raising an exception name declared in the ABAP function module interface, you can branch cleanly in ABAP (or map to different return handling) and keep the contract explicit and maintainable. Message class + number (most SAP-native): Use this when you want errors to look and behave like standard SAP messages—consistent wording, centralized maintenance, and better alignment with SAP operational practices. In this mode, ABAP can render the final localized string using FORMAT_MESSAGE and return it as EXCEPTIONMSG (and optionally BAPIRET2 - MESSAGE ), which makes the failure both human-friendly and SAP-friendly. A practical rule of thumb: start with the default exception while you stabilize the integration, move to named exceptions when you need clearer routing semantics, and adopt message classes when you want SAP-native error governance (standardization, maintainability, and localization). Regardless of the option, the key is to end with a predictable SAP-side contract: a clear success path, and a failure path that produces a structured return and a readable message. 6. Response Handling This section shows how the destination workflow returns either a successful analysis response or a workflow exception back to SAP, and how the source (caller) workflow interprets the RFC response structure to produce a single, human‑readable outcome (an email body). The key idea is to keep the SAP-facing contract stable: SAP always returns a Z_GET_ORDERS_ANALYSISResponse envelope, and the caller workflow decides between success and error using just two fields: EXCEPTIONMSG and RETURN / MESSAGE . To summarize the steps: Destination workflow either: sends a normal response via Respond to SAP server, or raises an exception via Send exception to SAP server (with an error message). SAP server exposes those outcomes through the RFC wrapper: sy-subrc = 0 → success ( EXCEPTIONMSG = 'ok') sy-subrc = 1 → workflow exception ( SENDEXCEPTIONTOSAPSERVER ) sy-subrc = 2/3 → system/communication failures Source workflow calls the RFC, extracts: EXCEPTIONMSG RETURN / MESSAGE and uses an Has errors gate to choose between a success email body (analysis) or a failure email body (error summary). The figure below shows the full return path for results and failures. On the right, the destination workflow either responds normally (Respond to SAP server) or raises a workflow exception (Send exception to SAP server). SAP then maps that into the RFC outcome ( sy-subrc and message fields). On the left, the source workflow parses the RFC response structure and populates a single EmailBody variable using two cases: failure (error details) or success (analysis text). Figure: Response/exception flow Two things make this pattern easy to operationalize. First, the caller workflow does not need to understand every SAP field—only EXCEPTIONMSG and RETURN / MESSAGE are required to decide success vs failure. Second, the failure path intentionally aggregates details ( MESSAGE_V1 … MESSAGE_V4 plus the exception text) into a single readable string so errors don’t get trapped in run history. Callout: The caller workflow deliberately treats EXCEPTIONMSG != "ok" or RETURN / MESSAGE present as the single source of truth for failure, which keeps the decision logic stable even if the response schema grows. Detailed description Phase 1 — Destination workflow: choose “response” vs “exception” Respond to SAP server returns the normal response payload back to SAP. Send exception to SAP server raises a workflow failure with an Exception Error Message (the screenshot shows an example beginning with “Unexpected action in request:” and a token for Function Name). Outcome: SAP receives either a normal response or a raised exception for the RFC call. Phase 2 — SAP server: map workflow outcomes to RFC results The SAP-side wrapper code shown in the figure calls: CALL FUNCTION ' Z_GET_ORDERS_ANALYSIS ' DESTINATION DEST ... It declares exception mappings including: SENDEXCEPTIONTOSAPSERVER = 1 system_failure = 2 MESSAGE EXCEPTIONMSG communication_failure = 3 MESSAGE EXCEPTIONMSG OTHERS = 4 Then it uses CASE sy-subrc . to normalize outcomes (the figure shows WHEN 0. setting EXCEPTIONMSG = 'ok'., and WHEN 1. building a readable message for the workflow exception). Outcome: regardless of why it failed, SAP can provide a consistent set of fields back to the caller: a return structure and an exception/status message. Phase 3 — Source workflow: parse response and build one “email body” After the RFC action ([RFC] Call Z GET ORDERS ANALYSIS ) the source workflow performs: Save EXCEPTION message Extracts EXCEPTIONMSG from the response XML using XPath. Save RETURN message Extracts RETURN / MESSAGE from the response XML using XPath. Initialize email body Creates EmailBody once, then sets it in exactly one of two cases. Has errors (two cases) The condition treats the run as “error” if either: EXCEPTIONMSG is not equal to "ok", or RETURN / MESSAGE is not empty. Set email body (failure) / Set email body (success) Failure: builds a consolidated string containing RETURN / MESSAGE , message details ( MESSAGE_V1 ..V4), and EXCEPTIONMSG . Success: sets EmailBody to the ANALYSIS field extracted from the response. Outcome: the caller produces a single artifact (EmailBody) that is readable and actionable, without requiring anyone to inspect the raw RFC response. Note: the email recipient is set as a logic app parameter. 7. Destination Workflow #2: Persisting failed rows as custom IDocs In this section I zoom in on the optional “IDoc persistence” branch at the end of the destination workflow. After the workflow identifies invalid rows (via the Data Validation Agent) and emails a verification summary, it can optionally call a second SAP RFC to save the failed rows as IDocs for later processing. This is mainly included to showcase another common SAP integration scenario—creating/handling IDocs—and to highlight that you can combine “AI-driven validation” with traditional enterprise workflows. The deeper motivation for invoking this as part of the agent tooling is covered in Part 2; here, the goal is to show the connector pattern and the custom RFC used to create IDocs from CSV input. The figure below shows the destination workflow at two levels: a high-level overview at the top, and a zoomed view of the post-validation remediation steps at the bottom. The zoom starts from Data Validation Agent → Summarize CSV payload review and then expands the sequence that runs after Send verification summary: Transform CSV to XML followed by an SAP RFC call that creates IDocs from the failed data. The key point is that this branch is not the main “analysis response” path. It’s a practical remediation option: once invalid rows are identified and reported, the workflow can persist them into SAP using a dedicated RFC ( Z_CREATE_ONLINEORDER_IDOC ) and a simple IT_CSV payload. This keeps the end-to-end flow modular: analysis can remain focused on validated data, while failed records can be routed to SAP for follow-up processing on their own timeline. Callout: This branch exists to showcase an IDoc-oriented connector scenario. The “why this is invoked from the agent tooling” context is covered in Part 2; here the focus is the mechanics of calling Z_CREATE_ONLINEORDER_IDOC with IT_CSV and receiving ET_RETURN / ET_DOCNUMS . The screenshot shows an XML body with the RFC root element and an SAP namespace: <z_create_onlineorder_idoc xmlns="http://Microsoft.LobServices.Sap/2007/03/Rfc/"> <iv_direction>...</iv_direction> <iv_sndptr>...</iv_sndptr> <iv_sndprn>...</iv_sndprn> <iv_rcvptr>...</iv_rcvptr> <iv_rcvprn>...</iv_rcvprn> <it_csv> @{ ...Outputs... } </it_csv> <et_return></et_return> <et_docnums></et_docnums> </z_create_onlineorder_idoc> What to notice: the workflow passes invalid CSV rows in IT_CSV , and SAP returns a status table ( ET_RETURN ) and created document numbers ( ET_DOCNUMS ) for traceability. The payload includes standard-looking control fields ( IV_DIRECTION , IV_SNDPTR , IV_SNDPRN , IV_RCVPTR , IV_RCVPRN ) and the actual failed-row payload as IT_CSV . IT_CSV is populated via a Logic Apps expression (shown as @{ ...Outputs... } in the screenshot), which is the bridge between the prior transform step and the RFC call. The response side indicates table-like outputs: ET_RETURN and ET_DOCNUMS . 7.1 From CSV to IDocs I’ll cover the details of Destination workflow #2 in Part 2. In this post (Part 1), I focus on the contract and the end-to-end mechanics: what the RFC expects, what it returns, and how the created IDocs show up in the receiving workflow. Before looking at the RFC itself, it helps to understand the payload we’re building inside the IDoc. The screenshot below shows the custom segment definition used by the custom IDoc type. This segment is intentionally shaped to mirror the columns of the CSV input so the mapping stays direct and easy to reason about. Figure: Custom segment ZONLINEORDER000 (segment type ZONLINEORDER ) This segment definition is the contract anchor: it makes the CSV-to-IDoc mapping explicit and stable. Each CSV record becomes one segment instance with the same 14 business fields. That keeps the integration “boringly predictable,” which is exactly what you want when you’re persisting rejected records for later processing. The figure below shows the full loop for persisting failed rows as IDocs. The source workflow calls the custom RFC and sends the invalid CSV rows as XML. SAP converts each row into the custom segment and creates outbound IDocs. Those outbound IDocs are then received by Destination workflow #2, which processes them asynchronously (one workflow instance per IDoc) and appends results into shared storage for reporting. This pattern deliberately separates concerns: the first destination workflow identifies invalid rows and decides whether to persist them, SAP encapsulates the mechanics of IDoc creation behind a stable RFC interface, and a second destination workflow processes those IDocs asynchronously (one per IDoc), which is closer to how IDoc-driven integrations typically operate in production. Destination workflow #2 is included here to show the end-to-end contract and the “receipt” side of the connector scenario: Triggered by the SAP built-in trigger and checks FunctionName = IDOC_INBOUND_ASYNCHRONOUS extracts DOCNUM from the IDoc control record ( EDI_DC40 / DOCNUM ) reconstructs a CSV payload from the IDoc data segment (the fields shown match the segment definition) appends a “verification info” line to shared storage for reporting The implementation details of that workflow (including why it is invoked from the agent tooling) are covered in Part 2. 7.2 Z_CREATE_ONLINEORDER_IDOC - Contract overview The full source code for Z_CREATE_ONLINEORDER_IDOC is included in the supporting material. It’s too long to reproduce inline, so this post focuses on the contract—the part you need to call the RFC correctly and interpret its results. A quick note on authorship: most of the implementation was generated with Copilot, with manual review and fixes to resolve build errors and align the behavior with the intended integration pattern. The contract is deliberately generic because the goal was to produce an RFC that’s reusable across more than one scenario, rather than tightly coupled to a single workflow. At a high level, the RFC is designed to support: Both inbound and outbound IDoc creation It can either write IDocs to the SAP database (inbound-style persistence) or create/distribute IDocs outbound. Multiple IDoc/message/segment combinations IDoc type ( IDOCTYP ), message type ( MESTYP ), and segment type ( SEGTP ) are configurable so the same RFC can be reused. Explicit partner/port routing control Optional sender/receiver partner/port fields can be supplied when routing matters. Traceability of created artifacts The RFC returns created IDoc numbers so the caller can correlate “these failed rows” to “these IDocs.” Contract: Inputs (import parameters) IV_DIRECTION (default: 'O') — 'I' for inbound write-to-db, 'O' for outbound distribute/dispatch IV_IDOCTYP (default: ZONLINEORDERIDOC ) IV_MESTYP (default: ZONLINEORDER ) IV_SEGTP (default: ZONLINEORDER ) Optional partner/port routing fields: IV_SNDPRT , IV_SNDPRN , IV_RCVPRT , IV_RCVPRN , IV_RCVPOR Tables IT_CSV (structure ZTY_CSV_LINE ) — each row is one CSV line (the “table-of-lines” pattern) ET_RETURN (structure BAPIRET2 ) — success/warning/error messages (per-row and/or aggregate) ET_DOCNUMS (type ZTY_DOCNUM_TT ) — list of created IDoc numbers for correlation/traceability Outputs EV_DOCNUM — a convenience “primary / last created” DOCNUM value returned by the RFC 8. Concluding Remarks Part 1 established a stable SAP ↔ Logic Apps integration baseline: CSV moves end‑to‑end using explicit contracts, and failures are surfaced predictably. The source workflow reads CSV from Blob, wraps rows into the IT_CSV table‑of‑lines payload, calls Z_GET_ORDERS_ANALYSIS , and builds one outcome using two fields from the RFC response: EXCEPTIONMSG and RETURN / MESSAGE . The destination workflow gates requests, validates input, and returns only analysis (or errors) back to SAP while handling invalid rows operationally (notification + optional persistence). On the error path, we covered three concrete patterns to raise workflow failures back into SAP: the default connector exception ( SENDEXCEPTIONTOSAPSERVER ), named exceptions (explicit ABAP contract), and message‑class‑based errors (SAP‑native formatting via FORMAT_MESSAGE ). On the remediation side, we added a realistic enterprise pattern: persist rejected rows as custom IDocs via Z_CREATE_ONLINEORDER_IDOC ( IT_CSV in, ET_RETURN + ET_DOCNUMS out), using the custom segment ZONLINEORDER000 as the schema anchor and enabling downstream receipt in Destination workflow #2 (one run per IDoc, correlated via DOCNUM ). Part 2 is separate because it tackles a different problem: the AI layer. With contracts and error semantics now fixed, Part 2 can focus on the agent/tooling details that tend to iterate—rule retrieval, structured validation outputs, prompt constraints, token/history controls, and how the analysis output is generated and shaped—without muddying the transport story. Appendix 1: Parse XML with schema In this section I consider the CSV payload creation as an example, but parsing XML with schema applies in every place where we get an XML input to process, such as when receiving SAP responses, exceptions, or request/responses from other RFCs. Strong contract The Create_CSV_payload step in the shown implementation uses an xpath() + join() expression to extract LINE values from the incoming XML: join( xpath( xml(triggerBody()?['content']), '/*[local-name()="Z_GET_ORDERS_ANALYSIS"] /*[local-name()="IT_CSV"] /*[local-name()="ZTY_CSV_LINE"] /*[local-name()="LINE"]/text()' ), '\r\n' ) That approach works, but it’s essentially a “weak contract”: it assumes the message shape stays stable and that your XPath continues to match. By contrast, the Parse XML with schema action turns the XML payload into structured data based on an XSD, which gives you a “strong contract” and enables downstream steps to bind to known fields instead of re-parsing XML strings. The figure below compares two equivalent ways to build the CSV payload from the RFC input. On the left is the direct xpath() compose (labeled “weak contract”). On the right is the schema-based approach (labeled “strong contract”), where the workflow parses the request first and then builds the CSV payload by iterating over typed rows. What’s visible in the diagram is the key tradeoff: XPath compose path (left): the workflow creates the CSV payload directly using join(xpath(...), '\r\n') , with the XPath written using local-name() selectors. This is fast to prototype, but the contract is implicit—your workflow “trusts” the XML shape and your XPath accuracy. Parse XML with schema path (right): the workflow inserts a Parse XML with schema step (“ Parse Z GET ORDERS ANALYSIS request ”), initializes variables, loops For each CSV row, and Appends to CSV payload, then performs join(variables('CSVPayload'), '\r\n') . Here, the contract is explicit—your XSD defines what IT_CSV and LINE mean, and downstream steps bind to those fields rather than re-parsing XML. A good rule of thumb is: XPath is great for lightweight extraction, while Parse XML with schema is better when you want contract enforcement and long-term maintainability, especially in enterprise integration / BizTalk migration scenarios where schemas are already part of the integration culture. Implementation details The next figure shows the concrete configuration for Parse XML with schema and how its outputs flow into the “For each CSV row” loop. This is the “strong contract” version of the earlier XPath compose. This screenshot highlights three practical implementation details: The Parse action is schema-backed. In the Parameters pane, the action uses: Content: the incoming XML Response Schema source: LogicApp Schema name: Z_GET_ORDERS_ANALYSIS The code view snippet shows the same idea: type: "XmlParse" with content: " @triggerBody()?['content'] " and schema: { source: "LogicApp", name: "Z_GET_ORDERS_ANALYSIS.xsd" }. The parsed output becomes typed “dynamic content.” The loop input is shown as “ JSON Schema for element 'Z_GET_ORDERS_ANALYSIS: IT_CSV' ”. This is the key benefit: you are no longer scraping strings—you are iterating over a structured collection that was produced by schema-based parsing. The LINE extraction becomes trivial and readable. The “Append to CSV payload” step appends @item()?['LINE'] to the CSVpayload variable (as shown in the code snippet). Then the final Create CSV payload becomes a simple join(variables('CSVPayload'), '\r\n') . This is exactly the kind of “workflow readability” benefit you get once XML parsing is schema-backed. Schema generation The Parse action requires XSD schemas, which can be stored in the Logic App (or via a linked Integration Account). The final figure shows a few practical ways to obtain and manage those XSDs: Generate Schema (SAP connector): a “Generate Schema” action with Operation Type = RFC and an RFC Name field, which is a practical way to bootstrap schema artifacts when you already know the RFC you’re calling. Run Diagnostics / Fetch RFC Metadata: a “Run Diagnostics” action showing Operation type = Fetch RFC Metadata and RFC Name, which is useful to confirm the shape of the RFC interface and reconcile it with your XSD/contract. If you don’t want to rely solely on connector-side schema generation, there are also classic “developer tools” approaches: Infer XSD from a sample XML using .NET’s XmlSchemaInference (good for quick starting points). Generate XSD from an XML instance using xsd.exe (handy when you already have representative sample payloads) or by asking your favorite AI prompt. When to choose XPath vs Parse XML with schema (practical guidance) Generally speaking, choose XPath when… You need a quick extraction and you’re comfortable maintaining a single XPath. You don’t want to manage schema artifacts yet (early prototypes). Choose Parse XML with schema when… You want a stronger, explicit contract (XSD defines what the payload is). You want the designer to expose structured outputs (“JSON Schema for element …”) so downstream steps are readable and less brittle. You expect the message shape to evolve over time and prefer schema-driven changes over XPath surgery. Appendix 2: Using FORMAT_MESSAGE to produce SAP‑native error text When propagating failures from Logic Apps back into SAP (for example via Send exception to SAP server), I want the SAP side to produce a predictable, human‑readable message without forcing callers to parse connector‑specific payloads. ABAP’s FORMAT_MESSAGE is ideal for this because it converts SAP’s message context—message class, message number, and up to four variables—into the final message text that SAP would normally display, but without raising a UI message. What FORMAT_MESSAGE does FORMAT_MESSAGE formats a message defined in SAP’s message catalog ( T100 / maintained via SE91 ) using the values in sy-msgid , sy-msgno , and sy-msgv1 … sy-msgv4 . Conceptually, it answers the question: “Given message class + number + variables, what is the rendered message string?” This is particularly useful after an RFC call fails, where ABAP may have message context available even if the exception itself is not a clean string. Why this matters in an RFC wrapper In the message class–based exception configuration, the workflow can provide message metadata (class/number/type) so that SAP can behave “natively”: ABAP receives a failure ( sy-subrc <> 0), formats the message using FORMAT_MESSAGE , and returns the final text in a field like EXCEPTIONMSG (and/or in BAPIRET2 - MESSAGE ). The result is: consistent wording across systems and environments easier localization (SAP selects language-dependent text) separation of concerns: code supplies variables; message content lives in message maintenance A robust pattern After the RFC call, I use this order of precedence: Use any explicit text already provided (for example via system_failure … MESSAGE exceptionmsg), because it’s already formatted. If that’s empty but SAP message context exists ( sy-msgid / sy-msgno ), call FORMAT_MESSAGE to produce the final string. If neither is available, fall back to a generic message that includes sy-subrc . Here is a compact version of that pattern: DATA: lv_text TYPE string. CALL FUNCTION 'Z_GET_ORDERS_ANALYSIS' DESTINATION dest IMPORTING analysis = analysis TABLES it_csv = it_csv CHANGING return = return EXCEPTIONS sendexceptiontosapserver = 1 system_failure = 2 MESSAGE exceptionmsg communication_failure = 3 MESSAGE exceptionmsg OTHERS = 4. IF sy-subrc <> 0. "Prefer explicit message text if it already exists IF exceptionmsg IS INITIAL. "Otherwise format SAP message context into a string IF sy-msgid IS NOT INITIAL AND sy-msgno IS NOT INITIAL. CALL FUNCTION 'FORMAT_MESSAGE' EXPORTING id = sy-msgid no = sy-msgno v1 = sy-msgv1 v2 = sy-msgv2 v3 = sy-msgv3 v4 = sy-msgv4 IMPORTING msg = lv_text. exceptionmsg = lv_text. ELSE. exceptionmsg = |RFC failed (sy-subrc={ sy-subrc }).|. ENDIF. ENDIF. "Optionally normalize into BAPIRET2 for structured consumption return-type = 'E'. return-message = exceptionmsg. ENDIF. Common gotchas FORMAT_MESSAGE only helps if sy-msgid and sy-msgno are set. If the failure did not originate from an SAP message (or message mapping is disabled), these fields may be empty—so keep a fallback. Message numbers are typically 3-digit strings (e.g., 001, 012), matching how messages are stored in the catalog. FORMAT_MESSAGE formats text; it does not raise or display a message. That makes it safe to use in RFC wrappers and background processing. Bottom line: FORMAT_MESSAGE is a simple tool that helps workflow‑originated failures “land” in SAP as clean, SAP‑native messages—especially when using message classes to standardize and localize error text. References Logic Apps Agentic Workflows with SAP - Part 2: AI Agents Handling Errors in SAP BAPI Transactions | Microsoft Community Hub Access SAP from workflows | Microsoft Learn Create common SAP workflows | Microsoft Learn Generate Schemas for SAP Artifacts via Workflows | Microsoft Learn Parse XML using Schemas in Standard workflows - Azure Logic Apps | Microsoft Learn Announcing XML Parse and Compose for Azure Logic Apps GA Exception Handling | ABAP Keyword Documentation Handling and Propagating Exceptions - ABAP Keyword Documentation SAP .NET Connector 3.1 Overview SAP .NET Connector 3.1 Programming Guide All supporting content for this post may be found in the companion GitHub repository.Data Mapper Test Executor: A New Addition to Logic Apps Standard Test Framework
Why Does It Matter? Testing complex data transformations has always been a challenge for Logic Apps developers. With workflows relying on XSLT, Liquid templates, and custom maps, validating these transformations often required manual steps or external tools. The Data Mapper Test Executor changes that by bringing first-class support for map testing directly into the Logic Apps Standard Automated Test Framework. This means faster feedback loops, improved reliability, and a more streamlined developer experience. Key Highlights Native Support for Data Mapper Testing Execute unit tests for XSLT 1.0/2.0/3.0 and Logic Apps Data Mapper (.lml) files without leaving your development environment. Integrated with Automated Test Framework SDK Leverages the latest SDK capabilities for consistent test execution and reporting. Enhanced Validation Supports schema validation and transformation checks, ensuring your maps behave as expected across scenarios. Performance Optimizations Built-in caching and resource management for efficient test runs. How to Get Started Install and reference the Latest Automated Test Framework SDK Make sure you’re referencing the version 1.0.1 or higher of the framework to access the new executor class. You can find the latest version in the NuGet gallery. <PackageReference Include="Microsoft.Azure.Workflows.WebJobs.Tests.Extension" Version="1.0.1" /> Add the Data Mapper Test Executor to Your Test Project Include the new class in your unit test suite for Logic Apps workflows. Write Your First Test Check the sample code below with three different methods, highlighting the options that can be used with the DataMapTestExecutor using System.Collections.Generic; using System.Threading.Tasks; using Microsoft.Azure.Workflows.UnitTesting.Definitions; using Microsoft.VisualStudio.TestTools.UnitTesting; using la1.Tests.Mocks.Stateful1; using Microsoft.Azure.Workflows.UnitTesting; using System.IO; using System.Text; using Microsoft.Azure.Workflows.Data.Entities; using Newtonsoft.Json.Linq; namespace la1.Tests { /// <summary> /// The unit test class. /// </summary> [TestClass] public class DataMapTest { /// <summary> /// The map to test. /// </summary> public const string MapName = "source_order_to_canonical_order"; public string PathToMapDefinition { get; private set; } public string PathToCompiledXslt { get; private set; } public string PathToXsltTestInputs { get; private set; } /// <summary> /// The data map test executor. /// </summary> public TestExecutor TestExecutor; [TestInitialize] public void Setup() { this.TestExecutor = new TestExecutor("Stateful1/testSettings.config"); this.PathToMapDefinition = Path.Combine(this.TestExecutor.rootDirectory, this.TestExecutor.logicAppName, "Artifacts", "MapDefinitions", $"{DataMapTest.MapName}.lml"); this.PathToCompiledXslt = Path.Combine(this.TestExecutor.rootDirectory, this.TestExecutor.logicAppName, "Artifacts", "Maps", $"{DataMapTest.MapName}.xslt"); this.PathToXsltTestInputs = Path.Combine(this.TestExecutor.rootDirectory, "Tests", "la1", "Stateful1", "DataMapTest", "test-input.json"); } /// <summary> /// Sample comparing compiled XSLT to generated XSLT using map name. /// </summary> [TestMethod] public async Task DataMapTest_GenerateXslt() { var dataMapTestExecutor = this.TestExecutor.CreateMapExecutor(); var xsltBytes = await dataMapTestExecutor .GenerateXslt(mapName: DataMapTest.MapName) .ConfigureAwait(continueOnCapturedContext: false); Assert.IsNotNull(xsltBytes); Assert.AreEqual(expected: File.ReadAllText(this.PathToCompiledXslt), actual: Encoding.UTF8.GetString(xsltBytes)); } /// <summary> /// Sample comparing compiled XSLT to generated XSLT using map content. /// </summary> [TestMethod] public async Task DataMapTest_GenerateXsltWithMapContent() { var mapContent = File.ReadAllText(this.PathToMapDefinition); var generateXsltInput = new GenerateXsltInput { MapContent = mapContent }; var dataMapTestExecutor = this.TestExecutor.CreateMapExecutor(); var xsltBytes = await dataMapTestExecutor .GenerateXslt(generateXsltInput: generateXsltInput) .ConfigureAwait(continueOnCapturedContext: false); Assert.IsNotNull(xsltBytes); Assert.AreEqual(expected: File.ReadAllText(this.PathToCompiledXslt), actual: Encoding.UTF8.GetString(xsltBytes)); } /// <summary> /// Sample running data map using map name and test inputs. /// </summary> [TestMethod] public async Task DataMapTest_RunMap() { var dataMapTestExecutor = this.TestExecutor.CreateMapExecutor(); var mapOutput = await dataMapTestExecutor .RunMapAsync( mapName: DataMapTest.MapName, inputContent: File.ReadAllBytes(this.PathToXsltTestInputs)) .ConfigureAwait(continueOnCapturedContext: false); Assert.IsNotNull(mapOutput); Assert.IsTrue(mapOutput.Type == JTokenType.Object); Assert.AreEqual(expected: "FC-20250603-001", actual: mapOutput["orderId"]); Assert.AreEqual(expected: "VIP-789456", actual: mapOutput["customerId"]); Assert.AreEqual(expected: "NEW", actual: mapOutput["status"]); } /// <summary> /// Sample running data map using generated XSLT content and test inputs. /// </summary> [TestMethod] public async Task DataMapTest_RunMapWithXsltContentBytes() { var dataMapTestExecutor = this.TestExecutor.CreateMapExecutor(); var mapContent = File.ReadAllText(this.PathToMapDefinition); var generateXsltInput = new GenerateXsltInput { MapContent = mapContent }; var xsltContent = await dataMapTestExecutor .GenerateXslt(generateXsltInput: generateXsltInput) .ConfigureAwait(continueOnCapturedContext: false); var mapOutput = await dataMapTestExecutor .RunMapAsync( xsltContent: xsltContent, inputContent: File.ReadAllBytes(this.PathToXsltTestInputs)) .ConfigureAwait(continueOnCapturedContext: false); Assert.IsNotNull(mapOutput); Assert.IsTrue(mapOutput.Type == JTokenType.Object); Assert.AreEqual(expected: "FC-20250603-001", actual: mapOutput["orderId"]); Assert.AreEqual(expected: "VIP-789456", actual: mapOutput["customerId"]); Assert.AreEqual(expected: "NEW", actual: mapOutput["status"]); } } } The default TestExecutor should be updated with a new method, so you can use the same class to create UnitTestExecutor and DataMapTestExecutor instances: public DataMapTestExecutor CreateMapExecutor() { // Set the path for workflow-related input files in the workspace and build the full paths to the required JSON files. var appDirectoryPath = Path.Combine(this.rootDirectory, this.logicAppName); return new DataMapTestExecutor(appDirectoryPath: appDirectoryPath); } Note: This feature is available starting with the latest SDK release. Update your dependencies before making changes to your code, to get full intelisense support. Limitations Loop Structures: Dynamic execution within repeating structures is not yet supported. Non-Mocked Connectors: All actions in the execution path should be mocked. Unsupported Actions: Integration Account maps, custom code actions, and EDI encode/decode remain out of scope for this release. Preview Caveats: If you’re using private preview components, expect limited testing compared to GA releases. Learn More Logic Apps Standard Automated Test Framework SDK666Views0likes2Comments

