This two-part series presents an end-to-end Azure Logic Apps implementation that integrates the SAP built-in connector with an AI-assisted validation and analysis pipeline. The intent is twofold: (1) complement existing SAP connector documentation by showing the SAP configuration and contracts in real integration context, and (2) move beyond basic connectivity by using AI to produce more actionable results from the same data flow. In the scenario, a Logic App sends CSV documents to an SAP system and receives back a structured analysis document (market trends, predictions, and recommendations) computed only from data that passes pre-defined business rules. Data is exchanged between SAP and Logic Apps using SAP RFCs, with results and failures returned to SAP in a consistent shape and summarized to end users via email. Part 1 focuses on the SAP ↔ Logic Apps integration mechanics—how data moves and how errors propagate. Part 2 covers the AI layer—how validation is performed using an agent loop and how insights are generated through an OpenAI-backed API connection. The approach is intentionally generic and reusable and serves as a starter pattern for new implementations or migrations from platforms such as BizTalk.
1. Introduction
When you integrate Azure Logic Apps with SAP, the “hello world” part is usually easy. The part that bites you later is data quality. In SAP-heavy flows, validation isn’t a nice-to-have — it’s what makes the downstream results meaningful. If invalid data slips through, it can get expensive fast: you may create incorrect business documents, trigger follow-up processes, and end up in a cleanup path that’s harder (and more manual) than building validation upfront. And in “all-or-nothing” transactional patterns, things get even more interesting: one bad record can force a rollback strategy, compensating actions, or a whole replay/reconciliation story you didn’t want to own. See for instance Handling Errors in SAP BAPI Transactions | Microsoft Community Hub to get an idea of the complexity in a BizTalk context.
That’s the motivation for this post: a practical starter pattern that you can adapt to many data shapes and domains for validating data in a Logic Apps + SAP integration.
| Note: For the full set of assets used here, see the companion GitHub repository (workflows, schemas, SAP ABAP code, and sample files). |
Scenario overview
The scenario is intentionally simple, but it mirrors what shows up in real systems:
- A Logic App workflow sends CSV documents to an SAP endpoint.
- SAP forwards the payload to a second Logic App workflow that performs:
- rule-based validation (based on pre-defined rules)
- analysis/enrichment (market trends, predictions, recommendations)
- The workflow either:
- returns validated results (or validation errors) to the initiating workflow, or
- persists outputs for later use
For illustration, I’m using fictitious retail data. The content is made up, but the mechanics are generic: the same approach works for orders, inventory, pricing, master data feeds, or any “file in → decision out” integration. You’ll see sample inputs and outputs below to keep the transformations concrete.
Figure: Input CSV data and corresponding rules
Figure: Outputs - Analysis of trends and predictions, validation summary and IDoc information.
What this post covers
This walkthrough focuses on the integration building blocks that tend to matter in production:
- Calling SAP RFCs from Logic App workflows, and invoking Logic App workflows from SAP function modules
- Using the Logic Apps SAP built-in trigger
- Receiving and processing IDocs
- Returning responses and exceptions back to SAP in a structured, actionable way
- Data manipulation patterns in Logic Apps, including:
- parsing and formatting
- inline scripts
- XPath (where it fits, and where it becomes painful).
Overall Implementation
A high-level view of the implementation is shown below. The source workflow handles end-to-end ingestion—file intake, transformation, SAP integration, error handling, and notifications—using Azure Logic Apps. The destination workflows focus on validation and downstream processing, including AI-assisted analysis and reporting, with robust exception handling across multiple technologies. I’ll cover the AI portion in a follow-up post.
Figure: Overall implementation.
|
Note on AI-assisted development Most of the workflow “glue” in this post—XPath, JavaScript snippets, and Logic Apps expressions—was built with help from Copilot and the AI assistant in the designer (see Get AI-assisted help for Standard workflows - Azure Logic Apps | Microsoft Learn). In my experience, this is exactly where AI assistance pays off: generating correct scaffolding quickly, then iterating based on runtime behavior. I’ve also included SAP ABAP snippets for the SAP-side counterpart. You don’t need advanced ABAP knowledge to follow along; the snippets are deliberately narrow and integration-focused. I include them because it’s hard to design robust integrations if you only understand one side of the contract. When you understand how SAP expects to receive data, how it signals errors, and where transactional boundaries actually are, you end up with cleaner workflows and fewer surprises. |
2. Source Workflow
This workflow is a small, end‑to‑end “sender” pipeline: it reads a CSV file from Azure Blob Storage, converts the rows into the SAP table‑of‑lines XML shape expected by an RFC, calls Z_GET_ORDERS_ANALYSIS via the SAP connector, then extracts analysis or error details from the RFC response and emails a single consolidated result.
At a high level:
- Input: an HTTP request (used to kick off the run) + a blob name.
- Processing: CSV → array of rows → XML (…) → RFC call
- Output: one email containing either:
- the analysis (success path), or
- a composed error summary (failure path).
The diagram below summarizes the sender pipeline: HTTP trigger → Blob CSV (header included) → rows → SAP RFC → parse response → email.
Figure: End‑to‑end sender pipelineTwo design choices are doing most of the work here. First, the workflow keeps the CSV transport contract stable by sending the file as a verbatim list of lines—including the header—wrapped into … elements under IT_CSV. Second, it treats the RFC response as the source of truth: EXCEPTIONMSG and RETURN/MESSAGE drive a single Has errors gate, which determines whether the email contains the analysis or a consolidated failure summary.
Step-by-step description
|
Phase 0 — Trigger
|
|
Phase 1 — Load and split the CSV
Design note: Keeping the header row is useful when downstream validation or analysis wants column names, and it avoids implicit assumptions in the sender workflow. |
|
Phase 2 — Shape the RFC payload
|
|
Phase 3 — Call SAP and extract response fields
|
|
Phase 4 — Decide success vs failure and notify
|
Note: Because the header row is included in IT_CSV, the SAP-side parsing/validation treats the first line as column titles (or simply ignores it). The sender workflow stays “schema-agnostic” by design.
Useful snippets
Snippet 1 — Split the CSV into rows
split(string(body('Read_CSV_orders_from_blob')?['content']), '\r\n')
Tip: If your CSV has a header row you don’t want to send to SAP, switch back to:
@skip(split(string(body('Read_CSV_orders_from_blob')?['content']), '\r\n'), 1)
Snippet 2 — JavaScript transform: “rows → SAP table‑of‑lines XML”
const lines = workflowContext.actions.Extract_rows.outputs;
function xmlEscape(value) {
return String(value)
.replace(/&/g, "&")
.replace(//g, ">")
.replace(/"/g, """)
.replace(/'/g, "'");
}
// NOTE: we don't want to keep empty lines (which can be produced by reading the blobs)
// the reason being that if the recipient uses a schema to validate the xml,
// it may reject it if it does not allow empty nodes.
const xml = lines
.filter(line => line && line.trim() !== '') // keep only non-empty lines
.map(line => `<zty_csv_line><line>${xmlEscape(line)}</line></zty_csv_line>`)
.join('');
return { xml };
Snippet 3 — XPath extraction of response fields (namespace-robust)
EXCEPTIONMSG:
@xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(
/*[local-name()="Z_GET_ORDERS_ANALYSISResponse"]
/*[local-name()="EXCEPTIONMSG"])')
RETURN/MESSAGE:
@xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(
/*[local-name()="Z_GET_ORDERS_ANALYSISResponse"]
/*[local-name()="RETURN"]
/*[local-name()="MESSAGE"])')
Snippet 4 — Failure email body composition
concat(
'Error message: ', outputs('Save_RETURN_message'), ', details: ',
xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(//*[local-name()=\"MESSAGE_V1\"])'),
xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(//*[local-name()=\"MESSAGE_V2\"])'),
xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(//*[local-name()=\"MESSAGE_V3\"])'),
xpath(body('[RFC]_Call_Z_GET_ORDERS_ANALYSIS')?['content'], 'string(//*[local-name()=\"MESSAGE_V4\"])'),
'; ',
'Exception message: ', outputs('Save_EXCEPTION_message'), '.')
3. SAP Support
To make the SAP/Logic Apps boundary simple, I model the incoming CSV as a table of “raw lines” on the SAP side. The function module Z_GET_ORDERS_ANALYSIS exposes a single table parameter, IT_CSV, typed using a custom line structure.
Figure: IT_CSV is a table of CSV lines (ZTY_CSV_LINE), with a single LINE field (CHAR2048).
IT_CSV uses the custom structure ZTY_CSV_LINE, which contains a single component LINE (CHAR2048). This keeps the SAP interface stable: the workflow can send CSV lines without SAP having to know the schema up front, and the parsing/validation logic can evolve independently.
The diagram below shows the plumbing that connects SAP to Azure Logic Apps in two common patterns: SAP sending IDocs to a workflow and SAP calling a remote-enabled endpoint via an RFC destination. I’m showing all three pieces together—the ABAP call site, the SM59 RFC destination, and the Logic Apps SAP built-in trigger—because most “it doesn’t work” problems come down to a small set of mismatched configuration values rather than workflow logic.
The key takeaway is that both patterns hinge on the same contract: Program ID plus the SAP Gateway host/service. In SAP, those live in SM59 (TCP/IP destination, registered server program). In Logic Apps, the SAP built-in trigger listens using the same Program ID and gateway settings, while the trigger configuration (for example, IDoc format and degree of parallelism) controls how messages are interpreted and processed. Once these values line up, the rest of the implementation becomes “normal workflow engineering”: validation, predictable error propagation, and response shaping.
Before diving into workflow internals, I make the SAP-side contract explicit. The function module interface below shows the integration boundary: CSV lines come in as IT_CSV, results come back as ANALYSIS, and status/error information is surfaced both as a human-readable EXCEPTIONMSG and as a structured RETURN (BAPIRET2). I also use a dedicated exception (SENDEXCEPTIONTOSAPSERVER) to signal workflow-raised failures cleanly.
Contract (what goes over RFC):
- Input:
IT_CSV(CSV lines) - Outputs:
ANALYSIS(analysis payload),EXCEPTIONMSG(human-readable status) - Return structure:
RETURN(BAPIRET2) for structured SAP-style success/error - Custom exception:
SENDEXCEPTIONTOSAPSERVERfor workflow-raised failures
Here is the ABAP wrapper that calls the remote implementation and normalizes the result.
FUNCTION z_get_orders_analysis.
*"----------------------------------------------------------------------
*" This module acts as a caller wrapper.
*" Important: the remote execution is determined by DESTINATION.
*" Even though the function name is the same, this is not recursion:
*" the call runs in the remote RFC server registered under DESTINATION "DEST".
*"----------------------------------------------------------------------
*" Contract:
*" TABLES it_csv "CSV lines
*" IMPORTING analysis "Result payload
*" EXPORTING exceptionmsg "Human-readable status / error
*" CHANGING return "BAPIRET2 return structure
*" EXCEPTIONS sendexceptiontosapserver
*"----------------------------------------------------------------------
CALL FUNCTION 'Z_GET_ORDERS_ANALYSIS' DESTINATION dest
IMPORTING
analysis = analysis
TABLES
it_csv = it_csv
CHANGING
return = return
EXCEPTIONS
sendexceptiontosapserver = 1
system_failure = 2 MESSAGE exceptionmsg
communication_failure = 3 MESSAGE exceptionmsg
OTHERS = 4.
CASE sy-subrc.
WHEN 0.
exceptionmsg = 'ok'.
"Optional: normalize success into RETURN for callers that ignore EXCEPTIONMSG
IF return-type IS INITIAL.
return-type = 'S'.
return-message = 'OK'.
ENDIF.
WHEN 1.
exceptionmsg =
|Exception from workflow: SENDEXCEPTIONTOSAPSERVER { sy-msgv1 }{ sy-msgv2 }{ sy-msgv3 }{ sy-msgv4 }|.
return-type = 'E'.
return-message = exceptionmsg.
WHEN 2 OR 3.
"system_failure / communication_failure usually already populate exceptionmsg
IF exceptionmsg IS INITIAL.
exceptionmsg = |RFC system/communication failure.|.
ENDIF.
return-type = 'E'.
return-message = exceptionmsg.
WHEN OTHERS.
exceptionmsg =
|Error in workflow: { sy-msgv1 }{ sy-msgv2 }{ sy-msgv3 }{ sy-msgv4 }|.
return-type = 'E'.
return-message = exceptionmsg.
ENDCASE.
ENDFUNCTION.
The wrapper is intentionally small: it forwards the payload to the remote implementation via the RFC destination and then normalizes the outcome into a predictable shape. The point isn’t fancy ABAP — it’s reliability. With a stable contract (IT_CSV, ANALYSIS, RETURN, EXCEPTIONMSG) the Logic Apps side can evolve independently while SAP callers still get consistent success/error semantics.
|
Important: in 'CALL FUNCTION 'Z_GET_ORDERS_ANALYSIS' DESTINATION dest' the name of the called function should be the same as the name of the ABAP wrapper function module, the reason being that the SAP built-in trigger in the logic app uses the function module signature as the contract (i.e. metadata). |
Z_GET_ORDERS_ANALYSIS.To sum up, the integration is intentionally shaped around three outputs: the raw input table (IT_CSV), a standardized SAP return structure (RETURN / BAPIRET2), and a readable status string (EXCEPTIONMSG). The custom exception (SENDEXCEPTIONTOSAPSERVER) gives me a clean way to surface workflow failures back into SAP without burying them inside connector-specific error payloads. This is depicted in the figure below.
4. Destination Workflow
The diagram below shows the destination workflow at a high level. I designed it as a staged pipeline: guard early, normalize input, validate, and then split the workload into two paths—operational handling of invalid records (notifications and optional IDoc remediation) and analysis of the validated dataset. Importantly, the SAP response is intentionally narrow: SAP receives only the final analysis (or a structured error), while validation details are delivered out-of-band via email.
|
How to read this diagram
|
Figure: Destination workflow with staged validation, optional IDoc remediation, and an SAP response .
Reading the workflow top-to-bottom, the main design choice is separation of concerns. Validation is used to filter and operationalize bad records (notify humans, optionally create IDocs), while the SAP-facing response stays clean and predictable: SAP receives the final analysis for the validated dataset, or an error if the run can’t complete. This keeps the SAP contract stable even as validation rules and reporting details evolve.
Step‑by‑step walkthrough
|
Phase 0 — Entry and routing
|
|
Phase 1 — Normalize input into a workflow‑friendly payload
|
|
Phase 2 — Validate the dataset (AI agent loop)
Note: The detailed AI prompt/agent mechanics are covered in Part 2. In Part 1, the focus is on the integration flow and how data moves. |
| Phase 3 — Operational handling of invalid records (email + optional SAP remediation)
After validation, the workflow treats invalid records as an operational concern: they are reported to humans and can optionally be routed into an SAP remediation path. This is shown in the right‑hand “Create IDocs” block.
Why this matters: Validation results are made visible (email) and optionally actionable (IDocs), without polluting the primary analysis response that SAP receives. |
|
Phase 4 — Analyze only the validated dataset (AI analysis) The workflow runs AI analysis on the validated dataset, explicitly excluding invalid order IDs discovered during the validation phase. The analysis prompt instructs the model to produce outputs such as trends, predictions, and recommendations. Note: The AI analysis prompt design and output shaping are covered in Part 2. |
|
Phase 5 — Post‑process the AI response and publish outputs
Key design choice: SAP receives a clean, stable contract—analysis on success, structured error on failure. Validation details are handled out‑of‑band via email (and optionally via IDoc creation). Note: the analysis email sent by the destination workflow is there for testing purposes, to verify that the html content remains the same as it is sent back to the source workflow. |
Useful snippets
Snippet 1 - Join each CSV line in the XML to make a CSV table:
join(
xpath(
xml(triggerBody()?['content']),
'/*[local-name()=\"Z_GET_ORDERS_ANALYSIS\"]
/*[local-name()=\"IT_CSV\"]
/*[local-name()=\"ZTY_CSV_LINE\"]
/*[local-name()=\"LINE\"]/text()'
),
'\r\n')
Note: For the sake of simplicity, XPath is used here and throughout all places where XML is parsed. In the general case however, the Parse XML with schema action is the better and recommended way to strictly enforce the data contract between senders and receivers. More information about Parse XML with schema is provided in Appendix 1.
Snippet 2 - Format markdown to html (simplified):
const raw = workflowContext.actions.Extract_analysis.outputs;
// Basic HTML escaping for safety (keeps <code> blocks clean)
const escapeHtml = s => s.replace(/[&<>"]/g, c => ({'&':'&','<':'<','>':'>','"':'"'}[c]));
// Normalize line endings
let md = raw; // raw.replace(/\r\n/g, '\n').trim();
// Convert code blocks (``` ... ```)
md = md.replace(/```([\s\S]*?)```/g, (m, p1) => `<pre><code>${escapeHtml(p1)}</code></pre>`);
// Horizontal rules --- or ***
md = md.replace(/(?:^|\n)---+(?:\n|$)/g, '<hr/>');
// Headings ###### to #
for (let i = 6; i >= 1; i--) {
const re = new RegExp(`(?:^|\\n)${'#'.repeat(i)}\\s+(.+?)\\s*(?=\\n|$)`, 'g');
md = md.replace(re, (m, p1) => `<h${i}>${p1.trim()}</h${i}>`);
}
// Bold and italic
md = md.replace(/\*\*([^*]+)\*\*/g, '<strong>$1</strong>');
md = md.replace(/\*([^*]+)\*/g, '<em>$1</em>');
// Unordered lists (lines starting with -, *, +)
md = md.replace(/(?:^|\n)([-*+]\s.+(?:\n[-*+]\s.+)*)/g, (m) => {
const items = m.trim().split(/\n/).map(l => l.replace(/^[-*+]\s+/, '').trim());
return '\n<ul>' + items.map(i => `<li>${i}</li>`).join('\n') + '</ul>';
});
// Paragraphs: wrap remaining text blocks in <p>...</p>
const blocks = md.split(/\n{2,}/).map(b => {
if (/^<h\d>|^<ul>|^<pre>|^<hr\/>/.test(b.trim())) return b;
return `<p>${b.replace(/\n/g, '<br/>')}</p>`;
});
const html = blocks.join('');
return { html };
5. Exception Handling
To illustrate exception handling, we supposed that multiple workflows may listen to the same program id (by design or unexpectedly) and could therefore receive messages that were meant for others. So the first thing that happens is validate that the function name is as expected. It is shown below.
In this section I show three practical ways to surface workflow failures back to SAP using the Logic Apps action “Send exception to SAP server”, and the corresponding ABAP patterns used to handle them. The core idea is the same in all three: Logic Apps raises an exception on the SAP side, SAP receives it as an RFC exception, and your ABAP wrapper converts that into something predictable (for example, a readable EXCEPTIONMSG, a populated RETURN, or both). The differences are in how much control you want over the exception identity and whether you want to leverage SAP message classes for consistent, localized messages.
5.1 Default exception
This first example shows the default behavior of Send exception to SAP server. When the action runs without a custom exception name configuration, the connector raises a pre-defined exception that can be handled explicitly in ABAP.
On the Logic Apps side, the action card “Send exception to SAP server” sends an Exception Error Message (for example, “Unexpected action in request: …”). On the ABAP side, the RFC call lists SENDEXCEPTIONTOSAPSERVER = 1 under EXCEPTIONS, and the code uses CASE sy-subrc to map that exception to a readable message.
Figure: Default exception
The key takeaway is that you get a reliable “out-of-the-box” exception path: ABAP can treat sy-subrc = 1 as the workflow‑raised failure and generate a consistent EXCEPTIONMSG. This is the simplest option and works well when you don’t need multiple exception names—just one clear “workflow failed” signal.
5.2 Message exception
If you want more control than the default, you can configure the action to raise a named exception declared in your ABAP function module interface. This makes it easier to route different failure types without parsing free-form text.
The picture shows Advanced parameters under the Logic Apps action, including “Exception Name” with helper text indicating it must match an exception declared in the ABAP function module definition.
Figure: Message exception
This option is useful when you want to distinguish workflow error categories (e.g., validation vs. routing vs. downstream failures) using exception identity, not just message text. The contract stays explicit: Logic Apps raises a named exception, and ABAP can branch on that name (or on sy-subrc mapping) with minimal ambiguity.
5.3 Message class exception
The third approach uses SAP’s built-in message class mechanism so that the exception raised by the workflow can map cleanly into SAP’s message catalog (T100). This is helpful when you want consistent formatting and localization aligned with standard SAP patterns.
On the Logic Apps side, the action shows advanced fields including Message Class, Message Number, and an Is ABAP Message toggle, with helper text stating the message class can come from message maintenance (SE91) or be custom. On the ABAP side, the code highlights an error-handling block that calls using sy-msgid, sy-msgno, and variables sy-msgv1…sy-msgv4, then stores the resulting text in EXCEPTIONMSG.
Figure: Message class exception
This pattern is ideal when you want workflow exceptions to look and behave like “native” SAP messages. Instead of hard-coding strings, you rely on the message catalog and let ABAP produce a consistent final message via FORMAT_MESSAGE. The result is easier to standardize across teams and environments—especially if you already manage message classes as part of your SAP development process.
Refer to Appendix 2 for further information on FORMAT_MESSAGE.
5.4 Choosing an exception strategy that SAP can act on
Across these examples, the goal is consistent: treat workflow failures as first‑class outcomes in SAP, not as connector noise buried in run history. The Logic Apps action Send exception to SAP server gives you three increasingly structured ways to do that, and the “right” choice depends on how much semantics you want SAP to understand.
- Default exception (lowest ceremony): Use this when you just need a reliable “workflow failed” signal. The connector raises a pre-defined exception name (for example,
SENDEXCEPTIONTOSAPSERVER), and ABAP can handle it with a simpleEXCEPTIONS… = 1 mapping and asy-subrccheck. This is the fastest way to make failures visible and deterministic. - Named exception(s) (more routing control): Use this when you want SAP to distinguish failure types without parsing message text. By raising an exception name declared in the ABAP function module interface, you can branch cleanly in ABAP (or map to different return handling) and keep the contract explicit and maintainable.
- Message class + number (most SAP-native): Use this when you want errors to look and behave like standard SAP messages—consistent wording, centralized maintenance, and better alignment with SAP operational practices. In this mode, ABAP can render the final localized string using
FORMAT_MESSAGEand return it asEXCEPTIONMSG(and optionallyBAPIRET2-MESSAGE), which makes the failure both human-friendly and SAP-friendly.
A practical rule of thumb: start with the default exception while you stabilize the integration, move to named exceptions when you need clearer routing semantics, and adopt message classes when you want SAP-native error governance (standardization, maintainability, and localization). Regardless of the option, the key is to end with a predictable SAP-side contract: a clear success path, and a failure path that produces a structured return and a readable message.
6. Response Handling
This section shows how the destination workflow returns either a successful analysis response or a workflow exception back to SAP, and how the source (caller) workflow interprets the RFC response structure to produce a single, human‑readable outcome (an email body). The key idea is to keep the SAP-facing contract stable: SAP always returns a Z_GET_ORDERS_ANALYSISResponse envelope, and the caller workflow decides between success and error using just two fields: EXCEPTIONMSG and RETURN/MESSAGE. To summarize the steps:
- Destination workflow either:
- sends a normal response via Respond to SAP server, or
- raises an exception via Send exception to SAP server (with an error message).
- SAP server exposes those outcomes through the RFC wrapper:
sy-subrc= 0 → success (EXCEPTIONMSG= 'ok')sy-subrc= 1 → workflow exception (SENDEXCEPTIONTOSAPSERVER)sy-subrc= 2/3 → system/communication failures
- Source workflow calls the RFC, extracts:
EXCEPTIONMSGRETURN/MESSAGEand uses an Has errors gate to choose between a success email body (analysis) or a failure email body (error summary).
The figure below shows the full return path for results and failures. On the right, the destination workflow either responds normally (Respond to SAP server) or raises a workflow exception (Send exception to SAP server). SAP then maps that into the RFC outcome (sy-subrc and message fields). On the left, the source workflow parses the RFC response structure and populates a single EmailBody variable using two cases: failure (error details) or success (analysis text).
Figure: Response/exception flow
Two things make this pattern easy to operationalize. First, the caller workflow does not need to understand every SAP field—only EXCEPTIONMSG and RETURN/MESSAGE are required to decide success vs failure. Second, the failure path intentionally aggregates details (MESSAGE_V1…MESSAGE_V4 plus the exception text) into a single readable string so errors don’t get trapped in run history.
Callout: The caller workflow deliberately treats EXCEPTIONMSG != "ok" or RETURN/MESSAGE present as the single source of truth for failure, which keeps the decision logic stable even if the response schema grows.
Detailed description
|
Phase 1 — Destination workflow: choose “response” vs “exception”
Outcome: SAP receives either a normal response or a raised exception for the RFC call. |
|
Phase 2 — SAP server: map workflow outcomes to RFC results The SAP-side wrapper code shown in the figure calls:
Then it uses Outcome: regardless of why it failed, SAP can provide a consistent set of fields back to the caller: a return structure and an exception/status message. |
|
Phase 3 — Source workflow: parse response and build one “email body” After the RFC action ([RFC] Call Z GET
Outcome: the caller produces a single artifact (EmailBody) that is readable and actionable, without requiring anyone to inspect the raw RFC response. |
7. Destination Workflow #2: Persisting failed rows as custom IDocs
In this section I zoom in on the optional “IDoc persistence” branch at the end of the destination workflow. After the workflow identifies invalid rows (via the Data Validation Agent) and emails a verification summary, it can optionally call a second SAP RFC to save the failed rows as IDocs for later processing.
This is mainly included to showcase another common SAP integration scenario—creating/handling IDocs—and to highlight that you can combine “AI-driven validation” with traditional enterprise workflows. The deeper motivation for invoking this as part of the agent tooling is covered in Part 2; here, the goal is to show the connector pattern and the custom RFC used to create IDocs from CSV input.
The figure below shows the destination workflow at two levels: a high-level overview at the top, and a zoomed view of the post-validation remediation steps at the bottom. The zoom starts from Data Validation Agent → Summarize CSV payload review and then expands the sequence that runs after Send verification summary: Transform CSV to XML followed by an SAP RFC call that creates IDocs from the failed data.
Figure: Zoomed remediation branch
The key point is that this branch is not the main “analysis response” path. It’s a practical remediation option: once invalid rows are identified and reported, the workflow can persist them into SAP using a dedicated RFC (Z_CREATE_ONLINEORDER_IDOC) and a simple IT_CSV payload. This keeps the end-to-end flow modular: analysis can remain focused on validated data, while failed records can be routed to SAP for follow-up processing on their own timeline.
Callout: This branch exists to showcase an IDoc-oriented connector scenario. The “why this is invoked from the agent tooling” context is covered in Part 2; here the focus is the mechanics of calling Z_CREATE_ONLINEORDER_IDOC with IT_CSV and receiving ET_RETURN / ET_DOCNUMS.
The screenshot shows an XML body with the RFC root element and an SAP namespace:
<z_create_onlineorder_idoc xmlns="http://Microsoft.LobServices.Sap/2007/03/Rfc/">
<iv_direction>...</iv_direction>
<iv_sndptr>...</iv_sndptr>
<iv_sndprn>...</iv_sndprn>
<iv_rcvptr>...</iv_rcvptr>
<iv_rcvprn>...</iv_rcvprn>
<it_csv>
@{ ...Outputs... }
</it_csv>
<et_return></et_return>
<et_docnums></et_docnums>
</z_create_onlineorder_idoc>
What to notice:
- the workflow passes invalid CSV rows in
IT_CSV, and SAP returns a status table (ET_RETURN) and created document numbers (ET_DOCNUMS) for traceability. - The payload includes standard-looking control fields (
IV_DIRECTION,IV_SNDPTR,IV_SNDPRN,IV_RCVPTR,IV_RCVPRN) and the actual failed-row payload asIT_CSV. IT_CSVis populated via a Logic Apps expression (shown as @{ ...Outputs... } in the screenshot), which is the bridge between the prior transform step and the RFC call.- The response side indicates table-like outputs:
ET_RETURNandET_DOCNUMS.
7.1 From CSV to IDocs
I’ll cover the details of Destination workflow #2 in Part 2. In this post (Part 1), I focus on the contract and the end-to-end mechanics: what the RFC expects, what it returns, and how the created IDocs show up in the receiving workflow.
Before looking at the RFC itself, it helps to understand the payload we’re building inside the IDoc. The screenshot below shows the custom segment definition used by the custom IDoc type. This segment is intentionally shaped to mirror the columns of the CSV input so the mapping stays direct and easy to reason about.
Figure: Custom segment ZONLINEORDER000 (segment type ZONLINEORDER)
This segment definition is the contract anchor: it makes the CSV-to-IDoc mapping explicit and stable. Each CSV record becomes one segment instance with the same 14 business fields. That keeps the integration “boringly predictable,” which is exactly what you want when you’re persisting rejected records for later processing.
The figure below shows the full loop for persisting failed rows as IDocs. The source workflow calls the custom RFC and sends the invalid CSV rows as XML. SAP converts each row into the custom segment and creates outbound IDocs. Those outbound IDocs are then received by Destination workflow #2, which processes them asynchronously (one workflow instance per IDoc) and appends results into shared storage for reporting.
Figure: Persisting rejected rows as IDocsThis pattern deliberately separates concerns:
- the first destination workflow identifies invalid rows and decides whether to persist them,
- SAP encapsulates the mechanics of IDoc creation behind a stable RFC interface, and
- a second destination workflow processes those IDocs asynchronously (one per IDoc), which is closer to how IDoc-driven integrations typically operate in production.
Destination workflow #2 is included here to show the end-to-end contract and the “receipt” side of the connector scenario:
- Triggered by the SAP built-in trigger and checks FunctionName =
IDOC_INBOUND_ASYNCHRONOUS - extracts
DOCNUMfrom the IDoc control record (EDI_DC40/DOCNUM) - reconstructs a CSV payload from the IDoc data segment (the fields shown match the segment definition)
- appends a “verification info” line to shared storage for reporting
The implementation details of that workflow (including why it is invoked from the agent tooling) are covered in Part 2.
7.2 Z_CREATE_ONLINEORDER_IDOC - Contract overview
The full source code for Z_CREATE_ONLINEORDER_IDOC is included in the supporting material. It’s too long to reproduce inline, so this post focuses on the contract—the part you need to call the RFC correctly and interpret its results.
A quick note on authorship: most of the implementation was generated with Copilot, with manual review and fixes to resolve build errors and align the behavior with the intended integration pattern. The contract is deliberately generic because the goal was to produce an RFC that’s reusable across more than one scenario, rather than tightly coupled to a single workflow.
At a high level, the RFC is designed to support:
- Both inbound and outbound IDoc creation
It can either write IDocs to the SAP database (inbound-style persistence) or create/distribute IDocs outbound. - Multiple IDoc/message/segment combinations
IDoc type (IDOCTYP), message type (MESTYP), and segment type (SEGTP) are configurable so the same RFC can be reused. - Explicit partner/port routing control
Optional sender/receiver partner/port fields can be supplied when routing matters. - Traceability of created artifacts
The RFC returns created IDoc numbers so the caller can correlate “these failed rows” to “these IDocs.”
Contract:
|
Inputs (import parameters)
|
|
Tables
|
|
Outputs
|
8. Concluding Remarks
Part 1 established a stable SAP ↔ Logic Apps integration baseline: CSV moves end‑to‑end using explicit contracts, and failures are surfaced predictably. The source workflow reads CSV from Blob, wraps rows into the IT_CSV table‑of‑lines payload, calls Z_GET_ORDERS_ANALYSIS, and builds one outcome using two fields from the RFC response: EXCEPTIONMSG and RETURN/MESSAGE. The destination workflow gates requests, validates input, and returns only analysis (or errors) back to SAP while handling invalid rows operationally (notification + optional persistence).
On the error path, we covered three concrete patterns to raise workflow failures back into SAP: the default connector exception (SENDEXCEPTIONTOSAPSERVER), named exceptions (explicit ABAP contract), and message‑class‑based errors (SAP‑native formatting via FORMAT_MESSAGE). On the remediation side, we added a realistic enterprise pattern: persist rejected rows as custom IDocs via Z_CREATE_ONLINEORDER_IDOC (IT_CSV in, ET_RETURN + ET_DOCNUMS out), using the custom segment ZONLINEORDER000 as the schema anchor and enabling downstream receipt in Destination workflow #2 (one run per IDoc, correlated via DOCNUM).
Part 2 is separate because it tackles a different problem: the AI layer. With contracts and error semantics now fixed, Part 2 can focus on the agent/tooling details that tend to iterate—rule retrieval, structured validation outputs, prompt constraints, token/history controls, and how the analysis output is generated and shaped—without muddying the transport story.
Appendix 1: Parse XML with schema
In this section I consider the CSV payload creation as an example, but parsing XML with schema applies in every place where we get an XML input to process, such as when receiving SAP responses, exceptions, or request/responses from other RFCs.
Strong contract
The Create_CSV_payload step in the shown implementation uses an xpath() + join() expression to extract LINE values from the incoming XML:
join(
xpath(
xml(triggerBody()?['content']),
'/*[local-name()="Z_GET_ORDERS_ANALYSIS"]
/*[local-name()="IT_CSV"]
/*[local-name()="ZTY_CSV_LINE"]
/*[local-name()="LINE"]/text()'
),
'\r\n'
)
That approach works, but it’s essentially a “weak contract”: it assumes the message shape stays stable and that your XPath continues to match. By contrast, the Parse XML with schema action turns the XML payload into structured data based on an XSD, which gives you a “strong contract” and enables downstream steps to bind to known fields instead of re-parsing XML strings.
The figure below compares two equivalent ways to build the CSV payload from the RFC input. On the left is the direct xpath() compose (labeled “weak contract”). On the right is the schema-based approach (labeled “strong contract”), where the workflow parses the request first and then builds the CSV payload by iterating over typed rows.
Figure: comparison Compose/XPath vs. Parse XML with schema.
What’s visible in the diagram is the key tradeoff:
- XPath compose path (left): the workflow creates the CSV payload directly using
join(xpath(...), '\r\n'), with the XPath written using local-name() selectors. This is fast to prototype, but the contract is implicit—your workflow “trusts” the XML shape and your XPath accuracy. - Parse XML with schema path (right): the workflow inserts a Parse XML with schema step (“
Parse Z GET ORDERS ANALYSIS request”), initializes variables, loops For each CSV row, and Appends to CSV payload, then performsjoin(variables('CSVPayload'), '\r\n'). Here, the contract is explicit—your XSD defines what IT_CSV and LINE mean, and downstream steps bind to those fields rather than re-parsing XML.
A good rule of thumb is: XPath is great for lightweight extraction, while Parse XML with schema is better when you want contract enforcement and long-term maintainability, especially in enterprise integration / BizTalk migration scenarios where schemas are already part of the integration culture.
Implementation details
The next figure shows the concrete configuration for Parse XML with schema and how its outputs flow into the “For each CSV row” loop. This is the “strong contract” version of the earlier XPath compose.
Figure: Parse XML with schema - details.This screenshot highlights three practical implementation details:
- The Parse action is schema-backed.
In the Parameters pane, the action uses:- Content: the incoming XML Response
- Schema source: LogicApp
- Schema name: Z_GET_ORDERS_ANALYSIS
The code view snippet shows the same idea: type: "XmlParse" with content: "@triggerBody()?['content']" and schema: { source: "LogicApp", name: "Z_GET_ORDERS_ANALYSIS.xsd" }.
- The parsed output becomes typed “dynamic content.”
The loop input is shown as “JSON Schema for element 'Z_GET_ORDERS_ANALYSIS: IT_CSV'”. This is the key benefit: you are no longer scraping strings—you are iterating over a structured collection that was produced by schema-based parsing. - The
LINEextraction becomes trivial and readable.
The “Append to CSV payload” step appends@item()?['LINE']to the CSVpayload variable (as shown in the code snippet). Then the final Create CSV payload becomes a simplejoin(variables('CSVPayload'), '\r\n'). This is exactly the kind of “workflow readability” benefit you get once XML parsing is schema-backed.
Schema generation
The Parse action requires XSD schemas, which can be stored in the Logic App (or via a linked Integration Account). The final figure shows a few practical ways to obtain and manage those XSDs:
- Generate Schema (SAP connector): a “Generate Schema” action with Operation Type = RFC and an RFC Name field, which is a practical way to bootstrap schema artifacts when you already know the RFC you’re calling.
- Run Diagnostics / Fetch RFC Metadata: a “Run Diagnostics” action showing Operation type = Fetch RFC Metadata and RFC Name, which is useful to confirm the shape of the RFC interface and reconcile it with your XSD/contract.
If you don’t want to rely solely on connector-side schema generation, there are also classic “developer tools” approaches:
- Infer XSD from a sample XML using .NET’s XmlSchemaInference (good for quick starting points).
- Generate XSD from an XML instance using xsd.exe (handy when you already have representative sample payloads) or by asking your favorite AI prompt.
When to choose XPath vs Parse XML with schema (practical guidance)
Generally speaking, choose XPath when…
- You need a quick extraction and you’re comfortable maintaining a single XPath.
- You don’t want to manage schema artifacts yet (early prototypes).
Choose Parse XML with schema when…
- You want a stronger, explicit contract (XSD defines what the payload is).
- You want the designer to expose structured outputs (“JSON Schema for element …”) so downstream steps are readable and less brittle.
- You expect the message shape to evolve over time and prefer schema-driven changes over XPath surgery.
Appendix 2: Using FORMAT_MESSAGE to produce SAP‑native error text
When propagating failures from Logic Apps back into SAP (for example via Send exception to SAP server), I want the SAP side to produce a predictable, human‑readable message without forcing callers to parse connector‑specific payloads. ABAP’s FORMAT_MESSAGE is ideal for this because it converts SAP’s message context—message class, message number, and up to four variables—into the final message text that SAP would normally display, but without raising a UI message.
What FORMAT_MESSAGE does
FORMAT_MESSAGE formats a message defined in SAP’s message catalog (T100 / maintained via SE91) using the values in sy-msgid, sy-msgno, and sy-msgv1…sy-msgv4. Conceptually, it answers the question:
“Given message class + number + variables, what is the rendered message string?”
This is particularly useful after an RFC call fails, where ABAP may have message context available even if the exception itself is not a clean string.
Why this matters in an RFC wrapper
In the message class–based exception configuration, the workflow can provide message metadata (class/number/type) so that SAP can behave “natively”: ABAP receives a failure (sy-subrc <> 0), formats the message using FORMAT_MESSAGE, and returns the final text in a field like EXCEPTIONMSG (and/or in BAPIRET2-MESSAGE). The result is:
- consistent wording across systems and environments
- easier localization (SAP selects language-dependent text)
- separation of concerns: code supplies variables; message content lives in message maintenance
A robust pattern
After the RFC call, I use this order of precedence:
- Use any explicit text already provided (for example via
system_failure…MESSAGEexceptionmsg), because it’s already formatted. - If that’s empty but SAP message context exists (
sy-msgid/sy-msgno), callFORMAT_MESSAGEto produce the final string. - If neither is available, fall back to a generic message that includes
sy-subrc.
Here is a compact version of that pattern:
DATA: lv_text TYPE string.
CALL FUNCTION 'Z_GET_ORDERS_ANALYSIS' DESTINATION dest
IMPORTING
analysis = analysis
TABLES
it_csv = it_csv
CHANGING
return = return
EXCEPTIONS
sendexceptiontosapserver = 1
system_failure = 2 MESSAGE exceptionmsg
communication_failure = 3 MESSAGE exceptionmsg
OTHERS = 4.
IF sy-subrc <> 0.
"Prefer explicit message text if it already exists
IF exceptionmsg IS INITIAL.
"Otherwise format SAP message context into a string
IF sy-msgid IS NOT INITIAL AND sy-msgno IS NOT INITIAL.
CALL FUNCTION 'FORMAT_MESSAGE'
EXPORTING
id = sy-msgid
no = sy-msgno
v1 = sy-msgv1
v2 = sy-msgv2
v3 = sy-msgv3
v4 = sy-msgv4
IMPORTING
msg = lv_text.
exceptionmsg = lv_text.
ELSE.
exceptionmsg = |RFC failed (sy-subrc={ sy-subrc }).|.
ENDIF.
ENDIF.
"Optionally normalize into BAPIRET2 for structured consumption
return-type = 'E'.
return-message = exceptionmsg.
ENDIF.
Common gotchas
FORMAT_MESSAGEonly helps ifsy-msgidandsy-msgnoare set. If the failure did not originate from an SAP message (or message mapping is disabled), these fields may be empty—so keep a fallback.- Message numbers are typically 3-digit strings (e.g., 001, 012), matching how messages are stored in the catalog.
FORMAT_MESSAGEformats text; it does not raise or display a message. That makes it safe to use in RFC wrappers and background processing.
Bottom line: FORMAT_MESSAGE is a simple tool that helps workflow‑originated failures “land” in SAP as clean, SAP‑native messages—especially when using message classes to standardize and localize error text.
References
Agentic Logic Apps Integration with SAP - Part 2: AI Agents
Handling Errors in SAP BAPI Transactions | Microsoft Community Hub
Access SAP from workflows | Microsoft Learn
Create common SAP workflows | Microsoft Learn
Generate Schemas for SAP Artifacts via Workflows | Microsoft Learn
Parse XML using Schemas in Standard workflows - Azure Logic Apps | Microsoft Learn
Announcing XML Parse and Compose for Azure Logic Apps GA
Exception Handling | ABAP Keyword Documentation
Handling and Propagating Exceptions - ABAP Keyword Documentation
SAP .NET Connector 3.1 Overview
SAP .NET Connector 3.1 Programming Guide
All supporting content for this post may be found in the companion GitHub repository.