Implementing Pagination with the Copy Activity in Microsoft Fabric
Published Aug 31 2023 06:11 AM 3,530 Views


APIs often return a large amount of data. Pagination allows you to retrieve a manageable subset of this data at a time, preventing unnecessary strain on resources and reducing the likelihood of timeouts or errors.

In this example, the client starts by requesting the first page of data from the server. The server responds with both the data and metadata indicating the current page, the total number of records. The client then proceeds to request subsequent pages of data until it reaches the last page. This approach allows for efficient data retrieval and processing without overwhelming the client or the server.

we want to get a file in ADLS containing all data from API without the need to use other activities like until/forEach, we want the Copy activity to perform all the pagination needed to collect all the data.






1. Basic knowledge in Rest API.

2. Workspace in Microsoft Fabric.

3. ADLS storage account.

API used:


default limit according to pokeapi documentation is 20 records per request. 
In this tutorial, i want to limit records up to 500 records per request like so:



The initial API call will be made using the following URL:


Subsequently, the API calls will proceed as follows:
Second call:


Third call:


In each successive request, the offset value will be incremented by 500 to retrieve the next set of records.

Step1: Prepare your workspace.
In your fabric workspace, navigate to Data Factory component and add a pipeline to your workspace, after that drag a copy activity to your canvas.
Follow steps mentioned in MS documentation: Module 1 - Create a pipeline with Data Factory - Microsoft Fabric | Microsoft Learn 

Step2: Configure the Copy activity.
      1. Source settings:

         data store type: External
         connection: add new -> click on Rest connection.




       Fill connection settings like so:



          click on Create. 
          Relative URL:




        here I'm adding a value to the offset parameter, pageOffset is a variable that will get value from the pagination rule.
 In Advanced: under Pagination Rule, add a value to the variable 'pageOffset' 



which indicates that we will run from 1 to 1281 with an offset of 500, so each call to the API we will jump by 500 records as mentioned above.

      2. destination

          I would like to write data as a .csv file,named outputPartitioning.csv , i added my ADLS connection to my lake house, follow the steps in MS documentation: Create an Azure Data Lake Storage Gen2 shortcut - Microsoft Fabric | Microsoft Learn 



     3. Mapping tab:

       After we configured both source and destination, now we need to map our data, so data from the API comes as a Json with these attributes:


    "count": 1281,
    "next": "",
    "previous": "",
    "results": [
            "name": "ivysaur",
            "url": ""
            "name": "venusaur",
            "url": ""


   now we don't care about metadata provided by the API (like count,next,previous keys), we only want the results array. 
click on import schemas, after that add in collection reference:




delete extra results that you see below, and make sure to make name and url keys is saved as String in destination like so:



Step3: Run copy activity.

after running copy activity, you should see output file in your ADLS storage account and activity marked as success in the Fabric workspace. 


   downloaded my file from my ADLS storage account and opened it in Visual Studio. 



we can see that we got 1281 records as promised from the API, so pagination worked. 


Module 1 - Create a pipeline with Data Factory - Microsoft Fabric | Microsoft Learn 

Create an Azure Data Lake Storage Gen2 shortcut - Microsoft Fabric | Microsoft Learn

Documentation - PokéAPI ( 

How to configure REST in a copy activity - Microsoft Fabric | Microsoft Learn

- Make sure to establish all connections in ADLS and in Fabric workspace.
- check MS documentation on pagination in copy activity.
- Please help us improve by sharing your valuable feedback.
- Follow me on LinkedIn: Sally Dabbah | LinkedIn

Version history
Last update:
‎Aug 31 2023 06:10 AM
Updated by: