Share via

Call API Endpoint dynamically and download data using the copy task

MrFlinstone 761 Reputation points
2026-03-26T14:26:56.7766667+00:00

Hi All.

I have got a powershell script that does the following.

  1. Obtain bearer token from Azure tenant
  2. Have a base URL and then declare an error of endpoint names
  3. The powershell script loops through the endpoints using the bearer token in step 1 and including the subscription key within this step to hit the endpoints.
  4. Get the json data from the endpoints and then save onto a a storage account. What I am trying to do now is have steps 1- 4 within a synapse pipeline. This is what my powershell looks like to obtain token
       
           $tokenBody = @{
               grant_type    = $grantType
               client_id     = $clientID
               client_secret = $clientSecret
               scope         = $scope
           }
       
           $tokenResponse = Invoke-RestMethod `
               -Method Post `
               -Uri $tokenEndpoint `
               -Body $tokenBody `
               -ContentType "application/x-www-form-urlencoded" `
               -ErrorAction Stop
       
           if (-not $tokenResponse.access_token) {
               throw "Token response did not contain an access_token."
           }
       
           $accessToken = $tokenResponse.access_token
       
           Write-Host "Token retrieved successfully." -ForegroundColor Green
       
    
    This is what the other call looks like

foreach ($resourcePath in $resourcePaths) {
    $apiEndpoint = "$baseUrl/$contextPath/$resourcePath"

        Write-Host ""
        Write-Host "Calling API endpoint: $apiEndpoint" -ForegroundColor Cyan

        $apiResponse = Invoke-RestMethod `
            -Method Get `
            -Uri $apiEndpoint `
            -Headers $headers `
            -ErrorAction Stop

	}
}

$resourcePaths here is an array consisting of endpoint names

From synapse pipeline perspective, this is what I have done so far.

  1. web activity to get the client secret from key vault.
  2. web activity to get the bearer token from the Azure tenant using the output from the step that gets the client secret (step 1 above)
  3. web activity to get the API subscription key

Issues

With the Copy task, I have created a foreach loop, fed the array of endpoint names to the foreach loop, the idea here is that this will loop through each endpoint.

The issue I have now is how do I pass the headers to the Rest task and dynamically pass the credentials as part of the headers, I have selected anonymous authentication, from the research I did this is the way to go if one wants to inject headers which contains the bearer token and the subscription key.

I would be grateful for some guidance on how to do this or perhaps an example that works.

Azure Synapse Analytics
Azure Synapse Analytics

An Azure analytics service that brings together data integration, enterprise data warehousing, and big data analytics. Previously known as Azure SQL Data Warehouse.


1 answer

Sort by: Most helpful
  1. Q&A Assist
    2026-03-26T14:27:23.59+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    Use a REST linked service with Anonymous auth and parameterized headers, then drive it from a ForEach.

    High-level pattern in Synapse/ADF:

    1. Use Web activity to get bearer token (already done).
    2. Use Web activity (or Key Vault) to get subscription key (already done).
    3. Store both values in pipeline variables or pass them directly as expressions.
    4. Create a REST linked service with authenticationType = Anonymous.
    5. In the REST dataset (or directly in the Copy activity source), set additionalHeaders using dynamic content that injects the token and subscription key.
    6. Use a ForEach activity over the array of resourcePaths, and inside it run a Copy activity whose relative URL is built from the loop item.

    Details from the REST connector and Web activity behavior:

    • REST connector supports Anonymous auth and custom headers via authHeaders/additionalHeaders:
      • Anonymous is appropriate when the header must carry the bearer token and subscription key instead of using built-in auth types.
      • Copy activity with REST source/sink supports custom headers via additionalHeaders.
    • Web activity and REST connector both require JSON responses; REST source supports JSON payloads.

    Example configuration outline

    1. REST linked service (JSON, via code or UI equivalent):
    {
      "name": "LS_RestApi",
      "properties": {
        "type": "RestService",
        "typeProperties": {
          "url": "https://your-base-url",  
          "authenticationType": "Anonymous"
        }
      }
    }
    
    1. REST dataset (optional) or inline source in Copy activity:

    In the Copy activity → Source → Additional headers (UI) or additionalHeaders (JSON), use dynamic content:

    "additionalHeaders": {
      "Authorization": "@concat('Bearer ', pipeline().variables.accessToken)",
      "Ocp-Apim-Subscription-Key": "@pipeline().variables.subscriptionKey"
    }
    

    Where accessToken and subscriptionKey are pipeline variables set from the Web activities’ outputs, for example:

    • After token Web activity (name: GetToken):
    @activity('GetToken').output.access_token
    
    • After subscription key Web activity (name: GetSubKey):
    @activity('GetSubKey').output.yourKeyField
    
    1. ForEach over endpoints
    • Create an array parameter or variable resourcePaths that contains the endpoint names.
    • ForEach items: @pipeline().parameters.resourcePaths (or variable equivalent).
    • Inside ForEach, Copy activity source relative URL:
    "relativeUrl": "@concat('yourContextPath/', item())"
    

    The REST connector will then call:

    {base url from linked service}/{relativeUrl}

    with the headers defined in additionalHeaders, which include the bearer token and subscription key.

    1. Sink to storage
    • Configure the Copy activity sink as your storage (e.g., Azure Blob/ADLS) and use dynamic file names based on item() so each endpoint’s data is written to a separate file.

    This approach reproduces the PowerShell logic:

    • Token acquisition → Web activity.
    • Header construction → dynamic additionalHeaders using pipeline expressions.
    • Loop over endpoints → ForEach with item().
    • Download JSON and save to storage → Copy activity REST source → storage sink.

    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.