Share via

Is there way to create dataset on multiple tables together?

Justin Doh 1,000 Reputation points
2026-03-24T20:50:26.7133333+00:00

I have a question for Azure Data Factory.

I am trying to create datasets for 50 tables in VM SQL database and moving data into each corresponding 50 tables Azure SQL database.

On the process of creating datasets, is there way to select possibly all tables inside that database (VM SQL database) instead of creating one by one (each table)?

Or is there way to bulk copy (whole tables together) into destination database?

Azure Data Factory
Azure Data Factory

An Azure service for ingesting, preparing, and transforming data at scale.


Answer accepted by question author
  1. Pilladi Padma Sai Manisha 6,430 Reputation points Microsoft External Staff Moderator
    2026-03-24T22:45:23.96+00:00

    Hi Justin Doh,
    Yes, there is a way to handle this efficiently in Azure Data Factory without creating 50 datasets manually.

    ADF does not support selecting all tables in a single dataset, as each dataset is designed to point to one table or structure. However, you can achieve bulk processing by using a dynamic and parameterized approach.

    Create one source dataset and one sink dataset, and add a parameter such as TableName. Use this parameter inside the dataset so that the table name can be passed dynamically at runtime.

    Then use a Lookup activity to retrieve the list of all tables from your source database. You can run a query like:

    SELECT TABLE_NAME 
    

    After that, use a ForEach activity to iterate through the list of tables returned by the Lookup. Inside the loop, add a Copy Activity and pass the table name dynamically to both the source and destination datasets. This way, each iteration copies one table, and all tables get migrated automatically.

    This is the recommended and scalable approach for handling multiple tables. There is no built-in option in ADF to select or copy all tables at once directly, but this method effectively achieves the same result.

    References:
    Metadata-driven copy pipelines in ADF
    ADF dynamic pipeline example (Q&A guidance)
    ForEach activity in Azure Data Factory

    Hope this helps!

    0 comments No comments

1 additional answer

Sort by: Most helpful
  1. Q&A Assist
    2026-03-24T20:50:47.29+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    Yes. Instead of creating 50 separate datasets manually, use a parameterized dataset and drive the table names from a pipeline (typically via a ForEach activity).

    Key idea:

    • Create one source dataset and one sink dataset.
    • Do not hard-code the table name in the dataset; make it a parameter.
    • Use a pipeline (with ForEach) to iterate over the list of table names and pass each table name into the dataset at runtime.

    Example pattern from the documentation:

    1. Source dataset (SQL Server on VM)
      • Create a dataset pointing to the SQL Server linked service.
      • Do not select a table; the Copy activity will use a SQL query or dynamic table name.
      • This is shown as SourceDataset where the Copy activity uses a SQL query instead of a fixed table:
        • “The Copy activity in the pipeline uses a SQL query to load the data rather than load the entire table.”
    2. Sink dataset (Azure SQL Database)
      • Create a dataset pointing to the Azure SQL Database linked service.
      • Add a parameter for the table name, for example SinkTableName.
      • On the Connection tab, set Table to dynamic content @dataset().SinkTableName.
      • At runtime, the pipeline sets SinkTableName for each iteration.
      From the portal tutorial:
      • A parameter SinkTableName is defined on the dataset.
      • In Connection, the table is set via dynamic content so the dataset can be reused for multiple tables.
    3. Pipeline with ForEach
      • Build a list of the 50 table names (e.g., in a Lookup activity or hard-coded array).
      • Use a ForEach activity to iterate over that list.
      • Inside ForEach, use a Copy activity that:
        • Uses the shared source dataset.
        • Uses the parameterized sink dataset and passes the current table name to SinkTableName.
    4. Bulk-copy pattern
      • The bulk-copy tutorial uses the same concept: one source dataset with a dummy table and one sink dataset whose tableName is set as a parameter. The ForEach activity passes the actual table name at runtime.

    This approach lets a single pair of datasets handle all 50 tables, avoiding manual creation of 50 separate datasets.


    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.