Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This tutorial shows you how to run an existing SSIS package that writes files to Azure Data Lake Storage (ADLS) Gen2, and then surface those files in OneLake by using a shortcut. By combining the Invoke SSIS Package activity in Data Factory for Microsoft Fabric with OneLake shortcuts, you can centralize all your data in OneLake - even data produced by legacy SSIS workloads.
Use case
Many organizations have SSIS packages that extract and transform data, then write the results as flat files (CSV, Parquet, XML, and others) to Azure Data Lake Storage Gen2. These files are consumed by downstream analytics and reporting systems.
With Microsoft Fabric, you can bring those files into OneLake without changing your SSIS package logic:
- Preserve existing SSIS investments - Continue using battle-tested packages that write files to ADLS Gen2 through the Azure Storage connection manager. No package rewrite is required.
- Centralize data in OneLake - Create an ADLS Gen2 shortcut in a Fabric lakehouse so that files written by SSIS appear automatically in OneLake, ready for consumption by Spark, SQL, Power BI, and other Fabric workloads.
- Orchestrate in Fabric - Use the Invoke SSIS Package activity in a Fabric pipeline to schedule and monitor package execution alongside other Fabric-native activities.
Prerequisites
Before you begin, make sure you have:
- A Microsoft Fabric workspace with a Fabric capacity or trial.
- A lakehouse in the workspace.
- An Azure Data Lake Storage Gen2 storage account with hierarchical namespace enabled.
- An SSIS package (.dtsx) that uses an Azure Storage connection manager to write files to ADLS Gen2.
- Credentials for the ADLS Gen2 account - for example, an account key, shared access signature (SAS), service principal, or organizational account - with at least the Storage Blob Data Contributor role.
Overview
The end-to-end workflow has four steps:
| Step | What you do | Result |
|---|---|---|
| 1 | Configure the SSIS package to write files to ADLS Gen2 | Package produces output files in your storage account |
| 2 | Create an ADLS Gen2 shortcut in a Fabric lakehouse | Files written to ADLS Gen2 appear in OneLake automatically |
| 3 | Upload the SSIS package to OneLake | Package is stored in OneLake and ready to be invoked |
| 4 | Run the package from a Fabric pipeline | Pipeline orchestrates execution and writes output through to OneLake |
Step 1 - Configure the SSIS package to write files to ADLS Gen2
In this step you make sure your SSIS package uses an Azure Storage connection manager to write files to your ADLS Gen2 account.
Open your SSIS project in Visual Studio with the SQL Server Integration Services Projects extension.
Install the Azure Feature Pack for Integration Services (SSIS). The Feature Pack provides the Azure Storage connection manager, Azure Blob Source, Azure Blob Destination, and other Azure-related tasks and components needed to connect to ADLS Gen2 from an SSIS package.
In the Connection Managers tray, add (or verify) an Azure Storage connection manager. Set the following properties:
Property Value Service ADLS Gen2 Authentication Choose one: AccessKey, ServicePrincipal, or SharedAccessSignature Account name Your ADLS Gen2 storage account name
Configure your data flow or file system task to use this connection manager and write output files to a container and folder path in the storage account - for example,
mycontainer\myfolder.Test the connection and verify the package executes correctly on your local machine.
For full details on the Azure Storage connection manager, see Azure Storage connection manager.
Tip
If your package uses the DontSaveSensitive protection level, credentials aren't persisted in the package file. You supply them at runtime through the Connection Managers tab of the Invoke SSIS Package activity. Alternatively, you can set the package protection level to EncryptSensitiveWithPassword, which encrypts credentials inside the package. You then provide the package password in the Invoke SSIS Package activity at runtime instead of supplying individual connection manager credentials (Step 4).
Step 2 - Create an ADLS Gen2 shortcut in a Fabric lakehouse
A shortcut makes the files written by your SSIS package visible in OneLake without copying data. Any Fabric workload - Spark, SQL analytics endpoint, Power BI - can read the files through the shortcut.
Open your lakehouse in the Fabric portal.
In the Explorer pane, right-click the Files folder (or a subfolder) and select New shortcut.
Under External sources, select Azure Data Lake Storage Gen2.
Enter the connection URL - the DFS endpoint for your storage account:
https://<STORAGE_ACCOUNT_NAME>.dfs.core.windows.netSelect an existing connection or create a new one. Choose an authentication kind that has at least the Storage Blob Data Reader role on the storage account.
Select Next, then browse to the container and folder where your SSIS package writes files (for example,
mycontainer).Select the target folder, then select Next → Create.
The shortcut now appears in your lakehouse. Any file that the SSIS package writes to the ADLS Gen2 target folder is automatically accessible in OneLake through this shortcut.
For detailed instructions, see Create an Azure Data Lake Storage Gen2 shortcut. For more information about shortcuts, see OneLake shortcuts.
Step 3 - Upload the SSIS package to OneLake
The Invoke SSIS Package activity reads packages from OneLake. Upload your .dtsx file (and optional .dtsConfig file) to a lakehouse.
In the Fabric portal, open the lakehouse where you want to store the package.
In the Files section, create a folder - for example,
ssis-packages.Upload the package by using one of these methods:
Method How Fabric portal Select Upload → Upload files and choose your .dtsx file. OneLake file explorer Drag and drop the file into the packagesfolder through the OneLake file explorer on your desktop.
For more information about uploading files to OneLake, see the Invoke SSIS Package activity documentation.
Step 4 - Run the package in a Fabric pipeline
In your Fabric workspace, create a new Data Pipeline or open an existing one.
From the Activities pane, add the Invoke SSIS Package activity to the pipeline canvas.
On the Settings tab, configure the activity:
Setting Value Package path Browse to the .dtsx file you uploaded in Step 3. Configuration path (optional) Browse to the .dtsConfig file, if applicable. Encryption password (optional) If the package protection level is EncryptSensitiveWithPassword or EncryptAllWithPassword, provide the password used to encrypt the package. Enable logging Select to write execution logs to OneLake. Select Save, then select Run to execute the pipeline immediately, or select Schedule to set up recurring execution.
Monitor progress in the pipeline Output tab or the workspace Monitor hub. If logging is enabled, the activity output includes the logging path on OneLake.
For full configuration details, see Use the Invoke SSIS Package activity to run an SSIS package.
Verify the results
After the pipeline run completes successfully:
- Open the lakehouse and navigate to the shortcut you created in Step 2.
- Confirm that the output files written by the SSIS package appear in the shortcut folder.
Summary
By combining a few Fabric capabilities, you can bring file-based SSIS output into OneLake without modifying your existing packages:
- Azure Storage connection manager writes files to ADLS Gen2 from within your SSIS package.
- OneLake shortcut surfaces those files in a Fabric lakehouse - no data copy required.
- Package upload to OneLake makes the .dtsx file available for Fabric pipeline execution.
- Invoke SSIS Package activity orchestrates and monitors package execution in a Fabric pipeline.
This pattern lets you manage all your data in OneLake while preserving your existing SSIS investments.