Share via

Cannot get data from azure data lake gen2 Common Data Model

OUADAH Djilali 25 Reputation points
2026-03-31T14:44:22.3833333+00:00

Hello,

I tried to export data from my data lake storage with a dataflow in Azure Data Factory, the data structure seems good and I have the lines but I don't have any data appearing in the fields:

User's image

I have tried with the scheme derivation and without it, the configuration seems fine:User's image

The only thing that bother me is that I don't have "Common Data Model" so I have selected this option but it looks really similar:

User's imageUser's image

The files are automatically updated by the Microsoft Power Platform environment:User's image

Thank you for your help!

Best regards,

Djilali.

Azure Data Factory
Azure Data Factory

An Azure service for ingesting, preparing, and transforming data at scale.

0 comments No comments

Answer accepted by question author
  1. Pilladi Padma Sai Manisha 6,430 Reputation points Microsoft External Staff Moderator
    2026-03-31T23:56:06.92+00:00

    Hey OUADAH Djilali,

    it looks like the reason you’re getting rows full of NULLs is that you’ve hooked up the wrong inline dataset type in your Mapping Data Flow. “Common Data Service” is the Dataverse/OData connector – it won’t actually read your CDM model.json + its CSV partitions in ADLS Gen2. To pull real values from a Power Platform-generated CDM folder you need to use the Common Data Model file format.

    Here’s what I’d try:

    1. In your Data Flow, edit the Source transform and choose Inline dataset.
    2. For Type of inline dataset, pick File, then under Format select Common Data Model.
      • If you don’t see “Common Data Model” in the drop-down, choose an ADLS Gen2 inline dataset and then set Format to Common Data Model.
    3. Point the linked service to your Data Lake Storage Gen2 account.
    4. Under Options de la source → Format des métadonnées, pick Model.json (or try Manifest if you have a manifest file).
    5. In Emplacement racine, browse exactly to the folder containing your model.json (not to a parent or child folder).
    6. Expand Référence d’entité and pick your entity (crf19_registration_distributor).
    7. Switch to the Projection tab and click Import Projection (or turn on “Infer drifted column types” if you need to tweak date/time formats).
    8. Go to Aperçu des données and you should now see actual values in the columns.

    A few additional troubleshooting tips if you still get NULLs:

    • Make sure the CSV partitions in your CDM folder all have the correct number of delimiters (N–1 commas for N columns).

    • If your model.json partition URIs include unsupported characters (like @snapshot=…), remove that from both the file name and model.json.

    • Confirm your ADF/Synapse managed identity or service principal has at least “Storage Blob Data Reader” on the container.

    Give that a shot and let me know what you see!

    References

    Note: This content was drafted with the help of an AI system. Please verify the information before relying on it for decision-making.


1 additional answer

Sort by: Most helpful
  1. Q&A Assist
    2026-03-31T14:44:41.57+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    The behavior (schema visible but all values NULL) matches a known Common Data Model issue in Azure Data Factory mapping data flows when the CDM folder only contains physical models and not a logical model.

    In this case, the CDM source can read the entity definition and build the projection, but it cannot resolve the data partitions correctly, so no data rows are returned.

    To fix the issue:

    1. Ensure the CDM folder is separated into logical and physical models.
      • The logical entity definition should describe the entity and attributes.
      • The physical (resolved) entities and their data partitions should be described in a manifest that points to the actual CSV files.
    2. Use a logical model as the entity reference in the data flow source:
      • In SourceSource options, keep Metadata format set to Model.json (or manifest if applicable).
      • Point to the manifest/model.json that contains the logical entity definition.
      • Ensure that this manifest also describes the locations of the resolved entities and their data partitions.
    3. Validate that the data partition paths in the manifest/model.json point to ADLS Gen2 (*.dfs.core.windows.net) and not to blob (*.blob.core.windows.net). Incorrect corpus paths can also prevent data from being read.

    If the current export from Power Platform only produces physical models, regenerate or adjust the CDM metadata so that a logical model and corresponding manifest are present, then reconfigure the ADF source to use that logical model.


    References:


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.