cancel
Showing results for 
Search instead for 
Did you mean: 
amit_kothari
Employee
Employee

Overview

Incorta’s new Databricks data destination enables you to push curated, enriched, or transformed datasets directly into Databricks.

amit_kothari_0-1759962569414.png
 
Supported Incorta Versions: 2024.7.6 and above

Configuration

To enable the Databricks data destination, a set of configurations must be completed both in Databricks and Incorta:

  1. Ensure that Incorta is on Azure ADLS
  2. Ensure the Databricks connector is installed in Incorta.
  3. Here are the Prerequisites:
  • A working Azure Databricks cluster
  • A Personal Access Token with access to the Databricks cluster
  • A Databricks Catalog to be used
  • A Shared Access Signature (SAS) with Read, List, and Permissions access to the Incorta tenant folder—specifically the source folder. 
  • The process creates a temporary folder under the source and reads the schemas from there.
  • In Databricks allow access to the system catalog + info schema
    GRANT USAGE ON CATALOG system TO 'your_principal';
    GRANT USAGE ON SCHEMA system.information_schema TO 'your_principal';
    GRANT SELECT ON ALL VIEWS IN SCHEMA system.information_schema TO 'your_principal';
    GRANT SELECT ON FUTURE VIEWS IN SCHEMA system.information_schema TO 'your_principal';
    
    
    Create the Databricks  Data Destination
  • Here is a sample screenshot of the Databricks Data Destination:
amit_kothari_2-1759962569415.png
  • Please refer to the Incorta documentation to understand all the properties of a Databricks data destination.
  • Test and Configure the Data Destination

    1. Test the Connection and save the data destination.

      • If you encounter an error such as:
        HTTP 403 — Unauthorized network access to workspace: 754120970557878,
        this indicates that the Databricks workspace is restricted by front-end IP allowlists, authorized networks, or Private Link settings. In other words, the workspace can only be accessed from approved source IPs or through Private Link, and your client’s egress IP is not currently authorized.

    2. Set the Data Destination for the physical schema that will push tables to Databricks:

      • In the Navigation bar, go to Schema.

      • In the Action bar, select Settings (gear icon) → Set Data Destination.

      • In the Set Schema Destination for <SCHEMA_NAME> dialog:

        • Choose the desired Schema Destination.

        • Enter the Target Schema Name (the name of the table you want to create in Databricks).

          • By default, this value matches the schema name in Incorta.

    3. Load the schema and verify that the tables appear correctly in Databricks.

Best Practices Index
Best Practices

Just here to browse knowledge? This might help!

Contributors
Version history
Last update:
‎10-08-2025 04:03 PM
Updated by: