JDBC-ODBC Bridge Connector for Azure Data Factory (Pipeline)

In this article you will learn how to integrate JDBC-ODBC Bridge data in Azure Data Factory (Pipeline) without coding in just a few clicks (live / bi-directional connection to JDBC-ODBC Bridge). JDBC-ODBC Bridge driver can be used to consume data from any JDBC Driver in non JAVA apps (i.e. Excel, Power BI, C#). Many apps written in C++ or .net (e.g. Excel, Power BI, Informatica) which don’t have direct support for using JAVA based JDBC driver technology..

Using JDBC-ODBC Bridge Connector you will be able to connect, read, and write data from within Azure Data Factory (Pipeline). Follow the steps below to see how we would accomplish that.

Download Documentation

Create ODBC Data Source (DSN) based on ZappySys JDBC Driver

Step-by-step instructions

To get data from JDBC-ODBC Bridge using Azure Data Factory (Pipeline) we first need to create a DSN (Data Source) which will access data from JDBC-ODBC Bridge. We will later be able to read data using Azure Data Factory (Pipeline). Perform these steps:

  1. Install ZappySys ODBC PowerPack.

  2. Open ODBC Data Sources (x64):
    Open ODBC Data Source

  3. Create a System Data Source (System DSN) based on ZappySys JDBC Bridge Driver

    ZappySys JDBC Bridge Driver
    Create new System DSN for ZappySys JDBC Bridge Driver
    You should create a System DSN (instead of a User DSN) if the client application is launched under a Windows System Account, e.g. as a Windows Service. If the client application is 32-bit (x86) running with a System DSN, use ODBC Data Sources (32-bit) instead of the 64-bit version. Furthermore, a User DSN may be created instead, but then you will not be able to use the connection from Windows Services(or any application running under a Windows System Account).
  4. Now, we need JDBC Bridge Connection. Lets create it. When you see DSN Config Editor with zappysys logo first thing you need to do is change default DSN Name at the top and Configure JDBC Bridge Driver. Enter Credentials (In this Example We have used PostgreSQL Credentials) and then Click on Test Connection to verify your credentials.

    Note: Enter suitable JDBC Driver Credentials, You can read data from any JDBC Driver
    ODBC JDBC Bridge Driver - Create Connection

  5. This example shows how to write simple SOQL query (JDBC Bridge Object Query Language). It uses WHERE clause. For more SOQL Queries click here.
    SOQL is similar to database SQL query language but much simpler and many features you use in database query may not be supported in SOQL (Such as JOIN clause not supported). But you can use following Queries for Insert, Update, Delete and Upsert(Update or Insert record if not found).

    SELECT orderid, customerid, orderdate, orderamount FROM "public"."zappysys"
    ZappySys ODBC Driver - Select Table and Preview Data
  6. Click OK to finish creating the data source

Video instructions

Read data in Azure Data Factory (ADF) from ODBC datasource (JDBC-ODBC Bridge)

  1. To start press New button:

    Create new Self-Hosted integration runtime
  2. Select "Azure, Self-Hosted" option:

    Create new Self-Hosted integration runtime
  3. Select "Self-Hosted" option:

    Create new Self-Hosted integration runtime
  4. Set a name, we will use "OnPremisesRuntime":

    Set a name for IR
  5. Download and install Microsoft Integration Runtime.

  6. Launch Integration Runtime and copy/paste Authentication Key from Integration Runtime configuration in Azure Portal:

    Copy/paste Authentication Key
  7. After finishing registering the Integration Runtime node, you should see a similar view:

    Check Integration Runtime node status
  8. Go back to Azure Portal and finish adding new Integration Runtime. You should see it was successfully added:

    Integration Runtime status
  9. Go to Linked services section and create a new Linked service based on ODBC:

    Add new Linked service
  10. Select "ODBC" service:

    Add new ODBC service
  11. Configure new ODBC service. Use the same DSN name we used in the previous step and copy it to Connection string box:

    JdbC-OdbcBridgeDSN
    DSN=JdbC-OdbcBridgeDSN
    Configure new ODBC service
  12. For created ODBC service create ODBC-based dataset:

    Add new ODBC dataset
  13. Go to your pipeline and add Copy data connector into the flow. In Source section use OdbcDataset we created as a source dataset:

    Set source in Copy data
  14. Then go to Sink section and select a destination/sink dataset. In this example we use precreated AzureBlobStorageDataset which saves data into an Azure Blob:

    Set sink in Copy data
  15. Finally, run the pipeline and see data being transferred from OdbcDataset to your destination dataset:

    Run the flow

Conclusion

In this article we discussed how to connect to JDBC-ODBC Bridge in Azure Data Factory (Pipeline) and integrate data without any coding. Click here to Download JDBC-ODBC Bridge Connector for Azure Data Factory (Pipeline) and try yourself see how easy it is. If you still have any question(s) then ask here or simply click on live chat icon below and ask our expert (see bottom-right corner of this page).

Download JDBC-ODBC Bridge Connector for Azure Data Factory (Pipeline) Documentation 

More integrations

Other application integration scenarios for JDBC-ODBC Bridge

Other connectors for Azure Data Factory (Pipeline)


Download JDBC-ODBC Bridge Connector for Azure Data Factory (Pipeline) Documentation

  • How to connect JDBC-ODBC Bridge in Azure Data Factory (Pipeline)?

  • How to get JDBC-ODBC Bridge data in Azure Data Factory (Pipeline)?

  • How to read JDBC-ODBC Bridge data in Azure Data Factory (Pipeline)?

  • How to load JDBC-ODBC Bridge data in Azure Data Factory (Pipeline)?

  • How to import JDBC-ODBC Bridge data in Azure Data Factory (Pipeline)?

  • How to pull JDBC-ODBC Bridge data in Azure Data Factory (Pipeline)?

  • How to push data to JDBC-ODBC Bridge in Azure Data Factory (Pipeline)?

  • How to write data to JDBC-ODBC Bridge in Azure Data Factory (Pipeline)?

  • How to POST data to JDBC-ODBC Bridge in Azure Data Factory (Pipeline)?

  • Call JDBC-ODBC Bridge API in Azure Data Factory (Pipeline)

  • Consume JDBC-ODBC Bridge API in Azure Data Factory (Pipeline)

  • JDBC-ODBC Bridge Azure Data Factory (Pipeline) Automate

  • JDBC-ODBC Bridge Azure Data Factory (Pipeline) Integration

  • Integration JDBC-ODBC Bridge in Azure Data Factory (Pipeline)

  • Consume real-time JDBC-ODBC Bridge data in Azure Data Factory (Pipeline)

  • Consume real-time JDBC-ODBC Bridge API data in Azure Data Factory (Pipeline)

  • JDBC-ODBC Bridge ODBC Driver | ODBC Driver for JDBC-ODBC Bridge | ODBC JDBC-ODBC Bridge Driver | SSIS JDBC-ODBC Bridge Source | SSIS JDBC-ODBC Bridge Destination

  • Connect JDBC-ODBC Bridge in Azure Data Factory (Pipeline)

  • Load JDBC-ODBC Bridge in Azure Data Factory (Pipeline)

  • Load JDBC-ODBC Bridge data in Azure Data Factory (Pipeline)

  • Read JDBC-ODBC Bridge data in Azure Data Factory (Pipeline)

  • JDBC-ODBC Bridge API Call in Azure Data Factory (Pipeline)