Salesforce Connector for Azure Data Factory (Pipeline)
In this article you will learn how to integrate Using Salesforce Connector you will be able to connect, read, and write data from within Azure Data Factory (Pipeline). Follow the steps below to see how we would accomplish that. The driver mentioned above is part of ODBC PowerPack which is a collection of high-performance Drivers for various API data source (i.e. REST API, JSON, XML, CSV, Amazon S3 and many more). Using familiar SQL query language you can make live connections and read/write data from API sources or JSON / XML / CSV Files inside SQL Server (T-SQL) or your favorite Reporting (i.e. Power BI, Tableau, Qlik, SSRS, MicroStrategy, Excel, MS Access), ETL Tools (i.e. Informatica, Talend, Pentaho, SSIS). You can also call our drivers from programming languages such as JAVA, C#, Python, PowerShell etc. If you are new to ODBC and ZappySys ODBC PowerPack then check the following links to get started. |
Connect to Salesforce in other apps
|
Create ODBC Data Source (DSN) based on ZappySys Salesforce Driver
Step-by-step instructions
To get data from Salesforce using Azure Data Factory (Pipeline) we first need to create a DSN (Data Source) which will access data from Salesforce. We will later be able to read data using Azure Data Factory (Pipeline). Perform these steps:
-
Install ZappySys ODBC PowerPack.
-
Open ODBC Data Sources (x64):
-
Create a System Data Source (System DSN) based on ZappySys Salesforce Driver
ZappySys Salesforce DriverYou should create a System DSN (instead of a User DSN) if the client application is launched under a Windows System Account, e.g. as a Windows Service. If the client application is 32-bit (x86) running with a System DSN, use ODBC Data Sources (32-bit) instead of the 64-bit version. Furthermore, a User DSN may be created instead, but then you will not be able to use the connection from Windows Services(or any application running under a Windows System Account). -
Now, we need SalesForce Connection. Lets create it.
-
Now, When you see DSN Config Editor with zappysys logo first thing you need to do is change default DSN Name at the top and Click on Preview Tab, Select Table from Tables Dropdown or you can enter or modify a SOQL query and click on Preview Data.
This example shows how to write simple SOQL query (Salesforce Object Query Language). It uses WHERE clause. For more SOQL Queries click here.
SOQL is similar to database SQL query language but much simpler and many features you use in database query may not be supported in SOQL (Such as JOIN clause not supported). But you can use following Queries for Insert, Update, Delete and Upsert(Update or Insert record if not found).SELECT * FROM Account WHERE Name like '%Oil%' -
Click OK to finish creating the data source
Video instructions
Read data in Azure Data Factory (ADF) from ODBC datasource (Salesforce)
-
To start press New button:
-
Select "Azure, Self-Hosted" option:
-
Select "Self-Hosted" option:
-
Set a name, we will use "OnPremisesRuntime":
-
Download and install Microsoft Integration Runtime.
-
Launch Integration Runtime and copy/paste Authentication Key from Integration Runtime configuration in Azure Portal:
-
After finishing registering the Integration Runtime node, you should see a similar view:
-
Go back to Azure Portal and finish adding new Integration Runtime. You should see it was successfully added:
-
Go to Linked services section and create a new Linked service based on ODBC:
-
Select "ODBC" service:
-
Configure new ODBC service. Use the same DSN name we used in the previous step and copy it to Connection string box:
SalesforceDSNDSN=SalesforceDSN -
For created ODBC service create ODBC-based dataset:
-
Go to your pipeline and add Copy data connector into the flow. In Source section use OdbcDataset we created as a source dataset:
-
Then go to Sink section and select a destination/sink dataset. In this example we use precreated AzureBlobStorageDataset which saves data into an Azure Blob:
-
Finally, run the pipeline and see data being transferred from OdbcDataset to your destination dataset:
Conclusion
In this article we discussed how to connect to Salesforce in Azure Data Factory (Pipeline) and integrate data without any coding. Click here to Download Salesforce Connector for Azure Data Factory (Pipeline) and try yourself see how easy it is. If you still have any question(s) then ask here or simply click on live chat icon below and ask our expert (see bottom-right corner of this page).
Download Salesforce Connector for Azure Data Factory (Pipeline)
Documentation
More integrations
Other application integration scenarios for Salesforce
Other connectors for Azure Data Factory (Pipeline)
Download Salesforce Connector for Azure Data Factory (Pipeline)
Documentation
How to connect Salesforce in Azure Data Factory (Pipeline)?
How to get Salesforce data in Azure Data Factory (Pipeline)?
How to read Salesforce data in Azure Data Factory (Pipeline)?
How to load Salesforce data in Azure Data Factory (Pipeline)?
How to import Salesforce data in Azure Data Factory (Pipeline)?
How to pull Salesforce data in Azure Data Factory (Pipeline)?
How to push data to Salesforce in Azure Data Factory (Pipeline)?
How to write data to Salesforce in Azure Data Factory (Pipeline)?
How to POST data to Salesforce in Azure Data Factory (Pipeline)?
Call Salesforce API in Azure Data Factory (Pipeline)
Consume Salesforce API in Azure Data Factory (Pipeline)
Salesforce Azure Data Factory (Pipeline) Automate
Salesforce Azure Data Factory (Pipeline) Integration
Integration Salesforce in Azure Data Factory (Pipeline)
Consume real-time Salesforce data in Azure Data Factory (Pipeline)
Consume real-time Salesforce API data in Azure Data Factory (Pipeline)
Salesforce ODBC Driver | ODBC Driver for Salesforce | ODBC Salesforce Driver | SSIS Salesforce Source | SSIS Salesforce Destination
Connect Salesforce in Azure Data Factory (Pipeline)
Load Salesforce in Azure Data Factory (Pipeline)
Load Salesforce data in Azure Data Factory (Pipeline)
Read Salesforce data in Azure Data Factory (Pipeline)
Salesforce API Call in Azure Data Factory (Pipeline)