Salesforce Connector for Azure Data Factory (Pipeline)In this article you will learn how to quickly and efficiently integrate Salesforce data in Azure Data Factory (Pipeline) without coding. We will use high-performance Salesforce Connector to easily connect to Salesforce and then access the data inside Azure Data Factory (Pipeline). Salesforce Connector can be used to extract/load large amount of data from/in Salesforce.com without any programming. You can use simple Table mode or Query mode with full SOQL query language support (SOQL=Salesforce.com Object Query Language). Let's follow the steps below to see how we can accomplish that! Salesforce Connector for Azure Data Factory (Pipeline) is based on ZappySys Salesforce Driver which is part of ODBC PowerPack. It is a collection of high-performance ODBC drivers that enable you to integrate data in SQL Server, SSIS, a programming language, or any other ODBC-compatible application. ODBC PowerPack supports various file formats, sources and destinations, including REST/SOAP API, SFTP/FTP, storage services, and plain files, to mention a few. |
Connect to Salesforce in other apps
|
Create ODBC Data Source (DSN) based on ZappySys Salesforce Driver
Step-by-step instructions
To get data from Salesforce using Azure Data Factory (Pipeline) we first need to create a DSN (Data Source) which will access data from Salesforce. We will later be able to read data using Azure Data Factory (Pipeline). Perform these steps:
-
Install ZappySys ODBC PowerPack.
-
Open ODBC Data Sources (x64):
-
Create a User data source (User DSN) based on ZappySys Salesforce Driver
ZappySys Salesforce Driver-
Create and use User DSN
if the client application is run under a User Account.
This is an ideal option
in design-time , when developing a solution, e.g. in Visual Studio 2019. Use it for both type of applications - 64-bit and 32-bit. -
Create and use System DSN
if the client application is launched under a System Account, e.g. as a Windows Service.
Usually, this is an ideal option to use
in a production environment . Use ODBC Data Source Administrator (32-bit), instead of 64-bit version, if Windows Service is a 32-bit application.
Azure Data Factory (Pipeline) uses a Service Account, when a solution is deployed to production environment, therefore for production environment you have to create and use a System DSN. -
Create and use User DSN
if the client application is run under a User Account.
This is an ideal option
-
Now, we need SalesForce Connection. Lets create it.
-
Now, When you see DSN Config Editor with zappysys logo first thing you need to do is change default DSN Name at the top and Click on Preview Tab, Select Table from Tables Dropdown or you can enter or modify a SOQL query and click on Preview Data.
This example shows how to write simple SOQL query (Salesforce Object Query Language). It uses WHERE clause. For more SOQL Queries click here.
SOQL is similar to database SQL query language but much simpler and many features you use in database query may not be supported in SOQL (Such as JOIN clause not supported). But you can use following Queries for Insert, Update, Delete and Upsert(Update or Insert record if not found).SELECT * FROM Account WHERE Name like '%Oil%' -
Click OK to finish creating the data source
Video instructions
Read data in Azure Data Factory (ADF) from ODBC datasource (Salesforce)
-
To start press New button:
-
Select "Azure, Self-Hosted" option:
-
Select "Self-Hosted" option:
-
Set a name, we will use "OnPremisesRuntime":
-
Download and install Microsoft Integration Runtime.
-
Launch Integration Runtime and copy/paste Authentication Key from Integration Runtime configuration in Azure Portal:
-
After finishing registering the Integration Runtime node, you should see a similar view:
-
Go back to Azure Portal and finish adding new Integration Runtime. You should see it was successfully added:
-
Go to Linked services section and create a new Linked service based on ODBC:
-
Select "ODBC" service:
-
Configure new ODBC service. Use the same DSN name we used in the previous step and copy it to Connection string box:
SalesforceDSNDSN=SalesforceDSN -
For created ODBC service create ODBC-based dataset:
-
Go to your pipeline and add Copy data connector into the flow. In Source section use OdbcDataset we created as a source dataset:
-
Then go to Sink section and select a destination/sink dataset. In this example we use precreated AzureBlobStorageDataset which saves data into an Azure Blob:
-
Finally, run the pipeline and see data being transferred from OdbcDataset to your destination dataset:
Conclusion
In this article we showed you how to connect to Salesforce in Azure Data Factory (Pipeline) and integrate data without any coding, saving you time and effort. We encourage you to download Salesforce Connector for Azure Data Factory (Pipeline) and see how easy it is to use it for yourself or your team.
If you have any questions, feel free to contact ZappySys support team. You can also open a live chat immediately by clicking on the chat icon below.
Download Salesforce Connector for Azure Data Factory (Pipeline) Documentation
More integrations
Other connectors for Azure Data Factory (Pipeline)
Other application integration scenarios for Salesforce
How to connect Salesforce in Azure Data Factory (Pipeline)?
How to get Salesforce data in Azure Data Factory (Pipeline)?
How to read Salesforce data in Azure Data Factory (Pipeline)?
How to load Salesforce data in Azure Data Factory (Pipeline)?
How to import Salesforce data in Azure Data Factory (Pipeline)?
How to pull Salesforce data in Azure Data Factory (Pipeline)?
How to push data to Salesforce in Azure Data Factory (Pipeline)?
How to write data to Salesforce in Azure Data Factory (Pipeline)?
How to POST data to Salesforce in Azure Data Factory (Pipeline)?
Call Salesforce API in Azure Data Factory (Pipeline)
Consume Salesforce API in Azure Data Factory (Pipeline)
Salesforce Azure Data Factory (Pipeline) Automate
Salesforce Azure Data Factory (Pipeline) Integration
Integration Salesforce in Azure Data Factory (Pipeline)
Consume real-time Salesforce data in Azure Data Factory (Pipeline)
Consume real-time Salesforce API data in Azure Data Factory (Pipeline)
Salesforce ODBC Driver | ODBC Driver for Salesforce | ODBC Salesforce Driver | SSIS Salesforce Source | SSIS Salesforce Destination
Connect Salesforce in Azure Data Factory (Pipeline)
Load Salesforce in Azure Data Factory (Pipeline)
Load Salesforce data in Azure Data Factory (Pipeline)
Read Salesforce data in Azure Data Factory (Pipeline)
Salesforce API Call in Azure Data Factory (Pipeline)