Cosmos DB Connector for Azure Data Factory (SSIS)
Connect to your Azure Cosmos DB databases to read, query, create, update, and delete documents and more!
In this article you will learn how to
quickly and efficiently integrate Cosmos DB data in Azure Data Factory (SSIS) without coding.
We will use high-performance Cosmos DB Connector
to easily connect to Cosmos DB and then access the data inside Azure Data Factory (SSIS).
Let's follow the steps below to see how we can accomplish that!
Download
Documentation
Cosmos DB Connector for Azure Data Factory (SSIS) is based on ZappySys API Connector Framework
which is a part of SSIS PowerPack.
It is a collection of high-performance SSIS connectors that enable you
to integrate data with virtually any data provider supported by SSIS, including SQL Server.
SSIS PowerPack supports various file formats, sources and destinations,
including REST/SOAP API, SFTP/FTP, storage services, and plain files, to mention a few
(if you are new to SSIS and SSIS PowerPack, find out more on how to use them).
|
Connect to Cosmos DB in other apps
|
Create SSIS package
First of all, create an SSIS package, which will connect to Cosmos DB in SSIS.
Once you do that, you are one step closer to deploying and running it in Azure-SSIS integration runtime in Azure Data Factory (ADF).
Then simply proceed to the next step - creating and configuring Azure Blob Storage Container.
Prepare custom setup files for Azure-SSIS runtime
Now it's time to start preparing custom setup files for Azure-SSIS runtime.
During Azure-SSIS runtime creation you can instruct ADF to perform a custom setup on a VM (Azure-SSIS node);
i.e. to run the custom installer, copy files, execute PowerShell scripts, etc.
In that case, your custom setup files are downloaded and run in the Azure-SSIS node (a VM) when you start the runtime.
In this section we will prepare custom setup files so that you can run SSIS packages with SSIS PowerPack connectors inside in Azure-SSIS runtime.
Trial Users
Use the step below if you are a Trial User, when you did not purchase a license key.
Proceed with these steps:
-
Download ~/Views/IntegrationHub/ContentBlocks/Links/SSIS-PowerPack/DownloadTrial.cshtmlSSIS PowerPack trial installer.
Make sure you don't rename the installer and keep it named as SSISPowerPackSetup_64bit_Trial.msi.
-
Create a text file and name it main.cmd (make it all lowercase, very important).
-
Copy and paste this script into it and save it:
set DIR=%CUSTOM_SETUP_SCRIPT_LOG_DIR%
echo Calling Step 1 : %TIME% >> "%DIR%\steps_log.txt"
dir /s /b > "%DIR%\file_list.txt"
echo Calling Step 2 : %TIME% >> "%DIR%\steps_log.txt"
::Install SSIS PowerPack
msiexec /i "SSISPowerPackSetup_64bit_Trial.msi" ADDLOCAL=ALL /q /L*V "%DIR%\powerpack_trial_install_log.txt"
echo Calling Step 3 : %TIME% >> "%DIR%\steps_log.txt"
dir "C:\Program Files\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
dir "C:\Program Files (x86)\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
echo DONE : %TIME% >> "%DIR%\steps_log.txt"
echo complete
This is the entry-point script that is executed when Azure-SSIS runtime is started.
-
At last! You are ready to upload these two files — main.cmd & SSISPowerPackSetup_64bit_Trial.msi — into your Azure Blob Storage container's folder, which we will do in the Upload custom setup files to Azure Blob Storage container step.
Paid Customers
Use the steps below if you are a Paid Customer, when you purchased a license.
Proceed with these steps:
-
Download SSIS PowerPack paid installer.
Make sure you don't rename the installer and keep it named as SSISPowerPackSetup_64bit.msi.
-
Have your SSIS PowerPack license key handy, we will need it in the below script.
-
Create a text file and name it main.cmd (make it all lowercase, very important).
- Copy and paste the below script into it.
- Paste your license key by replacing parameter's
--register
argument with your real license key.
-
Finally, save main.cmd:
set DIR=%CUSTOM_SETUP_SCRIPT_LOG_DIR%
echo Calling Step 1 : %TIME% >> "%DIR%\steps_log.txt"
dir /s /b > "%DIR%\file_list.txt"
echo Calling Step 2 : %TIME% >> "%DIR%\steps_log.txt"
::Install SSIS PowerPack
msiexec /i "SSISPowerPackSetup_64bit.msi" ADDLOCAL=ALL /q /L*V "%DIR%\powerpack_install_log.txt"
echo Calling Step 3 : %TIME% >> "%DIR%\steps_log.txt"
::Activate PowerPack license (Optional)
"C:\Program Files (x86)\ZappySys\SSIS PowerPack (64 bit)\LicenseManager.exe" -p SSISPowerPack --register "lgGAAO0-----REPLACE-WITH-YOUR-LICENSE-KEY-----czM=" --logfile "%DIR%\powerpack_register_log.txt"
::Show System Info
echo Calling Step 4 : %TIME% >> "%DIR%\steps_log.txt"
"C:\Program Files (x86)\ZappySys\SSIS PowerPack (64 bit)\LicenseManager.exe" -i -l "%DIR%\sysinfo_log.txt"
echo Calling Step 5 : %TIME% >> "%DIR%\steps_log.txt"
dir "C:\Program Files\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
dir "C:\Program Files (x86)\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
echo DONE : %TIME% >> "%DIR%\steps_log.txt"
echo complete
This is the entry-point script that is executed when Azure-SSIS runtime is started.
-
At last! You are ready to upload these two files — main.cmd & SSISPowerPackSetup_64bit.msi — into your Azure Blob Storage container's folder, which we will do in the Upload custom setup files to Azure Blob Storage container step.
Upload custom setup files to Azure Blob Storage container
Within Azure Blob Storage container we will store custom setup files we prepared in the previous step so that Azure-SSIS can use them in custom setup process.
Just perform these very simple, but very important steps:
-
Create Azure Blob Storage container, if you haven't done it already
Make sure you create and use Azure Blob Storage container instead of Azure Data Lake Storage folder.
Azure Data Lake Storage won't allow creating an SAS URI for the container, which is a crucial step in the process.
-
Find Blob Containers node, right-click on it and hit Create Blob Container option:
-
Upload the two custom setup files — main.cmd & the MSI installer — into your Azure Blob Storage container's folder:
- It was easy, wasn't it? It's time we create an SAS URI in the next step.
Create SAS URI for Azure Blob Container
Once you have custom setup files prepared, it's time we generate an SAS URI.
This SAS URI will be used by a new Azure-SSIS runtime to install SSIS PowerPack inside the runtime's node, a VM.
Let's proceed together by performing the steps below:
- Install and launch Azure Storage Explorer.
-
Right-click on the Storage Accounts node and then hit Connect to Azure storage... menu item:
-
Proceed by right-clicking on that container node and select Get Shared Access Signature... option.
-
Next, set the Expiry time field to a date far in the future.
If you restart Azure-SSIS runtime and your SAS URI is expired by that time, it will not start.
-
Select Read, Create, Write, and List permissions:
We also recommend to add Delete permission too to support future functionality.
-
Copy SAS URL to the clipboard and save it for the next step:
You can also generate and copy SAS URL from within Azure Portal itself:
Create Azure-SSIS integration runtime
Once you have the SAS URL we obtained in the previous step, we are ready to move on to create an Azure-SSIS runtime in Azure Data Factory:
-
Firstly, perform the steps described in Create an Azure-SSIS integration runtime article in Azure Data Factory reference.
-
In Advanced settings page section, configure Custom setup container SAS URI you obtained in the previous step:
-
And you are done! That was quick! You can see your Azure-SSIS runtime up and running:
The custom setup script is executed only once — at the time an Azure-SSIS runtime is started.
It is also executed if you stop and start Azure-SSIS runtime again.
Deploy SSIS package in Visual Studio
We are ready to deploy the SSIS package to Azure-SSIS runtime. Once you do that, proceed to the next step for the grand finale!
Execute SSIS package in SQL Server Management Studio (SSMS)
After all hard work, we are ready to execute SSIS package in SQL Server Management Studio (SSMS):
- Connect to the SQL Server which is linked to your Azure-SSIS runtime and contains SSISDB database.
-
Navigate to Integration Services Catalog » Your Folder » Your Project » Your Package, right-click on it, and hit Execute...:
-
To view the status of the past execution, navigate to
Integration Services Catalog » Your Folder » Your Project » Your Package, right-click on it, and select Reports » Standard Reports » All Executions menu item:
Scenarios
Moving SSIS PowerPack license to another Azure-SSIS runtime
If you are a Paid Customer, there will be a time when you no longer use Azure-SSIS runtime or you need to use your license on a different ADF instance.
To transfer a license from one Azure-SSIS runtime to another, perform these steps:
-
Copy & paste this script into main.cmd we used in the previous step:
set DIR=%CUSTOM_SETUP_SCRIPT_LOG_DIR%
echo Calling Step 1 : %TIME% >> "%DIR%\steps_log.txt"
dir /s /b > "%DIR%\file_list.txt"
echo Calling Step 2 : %TIME% >> "%DIR%\steps_log.txt"
::Install SSIS PowerPack
msiexec /i "SSISPowerPackSetup_64bit.msi" ADDLOCAL=ALL /q /L*V "%DIR%\powerpack_install_log.txt"
echo Calling Step 3 : %TIME% >> "%DIR%\steps_log.txt"
::De-Activate same license
"C:\Program Files (x86)\ZappySys\SSIS PowerPack (64 bit)\LicenseManager.exe" -p SSISPowerPack --unregister --logfile "%DIR%\powerpack_un_register_log.txt"
::Show System Info
echo Calling Step 4 : %TIME% >> "%DIR%\steps_log.txt"
"C:\Program Files (x86)\ZappySys\SSIS PowerPack (64 bit)\LicenseManager.exe" -i -l "%DIR%\sysinfo_log.txt"
echo Calling Step 5 : %TIME% >> "%DIR%\steps_log.txt"
dir "C:\Program Files\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
dir "C:\Program Files (x86)\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
echo DONE : %TIME% >> "%DIR%\steps_log.txt"
echo complete
- Start Azure-SSIS runtime.
This will unregister your license on the original Azure-SSIS runtime.
- Stop Azure-SSIS runtime to deallocate resources in Azure.
- Now you are free to activate it on another Azure-SSIS runtime.
Advanced topics
Actions supported by Cosmos DB Connector
Cosmos DB Connector support following actions for REST API integration.
If some actions are not listed below then you can easily edit Connector file and enhance out of the box functionality.
Gets a list of the databases in the current database account. [
Read more...
]
Get Database Information by Id or Name
Gets a database by its Id. [
Read more...
]
Parameter |
Description |
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
Gets a list of the tables in the database. (Tables are also called 'containers' or 'collections') [
Read more...
]
Parameter |
Description |
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
Get table information by Id or Name
Gets a table by its Id. (Tables are also called 'containers' or 'collections') [
Read more...
]
Parameter |
Description |
Table Name (Case-Sensitive) |
|
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
Get table partition key ranges
Gets Partition Key Ranges for a table. This is useful for query if you want to minimize scan to specific partition (Tables are also called 'containers' or 'collections') [
Read more...
]
Parameter |
Description |
Table Name (Case-Sensitive) |
|
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
Query documents using Cosmos DB SQL query language
Gets data based on the specified SQL query. [
Read more...
]
Parameter |
Description |
Table Name (Case-Sensitive) |
|
SQL Query |
Query for Cosmos DB
|
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
Allow Query Scan |
Option |
Value |
true |
true |
false |
false |
|
Allow Cross Partition Query |
Option |
Value |
true |
true |
false |
false |
|
Cross Partition Key Range Id |
|
Get All Documents for a Table
Gets all documents for a Table. [
Read more...
]
Parameter |
Description |
Table Name (Case-Sensitive) |
|
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
Gets Document by Id. [
Read more...
]
Parameter |
Description |
Document Id |
|
Table Name (Case-Sensitive) |
|
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
EnableCrossPartition |
Option |
Value |
true |
true |
false |
false |
|
Partition Key Value (default is supplied Id) |
The partition key value for the document. Must be included if and only if the collection is created with a partitionKey definition
Option |
Value |
Default |
. |
SingleKeyValue |
["someValue1"] |
MultiKeyValue |
["some_value1","some_value2" ] |
|
ConsistencyLevel |
This is the consistency level override. The valid values are: Strong, Bounded, Session, or Eventual (in order of strongest to weakest). The override must be the same or weaker than the account's configured consistency level.
Option |
Value |
Strong |
Strong |
Bounded |
Bounded |
Session |
Session |
Eventual |
Eventual |
|
Deletes a Document by Id. [
Read more...
]
Parameter |
Description |
Document Id |
|
Table Name (Case-Sensitive) |
|
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
EnableCrossPartition |
Option |
Value |
true |
true |
false |
false |
|
Partition Key Value (default is supplied Id) |
The partition key value for the document. Must be included if and only if the collection is created with a partitionKey definition
Option |
Value |
Default |
. |
SingleKeyValue |
["someValue1"] |
MultiKeyValue |
["some_value1","some_value2" ] |
|
Get All Users for a Database
Gets all users for a Database. [
Read more...
]
Parameter |
Description |
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
Gets database user information for a specific Id [
Read more...
]
Parameter |
Description |
User Name (Case-Sensitive) |
|
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
Create a new user which you can later use to create permission set and obtain resource token. [
Read more...
]
Parameter |
Description |
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
RequestBody |
|
Create a document in the container
Insert JSON document in Cosmos DB Container. [
Read more...
]
Upsert a document in the container
Insert JSON document in Cosmos DB Container. [
Read more...
]
Parameter |
Description |
Upsert |
|
Update Document in the Container
Update full or part of the document in Cosmos DB Container. [
Read more...
]
Create Permission Token for a User (One Table)
Create a new user which you can later use to create permission set and obtain resource token. [
Read more...
]
Parameter |
Description |
Permission Name (e.g. read_orders) |
|
Database Name (keep blank to use default) Case-Sensitive |
Leave blank to use default DB set on connection screen
|
User Name (Case-Sensitive) |
|
PermissionMode |
Option |
Value |
All |
All |
Read |
Read |
Write |
Write |
Delete |
Delete |
|
Table (Add Permission for this) |
|
ExpiresInSecond |
The validity period of the resource token returned by the operation. By default, a resource token is valid for one hour. To override the default, set this header with the desired validity period in seconds. The max override value is 18000, which is five hours.
|
This is generic endpoint. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL. [
Read more...
]
Parameter |
Description |
Url |
API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
|
Body |
Request Body content goes here
|
IsMultiPart |
Set this option if you want to upload file(s) using either raw file data (i.e., POST raw file data) or send data using the multi-part encoding method (i.e. Content-Type: multipart/form-data).
A multi-part request allows you to mix key/value pairs and upload files in the same request. On the other hand, raw upload allows only a single file to be uploaded (without any key/value data).
==== Raw Upload (Content-Type: application/octet-stream) =====
To upload a single file in raw mode, check this option and specify the full file path starting with the @ sign in the Body (e.g. @c:\data\myfile.zip)
==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) =====
To treat your request data as multi-part fields, you must specify key/value pairs separated by new lines in the RequestData field (i.e., Body). Each key/value pair should be entered on a new line, and key/value are separated using an equal sign (=). Leading and trailing spaces are ignored, and blank lines are also ignored.
If a field value contains any special character(s), use escape sequences (e.g., for NewLine: \r\n, for Tab: \t, for at (@): @). When the value of any field starts with the at sign (@), it is automatically treated as a file you want to upload. By default, the file content type is determined based on the file extension; however, you can supply a content type manually for any field using this format: [YourFileFieldName.Content-Type=some-content-type].
By default, file upload fields always include Content-Type in the request (non-file fields do not have Content-Type by default unless you supply it manually). If, for some reason, you don't want to use the Content-Type header in your request, then supply a blank Content-Type to exclude this header altogether (e.g., SomeFieldName.Content-Type=).
In the example below, we have supplied Content-Type for file2 and SomeField1. All other fields are using the default content type.
See the example below of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data, then manually set Request Header => Content-Type: multipart/mixed (it must start with multipart/ or it will be ignored).
file1=@c:\data\Myfile1.txt
file2=@c:\data\Myfile2.json
file2.Content-Type=application/json
SomeField1=aaaaaaa
SomeField1.Content-Type=text/plain
SomeField2=12345
SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab
SomeFieldStartingWithAtSign=\@MyTwitterHandle
|
Filter |
Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Option |
Value |
No filter |
|
Example1 |
$.store.books[*] |
Example2 (Sections Under Books) |
$.store.books[*].sections[*] |
Example3 (Equals) |
$.store.books[?(@author=='sam')] |
Example4 (Equals - Any Section) |
$..[?(@author=='sam')] |
Example5 (Not Equals - Any Section) |
$..[?(@author!='sam')] |
Example6 (Number less than) |
$.store.books[?(@.price<10)]
Example7 (Regular Expression - Contains Pattern)=$.store.books[?(@author=~ /sam|bob/ )] |
Example8 (Regular Expression - Does Not Contain Pattern) |
$.store.books[?(@author=~ /^((?!sam|bob).)*$/ )] |
Example9 (Regular Expression - Exact Pattern Match) |
$.store.books[?(@author=~ /^sam|bob$/ )] |
Example10 (Regular Expression - Starts With) |
$.store.books[?(@author=~ /^sam/ )] |
Example11 (Regular Expression - Ends With) |
$.store.books[?(@author=~ /sam$/ )] |
Example12 (Between) |
$.store.employees[?( @.hiredate>'2015-01-01' && @.hiredate<'2015-01-04' )] |
|
Headers |
Headers for Request. To enter multiple headers use double pipe or new line after each {header-name}:{value} pair
|
Generic Request (Bulk Write)
This is a generic endpoint for bulk write purpose. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL. [
Read more...
]
Parameter |
Description |
Url |
API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
|
IsMultiPart |
Set this option if you want to upload file(s) using either raw file data (i.e., POST raw file data) or send data using the multi-part encoding method (i.e. Content-Type: multipart/form-data).
A multi-part request allows you to mix key/value pairs and upload files in the same request. On the other hand, raw upload allows only a single file to be uploaded (without any key/value data).
==== Raw Upload (Content-Type: application/octet-stream) =====
To upload a single file in raw mode, check this option and specify the full file path starting with the @ sign in the Body (e.g. @c:\data\myfile.zip)
==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) =====
To treat your request data as multi-part fields, you must specify key/value pairs separated by new lines in the RequestData field (i.e., Body). Each key/value pair should be entered on a new line, and key/value are separated using an equal sign (=). Leading and trailing spaces are ignored, and blank lines are also ignored.
If a field value contains any special character(s), use escape sequences (e.g., for NewLine: \r\n, for Tab: \t, for at (@): @). When the value of any field starts with the at sign (@), it is automatically treated as a file you want to upload. By default, the file content type is determined based on the file extension; however, you can supply a content type manually for any field using this format: [YourFileFieldName.Content-Type=some-content-type].
By default, file upload fields always include Content-Type in the request (non-file fields do not have Content-Type by default unless you supply it manually). If, for some reason, you don't want to use the Content-Type header in your request, then supply a blank Content-Type to exclude this header altogether (e.g., SomeFieldName.Content-Type=).
In the example below, we have supplied Content-Type for file2 and SomeField1. All other fields are using the default content type.
See the example below of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data, then manually set Request Header => Content-Type: multipart/mixed (it must start with multipart/ or it will be ignored).
file1=@c:\data\Myfile1.txt
file2=@c:\data\Myfile2.json
file2.Content-Type=application/json
SomeField1=aaaaaaa
SomeField1.Content-Type=text/plain
SomeField2=12345
SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab
SomeFieldStartingWithAtSign=\@MyTwitterHandle
|
Filter |
Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
|
Headers |
Headers for Request. To enter multiple headers use double pipe (||) or new line after each {header-name}:{value} pair
|
Conclusion
In this article we showed you how to connect to Cosmos DB in Azure Data Factory (SSIS) and integrate data without any coding, saving you time and effort.
It's worth noting that ZappySys API Driver allows you to connect not only to Cosmos DB,
but to any Java application that supports JDBC
(just use a different JDBC driver and configure it appropriately).
We encourage you to download Cosmos DB Connector for Azure Data Factory (SSIS) and see how easy it is to use it for yourself or your team.
If you have any questions, feel free to contact ZappySys support team.
You can also open a live chat immediately by clicking on the chat icon below.
Download Cosmos DB Connector for Azure Data Factory (SSIS)
Documentation
More integrations
Other connectors for Azure Data Factory (SSIS)
All
Big Data & NoSQL
Database
CRM & ERP
Marketing
Collaboration
Cloud Storage
Reporting
Commerce
API & Files
Other application integration scenarios for Cosmos DB
All
Data Integration
Database
BI & Reporting
Productivity
Programming Languages
Automation & Scripting
ODBC applications
Common Searches:
How to connect Cosmos DB in Azure Data Factory (SSIS)?
How to get Cosmos DB data in Azure Data Factory (SSIS)?
How to read Cosmos DB data in Azure Data Factory (SSIS)?
How to load Cosmos DB data in Azure Data Factory (SSIS)?
How to import Cosmos DB data in Azure Data Factory (SSIS)?
How to pull Cosmos DB data in Azure Data Factory (SSIS)?
How to push data to Cosmos DB in Azure Data Factory (SSIS)?
How to write data to Cosmos DB in Azure Data Factory (SSIS)?
How to POST data to Cosmos DB in Azure Data Factory (SSIS)?
Call Cosmos DB API in Azure Data Factory (SSIS)
Consume Cosmos DB API in Azure Data Factory (SSIS)
Cosmos DB Azure Data Factory (SSIS) Automate
Cosmos DB Azure Data Factory (SSIS) Integration
Integration Cosmos DB in Azure Data Factory (SSIS)
Consume real-time Cosmos DB data in Azure Data Factory (SSIS)
Consume real-time Cosmos DB API data in Azure Data Factory (SSIS)
Cosmos DB ODBC Driver | ODBC Driver for Cosmos DB | ODBC Cosmos DB Driver | SSIS Cosmos DB Source | SSIS Cosmos DB Destination
Connect Cosmos DB in Azure Data Factory (SSIS)
Load Cosmos DB in Azure Data Factory (SSIS)
Load Cosmos DB data in Azure Data Factory (SSIS)
Read Cosmos DB data in Azure Data Factory (SSIS)
Cosmos DB API Call in Azure Data Factory (SSIS)