Cosmos DB Connector for SSIS

In this article you will learn how to quickly and efficiently integrate Cosmos DB data in SSIS without coding. We will use high-performance Cosmos DB Connector to easily connect to Cosmos DB and then access the data inside SSIS.

Connect to your Azure Cosmos DB databases to read, query, create, update, and delete documents and more!

Let's follow the steps below to see how we can accomplish that!

Download Documentation

Video Tutorial - Integrate Cosmos DB data in SSIS

This video covers following and more so watch carefully. After watching this video follow the steps described in this article.

  • How to download / install required driver for Cosmos DB integration in SSIS
  • How to configure connection for Cosmos DB
  • Features about API Source (Authentication / Query Language / Examples / Driver UI)
  • Using Cosmos DB Connection in SSIS

Prerequisites

Before we begin, make sure the following prerequisites are met:

  1. SSIS designer installed. Sometimes it is referred as BIDS or SSDT (download it from Microsoft).
  2. Basic knowledge of SSIS package development using Microsoft SQL Server Integration Services.
  3. SSIS PowerPack is installed (if you are new to SSIS PowerPack, then get started!).

Read data from Cosmos DB in SSIS (Export data)

In this section we will learn how to configure and use Cosmos DB Connector in API Source to extract data from Cosmos DB.

  1. Begin with opening Visual Studio and Create a New Project.

  2. Select Integration Service Project and in new project window set the appropriate name and location for project. And click OK.

  3. In the new SSIS project screen you will find the following:

    1. SSIS ToolBox on left side bar
    2. Solution Explorer and Property Window on right bar
    3. Control flow, data flow, event Handlers, Package Explorer in tab windows
    4. Connection Manager Window in the bottom

    SSIS Project Screen
    Note: If you don't see ZappySys SSIS PowerPack Task or Components in SSIS Toolbox, please refer to this help link.
  4. Now, Drag and Drop SSIS Data Flow Task from SSIS Toolbox. Double click on the Data Flow Task to see Data Flow designer.

    SSIS Data Flow Task - Drag and Drop
  5. From the SSIS toolbox drag and API Source (Predefined Templates) on the data flow designer surface, and double click on it to edit it:
    SSIS API Source (Predefined Templates) - Drag and Drop

  6. Select New Connection to create a new connection:
    API Source - New Connection

  7. Use a preinstalled Cosmos DB Connector from Popular Connector List or press Search Online radio button to download Cosmos DB Connector. Once downloaded simply use it in the configuration:

    Cosmos DB
    Cosmos DB Connector Selection

  8. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    Steps how to get and use Cosmos DB credentials : API Key [Http]
    Connecting to your Azure Cosmos DB data requires you to authenticate your REST API access. Follow the instructions below:
    1. Go to your Azure portal homepage: https://portal.azure.com/.
    2. In the search bar at the top of the homepage, enter Azure Cosmos DB. In the dropdown that appears, select Azure Cosmos DB.
    3. Click on the name of the database account you want to connect to (also copy and paste the name of the database account for later use).
    4. On the next page where you can see all of the database account information, look along the left side and select Keys: Use API key to get Cosmos DB data via REST API in Azure
    5. On the Keys page, you will have two tabs: Read-write Keys and Read-only Keys. If you are going to write data to your database, you need to remain on the Read-write Keys tab. If you are only going to read data from your database, you should select the Read-only Keys tab.
    6. On the Keys page, copy the PRIMARY KEY value and paste it somewhere for later use (the SECONDARY KEY value may also be copied and used).
    7. Now go to SSIS package or ODBC data source and use this PRIMARY KEY in API Key authentication configuration.
    8. Enter the primary or secondary key you recorded in step 6 into the Primary or Secondary Key field.
    9. Then enter the database account you recorded in step 3 into the Database Account field.
    10. Next, enter or select the default database you want to connect to using the Defualt Database field.
    11. Continue by entering or selecting the default table (i.e. container/collection) you want to connect to using the Default Table (Container/Collection) field.
    12. Select the Test Connection button at the bottom of the window to verify proper connectivity with your Azure Devops account.
    13. If the connection test succeeds, select OK.
    14. Done! Now you are ready to use Asana Connector!

    Configuring authentication parameters
    Cosmos DB
    API Key [Http]
    https://[$Account$].documents.azure.com
    Required Parameters
    Primary or Secondary Key Fill-in the parameter...
    Account Name (Case-Sensitive) Fill-in the parameter...
    Database Name (keep blank to use default) Case-Sensitive Fill-in the parameter...
    API Version Fill-in the parameter...
    Optional Parameters
    Default Table (needed to invoke #DirectSQL)
    ZappySys Http Connection

  9. Select the desired endpoint, change/pass the properties values, and click on Preview Data button to make the API call.

    API Source - Cosmos DB
    Connect to your Azure Cosmos DB databases to read, query, create, update, and delete documents and more!
    API Source - Select Endpoint

  10. That's it! We are done! Just in a few clicks we configured the call to Cosmos DB using Cosmos DB Connector.

    You can load the source data into your desired destination using the Upsert Destination, which supports SQL Server, PostgreSQL, and Amazon Redshift. We also offer other destinations such as CSV, Excel, Azure Table, Salesforce, and more. You can check out our SSIS PowerPack Tasks and components for more options. (*loaded in Trash Destination)

    Execute Package - Reading data from Cosmos DB and load into target

Write data to Cosmos DB using SSIS (Import data)

In this section we will learn how to configure and use Cosmos DB Connector in the API Destination to write data to Cosmos DB.

Video tutorial

This video covers following and more so watch carefully. After watching this video follow the steps described in this article.

  • How to download SSIS PowerPack for Cosmos DB integration in SSIS
  • How to configure connection for Cosmos DB
  • How to write or lookup data to Cosmos DB
  • Features about SSIS API Destination
  • Using Cosmos DB Connector in SSIS

Step-by-step instructions

In upper section we learned how to read data, now in this section we will learn how to configure Cosmos DB in the API Source to POST data to the Cosmos DB.

  1. Read the data from the source, being any desired source component. In example we will use ZappySys Dummy Data Source component.

  2. From the SSIS Toolbox drag and drop API Destination (Predefined Templates) on the Data Flow Designer surface and connect source component with it, and double click to edit it.
    SSIS API Destination (Predefined Templates) - Drag and Drop

  3. Select New Connection to create a new connection:

    API Destination - Cosmos DB
    Connect to your Azure Cosmos DB databases to read, query, create, update, and delete documents and more!
    API Destination - New Connection

  4. Use a preinstalled Cosmos DB Connector from Popular Connector List or press Search Online radio button to download Cosmos DB Connector. Once downloaded simply use it in the configuration:

    Cosmos DB
    Cosmos DB Connector Selection

  5. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    Steps how to get and use Cosmos DB credentials : API Key [Http]
    Connecting to your Azure Cosmos DB data requires you to authenticate your REST API access. Follow the instructions below:
    1. Go to your Azure portal homepage: https://portal.azure.com/.
    2. In the search bar at the top of the homepage, enter Azure Cosmos DB. In the dropdown that appears, select Azure Cosmos DB.
    3. Click on the name of the database account you want to connect to (also copy and paste the name of the database account for later use).
    4. On the next page where you can see all of the database account information, look along the left side and select Keys: Use API key to get Cosmos DB data via REST API in Azure
    5. On the Keys page, you will have two tabs: Read-write Keys and Read-only Keys. If you are going to write data to your database, you need to remain on the Read-write Keys tab. If you are only going to read data from your database, you should select the Read-only Keys tab.
    6. On the Keys page, copy the PRIMARY KEY value and paste it somewhere for later use (the SECONDARY KEY value may also be copied and used).
    7. Now go to SSIS package or ODBC data source and use this PRIMARY KEY in API Key authentication configuration.
    8. Enter the primary or secondary key you recorded in step 6 into the Primary or Secondary Key field.
    9. Then enter the database account you recorded in step 3 into the Database Account field.
    10. Next, enter or select the default database you want to connect to using the Defualt Database field.
    11. Continue by entering or selecting the default table (i.e. container/collection) you want to connect to using the Default Table (Container/Collection) field.
    12. Select the Test Connection button at the bottom of the window to verify proper connectivity with your Azure Devops account.
    13. If the connection test succeeds, select OK.
    14. Done! Now you are ready to use Asana Connector!

    Configuring authentication parameters
    Cosmos DB
    API Key [Http]
    https://[$Account$].documents.azure.com
    Required Parameters
    Primary or Secondary Key Fill-in the parameter...
    Account Name (Case-Sensitive) Fill-in the parameter...
    Database Name (keep blank to use default) Case-Sensitive Fill-in the parameter...
    API Version Fill-in the parameter...
    Optional Parameters
    Default Table (needed to invoke #DirectSQL)
    ZappySys Http Connection

  6. Select the desired endpoint, change/pass the properties values, and go to the Mappings tab to map the columns.

    API Destination - Cosmos DB
    Connect to your Azure Cosmos DB databases to read, query, create, update, and delete documents and more!
    API Destination - Select Endpoint

  7. Finally, map the desired columns:

    API Destination - Cosmos DB
    Connect to your Azure Cosmos DB databases to read, query, create, update, and delete documents and more!
    API Destination - Columns Mapping

  8. That's it; we successfully configured the POST API Call. In a few clicks we configured the Cosmos DB API call using ZappySys Cosmos DB Connector

    Execute Package - Reading data from API Source and load into target

Load Cosmos DB data into SQL Server using Upsert Destination (Insert or Update)

Once you configured the data source, you can load Cosmos DB data into SQL Server using Upsert Destination.

Upsert Destination can merge or synchronize source data with the target table. It supports Microsoft SQL Server, PostgreSQL, and Redshift databases as targets. Upsert Destination also supports very fast bulk upsert operation along with bulk delete.

Upsert operation - a database operation which performs INSERT or UPDATE SQL commands based on record's existence condition in the target table. It inserts records that don't have matching records in the target table or updates them, if they do, by matching them by key columns.

Upsert Destination supports INSERT, UPDATE, and DELETE operations, so it is similar to SQL Server's MERGE command, except it can be used directly in SSIS package.

  1. From the SSIS Toolbox drag-and-drop Upsert Destination component onto the Data Flow designer background.

  2. Connect your SSIS source component to Upsert Destination.

  3. Double-click on Upsert Destination component to open configuration window.

  4. Start by selecting the Action from the list.

  5. Next, select the desired target connection or create one by clicking <New [provider] Connection> menu item from the Target Connection dropdown.

  6. Then select a table from the Target Table list or click New button to create a new table based on the source columns.

  7. Continue by checking Insert and Update options according to your scenario (e.g. if Update option is unchecked, no updates will be made).

  8. Finally, click Map All button to map all columns and then select the Key columns to match the columns on:

    Configure SSIS Upsert Destination component to merge data with SQL Server, PostgreSQL, or Redshift table
  9. Click OK to save the configuration.

  10. Run the package and Cosmos DB data will be merged with the target table in SQL Server, PostgreSQL, or Redshift:

    Execute Package - Reading data from API Source and load into target
  11. Done!

Deploy and schedule SSIS package

After you are done creating SSIS package, most likely, you want to deploy it to SQL Server Catalog and run it periodically. Just follow the instructions in this article:

Running SSIS package in Azure Data Factory (ADF)

To use SSIS PowerPack in ADF, you must first prepare Azure-SSIS Integration Runtime. Follow this link for detailed instructions:

Advanced topics

Actions supported by Cosmos DB Connector

Cosmos DB Connector support following actions for REST API integration. If some actions are not listed below then you can easily edit Connector file and enhance out of the box functionality.
 Get List of Databases
Gets a list of the databases in the current database account.    [ Read more... ]
 Get Database Information by Id or Name
Gets a database by its Id.    [ Read more... ]
Parameter Description
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
 Get List of Tables
Gets a list of the tables in the database. (Tables are also called 'containers' or 'collections')    [ Read more... ]
Parameter Description
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
 Get table information by Id or Name
Gets a table by its Id. (Tables are also called 'containers' or 'collections')    [ Read more... ]
Parameter Description
Table Name (Case-Sensitive)
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
 Get table partition key ranges
Gets Partition Key Ranges for a table. This is useful for query if you want to minimize scan to specific partition (Tables are also called 'containers' or 'collections')    [ Read more... ]
Parameter Description
Table Name (Case-Sensitive)
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
 Query documents using Cosmos DB SQL query language
Gets data based on the specified SQL query.    [ Read more... ]
Parameter Description
Table Name (Case-Sensitive)
SQL Query Query for Cosmos DB
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
Allow Query Scan
Option Value
true true
false false
Allow Cross Partition Query
Option Value
true true
false false
Cross Partition Key Range Id
 Get All Documents for a Table
Gets all documents for a Table.    [ Read more... ]
Parameter Description
Table Name (Case-Sensitive)
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
 Get Document by Id
Gets Document by Id.    [ Read more... ]
Parameter Description
Document Id
Table Name (Case-Sensitive)
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
EnableCrossPartition
Option Value
true true
false false
Partition Key Value (default is supplied Id) The partition key value for the document. Must be included if and only if the collection is created with a partitionKey definition
Option Value
Default .
SingleKeyValue ["someValue1"]
MultiKeyValue ["some_value1","some_value2" ]
ConsistencyLevel This is the consistency level override. The valid values are: Strong, Bounded, Session, or Eventual (in order of strongest to weakest). The override must be the same or weaker than the account's configured consistency level.
Option Value
Strong Strong
Bounded Bounded
Session Session
Eventual Eventual
 Delete a Document by Id
Deletes a Document by Id.    [ Read more... ]
Parameter Description
Document Id
Table Name (Case-Sensitive)
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
EnableCrossPartition
Option Value
true true
false false
Partition Key Value (default is supplied Id) The partition key value for the document. Must be included if and only if the collection is created with a partitionKey definition
Option Value
Default .
SingleKeyValue ["someValue1"]
MultiKeyValue ["some_value1","some_value2" ]
 Get All Users for a Database
Gets all users for a Database.    [ Read more... ]
Parameter Description
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
 Get User by Id or Name
Gets database user information for a specific Id    [ Read more... ]
Parameter Description
User Name (Case-Sensitive)
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
 Create User for Database
Create a new user which you can later use to create permission set and obtain resource token.    [ Read more... ]
Parameter Description
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
RequestBody
 Create a document in the container
Insert JSON document in Cosmos DB Container.    [ Read more... ]
 Upsert a document in the container
Insert JSON document in Cosmos DB Container.    [ Read more... ]
Parameter Description
Upsert
 Update Document in the Container
Update full or part of the document in Cosmos DB Container.    [ Read more... ]
 Create Permission Token for a User (One Table)
Create a new user which you can later use to create permission set and obtain resource token.    [ Read more... ]
Parameter Description
Permission Name (e.g. read_orders)
Database Name (keep blank to use default) Case-Sensitive Leave blank to use default DB set on connection screen
User Name (Case-Sensitive)
PermissionMode
Option Value
All All
Read Read
Write Write
Delete Delete
Table (Add Permission for this)
ExpiresInSecond The validity period of the resource token returned by the operation. By default, a resource token is valid for one hour. To override the default, set this header with the desired validity period in seconds. The max override value is 18000, which is five hours.
 Generic Request
This is generic endpoint. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.    [ Read more... ]
Parameter Description
Url API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
Body Request Body content goes here
IsMultiPart Set this option if you want to upload file(s) (i.e. POST RAW file data) or send data using Multi-Part encoding method (i.e. Content-Type: multipart/form-data). Multi-Part request allows you to mix key/value and upload files in same request. On the other hand raw upload allows only single file upload (without any key/value) ==== Raw Upload (Content-Type: application/octet-stream) ===== To upload single file in raw mode check this option and specify full file path starting with @ sign in the Body (e.g. @c:\data\myfile.zip ) ==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) ===== To treat your Request data as multi part fields you must specify key/value pairs separated by new lines into RequestData field (i.e. Body). Each key value pair is entered on new-line and key/value are separated using equal sign (=). Preceding and trailing spaces are ignored also blank lines are ignored. If field value has some any special character(s) then use escape sequence (e.g. For NewLine: \r\n, For Tab: \t, For at (@): \@). When value of any field starts with at sign (@) its automatically treated as File you want to upload. By default file content type is determined based on extension however you can supply content type manually for any field using this way [ YourFileFieldName.Content-Type=some-content-type ]. By default File Upload Field always includes Content-Type in the request (non file fields do not have content-type by default unless you supply manually). For some reason if you dont want to use Content-Type header in your request then supply blank Content-Type to exclude this header altogather [e.g. SomeFieldName.Content-Type= ]. In below example we have supplied Content-Type for file2 and SomeField1, all other fields are using default content-type. See below Example of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data then manually set Request Header => Content-Type: multipart/mixed (it must starts with multipart/ else will be ignored). file1=@c:\data\Myfile1.txt file2=@c:\data\Myfile2.json file2.Content-Type=application/json SomeField1=aaaaaaa SomeField1.Content-Type=text/plain SomeField2=12345 SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab SomeFieldStartingWithAtSign=\@MyTwitterHandle
Filter Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Option Value
No filter
Example1 $.store.books[*]
Example2 (Sections Under Books) $.store.books[*].sections[*]
Example3 (Equals) $.store.books[?(@author=='sam')]
Example4 (Equals - Any Section) $..[?(@author=='sam')]
Example5 (Not Equals - Any Section) $..[?(@author!='sam')]
Example6 (Number less than) $.store.books[?(@.price<10)] Example7 (Regular Expression - Contains Pattern)=$.store.books[?(@author=~ /sam|bob/ )]
Example8 (Regular Expression - Does Not Contain Pattern) $.store.books[?(@author=~ /^((?!sam|bob).)*$/ )]
Example9 (Regular Expression - Exact Pattern Match) $.store.books[?(@author=~ /^sam|bob$/ )]
Example10 (Regular Expression - Starts With) $.store.books[?(@author=~ /^sam/ )]
Example11 (Regular Expression - Ends With) $.store.books[?(@author=~ /sam$/ )]
Example12 (Between) $.store.employees[?( @.hiredate>'2015-01-01' && @.hiredate<'2015-01-04' )]
Headers Headers for Request. To enter multiple headers use double pipe or new line after each {header-name}:{value} pair
 Generic Request (Bulk Write)
This is a generic endpoint for bulk write purpose. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.    [ Read more... ]
Parameter Description
Url API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
IsMultiPart Set this option if you want to upload file(s) (i.e. POST RAW file data) or send data using Multi-Part encoding method (i.e. Content-Type: multipart/form-data). Multi-Part request allows you to mix key/value and upload files in same request. On the other hand raw upload allows only single file upload (without any key/value) ==== Raw Upload (Content-Type: application/octet-stream) ===== To upload single file in raw mode check this option and specify full file path starting with @ sign in the Body (e.g. @c:\data\myfile.zip ) ==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) ===== To treat your Request data as multi part fields you must specify key/value pairs separated by new lines into RequestData field (i.e. Body). Each key value pair is entered on new-line and key/value are separated using equal sign (=). Preceding and trailing spaces are ignored also blank lines are ignored. If field value has some any special character(s) then use escape sequence (e.g. For NewLine: \r\n, For Tab: \t, For at (@): \@). When value of any field starts with at sign (@) its automatically treated as File you want to upload. By default file content type is determined based on extension however you can supply content type manually for any field using this way [ YourFileFieldName.Content-Type=some-content-type ]. By default File Upload Field always includes Content-Type in the request (non file fields do not have content-type by default unless you supply manually). For some reason if you dont want to use Content-Type header in your request then supply blank Content-Type to exclude this header altogather [e.g. SomeFieldName.Content-Type= ]. In below example we have supplied Content-Type for file2 and SomeField1, all other fields are using default content-type. See below Example of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data then manually set Request Header => Content-Type: multipart/mixed (it must starts with multipart/ else will be ignored). file1=@c:\data\Myfile1.txt file2=@c:\data\Myfile2.json file2.Content-Type=application/json SomeField1=aaaaaaa SomeField1.Content-Type=text/plain SomeField2=12345 SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab SomeFieldStartingWithAtSign=\@MyTwitterHandle
Filter Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Headers Headers for Request. To enter multiple headers use double pipe (||) or new line after each {header-name}:{value} pair

Conclusion

In this article we showed you how to connect to Cosmos DB in SSIS and integrate data without any coding, saving you time and effort. We encourage you to download Cosmos DB Connector for SSIS and see how easy it is to use it for yourself or your team.

If you have any questions, feel free to contact ZappySys support team. You can also open a live chat immediately by clicking on the chat icon below.

Download Cosmos DB Connector for SSIS Documentation

More integrations

Other connectors for SSIS

Other application integration scenarios for Cosmos DB

  • How to connect Cosmos DB in SSIS?

  • How to get Cosmos DB data in SSIS?

  • How to read Cosmos DB data in SSIS?

  • How to load Cosmos DB data in SSIS?

  • How to import Cosmos DB data in SSIS?

  • How to pull Cosmos DB data in SSIS?

  • How to push data to Cosmos DB in SSIS?

  • How to write data to Cosmos DB in SSIS?

  • How to POST data to Cosmos DB in SSIS?

  • Call Cosmos DB API in SSIS

  • Consume Cosmos DB API in SSIS

  • Cosmos DB SSIS Automate

  • Cosmos DB SSIS Integration

  • Integration Cosmos DB in SSIS

  • Consume real-time Cosmos DB data in SSIS

  • Consume real-time Cosmos DB API data in SSIS

  • Cosmos DB ODBC Driver | ODBC Driver for Cosmos DB | ODBC Cosmos DB Driver | SSIS Cosmos DB Source | SSIS Cosmos DB Destination

  • Connect Cosmos DB in SSIS

  • Load Cosmos DB in SSIS

  • Load Cosmos DB data in SSIS

  • Read Cosmos DB data in SSIS

  • Cosmos DB API Call in SSIS