ElasticSearch Connector for SSIS

In this article you will learn how to quickly and efficiently integrate ElasticSearch data in SSIS without coding. We will use high-performance ElasticSearch Connector to easily connect to ElasticSearch and then access the data inside SSIS.

Read / write ElasticSearch data inside your app; perform many ElasticSearch operations without coding, just use easy to use high performance API Connector for ElasticSearch

Let's follow the steps below to see how we can accomplish that!

Download Documentation

Video Tutorial - Integrate ElasticSearch data in SSIS

This video covers following and more so watch carefully. After watching this video follow the steps described in this article.

  • How to download / install required driver for ElasticSearch integration in SSIS
  • How to configure connection for ElasticSearch
  • Features about API Source (Authentication / Query Language / Examples / Driver UI)
  • Using ElasticSearch Connection in SSIS

Prerequisites

Before we begin, make sure the following prerequisites are met:

  1. SSIS designer installed. Sometimes it is referred as BIDS or SSDT (download it from Microsoft).
  2. Basic knowledge of SSIS package development using Microsoft SQL Server Integration Services.
  3. SSIS PowerPack is installed (if you are new to SSIS PowerPack, then get started!).

Read data from ElasticSearch in SSIS (Export data)

In this section we will learn how to configure and use ElasticSearch Connector in API Source to extract data from ElasticSearch.

  1. Begin with opening Visual Studio and Create a New Project.

  2. Select Integration Service Project and in new project window set the appropriate name and location for project. And click OK.

  3. In the new SSIS project screen you will find the following:

    1. SSIS ToolBox on left side bar
    2. Solution Explorer and Property Window on right bar
    3. Control flow, data flow, event Handlers, Package Explorer in tab windows
    4. Connection Manager Window in the bottom

    SSIS Project Screen
    Note: If you don't see ZappySys SSIS PowerPack Task or Components in SSIS Toolbox, please refer to this help link.
  4. Now, Drag and Drop SSIS Data Flow Task from SSIS Toolbox. Double click on the Data Flow Task to see Data Flow designer.

    SSIS Data Flow Task - Drag and Drop
  5. From the SSIS toolbox drag and API Source (Predefined Templates) on the data flow designer surface, and double click on it to edit it:
    SSIS API Source (Predefined Templates) - Drag and Drop

  6. Select New Connection to create a new connection:
    API Source - New Connection

  7. Use a preinstalled ElasticSearch Connector from Popular Connector List or press Search Online radio button to download ElasticSearch Connector. Once downloaded simply use it in the configuration:

    ElasticSearch
    ElasticSearch Connector Selection

  8. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    Steps how to get and use ElasticSearch credentials

    For Local / Hosted Instance by you

    1. Get your userid / password and enter on the connection UI

    For Managed Instance (By Bonsai search)

    If your instance is hosted by bonsai then perform these steps to get your credentials for API call
    1. Go to https://app.bonsai.io/clusters/{your-instance-id}/tokens
    2. Copy Access Key and Access Secret and enter on the connection UI. Click Test connection.
    3. If your Cluster has no data you can generate sample data by visiting this URL and click Add Sample Data https://{your-cluster-id}.apps.bonsaisearch.net/app/home#/tutorial_directory

    Configuring authentication parameters
    ElasticSearch
    Basic Authentication (UserId/Password) [Http]
    http://localhost:9200
    Optional Parameters
    User Name (or Access Key)
    Password (or Access Secret)
    Ignore certificate related errors
    ZappySys Http Connection

    Configuring authentication parameters
    ElasticSearch
    Windows Authentication (No Password) [Http]
    http://localhost:9200
    Optional Parameters
    Ignore certificate related errors
    ZappySys Http Connection

  9. Select the desired endpoint, change/pass the properties values, and click on Preview Data button to make the API call.

    API Source - ElasticSearch
    Read / write ElasticSearch data inside your app; perform many ElasticSearch operations without coding, just use easy to use high performance API Connector for ElasticSearch
    API Source - Select Endpoint

  10. That's it! We are done! Just in a few clicks we configured the call to ElasticSearch using ElasticSearch Connector.

    You can load the source data into your desired destination using the Upsert Destination, which supports SQL Server, PostgreSQL, and Amazon Redshift. We also offer other destinations such as CSV, Excel, Azure Table, Salesforce, and more. You can check out our SSIS PowerPack Tasks and components for more options. (*loaded in Trash Destination)

    Execute Package - Reading data from ElasticSearch and load into target

Write data to ElasticSearch using SSIS (Import data)

In this section we will learn how to configure and use ElasticSearch Connector in the API Destination to write data to ElasticSearch.

Video tutorial

This video covers following and more so watch carefully. After watching this video follow the steps described in this article.

  • How to download SSIS PowerPack for ElasticSearch integration in SSIS
  • How to configure connection for ElasticSearch
  • How to write or lookup data to ElasticSearch
  • Features about SSIS API Destination
  • Using ElasticSearch Connector in SSIS

Step-by-step instructions

In upper section we learned how to read data, now in this section we will learn how to configure ElasticSearch in the API Source to POST data to the ElasticSearch.

  1. Read the data from the source, being any desired source component. In example we will use ZappySys Dummy Data Source component.

  2. From the SSIS Toolbox drag and drop API Destination (Predefined Templates) on the Data Flow Designer surface and connect source component with it, and double click to edit it.
    SSIS API Destination (Predefined Templates) - Drag and Drop

  3. Select New Connection to create a new connection:

    API Destination - ElasticSearch
    Read / write ElasticSearch data inside your app; perform many ElasticSearch operations without coding, just use easy to use high performance API Connector for ElasticSearch
    API Destination - New Connection

  4. Use a preinstalled ElasticSearch Connector from Popular Connector List or press Search Online radio button to download ElasticSearch Connector. Once downloaded simply use it in the configuration:

    ElasticSearch
    ElasticSearch Connector Selection

  5. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    Steps how to get and use ElasticSearch credentials

    For Local / Hosted Instance by you

    1. Get your userid / password and enter on the connection UI

    For Managed Instance (By Bonsai search)

    If your instance is hosted by bonsai then perform these steps to get your credentials for API call
    1. Go to https://app.bonsai.io/clusters/{your-instance-id}/tokens
    2. Copy Access Key and Access Secret and enter on the connection UI. Click Test connection.
    3. If your Cluster has no data you can generate sample data by visiting this URL and click Add Sample Data https://{your-cluster-id}.apps.bonsaisearch.net/app/home#/tutorial_directory

    Configuring authentication parameters
    ElasticSearch
    Basic Authentication (UserId/Password) [Http]
    http://localhost:9200
    Optional Parameters
    User Name (or Access Key)
    Password (or Access Secret)
    Ignore certificate related errors
    ZappySys Http Connection

    Configuring authentication parameters
    ElasticSearch
    Windows Authentication (No Password) [Http]
    http://localhost:9200
    Optional Parameters
    Ignore certificate related errors
    ZappySys Http Connection

  6. Select the desired endpoint, change/pass the properties values, and go to the Mappings tab to map the columns.

    API Destination - ElasticSearch
    Read / write ElasticSearch data inside your app; perform many ElasticSearch operations without coding, just use easy to use high performance API Connector for ElasticSearch
    API Destination - Select Endpoint

  7. Finally, map the desired columns:

    API Destination - ElasticSearch
    Read / write ElasticSearch data inside your app; perform many ElasticSearch operations without coding, just use easy to use high performance API Connector for ElasticSearch
    API Destination - Columns Mapping

  8. That's it; we successfully configured the POST API Call. In a few clicks we configured the ElasticSearch API call using ZappySys ElasticSearch Connector

    Execute Package - Reading data from API Source and load into target

Load ElasticSearch data into SQL Server using Upsert Destination (Insert or Update)

Once you configured the data source, you can load ElasticSearch data into SQL Server using Upsert Destination.

Upsert Destination can merge or synchronize source data with the target table. It supports Microsoft SQL Server, PostgreSQL, and Redshift databases as targets. Upsert Destination also supports very fast bulk upsert operation along with bulk delete.

Upsert operation - a database operation which performs INSERT or UPDATE SQL commands based on record's existence condition in the target table. It inserts records that don't have matching records in the target table or updates them, if they do, by matching them by key columns.

Upsert Destination supports INSERT, UPDATE, and DELETE operations, so it is similar to SQL Server's MERGE command, except it can be used directly in SSIS package.

  1. From the SSIS Toolbox drag-and-drop Upsert Destination component onto the Data Flow designer background.

  2. Connect your SSIS source component to Upsert Destination.

  3. Double-click on Upsert Destination component to open configuration window.

  4. Start by selecting the Action from the list.

  5. Next, select the desired target connection or create one by clicking <New [provider] Connection> menu item from the Target Connection dropdown.

  6. Then select a table from the Target Table list or click New button to create a new table based on the source columns.

  7. Continue by checking Insert and Update options according to your scenario (e.g. if Update option is unchecked, no updates will be made).

  8. Finally, click Map All button to map all columns and then select the Key columns to match the columns on:

    Configure SSIS Upsert Destination component to merge data with SQL Server, PostgreSQL, or Redshift table
  9. Click OK to save the configuration.

  10. Run the package and ElasticSearch data will be merged with the target table in SQL Server, PostgreSQL, or Redshift:

    Execute Package - Reading data from API Source and load into target
  11. Done!

Deploy and schedule SSIS package

After you are done creating SSIS package, most likely, you want to deploy it to SQL Server Catalog and run it periodically. Just follow the instructions in this article:

Running SSIS package in Azure Data Factory (ADF)

To use SSIS PowerPack in ADF, you must first prepare Azure-SSIS Integration Runtime. Follow this link for detailed instructions:

Advanced topics

Actions supported by ElasticSearch Connector

ElasticSearch Connector support following actions for REST API integration. If some actions are not listed below then you can easily edit Connector file and enhance out of the box functionality.
 Create Index
Create a new index    [ Read more... ]
Parameter Description
New Index Name
 Delete Index
Delete an exising index    [ Read more... ]
Parameter Description
Index to delete
 List indexes
Lists indexes    [ Read more... ]
 List aliases
Lists aliases    [ Read more... ]
Parameter Description
 Get Index or Alias metadata
Gets index or alias metadata    [ Read more... ]
Parameter Description
 Get documents from Index or Alias
Gets documents from Index or Alias    [ Read more... ]
 Get document by ID from Index or Alias
   [ Read more... ]
Parameter Description
Enter Document ID
 Search / Query documents
Gets documents (Using JSON Query Language)    [ Read more... ]
 Count documents
   [ Read more... ]
Parameter Description
Index (choose one --OR-- enter * --OR-- comma seperated names) You can enter index name(s) for which you like to perform document count. Enter * (asterisk) to perform search across all indices or comma seperate list (i.e. myidx1,myidx2) or select one from the populated list.
Enter Query (JSON Format)
Option Value
All Records {"match_all": { } }
Record where comment or name contains TV word {"query_string": {"query": "comment:TV OR name:TV"} }
Record with comment field (attribute exists) {"query_string": {"query": "_exists_:comment"} }
 Insert documents
Insert documents    [ Read more... ]
Parameter Description
 Upsert documents
Insert Or Update (Upserts) documents. If _id column not supplied then this acts like INSERT call. If _id column supplied and its found then UPDATE action happens else INSERT. Look at the Result column in the output to see if document was created or updated.    [ Read more... ]
Parameter Description
 Update documents
Update documents    [ Read more... ]
Parameter Description
 Delete documents
Deletes documents    [ Read more... ]
Parameter Description
Index
 Generic Request
This is generic endpoint. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.    [ Read more... ]
Parameter Description
Url API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
Body Request Body content goes here
IsMultiPart Set this option if you want to upload file(s) (i.e. POST RAW file data) or send data using Multi-Part encoding method (i.e. Content-Type: multipart/form-data). Multi-Part request allows you to mix key/value and upload files in same request. On the other hand raw upload allows only single file upload (without any key/value) ==== Raw Upload (Content-Type: application/octet-stream) ===== To upload single file in raw mode check this option and specify full file path starting with @ sign in the Body (e.g. @c:\data\myfile.zip ) ==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) ===== To treat your Request data as multi part fields you must specify key/value pairs separated by new lines into RequestData field (i.e. Body). Each key value pair is entered on new-line and key/value are separated using equal sign (=). Preceding and trailing spaces are ignored also blank lines are ignored. If field value has some any special character(s) then use escape sequence (e.g. For NewLine: \r\n, For Tab: \t, For at (@): \@). When value of any field starts with at sign (@) its automatically treated as File you want to upload. By default file content type is determined based on extension however you can supply content type manually for any field using this way [ YourFileFieldName.Content-Type=some-content-type ]. By default File Upload Field always includes Content-Type in the request (non file fields do not have content-type by default unless you supply manually). For some reason if you dont want to use Content-Type header in your request then supply blank Content-Type to exclude this header altogather [e.g. SomeFieldName.Content-Type= ]. In below example we have supplied Content-Type for file2 and SomeField1, all other fields are using default content-type. See below Example of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data then manually set Request Header => Content-Type: multipart/mixed (it must starts with multipart/ else will be ignored). file1=@c:\data\Myfile1.txt file2=@c:\data\Myfile2.json file2.Content-Type=application/json SomeField1=aaaaaaa SomeField1.Content-Type=text/plain SomeField2=12345 SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab SomeFieldStartingWithAtSign=\@MyTwitterHandle
Filter Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Option Value
No filter
Example1 $.store.books[*]
Example2 (Sections Under Books) $.store.books[*].sections[*]
Example3 (Equals) $.store.books[?(@author=='sam')]
Example4 (Equals - Any Section) $..[?(@author=='sam')]
Example5 (Not Equals - Any Section) $..[?(@author!='sam')]
Example6 (Number less than) $.store.books[?(@.price<10)] Example7 (Regular Expression - Contains Pattern)=$.store.books[?(@author=~ /sam|bob/ )]
Example8 (Regular Expression - Does Not Contain Pattern) $.store.books[?(@author=~ /^((?!sam|bob).)*$/ )]
Example9 (Regular Expression - Exact Pattern Match) $.store.books[?(@author=~ /^sam|bob$/ )]
Example10 (Regular Expression - Starts With) $.store.books[?(@author=~ /^sam/ )]
Example11 (Regular Expression - Ends With) $.store.books[?(@author=~ /sam$/ )]
Example12 (Between) $.store.employees[?( @.hiredate>'2015-01-01' && @.hiredate<'2015-01-04' )]
Headers Headers for Request. To enter multiple headers use double pipe or new line after each {header-name}:{value} pair
 Generic Request (Bulk Write)
This is a generic endpoint for bulk write purpose. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.    [ Read more... ]
Parameter Description
Url API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
IsMultiPart Set this option if you want to upload file(s) (i.e. POST RAW file data) or send data using Multi-Part encoding method (i.e. Content-Type: multipart/form-data). Multi-Part request allows you to mix key/value and upload files in same request. On the other hand raw upload allows only single file upload (without any key/value) ==== Raw Upload (Content-Type: application/octet-stream) ===== To upload single file in raw mode check this option and specify full file path starting with @ sign in the Body (e.g. @c:\data\myfile.zip ) ==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) ===== To treat your Request data as multi part fields you must specify key/value pairs separated by new lines into RequestData field (i.e. Body). Each key value pair is entered on new-line and key/value are separated using equal sign (=). Preceding and trailing spaces are ignored also blank lines are ignored. If field value has some any special character(s) then use escape sequence (e.g. For NewLine: \r\n, For Tab: \t, For at (@): \@). When value of any field starts with at sign (@) its automatically treated as File you want to upload. By default file content type is determined based on extension however you can supply content type manually for any field using this way [ YourFileFieldName.Content-Type=some-content-type ]. By default File Upload Field always includes Content-Type in the request (non file fields do not have content-type by default unless you supply manually). For some reason if you dont want to use Content-Type header in your request then supply blank Content-Type to exclude this header altogather [e.g. SomeFieldName.Content-Type= ]. In below example we have supplied Content-Type for file2 and SomeField1, all other fields are using default content-type. See below Example of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data then manually set Request Header => Content-Type: multipart/mixed (it must starts with multipart/ else will be ignored). file1=@c:\data\Myfile1.txt file2=@c:\data\Myfile2.json file2.Content-Type=application/json SomeField1=aaaaaaa SomeField1.Content-Type=text/plain SomeField2=12345 SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab SomeFieldStartingWithAtSign=\@MyTwitterHandle
Filter Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Headers Headers for Request. To enter multiple headers use double pipe (||) or new line after each {header-name}:{value} pair

Conclusion

In this article we showed you how to connect to ElasticSearch in SSIS and integrate data without any coding, saving you time and effort. We encourage you to download ElasticSearch Connector for SSIS and see how easy it is to use it for yourself or your team.

If you have any questions, feel free to contact ZappySys support team. You can also open a live chat immediately by clicking on the chat icon below.

Download ElasticSearch Connector for SSIS Documentation

More integrations

Other connectors for SSIS

Other application integration scenarios for ElasticSearch

  • How to connect ElasticSearch in SSIS?

  • How to get ElasticSearch data in SSIS?

  • How to read ElasticSearch data in SSIS?

  • How to load ElasticSearch data in SSIS?

  • How to import ElasticSearch data in SSIS?

  • How to pull ElasticSearch data in SSIS?

  • How to push data to ElasticSearch in SSIS?

  • How to write data to ElasticSearch in SSIS?

  • How to POST data to ElasticSearch in SSIS?

  • Call ElasticSearch API in SSIS

  • Consume ElasticSearch API in SSIS

  • ElasticSearch SSIS Automate

  • ElasticSearch SSIS Integration

  • Integration ElasticSearch in SSIS

  • Consume real-time ElasticSearch data in SSIS

  • Consume real-time ElasticSearch API data in SSIS

  • ElasticSearch ODBC Driver | ODBC Driver for ElasticSearch | ODBC ElasticSearch Driver | SSIS ElasticSearch Source | SSIS ElasticSearch Destination

  • Connect ElasticSearch in SSIS

  • Load ElasticSearch in SSIS

  • Load ElasticSearch data in SSIS

  • Read ElasticSearch data in SSIS

  • ElasticSearch API Call in SSIS