Nativo Connector for SSIS

In this article you will learn how to quickly and efficiently integrate Nativo data in SSIS without coding. We will use high-performance Nativo Connector to easily connect to Nativo and then access the data inside SSIS.

Nativo Connector can be used to integrated operations supported by Nativo REST API.

Let's follow the steps below to see how we can accomplish that!

Download Documentation

Video Tutorial - Integrate Nativo data in SSIS

This video covers following and more so watch carefully. After watching this video follow the steps described in this article.

  • How to download / install required driver for Nativo integration in SSIS
  • How to configure connection for Nativo
  • Features about API Source (Authentication / Query Language / Examples / Driver UI)
  • Using Nativo Connection in SSIS

Prerequisites

Before we begin, make sure the following prerequisites are met:

  1. SSIS designer installed. Sometimes it is referred as BIDS or SSDT (download it from Microsoft).
  2. Basic knowledge of SSIS package development using Microsoft SQL Server Integration Services.
  3. SSIS PowerPack is installed (if you are new to SSIS PowerPack, then get started!).

Read data from Nativo in SSIS (Export data)

In this section we will learn how to configure and use Nativo Connector in API Source to extract data from Nativo.

  1. Begin with opening Visual Studio and Create a New Project.

  2. Select Integration Service Project and in new project window set the appropriate name and location for project. And click OK.

  3. In the new SSIS project screen you will find the following:

    1. SSIS ToolBox on left side bar
    2. Solution Explorer and Property Window on right bar
    3. Control flow, data flow, event Handlers, Package Explorer in tab windows
    4. Connection Manager Window in the bottom

    SSIS Project Screen
    Note: If you don't see ZappySys SSIS PowerPack Task or Components in SSIS Toolbox, please refer to this help link.
  4. Now, Drag and Drop SSIS Data Flow Task from SSIS Toolbox. Double click on the Data Flow Task to see Data Flow designer.

    SSIS Data Flow Task - Drag and Drop
  5. From the SSIS toolbox drag and API Source (Predefined Templates) on the data flow designer surface, and double click on it to edit it:
    SSIS API Source (Predefined Templates) - Drag and Drop

  6. Select New Connection to create a new connection:
    API Source - New Connection

  7. Use a preinstalled Nativo Connector from Popular Connector List or press Search Online radio button to download Nativo Connector. Once downloaded simply use it in the configuration:

    Nativo
    Nativo Connector Selection

  8. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    Please refer to below API reference (external site) link for Http [Http]

    https://api-docs.nativo.com/docs/introduction


    Configuring authentication parameters
    Nativo
    Http [Http]
    https://api.nativo.com/v2
    Optional Parameters
    Api Key
    Api Secret
    RetryMode RetryWhenStatusCodeMatch
    RetryStatusCodeList 429
    RetryCountMax 5
    RetryMultiplyWaitTime True
    ZappySys Http Connection

  9. Select the desired endpoint, change/pass the properties values, and click on Preview Data button to make the API call.

    API Source - Nativo
    Nativo Connector can be used to integrated operations supported by Nativo REST API.
    API Source - Select Endpoint

  10. That's it! We are done! Just in a few clicks we configured the call to Nativo using Nativo Connector.

    You can load the source data into your desired destination using the Upsert Destination, which supports SQL Server, PostgreSQL, and Amazon Redshift. We also offer other destinations such as CSV, Excel, Azure Table, Salesforce, and more. You can check out our SSIS PowerPack Tasks and components for more options. (*loaded in Trash Destination)

    Execute Package - Reading data from Nativo and load into target

Write data to Nativo using SSIS (Import data)

In this section we will learn how to configure and use Nativo Connector in the API Destination to write data to Nativo.

Video tutorial

This video covers following and more so watch carefully. After watching this video follow the steps described in this article.

  • How to download SSIS PowerPack for Nativo integration in SSIS
  • How to configure connection for Nativo
  • How to write or lookup data to Nativo
  • Features about SSIS API Destination
  • Using Nativo Connector in SSIS

Step-by-step instructions

In upper section we learned how to read data, now in this section we will learn how to configure Nativo in the API Source to POST data to the Nativo.

  1. Read the data from the source, being any desired source component. In example we will use ZappySys Dummy Data Source component.

  2. From the SSIS Toolbox drag and drop API Destination (Predefined Templates) on the Data Flow Designer surface and connect source component with it, and double click to edit it.
    SSIS API Destination (Predefined Templates) - Drag and Drop

  3. Select New Connection to create a new connection:

    API Destination - Nativo
    Nativo Connector can be used to integrated operations supported by Nativo REST API.
    API Destination - New Connection

  4. Use a preinstalled Nativo Connector from Popular Connector List or press Search Online radio button to download Nativo Connector. Once downloaded simply use it in the configuration:

    Nativo
    Nativo Connector Selection

  5. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    Please refer to below API reference (external site) link for Http [Http]

    https://api-docs.nativo.com/docs/introduction


    Configuring authentication parameters
    Nativo
    Http [Http]
    https://api.nativo.com/v2
    Optional Parameters
    Api Key
    Api Secret
    RetryMode RetryWhenStatusCodeMatch
    RetryStatusCodeList 429
    RetryCountMax 5
    RetryMultiplyWaitTime True
    ZappySys Http Connection

  6. Select the desired endpoint, change/pass the properties values, and go to the Mappings tab to map the columns.

    API Destination - Nativo
    Nativo Connector can be used to integrated operations supported by Nativo REST API.
    API Destination - Select Endpoint

  7. Finally, map the desired columns:

    API Destination - Nativo
    Nativo Connector can be used to integrated operations supported by Nativo REST API.
    API Destination - Columns Mapping

  8. That's it; we successfully configured the POST API Call. In a few clicks we configured the Nativo API call using ZappySys Nativo Connector

    Execute Package - Reading data from API Source and load into target

Load Nativo data into SQL Server using Upsert Destination (Insert or Update)

Once you configured the data source, you can load Nativo data into SQL Server using Upsert Destination.

Upsert Destination can merge or synchronize source data with the target table. It supports Microsoft SQL Server, PostgreSQL, and Redshift databases as targets. Upsert Destination also supports very fast bulk upsert operation along with bulk delete.

Upsert operation - a database operation which performs INSERT or UPDATE SQL commands based on record's existence condition in the target table. It inserts records that don't have matching records in the target table or updates them, if they do, by matching them by key columns.

Upsert Destination supports INSERT, UPDATE, and DELETE operations, so it is similar to SQL Server's MERGE command, except it can be used directly in SSIS package.

  1. From the SSIS Toolbox drag-and-drop Upsert Destination component onto the Data Flow designer background.

  2. Connect your SSIS source component to Upsert Destination.

  3. Double-click on Upsert Destination component to open configuration window.

  4. Start by selecting the Action from the list.

  5. Next, select the desired target connection or create one by clicking <New [provider] Connection> menu item from the Target Connection dropdown.

  6. Then select a table from the Target Table list or click New button to create a new table based on the source columns.

  7. Continue by checking Insert and Update options according to your scenario (e.g. if Update option is unchecked, no updates will be made).

  8. Finally, click Map All button to map all columns and then select the Key columns to match the columns on:

    Configure SSIS Upsert Destination component to merge data with SQL Server, PostgreSQL, or Redshift table
  9. Click OK to save the configuration.

  10. Run the package and Nativo data will be merged with the target table in SQL Server, PostgreSQL, or Redshift:

    Execute Package - Reading data from API Source and load into target
  11. Done!

Deploy and schedule SSIS package

After you are done creating SSIS package, most likely, you want to deploy it to SQL Server Catalog and run it periodically. Just follow the instructions in this article:

Running SSIS package in Azure Data Factory (ADF)

To use SSIS PowerPack in ADF, you must first prepare Azure-SSIS Integration Runtime. Follow this link for detailed instructions:

Advanced topics

Actions supported by Nativo Connector

Nativo Connector support following actions for REST API integration. If some actions are not listed below then you can easily edit Connector file and enhance out of the box functionality.
 Read Campaign Data
   [ Read more... ]
Parameter Description
advertiser_id Return campaigns only for this advertiser.
 Read Advertisers Data
 Read Metrics
 Read DirectCampaign Data
 Read Preferred Campaign Data
 Read Managed Campaign Data
 Read Demand Campaign Data
 Read Auction Campaign Data
 Read Inventory Campaign Data
 Read Performance Campaign Data (Depriciated)
 Generic Request
This is generic endpoint. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.    [ Read more... ]
Parameter Description
Url API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
Body Request Body content goes here
IsMultiPart Set this option if you want to upload file(s) (i.e. POST RAW file data) or send data using Multi-Part encoding method (i.e. Content-Type: multipart/form-data). Multi-Part request allows you to mix key/value and upload files in same request. On the other hand raw upload allows only single file upload (without any key/value) ==== Raw Upload (Content-Type: application/octet-stream) ===== To upload single file in raw mode check this option and specify full file path starting with @ sign in the Body (e.g. @c:\data\myfile.zip ) ==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) ===== To treat your Request data as multi part fields you must specify key/value pairs separated by new lines into RequestData field (i.e. Body). Each key value pair is entered on new-line and key/value are separated using equal sign (=). Preceding and trailing spaces are ignored also blank lines are ignored. If field value has some any special character(s) then use escape sequence (e.g. For NewLine: \r\n, For Tab: \t, For at (@): \@). When value of any field starts with at sign (@) its automatically treated as File you want to upload. By default file content type is determined based on extension however you can supply content type manually for any field using this way [ YourFileFieldName.Content-Type=some-content-type ]. By default File Upload Field always includes Content-Type in the request (non file fields do not have content-type by default unless you supply manually). For some reason if you dont want to use Content-Type header in your request then supply blank Content-Type to exclude this header altogather [e.g. SomeFieldName.Content-Type= ]. In below example we have supplied Content-Type for file2 and SomeField1, all other fields are using default content-type. See below Example of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data then manually set Request Header => Content-Type: multipart/mixed (it must starts with multipart/ else will be ignored). file1=@c:\data\Myfile1.txt file2=@c:\data\Myfile2.json file2.Content-Type=application/json SomeField1=aaaaaaa SomeField1.Content-Type=text/plain SomeField2=12345 SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab SomeFieldStartingWithAtSign=\@MyTwitterHandle
Filter Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Option Value
No filter
Example1 $.store.books[*]
Example2 (Sections Under Books) $.store.books[*].sections[*]
Example3 (Equals) $.store.books[?(@author=='sam')]
Example4 (Equals - Any Section) $..[?(@author=='sam')]
Example5 (Not Equals - Any Section) $..[?(@author!='sam')]
Example6 (Number less than) $.store.books[?(@.price<10)] Example7 (Regular Expression - Contains Pattern)=$.store.books[?(@author=~ /sam|bob/ )]
Example8 (Regular Expression - Does Not Contain Pattern) $.store.books[?(@author=~ /^((?!sam|bob).)*$/ )]
Example9 (Regular Expression - Exact Pattern Match) $.store.books[?(@author=~ /^sam|bob$/ )]
Example10 (Regular Expression - Starts With) $.store.books[?(@author=~ /^sam/ )]
Example11 (Regular Expression - Ends With) $.store.books[?(@author=~ /sam$/ )]
Example12 (Between) $.store.employees[?( @.hiredate>'2015-01-01' && @.hiredate<'2015-01-04' )]
Headers Headers for Request. To enter multiple headers use double pipe or new line after each {header-name}:{value} pair
 Generic Request (Bulk Write)
This is a generic endpoint for bulk write purpose. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.    [ Read more... ]
Parameter Description
Url API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
IsMultiPart Set this option if you want to upload file(s) (i.e. POST RAW file data) or send data using Multi-Part encoding method (i.e. Content-Type: multipart/form-data). Multi-Part request allows you to mix key/value and upload files in same request. On the other hand raw upload allows only single file upload (without any key/value) ==== Raw Upload (Content-Type: application/octet-stream) ===== To upload single file in raw mode check this option and specify full file path starting with @ sign in the Body (e.g. @c:\data\myfile.zip ) ==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) ===== To treat your Request data as multi part fields you must specify key/value pairs separated by new lines into RequestData field (i.e. Body). Each key value pair is entered on new-line and key/value are separated using equal sign (=). Preceding and trailing spaces are ignored also blank lines are ignored. If field value has some any special character(s) then use escape sequence (e.g. For NewLine: \r\n, For Tab: \t, For at (@): \@). When value of any field starts with at sign (@) its automatically treated as File you want to upload. By default file content type is determined based on extension however you can supply content type manually for any field using this way [ YourFileFieldName.Content-Type=some-content-type ]. By default File Upload Field always includes Content-Type in the request (non file fields do not have content-type by default unless you supply manually). For some reason if you dont want to use Content-Type header in your request then supply blank Content-Type to exclude this header altogather [e.g. SomeFieldName.Content-Type= ]. In below example we have supplied Content-Type for file2 and SomeField1, all other fields are using default content-type. See below Example of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data then manually set Request Header => Content-Type: multipart/mixed (it must starts with multipart/ else will be ignored). file1=@c:\data\Myfile1.txt file2=@c:\data\Myfile2.json file2.Content-Type=application/json SomeField1=aaaaaaa SomeField1.Content-Type=text/plain SomeField2=12345 SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab SomeFieldStartingWithAtSign=\@MyTwitterHandle
Filter Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Headers Headers for Request. To enter multiple headers use double pipe (||) or new line after each {header-name}:{value} pair

Conclusion

In this article we showed you how to connect to Nativo in SSIS and integrate data without any coding, saving you time and effort. We encourage you to download Nativo Connector for SSIS and see how easy it is to use it for yourself or your team.

If you have any questions, feel free to contact ZappySys support team. You can also open a live chat immediately by clicking on the chat icon below.

Download Nativo Connector for SSIS Documentation

More integrations

Other connectors for SSIS

Other application integration scenarios for Nativo

  • How to connect Nativo in SSIS?

  • How to get Nativo data in SSIS?

  • How to read Nativo data in SSIS?

  • How to load Nativo data in SSIS?

  • How to import Nativo data in SSIS?

  • How to pull Nativo data in SSIS?

  • How to push data to Nativo in SSIS?

  • How to write data to Nativo in SSIS?

  • How to POST data to Nativo in SSIS?

  • Call Nativo API in SSIS

  • Consume Nativo API in SSIS

  • Nativo SSIS Automate

  • Nativo SSIS Integration

  • Integration Nativo in SSIS

  • Consume real-time Nativo data in SSIS

  • Consume real-time Nativo API data in SSIS

  • Nativo ODBC Driver | ODBC Driver for Nativo | ODBC Nativo Driver | SSIS Nativo Source | SSIS Nativo Destination

  • Connect Nativo in SSIS

  • Load Nativo in SSIS

  • Load Nativo data in SSIS

  • Read Nativo data in SSIS

  • Nativo API Call in SSIS