Reference

Endpoint Make Generic API Request


Name

generic_request

Description

This is generic endpoint. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.

Related Tables

Generic Table (Bulk Read / Write)

Parameters

Parameter Required Options
Name: Url

Label: Url

API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
YES
Name: Body

Label: Body

Request Body content goes here
Name: IsMultiPart

Label: IsMultiPart

Set this option if you want to upload file(s) using either raw file data (i.e., POST raw file data) or send data using the multi-part encoding method (i.e. Content-Type: multipart/form-data). A multi-part request allows you to mix key/value pairs and upload files in the same request. On the other hand, raw upload allows only a single file to be uploaded (without any key/value data). ==== Raw Upload (Content-Type: application/octet-stream) ===== To upload a single file in raw mode, check this option and specify the full file path starting with the @ sign in the Body (e.g. @c:\data\myfile.zip) ==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) ===== To treat your request data as multi-part fields, you must specify key/value pairs separated by new lines in the RequestData field (i.e., Body). Each key/value pair should be entered on a new line, and key/value are separated using an equal sign (=). Leading and trailing spaces are ignored, and blank lines are also ignored. If a field value contains any special character(s), use escape sequences (e.g., for NewLine: \r\n, for Tab: \t, for at (@): @). When the value of any field starts with the at sign (@), it is automatically treated as a file you want to upload. By default, the file content type is determined based on the file extension; however, you can supply a content type manually for any field using this format: [YourFileFieldName.Content-Type=some-content-type]. By default, file upload fields always include Content-Type in the request (non-file fields do not have Content-Type by default unless you supply it manually). If, for some reason, you don't want to use the Content-Type header in your request, then supply a blank Content-Type to exclude this header altogether (e.g., SomeFieldName.Content-Type=). In the example below, we have supplied Content-Type for file2 and SomeField1. All other fields are using the default content type. See the example below of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data, then manually set Request Header => Content-Type: multipart/mixed (it must start with multipart/ or it will be ignored). file1=@c:\data\Myfile1.txt file2=@c:\data\Myfile2.json file2.Content-Type=application/json SomeField1=aaaaaaa SomeField1.Content-Type=text/plain SomeField2=12345 SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab SomeFieldStartingWithAtSign=\@MyTwitterHandle
Name: Filter

Label: Filter

Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Option Value
No filter
Example1 $.store.books[*]
Example2 (Sections Under Books) $.store.books[*].sections[*]
Example3 (Equals) $.store.books[?(@author=='sam')]
Example4 (Equals - Any Section) $..[?(@author=='sam')]
Example5 (Not Equals - Any Section) $..[?(@author!='sam')]
Example6 (Number less than) $.store.books[?(@.price<10)] Example7 (Regular Expression - Contains Pattern)=$.store.books[?(@author=~ /sam|bob/ )]
Example8 (Regular Expression - Does Not Contain Pattern) $.store.books[?(@author=~ /^((?!sam|bob).)*$/ )]
Example9 (Regular Expression - Exact Pattern Match) $.store.books[?(@author=~ /^sam|bob$/ )]
Example10 (Regular Expression - Starts With) $.store.books[?(@author=~ /^sam/ )]
Example11 (Regular Expression - Ends With) $.store.books[?(@author=~ /sam$/ )]
Example12 (Between) $.store.employees[?( @.hiredate>'2015-01-01' && @.hiredate<'2015-01-04' )]
Name: Headers

Label: Headers

Headers for Request. To enter multiple headers use double pipe or new line after each {header-name}:{value} pair

Output Columns

Label Data Type (SSIS) Data Type (SQL) Length Description
There are no Static columns defined for this endpoint. This endpoint detects columns dynamically at runtime.

Input Columns

Label Data Type (SSIS) Data Type (SQL) Length Description
There are no Static columns defined for this endpoint. This endpoint detects columns dynamically at runtime.

Examples

SSIS

Use OneDrive Connector in API Source or in API Destination SSIS Data Flow components to read or write data.

API Source

This Endpoint belongs to the Generic Table (Bulk Read / Write) table, therefore it is better to use it, instead of accessing the endpoint directly:

API Source - OneDrive
OneDrive Connector can be used to integrate OneDrive and your defined data source, e.g. Microsoft SQL, Oracle, Excel, Power BI, etc. Get, write, delete OneDrive data in a few clicks!
OneDrive
Generic Table (Bulk Read / Write)
Required Parameters
Url Fill-in the parameter...
Request Method Fill-in the parameter...
Optional Parameters
Body
IsMultiPart
Filter
ExcludedProperties
Encoding
CharacterSet
EnableCustomReplace
SearchFor
ReplaceWith
JSON - Flatten Small Array (Not preferred for more than 10 items)
JSON - Max Array Items To Flatten 10
JSON - Array Transform Type
JSON - Array Transform Column Name Filter
JSON - Array Transform Row Value Filter
JSON - Array Transform Enable Custom Columns
EnablePivot
FileCompressionType
DateFormatString
Request Format (Content-Type) ApplicationJson
Response Format Default
Headers Accept: */* || Cache-Control: no-cache
Pagination - Mode
Pagination - Attribute Name
Pagination - Increment By 1
Pagination - Expression for Next URL
Pagination - Wait time after each request 0
Csv - Column Delimiter ,
Csv - Has Header Row True
Xml - ElementsToTreatAsArray
SSIS API Source - Read from table or endpoint

API Destination

This Endpoint belongs to the Generic Table (Bulk Read / Write) table, therefore it is better to use it, instead of accessing the endpoint directly. Use this table and table-operation pair to make generic api request:

API Destination - OneDrive
OneDrive Connector can be used to integrate OneDrive and your defined data source, e.g. Microsoft SQL, Oracle, Excel, Power BI, etc. Get, write, delete OneDrive data in a few clicks!
OneDrive
Generic Table (Bulk Read / Write)
Select
Required Parameters
Url Fill-in the parameter...
Request Method Fill-in the parameter...
Optional Parameters
Body
IsMultiPart
Filter
ExcludedProperties
Encoding
CharacterSet
EnableCustomReplace
SearchFor
ReplaceWith
JSON - Flatten Small Array (Not preferred for more than 10 items)
JSON - Max Array Items To Flatten 10
JSON - Array Transform Type
JSON - Array Transform Column Name Filter
JSON - Array Transform Row Value Filter
JSON - Array Transform Enable Custom Columns
EnablePivot
FileCompressionType
DateFormatString
Request Format (Content-Type) ApplicationJson
Response Format Default
Headers Accept: */* || Cache-Control: no-cache
Pagination - Mode
Pagination - Attribute Name
Pagination - Increment By 1
Pagination - Expression for Next URL
Pagination - Wait time after each request 0
Csv - Column Delimiter ,
Csv - Has Header Row True
Xml - ElementsToTreatAsArray
SSIS API Destination - Access table operation

ODBC application

Use these SQL queries in your ODBC application data source:

Get __DynamicRequest__

SELECT * FROM __DynamicRequest__

generic_request endpoint belongs to __DynamicRequest__ table(s), and can therefore be used via those table(s).

SQL Server

Use these SQL queries in SQL Server after you create a data source in Data Gateway:

Get __DynamicRequest__

DECLARE @MyQuery NVARCHAR(MAX) = 'SELECT * FROM __DynamicRequest__';

EXEC (@MyQuery) AT [LS_TO_ONEDRIVE_IN_GATEWAY];

generic_request endpoint belongs to __DynamicRequest__ table(s), and can therefore be used via those table(s).