Amazon Selling Partner (SP-API) Connector for Azure Data Factory (Pipeline)
In this article you will learn how to integrate Using Amazon Selling Partner (SP-API) Connector you will be able to connect, read, and write data from within Azure Data Factory (Pipeline). Follow the steps below to see how we would accomplish that. The driver mentioned above is part of ODBC PowerPack which is a collection of high-performance Drivers for various API data source (i.e. REST API, JSON, XML, CSV, Amazon S3 and many more). Using familiar SQL query language you can make live connections and read/write data from API sources or JSON / XML / CSV Files inside SQL Server (T-SQL) or your favorite Reporting (i.e. Power BI, Tableau, Qlik, SSRS, MicroStrategy, Excel, MS Access), ETL Tools (i.e. Informatica, Talend, Pentaho, SSIS). You can also call our drivers from programming languages such as JAVA, C#, Python, PowerShell etc. If you are new to ODBC and ZappySys ODBC PowerPack then check the following links to get started. |
Connect to Amazon Selling Partner (SP-API) in other apps
|
Create ODBC Data Source (DSN) based on ZappySys API Driver
Step-by-step instructions
To get data from Amazon Selling Partner (SP-API) using Azure Data Factory (Pipeline) we first need to create a DSN (Data Source) which will access data from Amazon Selling Partner (SP-API). We will later be able to read data using Azure Data Factory (Pipeline). Perform these steps:
-
Install ZappySys ODBC PowerPack.
-
Open ODBC Data Sources (x64):
-
Create a System Data Source (System DSN) based on ZappySys API Driver
ZappySys API DriverYou should create a System DSN (instead of a User DSN) if the client application is launched under a Windows System Account, e.g. as a Windows Service. If the client application is 32-bit (x86) running with a System DSN, use ODBC Data Sources (32-bit) instead of the 64-bit version. Furthermore, a User DSN may be created instead, but then you will not be able to use the connection from Windows Services(or any application running under a Windows System Account). -
When the Configuration window appears give your data source a name if you haven't done that already, then select "Amazon Selling Partner (SP-API)" from the list of Popular Connectors. If "Amazon Selling Partner (SP-API)" is not present in the list, then click "Search Online" and download it. Then set the path to the location where you downloaded it. Finally, click Continue >> to proceed with configuring the DSN:
AmazonSellingPartner(SP-ApI)DSNAmazon Selling Partner (SP-API) -
Now it's time to configure the Connection Manager. Select Authentication Type, e.g. Token Authentication. Then select API Base URL (in most cases, the default one is the right one). More info is available in the Authentication section.
Steps to get Amazon Selling Partner (SP-API) Credentials : OAuth (Self Authorize - Private App) [OAuth]
To call Amazon SP-API you need to register as Developer and create an App to obtain Client ID / Client Secret. After that authorize to get a Refresh Token. Perform the following steps (Detailed steps found in the each link described below)- Go to Register as a Private App developer. You can wait for day or two to get approval. Check status this way
- Once your Developer account is approved. Login to your account and create a new app and obtain Client ID and Client Secret.
- In the very last step can click here to learn how to obtain Refresh Token (Self-Authorize).
- Copy Client ID, Client Secret and Refresh Token and paste on the Connector UI. Refresh Token field is found under the Grid on the Connector UI and Client ID and Client Secret found in the Grid.
- For Video Tutorial check this blog post
Fill in all required parameters and set optional parameters if needed:
AmazonSellingPartner(SP-ApI)DSNAmazon Selling Partner (SP-API)OAuth (Self Authorize - Private App) [OAuth]https://sellingpartnerapi-na.amazon.comRequired Parameters Optional Parameters ClientId Fill in the parameter... ClientSecret Fill in the parameter... TokenUrl Fill in the parameter... TokenUIMode Fill in the parameter... AuthUrl (Do not Use for Private App - Self Authorization) Fill in the parameter... OrdersApiVersion Fill in the parameter... SellerApiVersion Fill in the parameter... ShippingApiVersion Fill in the parameter... ServicesApiVersion Fill in the parameter... FbaApiVersion Fill in the parameter... SalesApiVersion Fill in the parameter... ReportsApiVersion Fill in the parameter... ProductsFeesApiVersion Fill in the parameter... ProductPricingApiVersion Fill in the parameter... CatalogItemsApiVersion Fill in the parameter... VendorOrdersApiVersion Fill in the parameter... RetryMode Fill in the parameter... RetryStatusCodeList Fill in the parameter... RetryCountMax Fill in the parameter... RetryMultiplyWaitTime Fill in the parameter... -
Once the data source has been configured, you can preview data. Select the Preview tab and use settings similar to the following to preview data:
-
Click OK to finish creating the data source.
Video instructions
Read data in Azure Data Factory (ADF) from ODBC datasource (Amazon Selling Partner (SP-API))
-
To start press New button:
-
Select "Azure, Self-Hosted" option:
-
Select "Self-Hosted" option:
-
Set a name, we will use "OnPremisesRuntime":
-
Download and install Microsoft Integration Runtime.
-
Launch Integration Runtime and copy/paste Authentication Key from Integration Runtime configuration in Azure Portal:
-
After finishing registering the Integration Runtime node, you should see a similar view:
-
Go back to Azure Portal and finish adding new Integration Runtime. You should see it was successfully added:
-
Go to Linked services section and create a new Linked service based on ODBC:
-
Select "ODBC" service:
-
Configure new ODBC service. Use the same DSN name we used in the previous step and copy it to Connection string box:
AmazonSellingPartner(SP-ApI)DSNDSN=AmazonSellingPartner(SP-ApI)DSN -
For created ODBC service create ODBC-based dataset:
-
Go to your pipeline and add Copy data connector into the flow. In Source section use OdbcDataset we created as a source dataset:
-
Then go to Sink section and select a destination/sink dataset. In this example we use precreated AzureBlobStorageDataset which saves data into an Azure Blob:
-
Finally, run the pipeline and see data being transferred from OdbcDataset to your destination dataset:
Advanced topics
Create Custom Stored Procedure in ZappySys Driver
You can create procedures to encapsulate custom logic and then only pass handful parameters rather than long SQL to execute your API call.
Steps to create Custom Stored Procedure in ZappySys Driver. You can insert Placeholders anywhere inside Procedure Body. Read more about placeholders here
-
Go to Custom Objects Tab and Click on Add button and Select Add Procedure:
-
Enter the desired Procedure name and click on OK:
-
Select the created Stored Procedure and write the your desired stored procedure and Save it and it will create the custom stored procedure in the ZappySys Driver:
Here is an example stored procedure for ZappySys Driver. You can insert Placeholders anywhere inside Procedure Body. Read more about placeholders here
CREATE PROCEDURE [usp_get_orders] @fromdate = '<<yyyy-MM-dd,FUN_TODAY>>' AS SELECT * FROM Orders where OrderDate >= '<@fromdate>';
-
That's it now go to Preview Tab and Execute your Stored Procedure using Exec Command. In this example it will extract the orders from the date 1996-01-01:
Exec usp_get_orders '1996-01-01';
-
Let's generate the SQL Server Query Code to make the API call using stored procedure. Go to Code Generator Tab, select language as SQL Server and click on Generate button the generate the code.
As we already created the linked server for this Data Source, in that you just need to copy the Select Query and need to use the linked server name which we have apply on the place of [MY_API_SERVICE] placeholder.
SELECT * FROM OPENQUERY([MY_API_SERVICE], 'EXEC usp_get_orders @fromdate=''1996-07-30''')
-
Now go to SQL served and execute that query and it will make the API call using stored procedure and provide you the response.
Create Custom Virtual Table in ZappySys Driver
ZappySys API Drivers support flexible Query language so you can override Default Properties you configured on Data Source such as URL, Body. This way you don't have to create multiple Data Sources if you like to read data from multiple EndPoints. However not every application support supplying custom SQL to driver so you can only select Table from list returned from driver.
If you're dealing with Microsoft Access and need to import data from an SQL query, it's important to note that Access doesn't allow direct import of SQL queries. Instead, you can create custom objects (Virtual Tables) to handle the import process.
Many applications like MS Access, Informatica Designer wont give you option to specify custom SQL when you import Objects. In such case Virtual Table is very useful. You can create many Virtual Tables on the same Data Source (e.g. If you have 50 URLs with slight variations you can create virtual tables with just URL as Parameter setting.
-
Go to Custom Objects Tab and Click on Add button and Select Add Table:
-
Enter the desired Table name and click on OK:
-
And it will open the New Query Window Click on Cancel to close that window and go to Custom Objects Tab.
-
Select the created table, Select Text Type AS SQL and write the your desired SQL Query and Save it and it will create the custom table in the ZappySys Driver:
Here is an example SQL query for ZappySys Driver. You can insert Placeholders also. Read more about placeholders here
SELECT "ShipCountry", "OrderID", "CustomerID", "EmployeeID", "OrderDate", "RequiredDate", "ShippedDate", "ShipVia", "Freight", "ShipName", "ShipAddress", "ShipCity", "ShipRegion", "ShipPostalCode" FROM "Orders" Where "ShipCountry"='USA'
-
That's it now go to Preview Tab and Execute your custom virtual table query. In this example it will extract the orders for the USA Shipping Country only:
SELECT * FROM "vt__usa_orders_only"
-
Let's generate the SQL Server Query Code to make the API call using stored procedure. Go to Code Generator Tab, select language as SQL Server and click on Generate button the generate the code.
As we already created the linked server for this Data Source, in that you just need to copy the Select Query and need to use the linked server name which we have apply on the place of [MY_API_SERVICE] placeholder.
SELECT * FROM OPENQUERY([MY_API_SERVICE], 'EXEC [usp_get_orders] ''1996-01-01''')
-
Now go to SQL served and execute that query and it will make the API call using stored procedure and provide you the response.
Actions supported by Amazon Selling Partner (SP-API) Connector
Amazon Selling Partner (SP-API) Connector support following actions for REST API integration. If some actions are not listed below then you can easily edit Connector file and enhance out of the box functionality.Parameter | Description |
---|---|
ReportType |
|
Parameter | Description |
---|---|
ReportType |
|
Parameter | Description |
---|---|
ReportType |
|
Parameter | Description | ||||||
---|---|---|---|---|---|---|---|
ReportType |
|
||||||
Filter for XML File |
|
Parameter | Description |
---|---|
ReportType |
|
Filter for JSON File |
|
Parameter | Description | ||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MarketplaceIds |
|
||||||||||||||||||||||||||||||||||||||||||||||
Include details |
|
||||||||||||||||||||||||||||||||||||||||||||||
Granularity Type |
|
||||||||||||||||||||||||||||||||||||||||||||||
Granularity Id |
|
||||||||||||||||||||||||||||||||||||||||||||||
Start Date |
|
||||||||||||||||||||||||||||||||||||||||||||||
SellerSku (Single) |
|
||||||||||||||||||||||||||||||||||||||||||||||
SellerSkus (Multiple) |
|
Parameter | Description | ||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MarketplaceIds |
|
||||||||||||||||||||||||||||||||||||||||||||||
Identifiers (comma-delimited list) |
|
||||||||||||||||||||||||||||||||||||||||||||||
IdentifiersType |
|
||||||||||||||||||||||||||||||||||||||||||||||
IncludedData |
|
||||||||||||||||||||||||||||||||||||||||||||||
Filter |
|
||||||||||||||||||||||||||||||||||||||||||||||
Locale |
|
||||||||||||||||||||||||||||||||||||||||||||||
SellerId |
|
||||||||||||||||||||||||||||||||||||||||||||||
Keywords (comma-delimited list) |
|
||||||||||||||||||||||||||||||||||||||||||||||
BrandNames (comma-delimited list) |
|
||||||||||||||||||||||||||||||||||||||||||||||
Classification Ids (comma-delimited list) |
|
||||||||||||||||||||||||||||||||||||||||||||||
KeywordsLocale |
|
Parameter | Description | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Filter |
|
Parameter | Description | ||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
CreatedAfter |
|
||||||||||||||||||||||||||||||||||||||
CreatedBefore |
|
||||||||||||||||||||||||||||||||||||||
ChangedAfter |
|
||||||||||||||||||||||||||||||||||||||
ChangedBefore |
|
||||||||||||||||||||||||||||||||||||||
IncludeDetails |
|
||||||||||||||||||||||||||||||||||||||
SortOrder |
|
||||||||||||||||||||||||||||||||||||||
PoItemState |
|
||||||||||||||||||||||||||||||||||||||
IsPOChanged |
|
||||||||||||||||||||||||||||||||||||||
PurchaseOrderState |
|
||||||||||||||||||||||||||||||||||||||
OrderingVendorCode |
|
Parameter | Description | ||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
CreatedAfter |
|
||||||||||||||||||||||||||||||||||||||
CreatedBefore |
|
||||||||||||||||||||||||||||||||||||||
ChangedAfter |
|
||||||||||||||||||||||||||||||||||||||
ChangedBefore |
|
||||||||||||||||||||||||||||||||||||||
IncludeDetails |
|
||||||||||||||||||||||||||||||||||||||
SortOrder |
|
||||||||||||||||||||||||||||||||||||||
PoItemState |
|
||||||||||||||||||||||||||||||||||||||
IsPOChanged |
|
||||||||||||||||||||||||||||||||||||||
PurchaseOrderState |
|
||||||||||||||||||||||||||||||||||||||
OrderingVendorCode |
|
Parameter | Description |
---|---|
AmazonOrderId |
|
Parameter | Description |
---|---|
AmazonOrderId |
|
Parameter | Description |
---|---|
Url |
|
Body |
|
IsMultiPart |
|
Filter |
|
Headers |
|
Amazon Selling Partner (SP-API) Connector Examples for Azure Data Factory (Pipeline) Connection
This page offers a collection of SQL examples designed for seamless integration with the ZappySys API ODBC Driver under ODBC Data Source (36/64) or ZappySys Data Gateway, enhancing your ability to connect and interact with Prebuilt Connectors effectively.
Read Orders [Read more...]
Read orders with search criteria such as CreatedAfter, CreatedBefore, MarketPlaceIds, OrderStatuses, PaymentType and many more
SELECT * FROM Orders
--WHERE AmazonOrderId='902-1845936-5435065'
WITH(
CreatedAfter='1900-01-01T00:00:00'
-- , CreatedBefore='1900-01-01T00:00:00'
-- , LastUpdatedAfter='1900-01-01T00:00:00'
-- , LastUpdatedBefore='1900-01-01T00:00:00'
-- , OrderStatuses='Pending~Unshipped~PartiallyShipped~PendingAvailability~Shipped~Canceled~Unfulfillable'
-- , MarketplaceIds='ATVPDKIKX0DER~A2Q3Y263D00KWC~A2EUQ1WTGCTBG2'
-- , FulfillmentChannels='AFN~MFN'
-- , PaymentMethods='COD~CVS~Other'
-- , AmazonOrderIds='1111111,222222,333333'
)
--CONNECTION(
-- ServiceUrl='https://sellingpartnerapi-na.amazon.com'
--)
Read Single Order [Read more...]
Read single order by orderid
SELECT * FROM Orders
Where AmazonOrderId='902-1845936-5435065'
--CONNECTION(
-- ServiceUrl='https://sellingpartnerapi-na.amazon.com'
--)
Read Order Items (For Single Order) [Read more...]
Read order items for a specified orderid
SELECT * FROM get_order_items
WITH(
AmazonOrderId ='902-1845936-5435065'
)
--CONNECTION(
-- ServiceUrl='https://sellingpartnerapi-na.amazon.com'
--)
Read Order Items (For All Orders - Slow) [Read more...]
Read order items with search criteria on orders such as CreatedAfter, CreatedBefore, MarketPlaceIds, OrderStatuses, PaymentType and many more. This is slow way of pulling all items for all orders without reading one by one order.
SELECT * FROM OrderItems
WITH(
CreatedAfter='1900-01-01T00:00:00'
-- , CreatedBefore='1900-01-01T00:00:00'
-- , LastUpdatedAfter='1900-01-01T00:00:00'
-- , LastUpdatedBefore='1900-01-01T00:00:00'
-- , OrderStatuses='Pending~Unshipped~PartiallyShipped~PendingAvailability~Shipped~Canceled~Unfulfillable'
-- , MarketplaceIds='ATVPDKIKX0DER~A2Q3Y263D00KWC~A2EUQ1WTGCTBG2'
-- , FulfillmentChannels='AFN~MFN'
-- , PaymentMethods='COD~CVS~Other'
-- , AmazonOrderIds='1111111,222222,333333'
)
--CONNECTION(
-- ServiceUrl='https://sellingpartnerapi-na.amazon.com'
--)
Sandbox - Read Orders (Fake data for testing) [Read more...]
Read orders which has fake values (sandbox data)
SELECT *
FROM Orders
--DONOT try WHERE AmazonOrderId='TEST_CASE_200' (WHERE clause) for sandbox endpoint, it will return empty row. If you try in Live API then should work.
WITH(
CreatedAfter='TEST_CASE_200'
--CreatedAfter='TEST_CASE_200_NEXT_TOKEN'
, MarketplaceIds='ATVPDKIKX0DER'
)
CONNECTION(
ServiceUrl='https://sandbox.sellingpartnerapi-na.amazon.com'
)
Sandbox - Read Single Order (Fake data for testing) [Read more...]
Read single order with orderid which has fake values (sandbox data)
SELECT *
FROM get_order
--DONOT try WHERE AmazonOrderId='TEST_CASE_200' (WHERE clause) for sandbox endpoint, it will return empty row. If you try in Live API then should work.
WITH(
AmazonOrderId='TEST_CASE_200'
-- AmazonOrderId='TEST_CASE_IBA_200'
)
CONNECTION(
ServiceUrl='https://sandbox.sellingpartnerapi-na.amazon.com'
)
Sandbox - Read Order Items (Fake data for testing) [Read more...]
Read order items with orderid which has fake values (sandbox data)
SELECT *
FROM get_order_items
--DONOT try WHERE AmazonOrderId='TEST_CASE_200' (WHERE clause) for sandbox endpoint, it will return empty row. If you try in Live API then should work.
WITH(
AmazonOrderId='TEST_CASE_200'
-- AmazonOrderId='TEST_CASE_IBA_200'
)
CONNECTION(
ServiceUrl='https://sandbox.sellingpartnerapi-na.amazon.com'
)
Generic Request - Read Any API Endpoint [Read more...]
Read any API endpoint using generic request endpoint
SELECT *
FROM generic_request
WITH(
URL='/orders/v0/orders/TEST_CASE_200/orderItems'
, Filter='$.payload.OrderItems[*]'
, IncludeParentColumns=1
-- , RequestMethod='GET'
-- , Body=''
-- , IsMultiPart=0
-- , RequestContentTypeCode"='Default'
-- , ResponseFormat='Default' --Json, Csv, Xml
-- , Headers='Accept: */* || Cache-Control: no-cache'
-- , PagingMode"=''
-- , PagingByUrlAttributeName=''
-- , PagingIncrementBy='1'
-- , NextUrlAttributeOrExpr=''
-- , NextUrlWaitInMs='0'
-- , ColumnDelimiter=','
-- , HasColumnHeaderRow='True'
-- , ElementsToTreatAsArray=''
)
CONNECTION(
ServiceUrl='https://sandbox.sellingpartnerapi-na.amazon.com'
)
Get Report Types [Read more...]
Lists report types which you can use for download_report / get_report_tsv / get_report_csv or get_report_xml endpoints
SELECT * FROM ReportTypes)
Download Report to Local Disk [Read more...]
This example shows how to run a report and download data to local disk file. You can save any file format report by calling this endpoint.
SELECT * FROM download_report
WITH(
ReportType='GET_XML_BROWSE_TREE_DATA'
, TargetFilePath='c:\temp\GET_XML_BROWSE_TREE_DATA.gz'
, MarketplaceIds='ATVPDKIKX0DER'
--, FileOverwriteMode='FailIfExists' (Default is 'AlwaysOverwrite')
--, StartDate='2012-12-31'
--, EndDate='today-1d'
)
Generate Report [Read more...]
This example shows how to get data from a specified report
SELECT *
FROM get_report_tsv
WITH(
ReportType='GET_MERCHANT_LISTINGS_ALL_DATA'
, MarketplaceIds='ATVPDKIKX0DER'
)
Conclusion
In this article we discussed how to connect to Amazon Selling Partner (SP-API) in Azure Data Factory (Pipeline) and integrate data without any coding. Click here to Download Amazon Selling Partner (SP-API) Connector for Azure Data Factory (Pipeline) and try yourself see how easy it is. If you still have any question(s) then ask here or simply click on live chat icon below and ask our expert (see bottom-right corner of this page).
Download Amazon Selling Partner (SP-API) Connector for Azure Data Factory (Pipeline)
Documentation
More integrations
Other application integration scenarios for Amazon Selling Partner (SP-API)
Other connectors for Azure Data Factory (Pipeline)
Download Amazon Selling Partner (SP-API) Connector for Azure Data Factory (Pipeline)
Documentation
How to connect Amazon Selling Partner (SP-API) in Azure Data Factory (Pipeline)?
How to get Amazon Selling Partner (SP-API) data in Azure Data Factory (Pipeline)?
How to read Amazon Selling Partner (SP-API) data in Azure Data Factory (Pipeline)?
How to load Amazon Selling Partner (SP-API) data in Azure Data Factory (Pipeline)?
How to import Amazon Selling Partner (SP-API) data in Azure Data Factory (Pipeline)?
How to pull Amazon Selling Partner (SP-API) data in Azure Data Factory (Pipeline)?
How to push data to Amazon Selling Partner (SP-API) in Azure Data Factory (Pipeline)?
How to write data to Amazon Selling Partner (SP-API) in Azure Data Factory (Pipeline)?
How to POST data to Amazon Selling Partner (SP-API) in Azure Data Factory (Pipeline)?
Call Amazon Selling Partner (SP-API) API in Azure Data Factory (Pipeline)
Consume Amazon Selling Partner (SP-API) API in Azure Data Factory (Pipeline)
Amazon Selling Partner (SP-API) Azure Data Factory (Pipeline) Automate
Amazon Selling Partner (SP-API) Azure Data Factory (Pipeline) Integration
Integration Amazon Selling Partner (SP-API) in Azure Data Factory (Pipeline)
Consume real-time Amazon Selling Partner (SP-API) data in Azure Data Factory (Pipeline)
Consume real-time Amazon Selling Partner (SP-API) API data in Azure Data Factory (Pipeline)
Amazon Selling Partner (SP-API) ODBC Driver | ODBC Driver for Amazon Selling Partner (SP-API) | ODBC Amazon Selling Partner (SP-API) Driver | SSIS Amazon Selling Partner (SP-API) Source | SSIS Amazon Selling Partner (SP-API) Destination
Connect Amazon Selling Partner (SP-API) in Azure Data Factory (Pipeline)
Load Amazon Selling Partner (SP-API) in Azure Data Factory (Pipeline)
Load Amazon Selling Partner (SP-API) data in Azure Data Factory (Pipeline)
Read Amazon Selling Partner (SP-API) data in Azure Data Factory (Pipeline)
Amazon Selling Partner (SP-API) API Call in Azure Data Factory (Pipeline)