SQL Query Examples
PreviousNext

Content

Driver SQL Syntax

On this page you will find various query examples for Json ODBC Driver / Data Gateway Connector. This driver offers highly flexible Client Size SQL Query engine which runs on client side to offer rich query language like a traditional RDBMS. Keep in mind that its not same as traditional database SQL so many features might not be supported also this engine is invoked on client side so there might be some performance issue if you have very large dataset. In below section you can see high level SQL Grammer / Syntax for writing SQL query for this driver.

SELECT Syntax / INTO

Below section shows use of SELECT statement. It can output data or store data into temporary table (must be named with prefix # (i.e. #temp_table) ).

SELECT 
 [DISTINCT] [* | [ expression [[AS] column_name_alias] [, ...] ]
 [INTO #temp_table_name]
 [FROM table_name]
 [WHERE condition [, ...] ]
 [GROUP BY expression [, ...] ]
 [HAVING condition [, ...] ]	
 [ORDER BY expression [ASC | DESC] [, ...] ]
 [LIMIT row_count] 
 [WITH (option_name=option_value] [, ...]) ] [;]
 

option_name:=IDENTIFIER
option_value:= NUMBER | STRING

Select using Temp Table

This driver supports storing data into temporary table and then using it later in the same script. You can issue multiple SQL statements in the same script seperated by semicolon (;). For output only last Non-SELECT INTO statement is used. When you query #temp_table you get support for INNER / OUTER JOIN statements.

 SELECT [DISTINCT] [* | [ expression [[AS] column_name_alias] [, ...] ] 
 FROM #temp_table_name
 [[INNER | OUTER | CROSS] JOIN #temp_table_name [alias] ON condition 
   [join_statement] 
   [join_statement] 
   ... 
 ]
 [WHERE condition [, ...] ]
 [GROUP BY expression [, ...] ]
 [HAVING condition [, ...] ]	
 [ORDER BY expression [ASC | DESC] [, ...] ]
 [LIMIT row_count] 

Multiple SQL Statments

This driver supports executing multiple statements in a single command. You can issue multiple SQL statements in the same script seperated by semicolon (;). For output only last Non-SELECT INTO statement is used. When you query #temp_table you get support for INNER / OUTER JOIN statements.

select_into_temp; 
[select_into_temp;] 
[select_into_temp;]
...
select_from_temp_table;

Comment Syntax

ZappySys Driver supports single line comment and block comments anywhere in SQL

--some comment in single line 
|
/* some block comment */ 

UNION / UNION ALL Syntax

You can use UNION or UNION ALL key word to output multiple temp tables with similar structure in a single resultset. UNION ALL keyword doesnt apply DISTINCT operation to eliminate duplicate rows so its faster.

select_from_temp_table_1
   UNION | UNION ALL
select_from_temp_table_2
  [UNION | UNION ALL]
select_from_temp_table_3	
...
...

Stored Procedure Syntax

This driver supports writing Stored Procedure. This is useful when writing reusable parameterized SQL for ease of use. Rather than calling long SQL from application you can call short name with parameters to invoke same logic. Stored procedure support multi-statements using TEMP tables (i.e. SELECT ... INTO #mytemptable clause). To invoke Stored Proc use EXEC keyword before its name. See below Syntax.
CREATE PROCEDURE

CREATE PROCEDURE proc_name 
  [@parameter_name [= 'Some Default Value']  ] 
  [@parameter_name [= 'Some Default Value']  ] 
  ...
  ...
AS 
  sql-statement; 
 [sql-statement | sql-temp-table-usage] ;
 [sql-statement | sql-temp-table-usage] ;
 ...
 ... 

Executing PROCEDURE
EXEC proc_name [value1] [,value2] ... [,valueN]
Deleting PROCEDURE
DROP PROCEDURE proc_name

SQL Examples

Basic Read (API, File, Embedded) Query JSON API Service

This example shows how to query JSON API service. In order to see table list you must populate required connection parameters on the parameter grid
SELECT * FROM value 
WITH (SRC='https://services.odata.org/V3/Northwind/Northwind.svc/Invoices?$format=json')

Basic Read (API, File, Embedded) Query Single File

This example shows how to query single JSON file (For multiple file use wildcard (i.e. *.json)). This example selects root level hierarchy (i.e $ as table name). Use table selector from toolbar to exact data from different hierarchy (i.e table). In order to see table list you must populate File/URL and other required connection parameters on the parameter grid.

SELECT * FROM $ WITH (SRC='C:\Data\cust-1.json')

Basic Read (API, File, Embedded) Query Multiple Files

This example shows how to query multiple JSON files using wildcard (i.e. *.json). This example selects root level hierarchy (i.e $ as table name). Use table selector from toolbar to exact data from different hierarchy (i.e table).
SELECT * FROM $ 
WITH (
    SRC='C:\Data\cust*-?.json'
    --,RECURSIVE='True' --Include files from sub folder
)

Basic Read (API, File, Embedded) Query direct JSON string (embedded inside query)

This example shows how to query direct JSON string embedded inside query. This is useful for testing purpose. In real world this may not be the case because data may be coming from File or URL (API call). This example selects root level hierarchy (i.e $ as table name). Use table selector from toolbar to exact data from different hierarchy (i.e table). In order to see table list you must populate File/URL and other required connection parameters on the parameter grid.
SELECT * FROM rows 
WITH 
(DATA='
{          
  rows : [
        {id:1, name: "AAA"}, 
        {id:2, name: "BBB"}, 
        {id:3, name: "CCC"}
  ]
}'
)

Basic Read (API, File, Embedded) Query Command line Output (Console Capture)

This example shows how to return data from command line output / batch file. To treat Data as command line simply prefix source command line with cmd:>. Here is more information about streaming data from command line output. Syntax: cmd:>[exe / bat folder]<exe-name> [arguments]
First argument is exe name or full path for exe or bat file. Second part is arguments for command line program. You can use double quotes around exe / batch file path if it contains space.
For *.bat or *.cmd file make sure to add [ @echo off ] in the first line (without brackets) else command itself is added in output. To read more please see product help file

====================
Examples:
====================
cmd:>cmd /c dir *.dll /b
cmd:>aws iam list-users --output json
cmd:>az vm list --output json
cmd:>py c:\scripts\run-python.py
cmd:>powershell -executionpolicy bypass -File "c:\scrips\run.ps1"
cmd:>powershell -executionpolicy bypass -Command "[System.IO.StreamReader]::new((Invoke-WebRequest -URI https://zappysys.com/downloads/files/test/invoices.json).RawContentStream).ReadToEnd()"
cmd:>c:\folder\my-batch-file.bat
cmd:>c:\folder\my-batch-file.bat option1 option2
cmd:>curl -k https://httpbin.org/get
cmd:>curl.exe -k https://httpbin.org/get
cmd:>c:\folder\curl.exe -k https://httpbin.org/get
cmd:>"c:\folder with space\curl.exe" -k https://httpbin.org/get
SELECT * FROM $ 
WITH (
  Data='cmd:>powershell -executionpolicy bypass -Command "[System.IO.StreamReader]::new((Invoke-WebRequest -URI https://zappysys.com/downloads/files/test/invoices.json).RawContentStream).ReadToEnd()"'
, Filter='$.value[*]')

Basic Read (API, File, Embedded) Escape Sequence (String Literals with new lines, tabs)

This example shows how to use escape sequence in String Literals (Quoted values (single quotes) in WITH, SELECT or some other clause if driver supports). You can use @ symbol before string literal to allow special characters (e.g. \r \n \t \v \u0041 \\). \r=carriage return, \n=new line, \t=tab, \uXXXX=any ascii/unicode char by hex number. To produce Unicode character using escape sequence use hex number starting \u e.g. \u0041 will produce A (Uppercase).
SELECT * FROM rows 
WITH 
(DATA=@'{rows : [{id:1, name: "AA\u0041"},\r\n {id:2, name: "BBB"},\r\n {id:3, name: "CCC"}]}'
)

API Authentication (HTTP,OAuth) Basic Authentication (User Id/Password)

This example shows how to query JSON API service which requires basic authentication. In order to see table list you must populate required connection parameters on the parameter grid

--Credentials via Query
SELECT * FROM $ WITH (SRC='http://httpbin.org/basic-auth/user1/pass123',DataConnectionType='HTTP', CredentialType='Basic',UserName='user1',Password='pass123')

--Credentials inside connection string [Use Data source setup grid >> Select Connection = HTTP (Click Configure) >> Set Credential type=Basic, UserId=user1, Password=pass123]
--SELECT * FROM $ WITH (SRC='http://httpbin.org/basic-auth/user1/pass123')

API Authentication (HTTP,OAuth) NTLM / Windows Authentication

This example shows how to query JSON API service which requires windows authentication (NTLM).

--Credentials via Query
SELECT * FROM $ 
WITH (SRC='http://yourhost/some-service',DataConnectionType='HTTP',CredentialType='Ntlm')

--Credentials inside connection string [Use Data source setup grid >> Authentication type=Ntlm]
--SELECT * FROM $ WITH (SRC='http://yourhost/some-service')

API Authentication (HTTP,OAuth) OAuth 2.0 Authentication (3-Legged) - Custom API Example

This example shows how to query JSON API service which requires OAuth2 authentication.

--Credentials via Query
SELECT * FROM $ WITH (
    SRC='http://yourhost/api/some-service'
    ,DataConnectionType='OAuth'
    ,UseCustomApp='True'
    ,OAuthVersion='OAuth2'
    ,AuthUrl='https://yourhost/api/authorize'
    ,TokenUrl='https://yourhost/api/token'
    ,ClientId='xxxxxxxxxxx'
    ,ClientSecret='xxxxxxxxxxxxx'
    ,AccessToken='xxxxxxxxxxx'
    ,RefreshToken='xxxxxxxxxxx'  --//This is only obtained from UI by clicking Generate Token Button
)

--Credentials inside connection string [Use Data source setup grid >> Authentication type=OAuth >> Click OAuth Button to configure connection]
SELECT * FROM $ WITH (SRC='http://yourhost/api/some-service')

API Authentication (HTTP,OAuth) Dynamic Token Authentication (2 Steps) - Query REST API

This example shows how to call REST API Service using Dynamic Token approach. Learn how to authenticate using userid / password to obtain temporary token which can be passed to call API. Also learn how to pass Body, Headers using HTTP POST. For more information about Dynamic Token visit https://zappysys.com/links/?id=10091
SELECT * FROM $
WITH(
	 Src='https://zappysys.com/downloads/files/test/rest-getdata.aspx?format=json'
	--,RequestMethod='POST'
	--,RequestData='{sessionid: "[$token$]"}'
	,Filter='$.rows[*]'
	,Header='Content-Type: application/json || Accept: */* || Cache-Control: no-cache'	
	,DataConnectionType='HTTP'
	,CredentialType='TokenDynamic'
	,UserName='MyUser001'
    ,Password='P@$$w0rdAAc12'
	,AuthScheme='{none}'
	,TokenUrl='https://zappysys.com/downloads/files/test/rest-login.aspx'
	,TokenRequestData='{ user:"[$userid$]", password:"[$password$]" }'
	,TokenRequestMethod='POST'
	,TokenResponseContentFilter='$.sessionid'
	,TokenRequestHeaders='Content-Type:application/json | Accept:*/* | Cache-Control:no-cache'
	,TokenResponseContentType='Json'

)

API Authentication (HTTP,OAuth) OAuth 2.0 Authentication (3-Legged) - Google API Example

This example shows how to query Google API using OAuth2 authentication. For demo its using Google Analytics API but you can apply same concept for other OAuth2 compatible Google API.

--Credentials via Query
--Check this link to learn more about how to generate ClientId / ClientSecret >> https://zappysys.com/blog/register-google-oauth-application-get-clientid-clientsecret/
--To get Access Token and Refresh Token >> Goto properties grid >> Change Authentication type=OAuth >> Click OAuth Button to configure connection >> Select Provider='Google' >> Use Custom App >> Select Scope and click Generate Token >> Save tokens to file

SELECT * FROM $ WITH
(
 SRC='https://www.googleapis.com/analytics/v3/metadata/ga/columns'
,UseCustomApp='True'
,DataConnectionType='OAuth'
,ServiceProvider='Google'
,ClientId='3855xxxxxx-xxxxxxxxxxxxxxxxx.apps.googleusercontent.com'
,ClientSecret='qF7axxxxxxxxxxxWfs'
,AccessToken='ya29.GlsxxxxxxxxxIam'
,RefreshToken='1/pFOxxxxxxxxxxgU' --//This is only obtained from UI by clicking Generate Token Button
)


--Credentials inside connection string [Use Data source setup grid >> Authentication type=OAuth >> Click OAuth Button to configure connection]
SELECT * FROM $ WITH (SRC='https://www.googleapis.com/analytics/v3/metadata/ga/columns')

API GET (Read Data) Query OData API Service

This example shows how to query OData API. By specifying DataFormat='OData' it will automatically paginate multiple requests for you so you don't have to worry about additional pagination configuration.
SELECT * FROM value WITH (SRC='https://services.odata.org/V3/Northwind/Northwind.svc/Order_Details?$format=json',DataFormat='OData')

Compression (Zip,GZip) Query Large Compressed File (From URL or Local disk)

This example shows how to read very large file from local disk or URL. You can read uncompressed or compressed file (in our example its GZip compressed file). Using --FAST option in your Filter enables Streaming Mode. To use STREAM Mode you must turn off IncludeParentColumns option as per the example.
SELECT * FROM $ 
--LIMIT 10
WITH(
 Filter='$.LargeArray[*]--FAST' --//Adding --FAST option turn on STREAM mode (large files)
,SRC='https://zappysys.com/downloads/files/test/large_file_10k_largearray_prop.json.gz'
--,SRC='C:\Data\large_file*.json.gz'
,IncludeParentColumns='False'  --//This Must be OFF for STREAM mode (read very large files)
,FileCompressionType='GZip' --Zip or None (Zip format only available for Local files)
)

API Pagination Paginate based on Response Attribute

This example shows how to paginate API calls where you need to paginate until last page detected. In this example next page is indicated by some attribute called nextlink (found in response). If this attribute is missing or null then it stops fetching next page.
SELECT * FROM $
WITH(
SRC='https://zappysys.com/downloads/files/test/pagination_nextlink_inarray_1.json'
,NextUrlAttributeOrExpr = '$.nextlink'  --keep reading until this attribute is missing. If attribute name contains dot then use brackets like this $.['my.attr.name']
)

API Pagination Paginate based on URL Parameter (Loop until certain StatusCode)

This example shows how to paginate API calls where you need to pass page number via URL. Driver keeps incrementing page number and calls next URL until last page detected (401 error). There are few ways to indicate last page (e.g. By status code, By row count, By response size). If you don't specify end detection then it will use default (i.e. No records found).
SELECT * FROM $
WITH (
SRC='https://zappysys.com/downloads/files/test/page-json.aspx?page=1&mode=DetectBasedOnResponseStatusCode'
,PagingMode='ByUrlParameter'
,PagingByUrlAttributeName='page'
,PagingByUrlEndStrategy='DetectBasedOnResponseStatusCode'
,PagingByUrlCheckResponseStatusCode=401
,IncrementBy=1
)

API Pagination Paginate based on URL Path (Loop until no record)

This example shows how to paginate API calls where you need to pass page number via URL Path. Driver keeps incrementing page number and calls next URL until last page is detected. There are few ways to indicate last page (e.g. By status code, By row count, By response size). If you don't specify end detection then it will use default (i.e. No records found).
SELECT * FROM $
WITH (
SRC='https://zappysys.com/downloads/files/test/cust-<%page%>.json'
,PagingMode='ByUrlPath'
,PagingByUrlAttributeName='<%page%>'
,PagingByUrlEndStrategy='DetectBasedOnRecordCount'
,IncrementBy=1
)

URL JOIN Pass parent URL data to Child URL

This example shows how to perform URL JOIN. Using this pattern you can pass parent URL data to child API URL in one query. You can pass parent data in child URL or Body (for POST). Placeholder syntax is [$parent_alias.parent_col_name$]. Each level must have ALIAS attribute. For root level you dont have to Prefix JOINxxxxxxx but for any other level you must Prefix JOINxxxxx before each attribute name. For more information on API JOIN Pattern check https://zappysys.com/links/?id=10139. You can use function when you pass parent data to child URL. e.g. [$a.firstname,FUN_TRIM$]. Visit this Help Page to see all placeholder functions you can use https://zappysys.com/onlinehelp/odbc-powerpack/scr/odbc-format-static-placeholders.htm. Here is another example of using BASE64DEC function in child URL https://zappysys.zendesk.com/hc/en-us/articles/360049791574

SELECT 
    c.custid,c.name,c.city
    ,o.orderid,o.orderdate,o.total
    /* ,i.itemid,i.qty,i.shipped */
FROM $
WITH(
    --1st level (root)--
	 src='https://zappysys.com/downloads/files/test/join/customers.json'
    ,NextUrlAttributeOrExpr='$.nextpage'
	,filter='$.customers[*]'
    ,alias='c'
    ,WaitTimeMs='500'
    
    --2nd level--
    --You can pass parent column to child URL using placeholder [$parent_alias.parent_col_name$] syntax (e.g. [$c.custid$] )
    ,join1_src='https://zappysys.com/downloads/files/test/join/c[$c.custid$]/orders.json'
    ,join1_filter='$.orders[*]'
    ,join1_alias='o'
    ,join1_NextUrlAttributeOrExpr='$.nextpage'    
    ,join1_WaitTimeMs='500'

    /*
    --3rd level-- 
    --You can pass parent column to child URL using placeholder [$parent_alias.parent_col_name$] syntax (e.g. [$c.custid$] and [$o.orderid$])
    ,join2_src='https://zappysys.com/downloads/files/test/join/c[$c.custid$]/o[$o.orderid$]/items.json'
    ,join2_filter='$.items[*]'
    ,join2_alias='i'
    ,join2_NextUrlAttributeOrExpr='$.nextpage'    
    ,join2_WaitTimeMs='500'

    --maximum 7 levels deep you can go. Any attributes allowed in WITH clause can be prefixed by by joinN_xxx
    -- join3_xxxxx, join4_xxxxxx .......... join7_xxxxx
    */
) 

URL JOIN Custom Metadata for Child URL and final output

This example shows how to perform URL JOIN and supply custom META for joined URL (child) and final output (SELECT list). Using this pattern you can control datatype parsing in child levels. Notice that in first META we included CUST_ID and CUST_CITY. This is aliased columns from SELECT list. You must use exact name from final output. Any columns (aliased or non-aliased) you intend to use in Final SELECT include in META clause to control exact Metadata. If you skip Output META and only use JOIN1_META (for child) then datatypes for final result will be detected automatically and sometimes length is set to 16MB max to support long string. For more information on API JOIN Pattern check https://zappysys.com/links/?id=10139
SELECT 
 _pid,q._rid
, r.custid as CUST_ID, r.city as CUST_CITY
, q.VAL1, q.VAL2, q.VAL3, q.DESCR 
FROM $
ORDER BY _pid,q._rid
WITH (

 SRC='https://zappysys.com/downloads/files/test/join/customers.json'
,filter= '$.customers[*]'    
,ALIAS='r'

,JOIN1_DATA='{"ORDERS":{"ROW":[
{"VAL1":"HINTROW","VAL2":"000100","VAL3":"000100","DESCR":"Some notes1"},
{"VAL1":"5023","VAL2":"000200","VAL3":"000200","DESCR":"Some notes2"},
{"VAL1":"0000","VAL2":"000300","VAL3":"000300","DESCR":"Some notes3"}]}}'
,JOIN1_FILTER='$.ORDERS.ROW[*]'
,JOIN1_IncludeParentColumns='False'
,JOIN1_ALIAS='q'

--metadata for final output (include column, type, length for output of SELECT). 
--This meta does not perform type casting
,Meta='[
  {Name: "_rid",Type: "Int32"},
  {Name: "_pid",Type: "Int32"},
 
  {Name: "CUST_ID",Type: "String",Length: 100},
  {Name: "CUST_CITY",Type: "String",Length: 100},
  
  {Name: "VAL1",Type: "String",Length: 100},
  {Name: "VAL2",Type: "Int32"},
  {Name: "VAL3",Type: "String"},
  {Name: "DESCR",Type: "String",Length: 254}
  ]'
  
 --child level1 metadata (length not needed). Datatype used below may perform some type casting ---
,JOIN1_meta='[
  {Name: "VAL1",Type: "String"},
  {Name: "VAL2",Type: "Int32"},
  {Name: "VAL3",Type: "String"},
  {Name: "DESCR",Type: "String"}
  ]'
)

URL JOIN Download files to local disk

This example shows how to download files to local disk. It first gets records by calling first URL and then pass it to seconds URL. For more information on API JOIN Pattern check https://zappysys.com/links/?id=10139
select a.url,b.data,file_write_binary(base64_to_bytes(b.data) ,'c:\temp\' || name  ) as bytes_written 
from $
WITH ( 
  SRC='https://zappysys.com/downloads/files/test/imagelist.json',
  FILTER='$.list[*]',
  ALIAS='a',
  
  JOIN1_SRC='[$a.url$]',
  JOIN1_ALIAS='b',
  JOIN1_SaveContentAsBinary='true'  
) 

URL JOIN Download binary files as Base64 and Upload to different URL

This example shows how to download binary files from one URL and Upload data to another URL as Base64 using URL JOIN techniques. When SaveContentAsBinary Property is supplied it will convert Binary Response to Base64 for easy string operations. For more information on API JOIN Pattern check https://zappysys.com/links/?id=10139
select a.url,c.* 
from $
WITH ( 
  --Step1: Get List of URLs to download
  SRC='https://zappysys.com/downloads/files/test/imagelist.json',
  FILTER='$.list[*]',
  ALIAS='a',
  
  --Step2: Download files as base64 data for each URL
  JOIN1_SRC='[$a.url$]',  --//SRC can be local or URL c:\image.png or http://some/url/file.pdf
  JOIN1_SaveContentAsBinary='true',  --//This property returns data as base base64 string rather byte array so its easy to pass in next step
  JOIN1_ALIAS='b',
  
  --Step3: Upload File - POST File Base64 data to target URL
  JOIN2_SRC='https://zappysys.com/downloads/files/test/callapi.aspx',
  JOIN2_METHOD='POST',
  JOIN2_BODY='{ name: "[$a.name$]", base64:"[$b.data$]"}',
  JOIN2_ALIAS='c'  
) 

URL JOIN Read unstructured data and pass to API call

This example shows how to query unstructured data and parse using Regex, XPath or JSONPath)

Examples of XPaths: https://zappysys.com/blog/ssis-extract-single-xml-node-using-xpath-soap-response/
Examples of JsonPath: https://zappysys.com/blog/jsonpath-examples-expression-cheetsheet/
Examples of Regular Expression: https://zappysys.com/blog/using-regular-expressions-in-ssis/

Also try following Query Examples to extract Raw data.

SELECT * FROM $
WITH(
Src='http://google.com'
,EnableRawOutputModeSingleRow='True'
)

SELECT * FROM $
WITH(
Src='http://google.com'
,EnableRawOutputModeSingleRow='True'
,RawOutputFilterExpr='<title>([^<]*)<\/title>{{0,1}}||content="([^"]*)"{{0,1}}' --can be Regex, JsonPath or XPath. Use double pipe to split multiple columns
,RawOutputDataRowTemplate=@'{title:"[$1]", content:"[$2]" }' --must be Json template
,RawOutputExtractMode='Regex' --can be Regex, Xml, Json or None. -- Comment this and RawOutputFilterExpr, RawOutputDataRowTemplate to extract Raw data without column parsing (i.e. returns raw text of URL/Path)
)

SELECT * FROM $
WITH(
Src='http://httpbin.org/get'
,EnableRawOutputModeSingleRow='True'
,RawOutputFilterExpr='$.origin||$.url' --can be Regex, JsonPath or XPath. Use double pipe to split multiple columns
--,RawOutputDataRowTemplate=@'{title:"[$1]", content:"[$2]" }' --must be Json template
,RawOutputExtractMode='Json' --can be Regex, Xml, Json or None. -- Comment this and RawOutputFilterExpr, RawOutputDataRowTemplate to extract Raw data without column parsing (i.e. returns raw text of URL/Path)
)

SELECT c.title,o.url,o.data 
FROM $
WITH(
	 Src='http://google.com'
	,EnableRawOutputModeSingleRow='True'
	,RawOutputFilterExpr='<title>([^<]*)<\/title>{{0,1}}||content="([^"]*)"{{0,1}}' --can be Regex, JsonPath or XPath. Use double pipe to split multiple columns
	,RawOutputDataRowTemplate=@'{title:"[$1]", content:"[$2]" }' --must be Json template. Comment this line to generate default columns e.g. col1, col2...If you comment this then use c.col1 and c.col2 instread of c.title and c.content in Select and join1_xxxxx attributes
	,RawOutputExtractMode='Regex' --can be Regex, Xml, Json or None. -- Comment this and RawOutputFilterExpr, RawOutputDataRowTemplate to extract Raw data without column parsing (i.e. returns raw text of URL/Path)
    ,Alias='c'
    ,join1_RequestMethod='POST'
    ,join1_RequestData='my data [$c.content$]'
    ,join1_src='https://httpbin.org/post?id=[$c.title$]'
    ,join1_RequestContentTypeCode = 'TextPlain'
    ,join1_alias='o'
    ,join1_EnableRawOutputModeSingleRow='False'
)

URL JOIN Query API with Status Check Loop (Wait until value found)

This example shows how to use status check feature to keep waiting until desired value is not found in response. This pattern is very useful for Job style API where you have to wait until API returns Success or Failure status of Job status. You can adjust iteration wait time, max wait time for timeout and regex pattern to search for Success or Failure. If Success value found it returns data. If Failure value found or timeout occurs it will throw error.

Examples of XPaths: https://zappysys.com/blog/ssis-extract-single-xml-node-using-xpath-soap-response/
Examples of JsonPath: https://zappysys.com/blog/jsonpath-examples-expression-cheetsheet/
Examples of Regular Expression: https://zappysys.com/blog/using-regular-expressions-in-ssis/

SELECT * FROM $ WITH 
(
SRC='https://zappysys.com/downloads/files/test/cust-1.json', 
EnableStatusCheck='True', 
StatucCheckMaxWaitSeconds=7,
StatucCheckIterationWaitSeconds=3,
--StatusCheckForFailedValue = 'True',
--StatusFailedValue = 'Failed|Cancelled',
--StatusFieldFilterExpr='$.response.status', --Use XPath, JsonPath or Regex to narrow down to single property / content under which you like to search 
--StatusFieldFilterType='Json', --or use Xml, Regex, None
StatusSuccessValue='C1' --Wait until C1 or C2 or C3 found in response (Use Regular expression syntax) 
)

/*
--- See real world example of status check pattern here (Use of XML Driver)... https://zappysys.com/blog/import-bing-ads-data-sql-server-performance-reports/ 

SELECT c.* FROM $
WITH(
	--///////////////////////////////////
	--Step-1: submit report request for XML format
	--///////////////////////////////////
	 alias='a' 
	,Src='https://reporting.api.bingads.microsoft.com/Api/Advertiser/Reporting/v13/ReportingService.svc'
	,Filter='$.s:Envelope.s:Body.SubmitGenerateReportResponse'
	,RequestData='@C:\Requests\BingAds\sql_body1.xml'  --Get Body from File for clean Look (Read blog post above to see actual body)
	,IsMultiPart='True'
	,RequestContentTypeCode='TextXml'
	,Header='SOAPAction: "SubmitGenerateReport"'
	,RequestMethod = 'POST'
	,EnableBodyPlaceholderForMultiPart = 'True'
	,Meta = '[{ "Name": "ReportRequestId","Type": "Int64"}]'

    --///////////////////////////////////
    --///Step-2: keep checking status until report is ready - returns final url when done
    --///////////////////////////////////
	,join1_alias = 'b'
	,join1_Filter = '$.s:Envelope.s:Body.PollGenerateReportResponse.ReportRequestStatus'
	,join1_RequestData = '@C:\Requests\BingAds\sql_body2.xml'
	,join1_IsMultiPart = 'True'
	,join1_Src = 'https://reporting.api.bingads.microsoft.com/Api/Advertiser/Reporting/v13/ReportingService.svc'
	,join1_RequestContentTypeCode = 'TextXml'
	,join1_Header = 'SOAPAction: "PollGenerateReport"'
	,join1_RequestMethod = 'POST'
	,join1_EnableBodyPlaceholderForMultiPart = 'True'
	,join1_EnableStatusCheck = 'True', 
	,join1_StatucCheckMaxWaitSeconds = 300, --wait max 5 mins ?
	,join1_StatucCheckIterationWaitSeconds = 5, --check every 5 sec ?
	,join1_StatusSuccessValue = 'Success', --look for success word in response
      --, join1_Meta = '[{ "Name": "ReportDownloadUrl","Type": "String",Length: "500"},{ "Name": "Status","Type": "String",Length: "20"}]'

   --///////////////////////////////////
   --Step-3: Download report(Zip file) and Parse rows
   --///////////////////////////////////
	,join2_alias = 'c'
	,join2_DataConnectionType = 'Default'
	,join2_Src = '[$b.ReportDownloadUrl$]'
	,join2_Filter = '$.Report.Table.Row[*]'
	,join2_ElementsToTreatAsArray = 'Row'
	,join2_RequestMethod = 'GET'
	,join2_IsMultiPart = 'False'
	,join2_FileCompressionType = 'Zip'
)

*/

Performance Handling Large Dataset / Avoid OutOfMemory Error (Using Disk for Query Engine)

Advanced Query can invokes ClientSide query Engine which means entire table is fetched in memory for processing causing OutOfMemory Exception (OOM). To avoid this error you can specify Disk based Temporary storage for query engine. Default is In memory storage.
SELECT * FROM $
WITH(

   Src='https://zappysys.com/downloads/files/test/cust-1.json?really-large-file=1'
  ,TempStorageMode='Disk' --or 'Memory'
--OR
--,TempStorageMode=1 --//Memory=0, Disk=1
)

Metadata Query with static Metadata (META option / Override Mode)

This example shows how to supply custom Metadata rather than relying dynamic metadata detection (based on 300 rows scanning). If you run query you can click on [View Metadata] button to generate default metadata. If you dont want to supply all columns in meta clause then use @OverrideMode:1 along with column list you want to alter. Possible Types for META is => <none>, String, Int64, Long, Int, Int32, Short, Byte, Decimal, Double, Float, DateTime, Date, Boolean
SELECT * FROM rows 
WITH 
(DATA='
{          
  rows : [
        {id:1, checknumber:"0001", name: "AAA"}, 
        {id:2, checknumber:"0002", name: "BBB"}, 
        {id:3, checknumber:"0003", name: "CCC"}
  ]
}'
--//Method-1: >>>DIRECT META (compact mode)<<<
,META='id:int;checknumber:string(10);name:string(10)'
-- Add: @OverrideMode:1 in the column list to override meta for only specific column(s) - In below example we are changing default datatype/length for id and checknumber, all other columns are dynamic so dont have to supply in META clause
--,META='@OverrideMode:1;id:int;checknumber:string(10)'  

--//Method-2: >>>DIRECT META (json mode)<<<
--,META='[{Name:"id",Type:"Int32"},{Name:"checknumber",Type:"String",Length:10},{Name:"name",Type:"String",Length:10}]'
-- Add: @OverrideMode entry in the column list to override meta for only specific column(s). No need to set Type for that entry - In below example we are changing default datatype/length for id and checknumber, all other columns are dynamic so dont have to supply in META clause
--,META='[{Name:"@OverrideMode"}, {Name:"id",Type:"Int32"},{Name:"checknumber",Type:"String",Length:10}]'

--//Method-3: >>>META from File <<<
--,META='c:\data\meta.json'

--//Method-4: >>>META from Internal Storage - Just refer entry by name <<<
--,META='MY_META_STORAGE_NAME'

)

API POST (Send Data) Simple POST (embedded string)

This example shows how to POST data to API service where data (i.e. request body) is coming from embedded string.
SELECT * FROM $
limit 3 WITH 
(METHOD='POST', HEADER='Content-Type: text/plain || x-hdr1: AAA'
,SRC='http://httpbin.org/post'
,BODY='
{
    id:1,
    notes:"Line1\r\nLine2"
}'
)

API POST (Send Data) Simple POST with Static Output Metadata

This example shows how to speed up POST API calls by supplying static Metadata about output columns. By default each API request is sent twice (First to Get Metadata and Second to Get actual data). This approach is fine in most cases but some POST API call might fail if you submit same request twice (i.e. Create New Account with Unique Constraint on Name). Check this article to learn more about how to use Metadata feature https://zappysys.com/links/?id=10111
SELECT * FROM $
WITH (METHOD='POST', HEADER='Content-Type: text/plain || x-hdr1: AAA'
,SRC='http://httpbin.org/post'
,BODY='{id:1,notes:"Line1\r\nLine2"}'
--define output metadata using static value like below or from file
,META='[{Name: "id",Type: "Int32"},{Name: "origin",Type: "String",Length: 64}]'
--,META='id=Int32;name=String(10)'  --compact meta format
--,META='c:\query_meta.json'
)

API POST (Send Data) File Upload data from local file

This example shows how to POST data to API service where data (i.e. request body) is coming from local file.
SELECT * FROM $
limit 3 WITH 
(METHOD='POST', HEADER='Content-Type: text/plain || x-hdr1: AAA'
,SRC='http://httpbin.org/post'
,BODY='@c:\files\dump.json'
,IsMultiPart='True'
)

Stored Proc Call stored proc (Ad-Hoc HTTP request).

This example shows how to fetch data from any http URL via stored procedure call. Using this method you can fetch any data from HTML pages not necessarily JSON.
exec webRequest 'http://httpbin.org/post', 'POST', '*BODY*', 'x-hdr1: AA||x-hdr2: BB'

--You can also arrange parameters in any order using named parameters and optional parameters
--exec webRequest @verb='POST', @uri='https://httpbin.org/post', @body='*BODY*', @headers='Insecure-Requests: 0||User-Data: httpbin.org'
--exec webRequest @verb='GET', @uri='https://httpbin.org/get'


File Operations Download File from URL (Raw Text) / Write to file / Read from file

This example shows how to read RAW data from URL and then save content to disk file. If you dont specify EnableRawOutputMode flag then parser is not invoked to parse data in rows/column. When EnableRawOutputMode=1 then it returns data in RAW_DOCUMENT column as RAW text. Other two lines commented to show example of READ File Text from local disk and write binary data (download from URL - no authentication)
SELECT 
     FILE_WRITE_TEXT( RAW_DOCUMENT,'c:\temp\dump.json' ) AS ColBytesWritten1 --//write text to file
    --,FILE_WRITE_TEXT( RAW_DOCUMENT,'c:\temp\dump.json', 1 ) AS ColBytesWritten1 --//write text to text file - fail if file exists
    --,BASE64_TO_FILE('c:\temp\dump.png',  FILE_TO_BASE64('http://httpbin.org/image/png') ) AS ColBytesWritten2 --//This is how you can for binary file input must be base64 of data
    --,FILE_READ_TEXT('c:\temp\dump.txt' ) DiskFileContent
FROM $
WITH(
	 Src='https://httpbin.org/json'
	,EnableRawOutputMode='True' --//set this flag to read data as RAW text
)

Functions String Manipulation

This example shows how to use string functions. When you use function your entire dataset is first cached locally and then transformation is applied so you may notice some speed degradation.
SELECT 
  substr('ZappySys', 1, 5)
, printf('%s %s %.2f', 'v1', CompanyName, CompanyId)
, ('Str1' || CompanyName || 'Str2') as ConcatExample
, upper('ZappySys')
, lower('ZappySys')
, replace('ZappySys', 'Sys', 'XYZ')
, instr('ZappySys', 'Sys')
, trim('   Zappy Sys  ') trim1
, trim('@@@ZappySys@@@', '@') trim2
, rtrim('    ZappySys   ')
, ltrim('    ZappySys   ')
, length( CompanyName )
FROM $ WITH (
FILTER='$.Info.Rows[*]',
DATA='{ Info: { Rows: [ { CompanyId: 1000 , CompanyName: "ZappySys" }, { CompanyId: 1001 , CompanyName: "Microsoft" } ] } }'
)

Functions Control flow

This example shows how to use math functions. When you use function your entire dataset is first cached locally and then transformation is applied so you may notice some speed degradation.
SELECT 
	 COALESCE(10,20) coalesce1 --returns first non-null arg
    ,COALESCE(null,20) coalesce2 --returns first non-null arg
    ,COALESCE(null,20,30,40) coalesce3 --returns first non-null arg
    ,IFNULL(null,20) --returns first non-null arg. Same function as COALESCE but simple (just two arguments allowed)
    ,NULLIF(20,20) nullif1 --returns null if both argument same
    ,NULLIF(10,20) nullif2 --returns null if both argument same
FROM $ WITH (
FILTER='$.Info.Rows[*]',
DATA='{ Info: { Rows : [ { CompanyId: 1000 , CompanyName: "ZappySys" }, { CompanyId: 1001 , CompanyName: "Microsoft" }] } }')

Functions Date/Time

This example shows how to use date/time functions. When you use function your entire dataset is first cached locally and then transformation is applied so you may notice some speed degradation.

JSON driver supports following 5 datetime functions. The date and time functions use a subset of IS0-8601 date and time formats. The date() function returns the date in this format: YYYY-MM-DD. The time() function returns the time as HH:MM:SS. The datetime() function returns YYYY-MM-DD HH:MM:SS.

date(timestring, modifier, modifier, ...)
time(timestring, modifier, modifier, ...)
datetime(timestring, modifier, modifier, ...)
julianday(timestring, modifier, modifier, ...)
strftime(format, timestring, modifier, modifier, ...)

=======================================
time string (input data)
=======================================
A time string (input) can be in any of the following formats. You can hard code or supply from column.

YYYY-MM-DD
YYYY-MM-DD HH:MM
YYYY-MM-DD HH:MM:SS
YYYY-MM-DD HH:MM:SS.SSS
YYYY-MM-DDTHH:MM
YYYY-MM-DDTHH:MM:SS
YYYY-MM-DDTHH:MM:SS.SSS
HH:MM
HH:MM:SS
HH:MM:SS.SSS
now
DDDDDDDDDD

=======================================
format specifier for strftime function
=======================================
Use below format specifiers to output date in desired format. For example to output date in MM-DD-YYYY use below select query

select STRFTIME('%Y-%m-%d','now') formatted_date_time from mytable
select STRFTIME('%Y-%m-%d',[my column]) formatted_date_time from mytable

%d day of month: 00
%f fractional seconds: SS.SSS
%H hour: 00-24
%j day of year: 001-366
%J Julian day number
%m month: 01-12
%M minute: 00-59
%s seconds since 1970-01-01
%S seconds: 00-59
%w day of week 0-6 with Sunday==0
%W week of year: 00-53
%Y year: 0000-9999
%% %

=================
Modifiers
=================
The time string (input) can be followed by zero or more modifiers that alter date and/or time. Each modifier is a transformation that is applied to the time value to its left. Modifiers are applied from left to right; order is important. The available modifiers are as follows.

For example to add 5 days in current date you call it as below

select DATE('now', '+5 day') from mytable

NNN days
NNN hours
NNN minutes
NNN.NNNN seconds
NNN months
NNN years
start of month
start of year
start of day
weekday N
unixepoch
localtime
utc

The first six modifiers (1 through 6) simply add the specified amount of time to the date and time specified by the preceding timestring and modifiers. The 's' character at the end of the modifier names is optional. Note that '±NNN months' works by rendering the original date into the YYYY-MM-DD format, adding the ±NNN to the MM month value, then normalizing the result. Thus, for example, the data 2001-03-31 modified by '+1 month' initially yields 2001-04-31, but April only has 30 days so the date is normalized to 2001-05-01. A similar effect occurs when the original date is February 29 of a leapyear and the modifier is ±N years where N is not a multiple of four.

The 'start of' modifiers (7 through 9) shift the date backwards to the beginning of the current month, year or day.

The 'weekday' modifier advances the date forward to the next date where the weekday number is N. Sunday is 0, Monday is 1, and so forth.
SELECT 
 DATE('now') date_now
,STRFTIME('%Y-%m-%dT%H:%M:%f','now') formatted_Date_time
,DATETIME('now') datetime_now
,DATE('now','localtime') datetime_now_local
,TIME('now') time_now
,JULIANDAY('now')
,DATE('now', 'start of month', '+1 month' , '-1 day' ) dt_modify1
,DATE('now', '+5 day') dt_modify2

FROM $ WITH (DATA='{ Info: { Rows : [ { CompanyId: 1000 , CompanyName: "ZappySys" }, { CompanyId: 1001 , CompanyName: "Microsoft" } ] } }')

Functions Math

This example shows how to use math functions. When you use function your entire dataset is first cached locally and then transformation is applied so you may notice some speed degradation.
SELECT 
	 abs(-500)
    ,random()
    ,round(1000.236, 2)
FROM $  WITH (DATA='{ Info: { Rows: [ { CompanyId: 1000 , CompanyName: "ZappySys" }, { CompanyId: 1001 , CompanyName: "Microsoft" } ] } }')

Functions String Manipulation

This example shows how to use string functions. When you use function your entire dataset is first cached locally and then transformation is applied so you may notice some speed degradation.
SELECT 
  substr('ZappySys', 1, 5)
, printf('%s %s %.2f', 'v1', CompanyName, CompanyId)
, ('Str1' || CompanyName || 'Str2') as ConcatExample
, upper('ZappySys')
, lower('ZappySys')
, replace('ZappySys', 'Sys', 'XYZ')
, instr('ZappySys', 'Sys')
, trim('   Zappy Sys  ') trim1
, trim('@@@ZappySys@@@', '@') trim2
, rtrim('    ZappySys   ')
, ltrim('    ZappySys   ')
, length( CompanyName )
FROM $ WITH (
FILTER='$.Info.Rows[*]',
DATA='{ Info: { Rows: [ { CompanyId: 1000 , CompanyName: "ZappySys" }, { CompanyId: 1001 , CompanyName: "Microsoft" } ] } }'
)

Functions JSON

This example shows how to use various JSON functions.
select 
json_value(j1,"$.V1"),
json_array_first(j2),
json_array_last(j2),
json_array_nth(j2,1)
from array WITH (DATA=
'{"array" : [{"c1": "abc", "c2": "ABC", "j1": "{''V1'':''value1''}", "j2": "[13,22,''31'']"}] }')

Functions JSON (Flattening un-collapsed JSON)

This example shows how to flatten JSON arrays which are not flattened by the driver. Another way to flatten small arrays without using json_value is set EnableArrayFlattening=true

/*
SELECT * FROM value
WITH(SRC='https://services.odata.org/TripPinRESTierService/People'
, EnableArrayFlattening = 'True' --This is another way to extract data from small arrays. It will flatten arrays upto N items
, MaxArrayItemsToFlatten = 5
)
*/

-- OR --

SELECT 
	UserName,
	Features as Features_Raw,
	json_value(Features,'$.[0]') || ',' || json_value(Features,'$.[1]')  Features_CSV,
	AddressInfo as AddressInfo_Raw,
	json_value(AddressInfo,'$.[0].Address') Address1_Address,
	json_value(AddressInfo,'$.[0].City.Name') Address1_City,
	json_value(AddressInfo,'$.[0].City.CountryRegion') Address1_Country,
	json_value(AddressInfo,'$.[0].City.Region') Address1_Region,
	
	json_value(AddressInfo,'$.[1].Address') Address2_Address,
	json_value(AddressInfo,'$.[1].City.Name') Address2_City,
	json_value(AddressInfo,'$.[1].City.CountryRegion') Address2_Country,
	json_value(AddressInfo,'$.[1].City.Region') Address2_Region,
	Cost
FROM value
WITH(SRC='https://services.odata.org/TripPinRESTierService/People'
)

Functions Regular Expression

This example shows how to regular expression functions.
SELECT 
 regex_ismatch(SSN, '\d{3}-\d{3}-\d{4}') IsValidSSN, --check pattern found
 regex_replace(SSN, '\d','*') MaskedSSN, --search/replace pattern
 regex_search(SSN, '\d{4}')  SSNLast4Digits, --extract substring using pattern search
 regex_search(Email, '(?<name>\w+)@(?<domain>\w+.com)',0, 2)  DomainOf1stEmailByIndex, --extract 1st email (zero based index) and get domain name (by group name). For 2nd argument you can use numeric index or named argument. Both arguments are zero based index. 
 regex_search(Email, '(?<name>\w+)@(?<domain>\w+.com)',2, 2)  DomainOf3rdEmailByIndex, --extract 3rd email (zero based index) and get 2nd part of match (i.e. domain name). Both arguments are zero based index.
 regex_search('a@x.com | b@y.com | c@z.com', '(?<name>\w+)@(?<domain>\w+.com)','-0', 2)  DomainOfLastEmail, --extract last email (use negative sign in quote - zero based index) and get 2nd part of match (i.e. domain name). Both arguments are zero based index.
 regex_search(
       'a@x.com | b@y.com | c@z.com'
     , '(?<name>\w+)@(?<domain>\w+.com)'
     , '*' --occurrence match index ( * , N or -N ) negative number will do reverse find (e.g. to get last match use -0 , to get second last use -1)
     , 2  --group index from match (default is full match or -1)
     --, 0  -- ignore case for pattern match (0=strict match, 1=ignore case, default is 0) 
     --, 0  -- throw error if not matched 0=don't throw error, 1=throw error
     --, '#'  -- separator for multi match
    )  GetAllDomains --extract domains from all emails and show as email (zero based index) and get 2nd part of match (i.e. domain name). Both arguments are zero based index.    
FROM $ WITH (Filter='$.data.row[*]', DATA= '{
 data: {
  row:[{SSN: "111-222-4444", Email: "a@x.com | b@y.com | c@z.com" },{SSN: "111-X2Z-AB44", Email: "a@x.com | b@y.com | c@z.com"}]
 }
}
')

Functions Using Placeholder Functions anywhere in SQL

This example shows how to use placeholders functions anywhere in your SQL Query. Placeholders are replaced in SQL text before Parser so you can use it anywhere in SQL. Another advantage of placeholder function is it doesn't invoke client side query processing engine unlike other advanced SQL constructs (e.g. WHERE, GROUP BY, ORDER BY, UDF). Placeholders are searched and replaced before parser can analyze the SQL query. Here are list of placeholder functions.

<<FUN_NOW>>
<<FUN_TODAY>>
<<today+2d,FUN_TO_DATE>>
<<yearstart,FUN_TO_DATE>>
<<monthstart-1d,FUN_TO_DATE>>
<<now+2h,FUN_TO_DATETIME>>
<<your_file_path,FUN_READFILE>>
<<input_text,FUN_BASE64ENC>>
<<input_text,FUN_BASE64ENC>>
<<input_text,FUN_BASE64DEC>>
<<input_text,FUN_URLENC>>
<<input_text,FUN_URLDEC>>
<<date_format,FUN_TODAY>>
<<date_format,FUN_TO_UTC_DATE>>
<<date_format,FUN_TO_UTC_DATETIME>>
<<date_format,FUN_TO_LOCAL_DATE>>
<<date_format,FUN_TO_LOCAL_DATETIME>>

Full list of functions found here: https://zappysys.com/links/?id=10103

Where date_format can be from any valid format specifiers from here https://docs.microsoft.com/en-us/dotnet/standard/base-types/custom-date-and-time-format-strings

Examples:
select * from $ WITH (SRC='http://httpbin.org/get', HEADER='Authtorization: Basic <<Myuser12:MyP@ssW!,FUN_BASE64ENC>>')
select * from $ WITH (SRC='http://httpbin.org/get?id=<<FUN_NEW_GUID>>')
select * from $ WITH (SRC='<<c:\temp\path.txt,FUN_READFILE>>')
select * from $ WITH (SRC='http://httpbin.org/get?startdate=<<yyyy-MM-dd HH:mm:ss,FUN_TODAY>>')
select * from $ WITH (SRC='http://httpbin.org/get', HEADER='date:<<2012-01-01T00:00:00,FUN_TO_UTC_DATE>>')
SELECT * FROM $ WITH(SRC='http://httpbin.org/json?date-para=<<yyyy-MM-dd,FUN_TODAY>>')

Functions Use Function(s) without FROM clause

This example shows how to call function(s) without using FROM clause.
SELECT RTRIM('trimmed     ') Col1, 
               DATE('now', '+5 day') Col2

Advanced Options Custom Search / Replace (Remove special characters)

This example shows how to use custom search/replace feature. This function is invoked before parser parse data so its useful for some unique scenario which needs removal of conflicting characters, renaming columns etc.
SELECT * FROM rows 
WITH 
(DATA='
{          
  rows : [
        {_id:1, _name: "AAA",number:{}}, 
        {_id:2, _name: "BBB",number:{}}, 
        {_id:3, _name: "CCC",number:{}}
  ]
}'
,EnableCustomReplace=1
,SearchFor='number:{}'  --//or USE: --regex  , --regex-ic (case-insensitive) or no suffix for plain text search.
,ReplaceWith='number:{blank:1}'
)

Hierarchy Filter / Flattening Flatten Hierarchy with Filter (JSONPath / Regular Expression / Regex)

This example shows how to query data using direct Filter expression and some other filter related settings. It also shows how to use regular expressions to filter nodes. /*
===========================
Filter Examples (for in-memory filter. Full object loaded in memory):
===========================
$.store.books[*] //get all books for store
$.store.employees[*] //get all employees for store
$.store.books[*].sections[*] //get all sections from all books
$.store.books[*].author //get all authors of all books for store
$.store.books[*] //get all books for store
$.store.books[2] //get 3rd book record
$.store.books[:2] //get first 2 books from the top
$.store.books[-2:] //get last 2 books
$.store.books[?(@author=='sam')] //get all books where author attribute equals to 'sam'
$..[?(@author=='sam')] //get all documents / sub documents (any level) where author attribute equals to 'sam'
$.store.books[?(@author=~ /sam|bob/ )] //Using Regular Expression: get all books where author attribute contains 'sam' or 'bob'
$.store.books[?(@author=~ /^((?!sam|bob).)*$/ )] //Using Regular Expression: get all books where author attribute does not contain word 'sam' or 'bob'
$.store.books[?(@author=~ /^((?!sam).)*$/ && @category=~ /^((?!science).)*$/ )] //Using Regular Expression: get all books where author attribute does not contain word 'sam' and category attribute does not contain word 'science'
$.store.books[?(@author=~ /^sam|bob$/ )] //Using Regular Expression: get all books where author name is exactly 'sam' or 'bob'
$.store.books[?(@author=~ /^sam/ )] //Using Regular Expression: get all books where author name starts with 'sam'
$.store.books[?(@author=~ /sam$/ )] //Using Regular Expression: get all books where author name ends with 'sam'
$.store.employees[?( @.hiredate>'2015-01-01' && @.hiredate<'2015-01-04' )] //get all employees where hiredate between two dates
$.store.books[?(@.price<10)] //get books where price is less than 10
$.store.books[?(@.tag)] //filter all books with tag
$.[0] //Get first node from Array Json format (only works if JSON starts with array)
===========================
Filter Examples (for Stream Mode filter):
===========================
$.store.books--FAST //get all books in stream mode. Means books property (which may have many records) is not loaded in memory but its streamed as needed
//stream mode is very fast but has limited filter capability so use very simple expressions (e.g. $.prop1.innerProp2)
//by default all simple expressions (e.g. no array brackets in expression) are STREAM mode
*/
{ "store": { "employees": [ { "name": "bob", "hiredate": "2015-01-01" }, { "name": "sam", "hiredate": "2015-01-02" }, { "name": "ken", "hiredate": "2015-01-03" } ], "books": [ { "category": "reference", "author": "bob", "title": "hellooo1", "price": 1.95, "sections": [ "s1", "s2", "s3" ] }, { "category": "fiction", "author": "sam", "title": "hellooo2", "price": 1.96, "sections": [ "s4", "s1", "s3" ] }, { "category": "science", "author": "steve", "title": "hellooo3", "tag": "1bcd", "price": 11, "sections": [ "s9", "s2", "s3" ] } ], "location": { "street": "123 Main St.", "city": "Newyork", "state": "GA" } } }
SELECT * FROM $ 
WITH 
(
--enter path in SRC or use static value in DATA'
--SRC='http://some/api/data.json'
--SRC='C:\somefile.json'
--SRC='C:\some*.json'
DATA='
{ version:1,         
  doc: {     
	  location: {lat:111.55,lon:-222.12}, 
	  rows: [
	        {id:1, name: "AAA"}, 
	        {id:2, name: "BBB"}, 
	        {id:3, name: "CCC"}
	  ]
  }
}'
,Filter='$.doc.rows[*]' -- for more information check this https://zappysys.com/blog/jsonpath-examples-expression-cheetsheet/
--,Filter='$.doc.rows[?(@name=='AAA')]' --//equal match example
--,Filter='$..[?(@name=='AAA')]' --//equal match example (scan all levels - parent or child)
--,Filter='$.doc.rows[?(@name=~/AAA|BBB/)]' --//regex match example

,IncludeParentColumns='true'  --this will include any columns found above filtered node (e.g. rows).
,ParentColumnPrefix='p_'
)

Language Features Pivot Value (Columns to Rows)

This example shows how to pivot column to rows (e.g. Transform property name as row value). By default Tag name (i.e. Property) outputs as Column name but when your column name is Dynamic then you can use Pivot Feature to transform the data.
SELECT
  Pivot_Name as ProjectName, 
  Pivot_Value_id as Id, 
  Pivot_Value_code as Code,
  Pivot_Value_budget as Budget
FROM $
WITH(
Filter='$.projects',
--enter path in SRC or use static value in DATA'
--SRC='http://some/api/data.json'
--SRC='C:\some*.json'
--SRC='C:\somefile.json'
DATA='
{          
  projects : {
    P101 : {id:1, code: "AAA", budget:1500.10}, 
    P102 : {id:2, code: "BBB", budget:7000.20}, 
    P103 : {id:3, code: "CCC", budget:1100.10}
  }
}
',
EnablePivot='True'
)

Array Transformation Query Simple 2D Array

This example shows how to query compact 2D array (array inside array) where column name is not repeated. You can specify Simple 2D Transformation along with filter to select column name (e.g. $.columns[*])
SELECT * FROM rows
WITH(
--enter path in SRC or use static value in DATA'
--SRC='http://some/api/data.json'
--SRC='C:\some*.json'
--SRC='C:\somefile.json'
DATA='
{
  "columns": ["RecordID","CustomerID","CustomerName"],
  "rows": [ [1,"AAA","Customer A"], [2,"BBB","Customer B"], [3,"CCC","Customer C"] ]
}'
,ArrayTransformType='TransformSimpleTwoDimensionalArray'
,ArrayTransColumnNameFilter='$.columns[*]'
)

Array Transformation Query Simple 2D Array (With Missing Columns)

This example shows how to query compact 2D array (array inside array) where column name is missing.
SELECT * FROM rows
WITH(
--enter path in SRC or use static value in DATA'
--SRC='http://some/api/data.json'
--SRC='C:\some*.json'
--SRC='C:\somefile.json'
DATA='
{
  "rows": [ [1,"AAA","Customer A"], [2,"BBB","Customer B"], [3,"CCC","Customer C"] ]
}'
,ArrayTransformType='TransformSimpleTwoDimensionalArray'
,ArrayTransEnableCustomColumns='True'
,ArrayTransCustomColumns='RecordID,CustomerID,CustomerName'
)

Array Transformation Query Complex 2D Array

This example shows how to query compact 2D array (array inside array) where column name is not repeated. You can specify Simple 2D Transformation along with filter to select column name (e.g. $.columns[*])
SELECT * FROM rows
WITH(
--enter path in SRC or use static value in DATA'
--SRC='http://some/api/data.json'
--SRC='C:\some*.json'
--SRC='C:\somefile.json'
DATA='
{ 
	requestid:1, 
	schema:{ fields:[{name:"id"},{name:"name"},{name:"birthdate"}] }, 
	rows: [ { f:[{v:10},{v:"aaa"},{v:"2015-01-01"}]}, { f:[{v:20},{v:"bbb"},{v:"2015-02-01"}] } ]
}'
,ArrayTransformType = 'TransformComplexTwoDimensionalArray'
,ArrayTransColumnNameFilter = '$.schema.fields[*].name'
,ArrayTransRowValueFilter = '$.f[*].v'
)

Advanced Options Query Multiple URLs / Paths without UNION keyword

This example shows how to call multiple URLs and Union results. Use this approach when you dont have an option for API pagination or you want to query selected Files / URLs only. You can also use temp tables along with UNION statement but this approach can be better because it doesnt invoke client side engine.
SELECT * FROM $ WITH 
(
--//separate URLs or Local Paths using double colon ( :: ) or new line (like below). 
--//Example of double colon :: as separator
--//http://a.com/storeid=1::http://a.com/storeid=5

SRC='https://zappysys.com/downloads/files/test/cust-1.json
     https://zappysys.com/downloads/files/test/cust-2.json', 
EnableMultiPathMode='True'
)

Language Features DateTime field Compare in WHERE / ORDER BY

This example shows how to use date time field in WHERE or / ORDER BY. Date Time field needs special handling. You have to use DATETIME function else it will do string compare rather true date time compare. Foe example WHERE OrderDate<=DATETIME('1997-12-31 00:00:00') -- OR WHERE OrderDate<=DATE('1997-12-31'). You must use DATETIME either left side or right side. Also must use 'yyyy-MM-dd' or 'yyyy-MM-dd HH:mm:ss' or 'yyyy-MM-dd HH:mm:ss.fff' format (where fff is milliseconds)
SELECT * FROM value 
WHERE OrderDate<=DATETIME('1997-12-31 00:00:00') -- OR use DATE('1997-12-31') DateTime column can be queried this way. You must wrap DATETIME function around it. Also must use 'yyyy-MM-dd' or 'yyyy-MM-dd HH:mm:ss'  or 'yyyy-MM-dd HH:mm:ss.fff' format (where fff is milliseconds)
ORDER BY DATETIME(OrderDate) 
WITH (SRC='https://services.odata.org/V3/Northwind/Northwind.svc/Invoices?$format=json')

Language Features Group By / Limit / Order By

This example shows how to use GROUP BY / ORDER BY and LIMIT clauses.
SELECT 
  Country AS Invoice_Country
, SUM(UnitPrice * Quantity) AS Invoice_Total
FROM value 
WHERE Discount > 0
--AND OrderDate<=DATETIME('1997-12-31 00:00:00') -- OR use DATE('1997-12-31') DateTime column can be queried this way. You must wrap DATETIME function around it. Also must use 'yyyy-MM-dd' or 'yyyy-MM-dd HH:mm:ss'  or 'yyyy-MM-dd HH:mm:ss.fff' format (where fff is milliseconds)
GROUP BY Country 
HAVING SUM(UnitPrice * Quantity) > 1000
ORDER BY Invoice_Total DESC--,DATETIME(OrderDate) 
LIMIT 3
WITH (SRC='https://services.odata.org/V3/Northwind/Northwind.svc/Invoices?$format=json')

Language Features Case Statement

This example shows how to write CASE statement to select / output different value based on multiple conditions.
SELECT 
  name 
, CASE id 
 WHEN 1 THEN 1+1 
 WHEN 2 THEN 2+2 
 ELSE 0 
 END ThisIsCaseColumn
FROM rows 
WITH 
(DATA='
{          
  rows : [
        {id:1, name: "AAA"}, 
        {id:2, name: "BBB"}, 
        {id:3, name: "CCC"}
  ]
}'
)

Language Features SELECT INTO (Temporary Table) / Multi Statements

This example shows how to use SELECT INTO statement to save result into temporary table and use it later in the same batch. Temp table is indicated by # prefix (e.g. #tmp). This example also uses multiple statements in the same batch. If you have multiple results in the same batch (i.e. SELECT * FROM ) then only last result is returned and others results are discarded. Each statement must be separated by semi-colon ( ; )
SELECT * INTO #tmp FROM rows 
WITH 
(
--enter path in SRC or use static value in DATA
--SRC='http://some/api/data.json'
--SRC='C:\some*.json'
--SRC='C:\somefile.json'
--OR--
DATA='
{          
  rows : [
        {id:1, name: "AAA"}, 
        {id:2, name: "BBB"}, 
        {id:3, name: "CCC"}
  ]
}'
);

--//second statement in the same batch 
select * from #tmp 
where id > 1
--AND SomeDateColumn=DATETIME('2012-12-31 00:00:00')  --DateTime column can be queried this way... Must use 'yyyy-MM-dd HH:mm:ss' format
;

Language - JOIN Statement using SELECT INTO / Multi Statements

This example shows how to use SELECT INTO statement and use of multiple statements in the same batch. Output from only last SELECT is returned if you have multiple SELECT * FROM in the same batch. Each statement must be separated by semi-colon ( ; )

SELECT * INTO #cust FROM rows 
WITH (
--enter path in SRC or use static value in DATA
--SRC='http://some/api/customers.json'
--SRC='C:\cust*.json'
--SRC='C:\customers.json'
DATA='{
rows: [
  {customerid: 1, name: "AAA"},
  {customerid: 1, name: "BBB"},
  {customerid: 1, name: "CCC"}
 ]
}
'
);

SELECT * INTO #ord FROM rows
WITH (
DATA='{
rows: [
  {orderid:1000, customerid: 1, orderdate: "2012-01-01T00:00:00",total:2000.50},
  {orderid:1001, customerid: 1, orderdate: "2012-01-01T00:00:00",total:2000.50},
  {orderid:1002, customerid: 2, orderdate: "2013-01-01T00:00:00",total:5000.70},
  {orderid:1003, customerid: 2, orderdate: "2013-01-02T00:00:00",total:6000.70}
 ]
}'
);

--//second statement in the same batch 
SELECT c.customerid,o.orderid,o.orderdate,o.total 
FROM #cust c
INNER JOIN #ord o ON c.customerid=o.customerid
--LEFT OUTER JOIN #ord o ON c.customerid=o.customerid
--WHERE o.orderdate=DATETIME('2012-01-01 00:00:00') --for datetime column format must be queried in 'yyyy-MM-dd HH:mm:ss' format only
-- To Use greater than less than must need to use DATETIME function (e.g. WHERE o.OrderDate<=DATETIME('1997-12-31 23:12:59')
;            

Language Features UNION ALL / UNION Statement

This example shows how to write UNION or UNION ALL statement to combine results from multiple SQL queries. To remove duplicate rows from result use UNION rather than UNION ALL.
SELECT * into #tbl1 FROM rows 
WITH (
--enter path in SRC or use static value in DATA
--SRC='http://some/api/data.json'
--SRC='C:\some*.json'
--SRC='C:\somefile.json'

DATA='{ rows : [{id:1, name: "AAA"}, {id:2, name: "BBB"}]}'
);

SELECT * into #tbl2 FROM rows 
WITH (DATA='{ rows : [{id:3, name: "CCC"}, {id:4, name: "DDD"}]}' );

select * from #tbl1
UNION ALL
select * from #tbl2;

Language Features Basic SQL queries

Basic query examples.

--Whenever Possible try to write Simple Query. When you use WHERE / GROUP BY / ORDER BY Expressions. Driver may invoke Client side processing if server side doesnt support that construct.
--For comment use line comment with two dash (e.g. --my comment) or use block comment for multi line comment (e.g. /*my comment*/ )
--Any query must contain at least two required parts
--data source (or table) names and a list of column names
--column names come first (preceded by SELECT keyword)
--table names follow (preceded by FROM keyword)

--Here is the query that returns all rows with all columns (* stands for all columns) from 'Accounts' table
SELECT * FROM Accounts

--Use quoted identifiers when you have space or other character in names (e.g. Column or Table name contains space)
SELECT ID, "Account Name" FROM "Closed Accounts"
SELECT ID, [Account Name] FROM [Closed Accounts]

--returns all rows from 'Accounts' table, each row will have Id and Amount columns
SELECT Id, Amount FROM Accounts

--returns only those rows of 'Accounts' table that meet next condition: Amount is greater or equal to 1000.0
SELECT Id, Amount FROM Accounts WHERE Amount >= 1000.0

--returns only those rows of 'Accounts' table that meet next condition: Amount is greater than 1000 or DepartamentID is equal to 123 
SELECT Id, Amount FROM Accounts WHERE (Amount > 1000.0 OR DepartamentID = 123)

--returns only those rows of 'Accounts' table that meet next condition: DepartamentID is equal to 1, 101 or 202 
SELECT Id, Amount FROM Accounts WHERE DepartamentID IN (1, 101, 202)

--returns only those rows of 'Accounts' table that meet next condition: DepartamentID is equal to 321 and Amount is less than 100 
SELECT Id, Amount FROM Accounts WHERE DepartamentID = 321 AND Amount < 100

--returns only those rows of 'Persons' table that meet next condition: Name column value starts with 'John'
SELECT * FROM Person WHERE Name LIKE 'John%'

--case statement
SELECT 
 FirstName,
 LastName,
 CASE Country 
  WHEN 'US' THEN 'NA' 
  WHEN 'Canada' THEN 'NA'
  WHEN 'India' THEN 'Asia'
  WHEN 'China' THEN 'Asia'
  ELSE 'Unknown' 
 END ColRegion 
 FROM Person 

--escaping string values (use of \n \r \t \v allowed)
SELECT * FROM MyTable WITH( Body=@'Line1\nLine2')


Copyrights reserved. ZappySys LLC.