Google Drive Connector for Azure Data Factory (SSIS)
Read / write Google Drive data inside your app; perform many Google Drive operations without coding, just using easy to use high performance API Connector for Google Drive
In this article you will learn how to
quickly and efficiently integrate Google Drive data in Azure Data Factory (SSIS) without coding.
We will use high-performance Google Drive Connector
to easily connect to Google Drive and then access the data inside Azure Data Factory (SSIS).
Let's follow the steps below to see how we can accomplish that!
Download
Documentation
Google Drive Connector for Azure Data Factory (SSIS) is based on ZappySys API Connector Framework
which is a part of SSIS PowerPack.
It is a collection of high-performance SSIS connectors that enable you
to integrate data with virtually any data provider supported by SSIS, including SQL Server.
SSIS PowerPack supports various file formats, sources and destinations,
including REST/SOAP API, SFTP/FTP, storage services, and plain files, to mention a few
(if you are new to SSIS and SSIS PowerPack, find out more on how to use them).
|
Connect to Google Drive in other apps
|
Create SSIS package
First of all, create an SSIS package, which will connect to Google Drive in SSIS.
Once you do that, you are one step closer to deploying and running it in Azure-SSIS integration runtime in Azure Data Factory (ADF).
Then simply proceed to the next step - creating and configuring Azure Blob Storage Container.
Prepare custom setup files for Azure-SSIS runtime
Now it's time to start preparing custom setup files for Azure-SSIS runtime.
During Azure-SSIS runtime creation you can instruct ADF to perform a custom setup on a VM (Azure-SSIS node);
i.e. to run the custom installer, copy files, execute PowerShell scripts, etc.
In that case, your custom setup files are downloaded and run in the Azure-SSIS node (a VM) when you start the runtime.
In this section we will prepare custom setup files so that you can run SSIS packages with SSIS PowerPack connectors inside in Azure-SSIS runtime.
Trial Users
Use the step below if you are a Trial User, when you did not purchase a license key.
Proceed with these steps:
-
Download SSIS PowerPack trial installer.
Make sure you don't rename the installer and keep it named as SSISPowerPackSetup_64bit_Trial.msi.
-
Create a text file and name it main.cmd (make it all lowercase, very important).
-
Copy and paste this script into it and save it:
set DIR=%CUSTOM_SETUP_SCRIPT_LOG_DIR%
echo Calling Step 1 : %TIME% >> "%DIR%\steps_log.txt"
dir /s /b > "%DIR%\file_list.txt"
echo Calling Step 2 : %TIME% >> "%DIR%\steps_log.txt"
::Install SSIS PowerPack
msiexec /i "SSISPowerPackSetup_64bit_Trial.msi" ADDLOCAL=ALL /q /L*V "%DIR%\powerpack_trial_install_log.txt"
echo Calling Step 3 : %TIME% >> "%DIR%\steps_log.txt"
dir "C:\Program Files\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
dir "C:\Program Files (x86)\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
echo DONE : %TIME% >> "%DIR%\steps_log.txt"
echo complete
This is the entry-point script that is executed when Azure-SSIS runtime is started.
-
At last! You are ready to upload these two files — main.cmd & SSISPowerPackSetup_64bit_Trial.msi — into your Azure Blob Storage container's folder, which we will do in the Upload custom setup files to Azure Blob Storage container step.
Paid Customers
Use the steps below if you are a Paid Customer, when you purchased a license.
Proceed with these steps:
-
Download SSIS PowerPack paid installer.
Make sure you don't rename the installer and keep it named as SSISPowerPackSetup_64bit.msi.
-
Have your SSIS PowerPack license key handy, we will need it in the below script.
-
Create a text file and name it main.cmd (make it all lowercase, very important).
- Copy and paste the below script into it.
- Paste your license key by replacing parameter's
--register
argument with your real license key.
-
Finally, save main.cmd:
set DIR=%CUSTOM_SETUP_SCRIPT_LOG_DIR%
echo Calling Step 1 : %TIME% >> "%DIR%\steps_log.txt"
dir /s /b > "%DIR%\file_list.txt"
echo Calling Step 2 : %TIME% >> "%DIR%\steps_log.txt"
::Install SSIS PowerPack
msiexec /i "SSISPowerPackSetup_64bit.msi" ADDLOCAL=ALL /q /L*V "%DIR%\powerpack_install_log.txt"
echo Calling Step 3 : %TIME% >> "%DIR%\steps_log.txt"
::Activate PowerPack license (Optional)
"C:\Program Files (x86)\ZappySys\SSIS PowerPack (64 bit)\LicenseManager.exe" -p SSISPowerPack --register "lgGAAO0-----REPLACE-WITH-YOUR-LICENSE-KEY-----czM=" --logfile "%DIR%\powerpack_register_log.txt"
::Show System Info
echo Calling Step 4 : %TIME% >> "%DIR%\steps_log.txt"
"C:\Program Files (x86)\ZappySys\SSIS PowerPack (64 bit)\LicenseManager.exe" -i -l "%DIR%\sysinfo_log.txt"
echo Calling Step 5 : %TIME% >> "%DIR%\steps_log.txt"
dir "C:\Program Files\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
dir "C:\Program Files (x86)\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
echo DONE : %TIME% >> "%DIR%\steps_log.txt"
echo complete
This is the entry-point script that is executed when Azure-SSIS runtime is started.
-
At last! You are ready to upload these two files — main.cmd & SSISPowerPackSetup_64bit.msi — into your Azure Blob Storage container's folder, which we will do in the Upload custom setup files to Azure Blob Storage container step.
Upload custom setup files to Azure Blob Storage container
Within Azure Blob Storage container we will store custom setup files we prepared in the previous step so that Azure-SSIS can use them in custom setup process.
Just perform these very simple, but very important steps:
-
Create Azure Blob Storage container, if you haven't done it already
Make sure you create and use Azure Blob Storage container instead of Azure Data Lake Storage folder.
Azure Data Lake Storage won't allow creating an SAS URI for the container, which is a crucial step in the process.
-
Find Blob Containers node, right-click on it and hit Create Blob Container option:
-
Upload the two custom setup files — main.cmd & the MSI installer — into your Azure Blob Storage container's folder:
- It was easy, wasn't it? It's time we create an SAS URI in the next step.
Create SAS URI for Azure Blob Container
Once you have custom setup files prepared, it's time we generate an SAS URI.
This SAS URI will be used by a new Azure-SSIS runtime to install SSIS PowerPack inside the runtime's node, a VM.
Let's proceed together by performing the steps below:
- Install and launch Azure Storage Explorer.
-
Right-click on the Storage Accounts node and then hit Connect to Azure storage... menu item:
-
Proceed by right-clicking on that container node and select Get Shared Access Signature... option.
-
Next, set the Expiry time field to a date far in the future.
If you restart Azure-SSIS runtime and your SAS URI is expired by that time, it will not start.
-
Select Read, Create, Write, and List permissions:
We also recommend to add Delete permission too to support future functionality.
-
Copy SAS URL to the clipboard and save it for the next step:
You can also generate and copy SAS URL from within Azure Portal itself:
Create Azure-SSIS integration runtime
Once you have the SAS URL we obtained in the previous step, we are ready to move on to create an Azure-SSIS runtime in Azure Data Factory:
-
Firstly, perform the steps described in Create an Azure-SSIS integration runtime article in Azure Data Factory reference.
-
In Advanced settings page section, configure Custom setup container SAS URI you obtained in the previous step:
-
And you are done! That was quick! You can see your Azure-SSIS runtime up and running:
The custom setup script is executed only once — at the time an Azure-SSIS runtime is started.
It is also executed if you stop and start Azure-SSIS runtime again.
Deploy SSIS package in Visual Studio
We are ready to deploy the SSIS package to Azure-SSIS runtime. Once you do that, proceed to the next step for the grand finale!
Execute SSIS package in SQL Server Management Studio (SSMS)
After all hard work, we are ready to execute SSIS package in SQL Server Management Studio (SSMS):
- Connect to the SQL Server which is linked to your Azure-SSIS runtime and contains SSISDB database.
-
Navigate to Integration Services Catalog » Your Folder » Your Project » Your Package, right-click on it, and hit Execute...:
-
To view the status of the past execution, navigate to
Integration Services Catalog » Your Folder » Your Project » Your Package, right-click on it, and select Reports » Standard Reports » All Executions menu item:
Scenarios
Moving SSIS PowerPack license to another Azure-SSIS runtime
If you are a Paid Customer, there will be a time when you no longer use Azure-SSIS runtime or you need to use your license on a different ADF instance.
To transfer a license from one Azure-SSIS runtime to another, perform these steps:
-
Copy & paste this script into main.cmd we used in the previous step:
set DIR=%CUSTOM_SETUP_SCRIPT_LOG_DIR%
echo Calling Step 1 : %TIME% >> "%DIR%\steps_log.txt"
dir /s /b > "%DIR%\file_list.txt"
echo Calling Step 2 : %TIME% >> "%DIR%\steps_log.txt"
::Install SSIS PowerPack
msiexec /i "SSISPowerPackSetup_64bit.msi" ADDLOCAL=ALL /q /L*V "%DIR%\powerpack_install_log.txt"
echo Calling Step 3 : %TIME% >> "%DIR%\steps_log.txt"
::De-Activate same license
"C:\Program Files (x86)\ZappySys\SSIS PowerPack (64 bit)\LicenseManager.exe" -p SSISPowerPack --unregister --logfile "%DIR%\powerpack_un_register_log.txt"
::Show System Info
echo Calling Step 4 : %TIME% >> "%DIR%\steps_log.txt"
"C:\Program Files (x86)\ZappySys\SSIS PowerPack (64 bit)\LicenseManager.exe" -i -l "%DIR%\sysinfo_log.txt"
echo Calling Step 5 : %TIME% >> "%DIR%\steps_log.txt"
dir "C:\Program Files\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
dir "C:\Program Files (x86)\Microsoft SQL Server\*Zappy*.*" /s /b >> "%DIR%\installed_files.txt"
echo DONE : %TIME% >> "%DIR%\steps_log.txt"
echo complete
- Start Azure-SSIS runtime.
This will unregister your license on the original Azure-SSIS runtime.
- Stop Azure-SSIS runtime to deallocate resources in Azure.
- Now you are free to activate it on another Azure-SSIS runtime.
Advanced topics
Actions supported by Google Drive Connector
Google Drive Connector support following actions for REST API integration.
If some actions are not listed below then you can easily edit Connector file and enhance out of the box functionality.
Lists the user's shared drives (i.e. formally known as team drive) with optional search criteria [
Read more...
]
List files / folders with search criteria [
Read more...
]
Parameter |
Description |
Search Criteria |
Data filter (e.g. somecolumn -eq 'somevalue' ) , User can select multiple filter criteria using 'AND','OR' operator , Note*:- please refer the link for more filter criteria : https://zappysys.com/links?url=https://developers.google.com/drive/api/guides/search-files
Option |
Value |
None |
|
By name (exact name match) |
name='abc' |
By name (contains sub string) |
name contains 'abc' |
By name (does not contain) |
not name contains 'abc' |
By text (search inside file) |
fullText contains 'abc' |
List items from a folder |
'your-folder-id' in parents and trashed=false |
List deleted items from a folder |
'your-folder-id' in parents and trashed=true |
By Parent Folder Id |
'your-folder-id' in parents |
By created time |
createdTime > '2012-06-04T12:00:00' |
By modified time |
modifiedTime > '2012-06-04T12:00:00' |
Allow only shared files and folders |
sharedWithMe=true |
Exclude trashed files |
trashed=false |
Include files from trash |
mimeType!='application/vnd.google-apps.folder' |
Exclude files from trash |
mimeType!='application/vnd.google-apps.folder' and trashed!=true |
Exclude Folders |
mimeType!='application/vnd.google-apps.folder' |
Exclude App Script |
mimeType!='application/vnd.google-apps.script' |
Search for spreadsheet |
mimeType = 'application/vnd.google-apps.spreadsheet' |
Search for multiple files type |
mimeType contains 'application/vnd.google-apps.spreadsheet' OR mimeType contains 'application/vnd.google-apps.document' OR mimeType contains 'application/vnd.google-apps.presentation' OR mimeType contains 'application/vnd.google-apps.drawing' |
|
List files / folders from a parent folder (Recursive)
List files or folders under a specified parent folder [
Read more...
]
Parameter |
Description |
Extra Query (must start with ' and ' --OR-- ' or ') |
Data filter (e.g. somecolumn -eq 'somevalue' ) , User can select multiple filter criteria using 'AND','OR' operator , Note*:- please refer the link for more filter criteria : https://zappysys.com/links?url=https://developers.google.com/drive/api/guides/search-files
Option |
Value |
None |
|
By name (exact name match) |
and name='abc' |
By name (contains sub string) |
and name contains 'abc' |
By name (does not contain) |
and not name contains 'abc' |
By text (search inside file) |
and fullText contains 'abc' |
By created time |
and createdTime > '2012-06-04T12:00:00' |
By modified time |
and modifiedTime > '2012-06-04T12:00:00' |
Allow only shared files and folders |
and sharedWithMe=true |
Exclude App Script |
and mimeType!='application/vnd.google-apps.script' |
Search for spreadsheet |
and mimeType = 'application/vnd.google-apps.spreadsheet' |
Search for multiple files type |
and mimeType contains 'application/vnd.google-apps.spreadsheet' OR mimeType contains 'application/vnd.google-apps.document' OR mimeType contains 'application/vnd.google-apps.presentation' OR mimeType contains 'application/vnd.google-apps.drawing' |
|
Search under Folder Id (Keep Blank for all folders) |
Folder Id for which you like to list all files
|
Include files from trash |
Option |
Value |
false |
false |
true |
true |
|
Search Item type (i.e. files or folders) |
Option |
Value |
all |
all |
files |
files |
folders |
folders |
files_native |
files_native |
files_exclude_native |
files_exclude_native |
sheets |
sheets |
documents |
documents |
|
Search items (i.e. files / folders)
Lists items (i.e. files / folders) with search criteria [
Read more...
]
Lists folders [
Read more...
]
Parameter |
Description |
Search Criteria |
Data filter (e.g. somecolumn -eq 'somevalue' ) , User can select multiple filter criteria using 'AND','OR' operator , Note*:- please refer the link for more filter criteria : https://zappysys.com/links?url=https://developers.google.com/drive/api/guides/search-files
Option |
Value |
None |
|
By name (exact name match) |
name='abc' |
By name (contains sub string) |
name contains 'abc' |
By name (does not contain) |
not name contains 'abc' |
By text (search inside file) |
fullText contains 'abc' |
By created time |
createdTime > '2012-06-04T12:00:00' |
Include folders from trash |
mimeType='application/vnd.google-apps.folder' |
Exclude folders from trash |
mimeType='application/vnd.google-apps.folder' and trashed!=true |
By modified time |
modifiedTime > '2012-06-04T12:00:00' |
Allow only shared files and folders |
sharedWithMe=true |
Exclude trashed files |
trashed=false |
Exclude Folders |
mimeType!='application/vnd.google-apps.folder' |
Exclude App Script |
mimeType!='application/vnd.google-apps.script' |
Search for spreadsheet |
mimeType = 'application/vnd.google-apps.spreadsheet' |
Search for multiple files type |
mimeType contains 'application/vnd.google-apps.spreadsheet' OR mimeType contains 'application/vnd.google-apps.document' OR mimeType contains 'application/vnd.google-apps.presentation' OR mimeType contains 'application/vnd.google-apps.drawing' |
|
List deleted files / folders
Lists only deleted files / folders from trash [
Read more...
]
Parameter |
Description |
Search Criteria |
|
Gets information of a file [
Read more...
]
Parameter |
Description |
Id |
Id of a file you want to see information for
|
Downloads a file [
Read more...
]
Parameter |
Description |
File Id |
Id of a file you want to download
|
Export a document editor files such document, Spreadsheets, Drawings, Presentations, Apps Scripts to common file formats as defined in this link https://developers.google.com/drive/api/guides/ref-export-formats [
Read more...
]
Parameter |
Description |
File Id |
Id of a file you want to export and file must be editor type (e.g. document, Spreadsheets, Drawings, Presentations, Apps Scripts)
|
Export As (Mime Type) |
The MIME type of the format requested for this export.
Option |
Value |
None |
|
Export to PDF |
application/pdf |
Export to HTML |
text/html |
Export to HTML (zipped) |
application/zip |
Export to Plain text |
text/plain |
Export to Rich text |
application/rtf |
Export to Open Office doc |
application/vnd.oasis.opendocument.text |
Export to MS Word document |
application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Export to EPUB |
application/epub+zip |
Export to MS Excel |
application/vnd.openxmlformats-officedocument.spreadsheetml.sheet |
Export to Open Office sheet |
application/x-vnd.oasis.opendocument.spreadsheet |
Export to (sheet only) |
text/tab-separated-values |
Export to JPEG |
image/jpeg |
Export to PNG |
image/png |
Export to SVG |
image/svg+xml |
Export to MS PowerPoint |
application/vnd.openxmlformats-officedocument.presentationml.presentation |
Export to Open Office presentation |
application/vnd.oasis.opendocument.presentation |
Export to JSON |
application/vnd.google-apps.script+json |
|
fields |
fields
|
Uploads a file. If file exits it does not overwrite. If you like to overwrite existing file then use [Replace file data] endpoint. It requires FileId you like to replace. Get file id by calling list_parent_items or list_files endpoint (Search for file name and get id). Google API does not allow to perform create or replace in one operation. [
Read more...
]
Parameter |
Description |
Keep Revision Forever |
Defines whether uploaded file has revisions
Option |
Value |
true |
true |
false |
false |
|
Parent FolderId |
Id of a parent folder you want to upload the file in. Use value 'root' to place the file in the topmost level.
|
Shared DriveId |
By default file lists from MyDrive but if you like to search other Shared drive then set this parameter.
|
Drive Type |
Default search context is User's drive. Bodies of items (files/documents) to which the query applies. Supported bodies are 'user', 'domain', 'drive', and 'allDrives'. Prefer 'user' or 'drive' to 'allDrives' for efficiency. By default, corpora is set to 'user'. However, this can change depending on the filter set through the 'Query' parameter.
Option |
Value |
My Drive |
user |
Shared Drive |
drive |
|
Supports all drives (e.g. My and Shared) |
Whether the requesting application supports both My Drives and shared drives.
Option |
Value |
true |
true |
false |
false |
|
Target FileName |
A filename the file will have in Google Drive
|
Local FilePath |
Specify a disk file path
|
AddParents |
A comma-separated list of parent IDs to add
|
OcrLanguage |
A language hint for OCR processing during image import (ISO 639-1 code).
|
UseContentAsIndexableText |
Whether to use the uploaded content as indexable text.
|
Upload a file (with overwrite action)
Uploads a file (if file with same name exists then overwrite else create a new file). [
Read more...
]
Parameter |
Description |
Keep Revision Forever |
Defines whether uploaded file has revisions
Option |
Value |
true |
true |
false |
false |
|
Parent FolderId |
Id of a parent folder you want to upload the file in. Use value 'root' to place the file in the topmost level.
|
Shared DriveId |
By default file lists from MyDrive but if you like to search other Shared drive then set this parameter.
|
Drive Type |
Default search context is User's drive. Bodies of items (files/documents) to which the query applies. Supported bodies are 'user', 'domain', 'drive', and 'allDrives'. Prefer 'user' or 'drive' to 'allDrives' for efficiency. By default, corpora is set to 'user'. However, this can change depending on the filter set through the 'Query' parameter.
Option |
Value |
My Drive |
user |
Shared Drive |
drive |
|
Supports all drives (e.g. My and Shared) |
Whether the requesting application supports both My Drives and shared drives.
Option |
Value |
true |
true |
false |
false |
|
Target FileName |
A filename the file will have in Google Drive
|
Local FilePath |
Specify a disk file path
|
File Overwrite Mode |
|
AddParents |
A comma-separated list of parent IDs to add
|
OcrLanguage |
A language hint for OCR processing during image import (ISO 639-1 code).
|
UseContentAsIndexableText |
Whether to use the uploaded content as indexable text.
|
Creates a folder [
Read more...
]
Parameter |
Description |
Name |
A folder name the folder will have in Google Drive
|
Parent FolderId |
Id of a parent folder you want to create the folder in. Use value 'root' to create the folder in the topmost level.
|
Duplicates a file [
Read more...
]
Parameter |
Description |
File Id |
Id of a file you want to duplicate
|
Replace file data (keep same file id)
Update file with new content / metadata (keep same file Id) [
Read more...
]
Parameter |
Description |
File Id |
Id of a file you want to update
|
DiskFilePath |
A disk file path you want to update file contents with
|
KeepRevisionForever |
Defines whether uploaded file has revisions
|
AddParents |
A comma-separated list of parent IDs to add
|
OcrLanguage |
A language hint for OCR processing during image import (ISO 639-1 code).
|
UseContentAsIndexableText |
Whether to use the uploaded content as indexable text.
|
Update metadata in a file
Updates meta-data in a file [
Read more...
]
Parameter |
Description |
File Id |
Id of a file you want to update meta-data to
|
Keep Revision Forever |
Defines whether uploaded file has revisions
Option |
Value |
True |
True |
False |
False |
|
AddParents |
A comma-separated list of parent IDs to add
|
OcrLanguage |
A language hint for OCR processing during image import (ISO 639-1 code).
|
UseContentAsIndexableText |
Whether to use the uploaded content as indexable text.
|
This is generic endpoint. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL. [
Read more...
]
Parameter |
Description |
Url |
API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
|
Body |
Request Body content goes here
|
IsMultiPart |
Set this option if you want to upload file(s) using either raw file data (i.e., POST raw file data) or send data using the multi-part encoding method (i.e. Content-Type: multipart/form-data).
A multi-part request allows you to mix key/value pairs and upload files in the same request. On the other hand, raw upload allows only a single file to be uploaded (without any key/value data).
==== Raw Upload (Content-Type: application/octet-stream) =====
To upload a single file in raw mode, check this option and specify the full file path starting with the @ sign in the Body (e.g. @c:\data\myfile.zip)
==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) =====
To treat your request data as multi-part fields, you must specify key/value pairs separated by new lines in the RequestData field (i.e., Body). Each key/value pair should be entered on a new line, and key/value are separated using an equal sign (=). Leading and trailing spaces are ignored, and blank lines are also ignored.
If a field value contains any special character(s), use escape sequences (e.g., for NewLine: \r\n, for Tab: \t, for at (@): @). When the value of any field starts with the at sign (@), it is automatically treated as a file you want to upload. By default, the file content type is determined based on the file extension; however, you can supply a content type manually for any field using this format: [YourFileFieldName.Content-Type=some-content-type].
By default, file upload fields always include Content-Type in the request (non-file fields do not have Content-Type by default unless you supply it manually). If, for some reason, you don't want to use the Content-Type header in your request, then supply a blank Content-Type to exclude this header altogether (e.g., SomeFieldName.Content-Type=).
In the example below, we have supplied Content-Type for file2 and SomeField1. All other fields are using the default content type.
See the example below of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data, then manually set Request Header => Content-Type: multipart/mixed (it must start with multipart/ or it will be ignored).
file1=@c:\data\Myfile1.txt
file2=@c:\data\Myfile2.json
file2.Content-Type=application/json
SomeField1=aaaaaaa
SomeField1.Content-Type=text/plain
SomeField2=12345
SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab
SomeFieldStartingWithAtSign=\@MyTwitterHandle
|
Filter |
Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Option |
Value |
No filter |
|
Example1 |
$.store.books[*] |
Example2 (Sections Under Books) |
$.store.books[*].sections[*] |
Example3 (Equals) |
$.store.books[?(@author=='sam')] |
Example4 (Equals - Any Section) |
$..[?(@author=='sam')] |
Example5 (Not Equals - Any Section) |
$..[?(@author!='sam')] |
Example6 (Number less than) |
$.store.books[?(@.price<10)]
Example7 (Regular Expression - Contains Pattern)=$.store.books[?(@author=~ /sam|bob/ )] |
Example8 (Regular Expression - Does Not Contain Pattern) |
$.store.books[?(@author=~ /^((?!sam|bob).)*$/ )] |
Example9 (Regular Expression - Exact Pattern Match) |
$.store.books[?(@author=~ /^sam|bob$/ )] |
Example10 (Regular Expression - Starts With) |
$.store.books[?(@author=~ /^sam/ )] |
Example11 (Regular Expression - Ends With) |
$.store.books[?(@author=~ /sam$/ )] |
Example12 (Between) |
$.store.employees[?( @.hiredate>'2015-01-01' && @.hiredate<'2015-01-04' )] |
|
Headers |
Headers for Request. To enter multiple headers use double pipe or new line after each {header-name}:{value} pair
|
Generic Request (Bulk Write)
This is a generic endpoint for bulk write purpose. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL. [
Read more...
]
Parameter |
Description |
Url |
API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
|
IsMultiPart |
Set this option if you want to upload file(s) using either raw file data (i.e., POST raw file data) or send data using the multi-part encoding method (i.e. Content-Type: multipart/form-data).
A multi-part request allows you to mix key/value pairs and upload files in the same request. On the other hand, raw upload allows only a single file to be uploaded (without any key/value data).
==== Raw Upload (Content-Type: application/octet-stream) =====
To upload a single file in raw mode, check this option and specify the full file path starting with the @ sign in the Body (e.g. @c:\data\myfile.zip)
==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) =====
To treat your request data as multi-part fields, you must specify key/value pairs separated by new lines in the RequestData field (i.e., Body). Each key/value pair should be entered on a new line, and key/value are separated using an equal sign (=). Leading and trailing spaces are ignored, and blank lines are also ignored.
If a field value contains any special character(s), use escape sequences (e.g., for NewLine: \r\n, for Tab: \t, for at (@): @). When the value of any field starts with the at sign (@), it is automatically treated as a file you want to upload. By default, the file content type is determined based on the file extension; however, you can supply a content type manually for any field using this format: [YourFileFieldName.Content-Type=some-content-type].
By default, file upload fields always include Content-Type in the request (non-file fields do not have Content-Type by default unless you supply it manually). If, for some reason, you don't want to use the Content-Type header in your request, then supply a blank Content-Type to exclude this header altogether (e.g., SomeFieldName.Content-Type=).
In the example below, we have supplied Content-Type for file2 and SomeField1. All other fields are using the default content type.
See the example below of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data, then manually set Request Header => Content-Type: multipart/mixed (it must start with multipart/ or it will be ignored).
file1=@c:\data\Myfile1.txt
file2=@c:\data\Myfile2.json
file2.Content-Type=application/json
SomeField1=aaaaaaa
SomeField1.Content-Type=text/plain
SomeField2=12345
SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab
SomeFieldStartingWithAtSign=\@MyTwitterHandle
|
Filter |
Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
|
Headers |
Headers for Request. To enter multiple headers use double pipe (||) or new line after each {header-name}:{value} pair
|
Conclusion
In this article we showed you how to connect to Google Drive in Azure Data Factory (SSIS) and integrate data without any coding, saving you time and effort.
It's worth noting that ZappySys API Driver allows you to connect not only to Google Drive,
but to any Java application that supports JDBC
(just use a different JDBC driver and configure it appropriately).
We encourage you to download Google Drive Connector for Azure Data Factory (SSIS) and see how easy it is to use it for yourself or your team.
If you have any questions, feel free to contact ZappySys support team.
You can also open a live chat immediately by clicking on the chat icon below.
Download Google Drive Connector for Azure Data Factory (SSIS)
Documentation
More integrations
Other connectors for Azure Data Factory (SSIS)
All
Big Data & NoSQL
Database
CRM & ERP
Marketing
Collaboration
Cloud Storage
Reporting
Commerce
API & Files
Other application integration scenarios for Google Drive
All
Data Integration
Database
BI & Reporting
Productivity
Programming Languages
Automation & Scripting
ODBC applications
Common Searches:
How to connect Google Drive in Azure Data Factory (SSIS)?
How to get Google Drive data in Azure Data Factory (SSIS)?
How to read Google Drive data in Azure Data Factory (SSIS)?
How to load Google Drive data in Azure Data Factory (SSIS)?
How to import Google Drive data in Azure Data Factory (SSIS)?
How to pull Google Drive data in Azure Data Factory (SSIS)?
How to push data to Google Drive in Azure Data Factory (SSIS)?
How to write data to Google Drive in Azure Data Factory (SSIS)?
How to POST data to Google Drive in Azure Data Factory (SSIS)?
Call Google Drive API in Azure Data Factory (SSIS)
Consume Google Drive API in Azure Data Factory (SSIS)
Google Drive Azure Data Factory (SSIS) Automate
Google Drive Azure Data Factory (SSIS) Integration
Integration Google Drive in Azure Data Factory (SSIS)
Consume real-time Google Drive data in Azure Data Factory (SSIS)
Consume real-time Google Drive API data in Azure Data Factory (SSIS)
Google Drive ODBC Driver | ODBC Driver for Google Drive | ODBC Google Drive Driver | SSIS Google Drive Source | SSIS Google Drive Destination
Connect Google Drive in Azure Data Factory (SSIS)
Load Google Drive in Azure Data Factory (SSIS)
Load Google Drive data in Azure Data Factory (SSIS)
Read Google Drive data in Azure Data Factory (SSIS)
Google Drive API Call in Azure Data Factory (SSIS)