SSIS Google BigQuery Connector

Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector
You can use this connector to integrate Google BigQuery data inside SSIS and SQL Server. Let's take a look at the steps below to see how exactly to accomplish that.

Download  Help File  Buy 

Video Tutorial - Integrate Google BigQuery data in SSIS

This video covers following and more so watch carefully. After watching this video follow the steps described in this article.

  • How to download / install required driver for Google BigQuery integration in SSIS
  • How to configure connection for Google BigQuery
  • Features about API Source (Authentication / Query Language / Examples / Driver UI)
  • Using Google BigQuery Connection in SSIS

Prerequisites

Before we perform steps listed in this article, you will need to make sure following prerequisites are met:
  1. SSIS designer installed. Sometimes it is referred as BIDS or SSDT (download it from Microsoft site).
  2. Basic knowledge of SSIS package development using Microsoft SQL Server Integration Services.
  3. Make sure ZappySys SSIS PowerPack is downloaded and installed (download it). Check Getting started section for more information.
  4. Optional (If you want to Deploy and Schedule ) - Deploy and Schedule SSIS Packages

How to read data from Google BigQuery in SSIS (Export data)

In this section we will learn how to configure and use Google BigQuery Connector in the API Source to extract data from the Google BigQuery.

  1. Begin with opening Visual Studio and Create a New Project.

  2. Select Integration Service Project and in new project window set the appropriate name and location for project. And click OK.

  3. In the the new SSIS project screen you will find the following:

    1. SSIS ToolBox on left side bar
    2. Solution Explorer and Property Window on right bar
    3. Control flow, data flow, event Handlers, Package Explorer in tab windows
    4. Connection Manager Window in the bottom
    SSIS Project Screen
    Note: If you don't see ZappySys SSIS PowerPack Task or Components in SSIS Toolbox, please refer to this help link.

  4. Now, Drag and Drop SSIS Data Flow Task from SSIS Toolbox. Double click on the Data Flow Task to see Data Flow designer.
    SSIS Data Flow Task - Drag and Drop

  5. From the SSIS toolbox drag and API Source (Predefined Templates) on the data flow designer surface, and double click on it to edit it:
    SSIS API Source (Predefined Templates) - Drag and Drop

  6. Select New Connection to create a new connection:
    API Source - New Connection

  7. Use a preinstalled Google BigQuery Connector from Popular Connector List or press Search Online radio button to download Google BigQuery Connector. Once downloaded simply use it in the configuration:

    Google BigQuery
    Google BigQuery Connector Selection

  8. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    Steps to get Google BigQuery Credentials
    This connection can be configured using two ways. Use Default App (Created by ZappySys) OR Use Custom App created by you.
    To use minimum settings you can start with ZappySys created App. Just change UseCustomApp=false on the properties grid so you dont need ClientID / Secret. When you click Generate Token you might see warning about App is not trusted (Simply Click Advanced Link to expand hidden section and then click Go to App link to Proceed).

    To register custom App, perform the following steps (Detailed steps found in the help link at the end)

    1. Go to Google API Console
    2. From the Project Dropdown (usually found at the top bar) click Select Project
    3. On Project Propup click CREATE PROJECT
    4. Once project is created you can click Select Project to switch the context (You can click on Notification link or Choose from Top Dropdown)
    5. Click ENABLE APIS AND SERVICES
    6. Now we need to Enable two APIs one by one (BigQuery API and Cloud Resource Manager API).
    7. Search BigQuery API. Select and click ENABLE
    8. Search Cloud Resource Manager API. Select and click ENABLE
    9. Go to back to main screen of Google API Console
    10. Click OAuth consent screen Tab. Enter necessary details and Save.

      1. Choose Testing as Publishing status
      2. Set application User type to Internal, if possible
      3. If MAKE INTERNAL option is disabled, then add a user in Test users section, which you will use in authentication process when generating Access and Refresh tokens
    11. Click Credentials Tab
    12. Click CREATE CREDENTIALS (some where in topbar) and select OAuth Client ID option.
    13. When prompted Select Application Type as Desktop App and click Create to receive your ClientID and Secret. You can use this information now to configure Connection with UseCustomApp=true.
    Google BigQuery
    User Account [OAuth]
    https://www.googleapis.com/bigquery/v2
    Required Parameters
    UseCustomApp Fill in the parameter...
    ProjectId (Choose after [Generate Token] clicked) Fill in the parameter...
    DatasetId (Choose after [Generate Token] clicked and ProjectId selected) Fill in the parameter...
    Optional Parameters
    ClientId Fill in the parameter...
    ClientSecret Fill in the parameter...
    Scope Fill in the parameter...
    RetryMode Fill in the parameter...
    RetryStatusCodeList Fill in the parameter...
    RetryCountMax Fill in the parameter...
    RetryMultiplyWaitTime Fill in the parameter...
    Job Location Fill in the parameter...
    ZappySys OAuth Connection
    Steps to get Google BigQuery Credentials
    Use these steps to authenticate as service account rather than Google / GSuite User. Learn more about service account here

    Basically to call Google API as Service account we need to perform following steps listed in 3 sections (Detailed steps found in the help link at the end)

    Create Project

    First thing is create a Project so we can call Google API. Skip this section if you already have Project (Go to next section)
    1. Go to Google API Console
    2. From the Project Dropdown (usually found at the top bar) click Select Project
    3. On Project Propup click CREATE PROJECT
    4. Once project is created you can click Select Project to switch the context (You can click on Notification link or Choose from Top Dropdown)
    5. Click ENABLE APIS AND SERVICES
    6. Now we need to Enable two APIs one by one (BigQuery API and Cloud Resource Manager API).
    7. Search BigQuery API. Select and click ENABLE
    8. Search Cloud Resource Manager API. Select and click ENABLE

    Create Service Account

    Once Project is created and APIs are enabled we can now create a service account under that project. Service account has its ID which looks like some email ID (not to confuse with Google /Gmail email ID)
    1. Go to Create Service Account
    2. From the Project Dropdown (usually found at the top bar) click Select Project
    3. Enter Service account name and Service account description
    4. Click on Create. Now you should see an option to assign Service Account permissions (See Next Section).

    Give Permission to Service Account

    By default service account cant access BigQuery data or List BigQuery Projects so we need to give that permission using below steps.
    1. After you Create Service Account look for Permission drop down in the Wizard.
    2. Choose BigQuery -> BigQuery Admin role so we can read/write data. (NOTE: If you just need read only access then you can choose BigQuery Data Viewer)
    3. Now choose one more Project -> Viewer and add that role so we can query Project Ids.
    4. Click on Continue. Now you should see an option to Create Key (See Next Section).

    Create Key (P12)

    Once service account is created and Permission is assigned we need to create key file.
    1. In the Cloud Console, click the email address for the service account that you created.
    2. Click Keys.
    3. Click Add key, then click Create new key.
    4. Click Create and select P12 format. A P12 key file is downloaded to your computer. We will use this file in our API connection.
    5. Click Close.
    6. Now you may use downloaded *.p12 key file as secret file and Service Account Email as Client ID (e.g. some_name@some_name.iam.gserviceaccount.com).

    Manage Permissions / Give Access to Other Projects

    We saw how to add permissions for Service Account during Account Creation Wizard but if you ever wish to edit after its created or you wish to give permission for other projects then perform forllowing steps.
    1. From the top Select Project for which you like to edit Permission.
    2. Go to IAM Menu option (here)
      Link to IAM: https://console.cloud.google.com/iam-admin/iam
    3. Goto Permissions tab. Over there you will find ADD button.
    4. Enter Service account email for which you like to grant permission. Select role you wish to assign.
    Google BigQuery
    Service Account (Using Private Key File) [OAuth]
    https://www.googleapis.com/bigquery/v2
    Required Parameters
    Service Account Email Fill in the parameter...
    Service Account Private Key Path (i.e. *.p12) Fill in the parameter...
    ProjectId Fill in the parameter...
    DatasetId (Choose after ProjectId) Fill in the parameter...
    Optional Parameters
    Scope Fill in the parameter...
    RetryMode Fill in the parameter...
    RetryStatusCodeList Fill in the parameter...
    RetryCountMax Fill in the parameter...
    RetryMultiplyWaitTime Fill in the parameter...
    Job Location Fill in the parameter...
    ZappySys OAuth Connection

  9. Select the desired endpoint, change/pass the properties values, and click on Preview Data button to make the API call.

    API Source - Google BigQuery
    Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector
    API Source - Select Endpoint

  10. That's it; we are done. In a few clicks we configured the call to Google BigQuery using ZappySys Google BigQuery Connector

Load Google BigQuery data in SQL Server using Upsert Destination (Insert or Update)

Once you read data from the desired source, now let's see how to Load Google BigQuery data in SQL Server using Upsert Destination. Upsert Destination can Merge/Synchronize data from source to target for Microsoft SQL Server, PostgreSql and Redshift. It supports very fast Bulk Upsert (Update or Insert) operation along and Bulk delete.

  1. From the SSIS toolbox drag and drop Upsert Destination on the dataflow designer surface

  2. Connect our Source component to Upsert Destination

  3. Double click on Upsert Destination component to configure it.

  4. Select the desired Microsoft SQL Server/PostgreSql/Redshift Target Connection or click NEW to create new connection. Select Target Table or click NEW to create new table based on source columns.
    Configure SSIS Upsert Destination Connection - Loading data (REST / SOAP / JSON / XML /CSV) into SQL Server or other target using SSIS
    Configure SSIS Upsert Destination Connection - Loading data (REST / SOAP / JSON / XML /CSV) into SQL Server or other target using SSIS

  5. Set Action to Upsert => (insert if not matching in target else update). Select Target Connection and Target Table. Check on Insert and Update. Click on Map All to Mappings all columns and check on Only Primary Key columns.
    SSIS SQL Upsert option

  6. Click on OK to save Upsert Destination settings UI.

  7. That's it; we are done. Run the SSIS Package and it will read the data from the Google BigQuery and load the the data in the SQL Server/PostgreSql/Redshift.

Video Tutorial - Write or lookup data to Google BigQuery using SSIS

This video covers following and more so watch carefully. After watching this video follow the steps described in this article.

  • How to download SSIS PowerPack for Google BigQuery integration in SSIS
  • How to configure connection for Google BigQuery
  • How to Write or lookup data to Google BigQuery
  • Features about SSIS API Destination
  • Using Google BigQuery Connector in SSIS

How to write or lookup data to Google BigQuery in SSIS (Import data)

In upper section we learned how to read data, now in this section we will learn how to configure Google BigQuery in the API Source to Post data to the Google BigQuery.

  1. Read the data from the source, being any desired source component. In example we will use ZappySys Dummy Data Source component.

  2. From the SSIS Toolbox drag and drop API Destination (Predefined Templates) on the Data Flow Designer surface and connect source component with it, and double click to edit it.
    SSIS API Destination (Predefined Templates) - Drag and Drop

  3. Select New Connection to create a new connection:

    API Destination - Google BigQuery
    Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector
    API Destination - New Connection

  4. Use a preinstalled Google BigQuery Connector from Popular Connector List or press Search Online radio button to download Google BigQuery Connector. Once downloaded simply use it in the configuration:

    Google BigQuery
    Google BigQuery Connector Selection

  5. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    Steps to get Google BigQuery Credentials
    This connection can be configured using two ways. Use Default App (Created by ZappySys) OR Use Custom App created by you.
    To use minimum settings you can start with ZappySys created App. Just change UseCustomApp=false on the properties grid so you dont need ClientID / Secret. When you click Generate Token you might see warning about App is not trusted (Simply Click Advanced Link to expand hidden section and then click Go to App link to Proceed).

    To register custom App, perform the following steps (Detailed steps found in the help link at the end)

    1. Go to Google API Console
    2. From the Project Dropdown (usually found at the top bar) click Select Project
    3. On Project Propup click CREATE PROJECT
    4. Once project is created you can click Select Project to switch the context (You can click on Notification link or Choose from Top Dropdown)
    5. Click ENABLE APIS AND SERVICES
    6. Now we need to Enable two APIs one by one (BigQuery API and Cloud Resource Manager API).
    7. Search BigQuery API. Select and click ENABLE
    8. Search Cloud Resource Manager API. Select and click ENABLE
    9. Go to back to main screen of Google API Console
    10. Click OAuth consent screen Tab. Enter necessary details and Save.

      1. Choose Testing as Publishing status
      2. Set application User type to Internal, if possible
      3. If MAKE INTERNAL option is disabled, then add a user in Test users section, which you will use in authentication process when generating Access and Refresh tokens
    11. Click Credentials Tab
    12. Click CREATE CREDENTIALS (some where in topbar) and select OAuth Client ID option.
    13. When prompted Select Application Type as Desktop App and click Create to receive your ClientID and Secret. You can use this information now to configure Connection with UseCustomApp=true.
    Google BigQuery
    User Account [OAuth]
    https://www.googleapis.com/bigquery/v2
    Required Parameters
    UseCustomApp Fill in the parameter...
    ProjectId (Choose after [Generate Token] clicked) Fill in the parameter...
    DatasetId (Choose after [Generate Token] clicked and ProjectId selected) Fill in the parameter...
    Optional Parameters
    ClientId Fill in the parameter...
    ClientSecret Fill in the parameter...
    Scope Fill in the parameter...
    RetryMode Fill in the parameter...
    RetryStatusCodeList Fill in the parameter...
    RetryCountMax Fill in the parameter...
    RetryMultiplyWaitTime Fill in the parameter...
    Job Location Fill in the parameter...
    ZappySys OAuth Connection
    Steps to get Google BigQuery Credentials
    Use these steps to authenticate as service account rather than Google / GSuite User. Learn more about service account here

    Basically to call Google API as Service account we need to perform following steps listed in 3 sections (Detailed steps found in the help link at the end)

    Create Project

    First thing is create a Project so we can call Google API. Skip this section if you already have Project (Go to next section)
    1. Go to Google API Console
    2. From the Project Dropdown (usually found at the top bar) click Select Project
    3. On Project Propup click CREATE PROJECT
    4. Once project is created you can click Select Project to switch the context (You can click on Notification link or Choose from Top Dropdown)
    5. Click ENABLE APIS AND SERVICES
    6. Now we need to Enable two APIs one by one (BigQuery API and Cloud Resource Manager API).
    7. Search BigQuery API. Select and click ENABLE
    8. Search Cloud Resource Manager API. Select and click ENABLE

    Create Service Account

    Once Project is created and APIs are enabled we can now create a service account under that project. Service account has its ID which looks like some email ID (not to confuse with Google /Gmail email ID)
    1. Go to Create Service Account
    2. From the Project Dropdown (usually found at the top bar) click Select Project
    3. Enter Service account name and Service account description
    4. Click on Create. Now you should see an option to assign Service Account permissions (See Next Section).

    Give Permission to Service Account

    By default service account cant access BigQuery data or List BigQuery Projects so we need to give that permission using below steps.
    1. After you Create Service Account look for Permission drop down in the Wizard.
    2. Choose BigQuery -> BigQuery Admin role so we can read/write data. (NOTE: If you just need read only access then you can choose BigQuery Data Viewer)
    3. Now choose one more Project -> Viewer and add that role so we can query Project Ids.
    4. Click on Continue. Now you should see an option to Create Key (See Next Section).

    Create Key (P12)

    Once service account is created and Permission is assigned we need to create key file.
    1. In the Cloud Console, click the email address for the service account that you created.
    2. Click Keys.
    3. Click Add key, then click Create new key.
    4. Click Create and select P12 format. A P12 key file is downloaded to your computer. We will use this file in our API connection.
    5. Click Close.
    6. Now you may use downloaded *.p12 key file as secret file and Service Account Email as Client ID (e.g. some_name@some_name.iam.gserviceaccount.com).

    Manage Permissions / Give Access to Other Projects

    We saw how to add permissions for Service Account during Account Creation Wizard but if you ever wish to edit after its created or you wish to give permission for other projects then perform forllowing steps.
    1. From the top Select Project for which you like to edit Permission.
    2. Go to IAM Menu option (here)
      Link to IAM: https://console.cloud.google.com/iam-admin/iam
    3. Goto Permissions tab. Over there you will find ADD button.
    4. Enter Service account email for which you like to grant permission. Select role you wish to assign.
    Google BigQuery
    Service Account (Using Private Key File) [OAuth]
    https://www.googleapis.com/bigquery/v2
    Required Parameters
    Service Account Email Fill in the parameter...
    Service Account Private Key Path (i.e. *.p12) Fill in the parameter...
    ProjectId Fill in the parameter...
    DatasetId (Choose after ProjectId) Fill in the parameter...
    Optional Parameters
    Scope Fill in the parameter...
    RetryMode Fill in the parameter...
    RetryStatusCodeList Fill in the parameter...
    RetryCountMax Fill in the parameter...
    RetryMultiplyWaitTime Fill in the parameter...
    Job Location Fill in the parameter...
    ZappySys OAuth Connection

  6. Select the desired endpoint, change/pass the properties values, and go to the Mappings tab to map the columns.

    API Destination - Google BigQuery
    Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector
    API Destination - Select Endpoint

  7. Finally, map the the desired columns:

    API Destination - Google BigQuery
    Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector
    API Destination - Columns Mapping

  8. That's it; we successfully configured the POST API Call. In a few clicks we configured the Google BigQuery API call using ZappySys Google BigQuery Connector

Conclusion

In this article we discussed how to connect to Google BigQuery in SSIS and integrate data without any coding. Click here to Download Google BigQuery Connector for SSIS and try yourself see how easy it is. If you still have any question(s) then ask here or simply click on live chat icon below and ask our expert (see bottom-right corner of this page).

Download Google BigQuery Connector for SSIS Documentation 

Actions supported by Google BigQuery Connector

Google BigQuery Connector support following actions for REST API integration. If some actions are not listed below then you can easily edit Connector file and enhance out of the box functionality.
 Read Data using SQL Query -OR- Execute Script (i.e. CREATE, SELECT, INSERT, UPDATE, DELETE)
Runs a BigQuery SQL query synchronously and returns query results if the query completes within a specified timeout    [Read more...]
Parameter Description
SQL Statement (i.e. SELECT / DROP / CREATE)
Option Value
Example1 SELECT title,id,language,wp_namespace,reversion_id ,comment,num_characters FROM bigquery-public-data.samples.wikipedia LIMIT 1000
Example2 CREATE TABLE TestDataset.Table1 (ID INT64,Name STRING,BirthDate DATETIME, Active BOOL)
Example3 INSERT TestDataset.Table1 (ID, Name,BirthDate,Active) VALUES(1,'AA','2020-01-01',true),(2,'BB','2020-01-02',true),(3,'CC','2020-01-03',false)
Use Legacy SQL Syntax?
Option Value
false false
true true
timeout (Milliseconds) Wait until timeout is reached.
Option Value
false false
true true
Job Location The geographic location where the job should run. For Non-EU and Non-US datacenters we suggest you to supply this parameter to avoid any error.
Option Value
System Default
Data centers in the United States US
Data centers in the European Union EU
Columbus, Ohio us-east5
Iowa us-central1
Las Vegas us-west4
Los Angeles us-west2
Montréal northamerica-northeast1
Northern Virginia us-east4
Oregon us-west1
Salt Lake City us-west3
São Paulo southamerica-east1
Santiago southamerica-west1
South Carolina us-east1
Toronto northamerica-northeast2
Delhi asia-south2
Hong Kong asia-east2
Jakarta asia-southeast2
Melbourne australia-southeast2
Mumbai asia-south1
Osaka asia-northeast2
Seoul asia-northeast3
Singapore asia-southeast1
Sydney australia-southeast1
Taiwan asia-east1
Tokyo asia-northeast1
Belgium europe-west1
Finland europe-north1
Frankfurt europe-west3
London europe-west2
Madrid europe-southwest1
Milan europe-west8
Netherlands europe-west4
Paris europe-west9
Warsaw europe-central2
Zürich europe-west6
AWS - US East (N. Virginia) aws-us-east-1
Azure - East US 2 azure-eastus2
Custom Name (Type your own) type-region-id-here
 Read Table Rows
Gets the specified table resource by table ID. This method does not return the data in the table, it only returns the table resource, which describes the structure of this table.    [Read more...]
Parameter Description
ProjectId Leave this value blank to use ProjectId from connection settings
DatasetId Leave this value blank to use DatasetId from connection settings
TableId
 [$parent.tableReference.datasetId$].[$parent.tableReference.tableId$]
Read data from [$parent.tableReference.datasetId$].[$parent.tableReference.tableId$] for project .    [Read more...]
Parameter Description
 List Projects
Lists Projects that the caller has permission on and satisfy the specified filter.    [Read more...]
Parameter Description
SearchFilter An expression for filtering the results of the request. Filter rules are case insensitive. If multiple fields are included in a filter query, the query will return results that match any of the fields. Some eligible fields for filtering are: name, id, labels.{key} (where key is the name of a label), parent.type, parent.id, lifecycleState. Example: name:how*
 List Datasets
Lists all BigQuery datasets in the specified project to which the user has been granted the READER dataset role.    [Read more...]
Parameter Description
ProjectId
SearchFilter An expression for filtering the results of the request. Filter rules are case insensitive. If multiple fields are included in a filter query, the query will return results that match any of the fields. Some eligible fields for filtering are: name, id, labels.{key} (where key is the name of a label), parent.type, parent.id, lifecycleState. Example: name:how*
all Whether to list all datasets, including hidden ones
Option Value
True True
False False
 Create Dataset
Creates a new empty dataset.    [Read more...]
Parameter Description
ProjectId
Dataset Name Enter dataset name
Description
 Delete Dataset
Deletes the dataset specified by the datasetId value. Before you can delete a dataset, you must delete all its tables, either manually or by specifying deleteContents. Immediately after deletion, you can create another dataset with the same name.    [Read more...]
Parameter Description
ProjectId
DatasetId
Delete All Tables If True, delete all the tables in the dataset. If False and the dataset contains tables, the request will fail. Default is False
Option Value
True True
False False
 Delete Table
Deletes the dataset specified by the datasetId value. Before you can delete a dataset, you must delete all its tables, either manually or by specifying deleteContents. Immediately after deletion, you can create another dataset with the same name.    [Read more...]
Parameter Description
ProjectId
DatasetId
TableId
 List Tables
Lists BigQuery Tables for the specified project / dataset to which the user has been granted the READER dataset role.    [Read more...]
Parameter Description
ProjectId
DatasetId
 Get Query Schema (From SQL)
Runs a BigQuery SQL query synchronously and returns query schema    [Read more...]
Parameter Description
SQL Query
Use Legacy SQL Syntax?
Option Value
false false
true true
timeout (Milliseconds) Wait until timeout is reached.
Option Value
false false
true true
Job Location The geographic location where the job should run. For Non-EU and Non-US datacenters we suggest you to supply this parameter to avoid any error.
Option Value
System Default
Data centers in the United States US
Data centers in the European Union EU
Columbus, Ohio us-east5
Iowa us-central1
Las Vegas us-west4
Los Angeles us-west2
Montréal northamerica-northeast1
Northern Virginia us-east4
Oregon us-west1
Salt Lake City us-west3
São Paulo southamerica-east1
Santiago southamerica-west1
South Carolina us-east1
Toronto northamerica-northeast2
Delhi asia-south2
Hong Kong asia-east2
Jakarta asia-southeast2
Melbourne australia-southeast2
Mumbai asia-south1
Osaka asia-northeast2
Seoul asia-northeast3
Singapore asia-southeast1
Sydney australia-southeast1
Taiwan asia-east1
Tokyo asia-northeast1
Belgium europe-west1
Finland europe-north1
Frankfurt europe-west3
London europe-west2
Madrid europe-southwest1
Milan europe-west8
Netherlands europe-west4
Paris europe-west9
Warsaw europe-central2
Zürich europe-west6
AWS - US East (N. Virginia) aws-us-east-1
Azure - East US 2 azure-eastus2
Custom Name (Type your own) type-region-id-here
 Get Table Schema
Gets the specified table resource by table ID. This method does not return the data in the table, it only returns the table resource, which describes the structure of this table.    [Read more...]
Parameter Description
DatasetId
TableId
 insert_table_data
   [Read more...]
Parameter Description
ProjectId
DatasetId
TableId
 post_[$parent.tableReference.datasetId$]_[$parent.tableReference.tableId$]
   [Read more...]
 Generic Request
This is generic endpoint. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.    [Read more...]
Parameter Description
Url API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
Body Request Body content goes here
IsMultiPart Check this option if you want to upload file(s) (i.e. POST RAW file data) or send data using Multi-Part encoding method (i.e. Content-Type: multipart/form-data). Multi-Part request allows you to mix key/value and upload files in same request. On the other hand raw upload allows only single file upload (without any key/value) ==== Raw Upload (Content-Type: application/octet-stream) ===== To upload single file in raw mode check this option and specify full file path starting with @ sign in the Body (e.g. @c:\data\myfile.zip ) ==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) ===== To treat your Request data as multi part fields you must specify key/value pairs separated by new lines into RequestData field (i.e. Body). Each key value pair is entered on new-line and key/value are separated using equal sign (=). Preceding and trailing spaces are ignored also blank lines are ignored. If field value has some any special character(s) then use escape sequence (e.g. For NewLine: \r\n, For Tab: \t, For at (@): \@). When value of any field starts with at sign (@) its automatically treated as File you want to upload. By default file content type is determined based on extension however you can supply content type manually for any field using this way [ YourFileFieldName.Content-Type=some-content-type ]. By default File Upload Field always includes Content-Type in the request (non file fields do not have content-type by default unless you supply manually). For some reason if you dont want to use Content-Type header in your request then supply blank Content-Type to exclude this header altogather [e.g. SomeFieldName.Content-Type= ]. In below example we have supplied Content-Type for file2 and SomeField1, all other fields are using default content-type. See below Example of uploading multiple files along with additional fields. If some API requires you to pass Content-Type: multipart/form-data rather than multipart/form-data then manually set Request Header => Content-Type: multipart/mixed (it must starts with multipart/ else will be ignored). file1=@c:\data\Myfile1.txt file2=@c:\data\Myfile2.json file2.Content-Type=application/json SomeField1=aaaaaaa SomeField1.Content-Type=text/plain SomeField2=12345 SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab SomeFieldStartingWithAtSign=\@MyTwitterHandle
Filter Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Headers Headers for Request. To enter multiple headers use double pipe or new line after each {header-name}:{value} pair

Other App Integration scenarios for Google BigQuery

Other Connectors for SSIS


Download Google BigQuery Connector for SSIS Documentation 

  • How to connect Google BigQuery in SSIS?

  • How to get Google BigQuery data in SSIS?

  • How to read Google BigQuery data in SSIS?

  • How to load Google BigQuery data in SSIS?

  • How to import Google BigQuery data in SSIS?

  • How to pull Google BigQuery data in SSIS?

  • How to push data to Google BigQuery in SSIS?

  • How to write data to Google BigQuery in SSIS?

  • How to POST data to Google BigQuery in SSIS?

  • Call Google BigQuery API in SSIS

  • Consume Google BigQuery API in SSIS

  • Google BigQuery SSIS Automate

  • Google BigQuery SSIS Integration

  • Integration Google BigQuery in SSIS

  • Consume real-time Google BigQuery data in SSIS

  • Consume realtime Google BigQuery API data in SSIS

  • Google BigQuery ODBC Driver | ODBC Driver for Google BigQuery | ODBC Google BigQuery Driver | SSIS Google BigQuery Source | SSIS Google BigQuery Destination

  • Connect Google BigQuery in SSIS

  • Load Google BigQuery in SSIS

  • Load Google BigQuery data in SSIS

  • Read Google BigQuery data in SSIS

  • Google BigQuery API Call in SSIS