Google BigQuery Connector for SSIS

Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector

In this article you will learn how to quickly and efficiently integrate Google BigQuery data in SSIS without coding. We will use high-performance Google BigQuery Connector to easily connect to Google BigQuery and then access the data inside SSIS.

Let's follow the steps below to see how we can accomplish that!

Download Documentation

Video Tutorial - Integrate Google BigQuery data in SSIS

This video covers the following topics and more, so please watch carefully. After watching the video, follow the steps outlined in this article:

  • How to download and install the required PowerPack for Google BigQuery integration in SSIS
  • How to configure the connection for Google BigQuery
  • Features of the ZappySys API Source (Authentication / Query Language / Examples / Driver UI)
  • How to use the Google BigQuery in SSIS

Prerequisites

Before we begin, make sure the following prerequisites are met:

  1. SSIS designer installed. Sometimes it is referred as BIDS or SSDT (download it from Microsoft).
  2. Basic knowledge of SSIS package development using Microsoft SQL Server Integration Services.
  3. SSIS PowerPack is installed (if you are new to SSIS PowerPack, then get started!).

Read data from Google BigQuery in SSIS (Export data)

In this section we will learn how to configure and use Google BigQuery Connector in API Source to extract data from Google BigQuery.

  1. Begin with opening Visual Studio and Create a New Project.

  2. Select Integration Service Project and in new project window set the appropriate name and location for project. And click OK.

  3. In the new SSIS project screen you will find the following:

    1. SSIS ToolBox on left side bar
    2. Solution Explorer and Property Window on right bar
    3. Control flow, data flow, event Handlers, Package Explorer in tab windows
    4. Connection Manager Window in the bottom

    SSIS Project Screen
    Note: If you don't see ZappySys SSIS PowerPack Task or Components in SSIS Toolbox, please refer to this help link.
  4. Now, Drag and Drop SSIS Data Flow Task from SSIS Toolbox. Double click on the Data Flow Task to see Data Flow designer.

    SSIS Data Flow Task - Drag and Drop
  5. From the SSIS toolbox drag and API Source (Predefined Templates) on the data flow designer surface, and double click on it to edit it:
    SSIS API Source (Predefined Templates) - Drag and Drop

  6. Select New Connection to create a new connection:
    API Source - New Connection

  7. Use a preinstalled Google BigQuery Connector from Popular Connector List or press Search Online radio button to download Google BigQuery Connector. Once downloaded simply use it in the configuration:

    Google BigQuery
    Google BigQuery Connector Selection

  8. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    User accounts represent a developer, administrator, or any other person who interacts with Google APIs and services. User accounts are managed as Google Accounts, either with Google Workspace or Cloud Identity. They can also be user accounts that are managed by a third-party identity provider and federated with Workforce Identity Federation. [API reference]

    Steps how to get and use Google BigQuery credentials

    Follow these steps on how to create Client Credentials (User Account principle) to authenticate and access BigQuery API in SSIS package or ODBC data source:

    WARNING: If you are planning to automate processes, we recommend that you use a Service Account authentication method. In case, you still need to use User Account, then make sure you use a system/generic account (e.g. automation@my-company.com). When you use a personal account which is tied to a specific employee profile and that employee leaves the company, the token may become invalid and any automated processes using that token will start to fail.

    Step-1: Create project

    This step is optional, if you already have a project in Google Cloud and can use it. However, if you don't, proceed with these simple steps to create one:

    1. First of all, go to Google API Console.

    2. Then click Select a project button and then click NEW PROJECT button:

      Start creating a new project in Google Cloud
    3. Name your project and click CREATE button:

      Create a new project in Google Cloud
    4. Wait until the project is created:

      Wait until project is created in Google Cloud
    5. Done! Let's proceed to the next step.

    Step-2: Enable Google Cloud APIs

    In this step we will enable BigQuery API and Cloud Resource Manager API:

    1. Select your project on the top bar:

      Select project in Google Cloud
    2. Then click the "hamburger" icon on the top left and access APIs & Services:

      Access APIs and services in Google Cloud
    3. Now let's enable several APIs by clicking ENABLE APIS AND SERVICES button:

      Enable API for project in Google Cloud
    4. In the search bar search for bigquery api and then locate and select BigQuery API:

      Search for API in Google Cloud
    5. If BigQuery API is not enabled, enable it:

      Enable Google BigQuery API
    6. Then repeat the step and enable Cloud Resource Manager API as well:

      Enable Cloud Resource Manager API
    7. Done! Let's proceed to the next step.

    Step-3: Create OAuth application

    1. First of all, click the "hamburger" icon on the top left and then hit VIEW ALL PRODUCTS:

      View all products in Google Cloud
    2. Then access Google Auth Platform to start creating an OAuth application:

      Open Google Auth Platform in Google Cloud
    3. Start by pressing GET STARTED button:

      Start creating an app in Google Cloud
    4. Next, continue by filling in App name and User support email fields:

      Fill app info in Google Cloud
    5. Choose Internal option, if it's enabled, otherwise select External:

      Choose app audience in Google Cloud
    6. Optional step if you used Internal option in the previous step. Nevertheless, if you had to use External option, then click ADD USERS to add a user:

      Add test user in Google Cloud app
    7. Then add your contact Email address:

      Enter app contact info in Google Cloud
    8. Finally, check the checkbox and click CREATE button:

      Create app in Google Cloud
    9. Done! Let's create Client Credentials in the next step.

    Step-4: Create Client Credentials

    1. In Google Auth Platform, select Clients menu item and click CREATE CLIENT button:

      Start creating app client in Google Cloud
    2. Choose Desktop app as Application type and name your credentials:

      Create OAuth app client in Google Cloud
    3. Continue by opening the created credentials:

      View app client credentials in Google Cloud
    4. Finally, copy Client ID and Client secret for the later step:

      Use client ID and secret to read Google REST API data
    5. Done! We have all the data needed for authentication, let's proceed to the last step!

    Step-5: Configure connection

    1. Now go to SSIS package or ODBC data source and use previously copied values in User Account authentication configuration:

      • In the ClientId field paste the Client ID value.
      • In the ClientSecret field paste the Client secret value.
    2. Press Generate Token button to generate Access and Refresh Tokens.

    3. Then choose ProjectId from the drop down menu.

    4. Continue by choosing DatasetId from the drop down menu.

    5. Finally, click Test Connection to confirm the connection is working.

    6. Done! Now you are ready to use Google BigQuery Connector!


    Configuring authentication parameters
    Google BigQuery
    User Account [OAuth]
    https://www.googleapis.com/bigquery/v2
    Required Parameters
    UseCustomApp Fill-in the parameter...
    ProjectId (Choose after [Generate Token] clicked) Fill-in the parameter...
    DatasetId (Choose after [Generate Token] clicked and ProjectId selected) Fill-in the parameter...
    Optional Parameters
    ClientId
    ClientSecret
    Scope https://www.googleapis.com/auth/bigquery https://www.googleapis.com/auth/bigquery.insertdata https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/cloud-platform.read-only https://www.googleapis.com/auth/devstorage.full_control https://www.googleapis.com/auth/devstorage.read_only https://www.googleapis.com/auth/devstorage.read_write
    RetryMode RetryWhenStatusCodeMatch
    RetryStatusCodeList 429|503
    RetryCountMax 5
    RetryMultiplyWaitTime True
    Job Location
    Redirect URL (Only for Web App)
    ZappySys OAuth Connection

    Service accounts are accounts that do not represent a human user. They provide a way to manage authentication and authorization when a human is not directly involved, such as when an application needs to access Google Cloud resources. Service accounts are managed by IAM. [API reference]

    Steps how to get and use Google BigQuery credentials

    Follow these steps on how to create Service Account to authenticate and access BigQuery API in SSIS package or ODBC data source:

    Step-1: Create project

    This step is optional, if you already have a project in Google Cloud and can use it. However, if you don't, proceed with these simple steps to create one:

    1. First of all, go to Google API Console.

    2. Then click Select a project button and then click NEW PROJECT button:

      Start creating a new project in Google Cloud
    3. Name your project and click CREATE button:

      Create a new project in Google Cloud
    4. Wait until the project is created:

      Wait until project is created in Google Cloud
    5. Done! Let's proceed to the next step.

    Step-2: Enable Google Cloud APIs

    In this step we will enable BigQuery API and Cloud Resource Manager API:

    1. Select your project on the top bar:

      Select project in Google Cloud
    2. Then click the "hamburger" icon on the top left and access APIs & Services:

      Access APIs and services in Google Cloud
    3. Now let's enable several APIs by clicking ENABLE APIS AND SERVICES button:

      Enable API for project in Google Cloud
    4. In the search bar search for bigquery api and then locate and select BigQuery API:

      Search for API in Google Cloud
    5. If BigQuery API is not enabled, enable it:

      Enable Google BigQuery API
    6. Then repeat the step and enable Cloud Resource Manager API as well:

      Enable Cloud Resource Manager API
    7. Done! Let's proceed to the next step and create a service account.

    Step-3: Create Service Account

    Use the steps below to create a Service Account in Google Cloud:

    1. First of all, go to IAM & Admin in Google Cloud console:

      Access IAM & Admin in Google Cloud
    2. Once you do that, click Service Accounts on the left side and click CREATE SERVICE ACCOUNT button:

      Start creating service account in Google Cloud
    3. Then name your service account and click CREATE AND CONTINUE button:

      Create service account in Google Cloud
    4. Continue by clicking Select a role dropdown and start granting service account BigQuery Admin and Project Viewer roles:

      Start granting service account project roles in Google Cloud
    5. Find BigQuery group on the left and then click on BigQuery Admin role on the right:

      Grant service account BigQuery Admin role
    6. Then click ADD ANOTHER ROLE button, find Project group and select Viewer role:

      Grant service account project viewer role
    7. Finish adding roles by clicking CONTINUE button:

      Finish granting service account project roles in Google Cloud
      You can always add or modify permissions later in IAM & Admin.
    8. Finally, in the last step, just click button DONE:

      Finish configuring service account in Google Cloud
    9. Done! We are ready to add a Key to this service account in the next step.

    Step-4: Add Key to Service Account

    We are ready to add a Key (JSON or P12 key file) to the created Service Account:

    1. In Service Accounts open newly created service account:

      Open service account in Google Cloud
    2. Next, copy email address of your service account for the later step:

      Copy service account email address in Google Cloud
    3. Continue by selecting KEYS tab, then press ADD KEY dropdown, and click Create new key menu item:

      Start creating key for service account in Google Cloud
    4. Finally, select JSON (Engine v19+) or P12 option and hit CREATE button:

      Create JSON or P12 key for service account in Google Cloud
    5. Key file downloads into your machine. We have all the data needed for authentication, let's proceed to the last step!

    Step-5: Configure connection

    1. Now go to SSIS package or ODBC data source and configure these fields in Service Account authentication configuration:

      • In the Service Account Email field paste the service account Email address value you copied in the previous step.
      • In the Service Account Private Key Path (i.e. *.json OR *.p12) field use downloaded certificate's file path.
    2. Done! Now you are ready to use Google BigQuery Connector!

    Configuring authentication parameters
    Google BigQuery
    Service Account [OAuth]
    https://www.googleapis.com/bigquery/v2
    Required Parameters
    Service Account Email Fill-in the parameter...
    Service Account Private Key Path (i.e. *.json OR *.p12) Fill-in the parameter...
    ProjectId Fill-in the parameter...
    DatasetId (Choose after ProjectId) Fill-in the parameter...
    Optional Parameters
    Scope https://www.googleapis.com/auth/bigquery https://www.googleapis.com/auth/bigquery.insertdata https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/cloud-platform.read-only https://www.googleapis.com/auth/devstorage.full_control https://www.googleapis.com/auth/devstorage.read_only https://www.googleapis.com/auth/devstorage.read_write
    RetryMode RetryWhenStatusCodeMatch
    RetryStatusCodeList 429
    RetryCountMax 5
    RetryMultiplyWaitTime True
    Job Location
    Impersonate As (Enter Email Id)
    ZappySys OAuth Connection

  9. Select the desired endpoint, change/pass the properties values, and click on Preview Data button to make the API call.

    API Source - Google BigQuery
    Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector
    API Source - Select Endpoint

  10. That's it! We are done! Just in a few clicks we configured the call to Google BigQuery using Google BigQuery Connector.

    You can load the source data into your desired destination using the Upsert Destination, which supports SQL Server, PostgreSQL, and Amazon Redshift. We also offer other destinations such as CSV, Excel, Azure Table, Salesforce, and more. You can check out our SSIS PowerPack Tasks and components for more options. (*loaded in Trash Destination)

    Execute Package - Reading data from Google BigQuery and load into target

Write data to Google BigQuery using SSIS (Import data)

In this section we will learn how to configure and use Google BigQuery Connector in the API Destination to write data to Google BigQuery.

Video tutorial

This video covers following and more so watch carefully. After watching this video follow the steps described in this article.

  • How to download SSIS PowerPack for Google BigQuery integration in SSIS
  • How to configure connection for Google BigQuery
  • How to write or lookup data to Google BigQuery
  • Features about SSIS API Destination
  • Using Google BigQuery Connector in SSIS

Step-by-step instructions

In upper section we learned how to read data, now in this section we will learn how to configure Google BigQuery in the API Source to POST data to the Google BigQuery.

  1. Read the data from the source, being any desired source component. In example we will use ZappySys Dummy Data Source component.

  2. From the SSIS Toolbox drag and drop API Destination (Predefined Templates) on the Data Flow Designer surface and connect source component with it, and double click to edit it.
    SSIS API Destination (Predefined Templates) - Drag and Drop

  3. Select New Connection to create a new connection:

    API Destination - Google BigQuery
    Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector
    API Destination - New Connection

  4. Use a preinstalled Google BigQuery Connector from Popular Connector List or press Search Online radio button to download Google BigQuery Connector. Once downloaded simply use it in the configuration:

    Google BigQuery
    Google BigQuery Connector Selection

  5. Proceed with selecting the desired Authentication Type. Then select API Base URL (in most cases default one is the right one). Finally, fill in all the required parameters and set optional parameters if needed. You may press a link Steps to Configure which will help set certain parameters. More info is available in Authentication section.

    User accounts represent a developer, administrator, or any other person who interacts with Google APIs and services. User accounts are managed as Google Accounts, either with Google Workspace or Cloud Identity. They can also be user accounts that are managed by a third-party identity provider and federated with Workforce Identity Federation. [API reference]

    Steps how to get and use Google BigQuery credentials

    Follow these steps on how to create Client Credentials (User Account principle) to authenticate and access BigQuery API in SSIS package or ODBC data source:

    WARNING: If you are planning to automate processes, we recommend that you use a Service Account authentication method. In case, you still need to use User Account, then make sure you use a system/generic account (e.g. automation@my-company.com). When you use a personal account which is tied to a specific employee profile and that employee leaves the company, the token may become invalid and any automated processes using that token will start to fail.

    Step-1: Create project

    This step is optional, if you already have a project in Google Cloud and can use it. However, if you don't, proceed with these simple steps to create one:

    1. First of all, go to Google API Console.

    2. Then click Select a project button and then click NEW PROJECT button:

      Start creating a new project in Google Cloud
    3. Name your project and click CREATE button:

      Create a new project in Google Cloud
    4. Wait until the project is created:

      Wait until project is created in Google Cloud
    5. Done! Let's proceed to the next step.

    Step-2: Enable Google Cloud APIs

    In this step we will enable BigQuery API and Cloud Resource Manager API:

    1. Select your project on the top bar:

      Select project in Google Cloud
    2. Then click the "hamburger" icon on the top left and access APIs & Services:

      Access APIs and services in Google Cloud
    3. Now let's enable several APIs by clicking ENABLE APIS AND SERVICES button:

      Enable API for project in Google Cloud
    4. In the search bar search for bigquery api and then locate and select BigQuery API:

      Search for API in Google Cloud
    5. If BigQuery API is not enabled, enable it:

      Enable Google BigQuery API
    6. Then repeat the step and enable Cloud Resource Manager API as well:

      Enable Cloud Resource Manager API
    7. Done! Let's proceed to the next step.

    Step-3: Create OAuth application

    1. First of all, click the "hamburger" icon on the top left and then hit VIEW ALL PRODUCTS:

      View all products in Google Cloud
    2. Then access Google Auth Platform to start creating an OAuth application:

      Open Google Auth Platform in Google Cloud
    3. Start by pressing GET STARTED button:

      Start creating an app in Google Cloud
    4. Next, continue by filling in App name and User support email fields:

      Fill app info in Google Cloud
    5. Choose Internal option, if it's enabled, otherwise select External:

      Choose app audience in Google Cloud
    6. Optional step if you used Internal option in the previous step. Nevertheless, if you had to use External option, then click ADD USERS to add a user:

      Add test user in Google Cloud app
    7. Then add your contact Email address:

      Enter app contact info in Google Cloud
    8. Finally, check the checkbox and click CREATE button:

      Create app in Google Cloud
    9. Done! Let's create Client Credentials in the next step.

    Step-4: Create Client Credentials

    1. In Google Auth Platform, select Clients menu item and click CREATE CLIENT button:

      Start creating app client in Google Cloud
    2. Choose Desktop app as Application type and name your credentials:

      Create OAuth app client in Google Cloud
    3. Continue by opening the created credentials:

      View app client credentials in Google Cloud
    4. Finally, copy Client ID and Client secret for the later step:

      Use client ID and secret to read Google REST API data
    5. Done! We have all the data needed for authentication, let's proceed to the last step!

    Step-5: Configure connection

    1. Now go to SSIS package or ODBC data source and use previously copied values in User Account authentication configuration:

      • In the ClientId field paste the Client ID value.
      • In the ClientSecret field paste the Client secret value.
    2. Press Generate Token button to generate Access and Refresh Tokens.

    3. Then choose ProjectId from the drop down menu.

    4. Continue by choosing DatasetId from the drop down menu.

    5. Finally, click Test Connection to confirm the connection is working.

    6. Done! Now you are ready to use Google BigQuery Connector!


    Configuring authentication parameters
    Google BigQuery
    User Account [OAuth]
    https://www.googleapis.com/bigquery/v2
    Required Parameters
    UseCustomApp Fill-in the parameter...
    ProjectId (Choose after [Generate Token] clicked) Fill-in the parameter...
    DatasetId (Choose after [Generate Token] clicked and ProjectId selected) Fill-in the parameter...
    Optional Parameters
    ClientId
    ClientSecret
    Scope https://www.googleapis.com/auth/bigquery https://www.googleapis.com/auth/bigquery.insertdata https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/cloud-platform.read-only https://www.googleapis.com/auth/devstorage.full_control https://www.googleapis.com/auth/devstorage.read_only https://www.googleapis.com/auth/devstorage.read_write
    RetryMode RetryWhenStatusCodeMatch
    RetryStatusCodeList 429|503
    RetryCountMax 5
    RetryMultiplyWaitTime True
    Job Location
    Redirect URL (Only for Web App)
    ZappySys OAuth Connection

    Service accounts are accounts that do not represent a human user. They provide a way to manage authentication and authorization when a human is not directly involved, such as when an application needs to access Google Cloud resources. Service accounts are managed by IAM. [API reference]

    Steps how to get and use Google BigQuery credentials

    Follow these steps on how to create Service Account to authenticate and access BigQuery API in SSIS package or ODBC data source:

    Step-1: Create project

    This step is optional, if you already have a project in Google Cloud and can use it. However, if you don't, proceed with these simple steps to create one:

    1. First of all, go to Google API Console.

    2. Then click Select a project button and then click NEW PROJECT button:

      Start creating a new project in Google Cloud
    3. Name your project and click CREATE button:

      Create a new project in Google Cloud
    4. Wait until the project is created:

      Wait until project is created in Google Cloud
    5. Done! Let's proceed to the next step.

    Step-2: Enable Google Cloud APIs

    In this step we will enable BigQuery API and Cloud Resource Manager API:

    1. Select your project on the top bar:

      Select project in Google Cloud
    2. Then click the "hamburger" icon on the top left and access APIs & Services:

      Access APIs and services in Google Cloud
    3. Now let's enable several APIs by clicking ENABLE APIS AND SERVICES button:

      Enable API for project in Google Cloud
    4. In the search bar search for bigquery api and then locate and select BigQuery API:

      Search for API in Google Cloud
    5. If BigQuery API is not enabled, enable it:

      Enable Google BigQuery API
    6. Then repeat the step and enable Cloud Resource Manager API as well:

      Enable Cloud Resource Manager API
    7. Done! Let's proceed to the next step and create a service account.

    Step-3: Create Service Account

    Use the steps below to create a Service Account in Google Cloud:

    1. First of all, go to IAM & Admin in Google Cloud console:

      Access IAM & Admin in Google Cloud
    2. Once you do that, click Service Accounts on the left side and click CREATE SERVICE ACCOUNT button:

      Start creating service account in Google Cloud
    3. Then name your service account and click CREATE AND CONTINUE button:

      Create service account in Google Cloud
    4. Continue by clicking Select a role dropdown and start granting service account BigQuery Admin and Project Viewer roles:

      Start granting service account project roles in Google Cloud
    5. Find BigQuery group on the left and then click on BigQuery Admin role on the right:

      Grant service account BigQuery Admin role
    6. Then click ADD ANOTHER ROLE button, find Project group and select Viewer role:

      Grant service account project viewer role
    7. Finish adding roles by clicking CONTINUE button:

      Finish granting service account project roles in Google Cloud
      You can always add or modify permissions later in IAM & Admin.
    8. Finally, in the last step, just click button DONE:

      Finish configuring service account in Google Cloud
    9. Done! We are ready to add a Key to this service account in the next step.

    Step-4: Add Key to Service Account

    We are ready to add a Key (JSON or P12 key file) to the created Service Account:

    1. In Service Accounts open newly created service account:

      Open service account in Google Cloud
    2. Next, copy email address of your service account for the later step:

      Copy service account email address in Google Cloud
    3. Continue by selecting KEYS tab, then press ADD KEY dropdown, and click Create new key menu item:

      Start creating key for service account in Google Cloud
    4. Finally, select JSON (Engine v19+) or P12 option and hit CREATE button:

      Create JSON or P12 key for service account in Google Cloud
    5. Key file downloads into your machine. We have all the data needed for authentication, let's proceed to the last step!

    Step-5: Configure connection

    1. Now go to SSIS package or ODBC data source and configure these fields in Service Account authentication configuration:

      • In the Service Account Email field paste the service account Email address value you copied in the previous step.
      • In the Service Account Private Key Path (i.e. *.json OR *.p12) field use downloaded certificate's file path.
    2. Done! Now you are ready to use Google BigQuery Connector!

    Configuring authentication parameters
    Google BigQuery
    Service Account [OAuth]
    https://www.googleapis.com/bigquery/v2
    Required Parameters
    Service Account Email Fill-in the parameter...
    Service Account Private Key Path (i.e. *.json OR *.p12) Fill-in the parameter...
    ProjectId Fill-in the parameter...
    DatasetId (Choose after ProjectId) Fill-in the parameter...
    Optional Parameters
    Scope https://www.googleapis.com/auth/bigquery https://www.googleapis.com/auth/bigquery.insertdata https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/cloud-platform.read-only https://www.googleapis.com/auth/devstorage.full_control https://www.googleapis.com/auth/devstorage.read_only https://www.googleapis.com/auth/devstorage.read_write
    RetryMode RetryWhenStatusCodeMatch
    RetryStatusCodeList 429
    RetryCountMax 5
    RetryMultiplyWaitTime True
    Job Location
    Impersonate As (Enter Email Id)
    ZappySys OAuth Connection

  6. Select the desired endpoint, change/pass the properties values, and go to the Mappings tab to map the columns.

    API Destination - Google BigQuery
    Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector
    API Destination - Select Endpoint

  7. Finally, map the desired columns:

    API Destination - Google BigQuery
    Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector
    API Destination - Columns Mapping

  8. That's it; we successfully configured the POST API Call. In a few clicks we configured the Google BigQuery API call using ZappySys Google BigQuery Connector

    Execute Package - Reading data from API Source and load into target

Load Google BigQuery data into SQL Server using Upsert Destination (Insert or Update)

Once you configured the data source, you can load Google BigQuery data into SQL Server using Upsert Destination.

Upsert Destination can merge or synchronize source data with the target table. It supports Microsoft SQL Server, PostgreSQL, and Redshift databases as targets. Upsert Destination also supports very fast bulk upsert operation along with bulk delete.

Upsert operation - a database operation which performs INSERT or UPDATE SQL commands based on record's existence condition in the target table. It inserts records that don't have matching records in the target table or updates them, if they do, by matching them by key columns.

Upsert Destination supports INSERT, UPDATE, and DELETE operations, so it is similar to SQL Server's MERGE command, except it can be used directly in SSIS package.

  1. From the SSIS Toolbox drag-and-drop Upsert Destination component onto the Data Flow designer background.

  2. Connect your SSIS source component to Upsert Destination.

  3. Double-click on Upsert Destination component to open configuration window.

  4. Start by selecting the Action from the list.

  5. Next, select the desired target connection or create one by clicking <New [provider] Connection> menu item from the Target Connection dropdown.

  6. Then select a table from the Target Table list or click New button to create a new table based on the source columns.

  7. Continue by checking Insert and Update options according to your scenario (e.g. if Update option is unchecked, no updates will be made).

  8. Finally, click Map All button to map all columns and then select the Key columns to match the columns on:

    Configure SSIS Upsert Destination component to merge data with SQL Server, PostgreSQL, or Redshift table
  9. Click OK to save the configuration.

  10. Run the package and Google BigQuery data will be merged with the target table in SQL Server, PostgreSQL, or Redshift:

    Execute Package - Reading data from API Source and load into target
  11. Done!

Deploy and schedule SSIS package

After you are done creating SSIS package, most likely, you want to deploy it to SQL Server Catalog and run it periodically. Just follow the instructions in this article:

Running SSIS package in Azure Data Factory (ADF)

To use SSIS PowerPack in ADF, you must first prepare Azure-SSIS Integration Runtime. Follow this link for detailed instructions:

Actions supported by Google BigQuery Connector

Google BigQuery Connector supports following actions for REST API integration:

[Dynamic Action]

Description

Read data from [$parent.tableReference.datasetId$].[$parent.tableReference.tableId$] for project .

Parameters

You can provide the following parameters to this action:

  • N/A

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • -Dynamic-
  • [Dynamic Column]_DT

Visit documentation for more information.

Create Dataset

Description

Creates a new empty dataset.

Parameters

You can provide the following parameters to this action:

  • Dataset Name
  • ProjectId
  • Description

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • datasetId
  • projectId
  • kind
  • id
  • location
  • friendlyName
  • description
  • access

Visit documentation for more information.

Delete Dataset

Description

Deletes the dataset specified by the datasetId value. Before you can delete a dataset, you must delete all its tables, either manually or by specifying deleteContents. Immediately after deletion, you can create another dataset with the same name.

Parameters

You can provide the following parameters to this action:

  • DatasetId
  • ProjectId
  • Delete All Tables

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • Response

Visit documentation for more information.

Delete Table

Description

Deletes the dataset specified by the datasetId value. Before you can delete a dataset, you must delete all its tables, either manually or by specifying deleteContents. Immediately after deletion, you can create another dataset with the same name.

Parameters

You can provide the following parameters to this action:

  • TableId
  • ProjectId
  • DatasetId

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • Response

Visit documentation for more information.

Get Query Schema (From SQL)

Description

Runs a BigQuery SQL query synchronously and returns query schema.

Parameters

You can provide the following parameters to this action:

  • SQL Query
  • Filter
  • Use Legacy SQL Syntax?
  • timeout (Milliseconds)
  • Job Location

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • name
  • type

Visit documentation for more information.

Get Table Schema

Description

Gets the specified table resource by table ID. This method does not return the data in the table, it only returns the table resource, which describes the structure of this table.

Parameters

You can provide the following parameters to this action:

  • DatasetId
  • TableId
  • Filter

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • name
  • type

Visit documentation for more information.

Insert Table Data

Description

Not available.

Parameters

You can provide the following parameters to this action:

  • ProjectId
  • DatasetId
  • TableId

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • index
  • reason
  • location
  • debugInfo
  • message

Visit documentation for more information.

List Datasets

Description

Lists all BigQuery datasets in the specified project to which the user has been granted the READER dataset role.

Parameters

You can provide the following parameters to this action:

  • ProjectId
  • SearchFilter
  • all

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • datasetId
  • projectId
  • kind
  • id
  • location

Visit documentation for more information.

List Projects

Description

Lists Projects that the caller has permission on and satisfy the specified filter.

Parameters

You can provide the following parameters to this action:

  • SearchFilter

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • projectId
  • name
  • projectNumber
  • lifecycleState
  • createTime

Visit documentation for more information.

List Tables

Description

Lists BigQuery Tables for the specified project / dataset to which the user has been granted the READER dataset role.

Parameters

You can provide the following parameters to this action:

  • DatasetId
  • ProjectId

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • tableId
  • datasetId
  • projectId
  • kind
  • id
  • type
  • creationTime

Visit documentation for more information.

Post Dynamic Endpoint

Description

Not available.

Parameters

You can provide the following parameters to this action:

  • N/A

Input Fields

You can provide the following fields to this action:

  • -Dynamic-
  • [Dynamic Column]_DT

Output Fields

The following fields are returned after calling this action:

  • index
  • reason
  • location
  • debugInfo
  • message

Visit documentation for more information.

Read Data using SQL Query -OR- Execute Script (i.e. CREATE, SELECT, INSERT, UPDATE, DELETE)

Description

Runs a BigQuery SQL query synchronously and returns query results if the query completes within a specified timeout.

Parameters

You can provide the following parameters to this action:

  • SQL Statement (i.e. SELECT / DROP / CREATE)
  • Use Legacy SQL Syntax?
  • timeout (Milliseconds)
  • Job Location

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • -Dynamic-
  • [Dynamic Column]_DT

Visit documentation for more information.

Read Table Rows

Description

Gets the specified table resource by table ID. This method does not return the data in the table, it only returns the table resource, which describes the structure of this table.

Parameters

You can provide the following parameters to this action:

  • TableId
  • ProjectId
  • DatasetId

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • -Dynamic-
  • [Dynamic Column]_DT

Visit documentation for more information.

Make Generic API Request

Description

This is generic endpoint. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.

Parameters

You can provide the following parameters to this action:

  • Url
  • Body
  • IsMultiPart
  • Filter
  • Headers

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • N/A

Visit documentation for more information.

Make Generic API Request (Bulk Write)

Description

This is a generic endpoint for bulk write purpose. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.

Parameters

You can provide the following parameters to this action:

  • Url
  • IsMultiPart
  • Filter
  • Headers

Input Fields

You can provide the following fields to this action:

  • N/A

Output Fields

The following fields are returned after calling this action:

  • N/A

Visit documentation for more information.

Conclusion

In this article we showed you how to connect to Google BigQuery in SSIS and integrate data without any coding, saving you time and effort.

We encourage you to download Google BigQuery Connector for SSIS and see how easy it is to use it for yourself or your team.

If you have any questions, feel free to contact ZappySys support team. You can also open a live chat immediately by clicking on the chat icon below.

Download Google BigQuery Connector for SSIS Documentation

More integrations

Other connectors for SSIS

All
Big Data & NoSQL
Database
CRM & ERP
Marketing
Collaboration
Cloud Storage
Reporting
Commerce
API & Files

Other application integration scenarios for Google BigQuery

All
Data Integration
Database
BI & Reporting
Productivity
Programming Languages
Automation & Scripting
ODBC applications