SSAS Google BigQuery API Connector

In this article you will learn, how to integrate Google BigQuery data to SSAS without coding in few clicks (Live / Bi-directional connection to Google BigQuery). Read / write Google BigQuery data inside your app without coding using easy to use high performance API Connector

Using Google BigQuery API Connector you will be able to connect, read and write data from within SSAS. Let's take a look at the steps below to see how exactly to accomplish that.

Download  Help File  Buy 

Create Data Source in ZappySys Data Gateway based on ZappySys API Driver

  1. Download and install ZappySys ODBC PowerPack.

  2. Search for gateway in start menu and Open ZappySys Data Gateway:
    Open ZappySys Data Gateway

  3. Go to Users Tab to add our first Gateway user. Click Add; we will give it a name tdsuser and enter password you like to give. Check Admin option and click OK to save. We will use these details later when we create linked server:
    ZappySys Data Gateway - Add User

  4. Now we are ready to add a data source. Click Add, give data source a name (Copy this name somewhere, we will need it later) and then select Native - ZappySys API Driver. Finally, click OK.

    GoogleBigQueryDSN

    ZappySys Data Gateway - Add Data Source

  5. When a window appears, firstly give your data source a name if you haven't done that already, then select "Google BigQuery" from the list of Popular Connectors. If "Google BigQuery" is not present in the list, then click "Search Online" and download it. Then set the path to the one where you downloaded it. Finally, hit Continue >> button to continue configuring DSN:

    GoogleBigQueryDSN
    Google BigQuery
    ODBC DSN Template Selection

  6. Another window appears and it's time to configure the Connection Manager. Firstly, select Authentication Type, e.g. Token Authentication. Then select API Base URL (in most cases default one is the right one). More info is available in Authentication section.

    Steps to get Google BigQuery Credentials
    This connection can be configured using two ways. Use Default App (Created by ZappySys) OR Use Custom App created by you.
    To use minimum settings you can start with ZappySys created App. Just change UseCustomApp=false on the properties grid so you dont need ClientID / Secret. When you click Generate Token you might see warning about App is not trusted (Simply Click Advanced Link to expand hidden section and then click Go to App link to Proceed).

    To register custom App, perform the following steps (Detailed steps found in the help link at the end)

    1. Go to Google API Console
    2. From the Project Dropdown (usually found at the top bar) click Select Project
    3. On Project Propup click CREATE PROJECT
    4. Once project is created you can click Select Project to switch the context (You can click on Notification link or Choose from Top Dropdown)
    5. Click ENABLE APIS AND SERVICES
    6. Now we need to Enable two APIs one by one (BigQuery API and Cloud Resource Manager API).
    7. Search BigQuery API. Select and click ENABLE
    8. Search Cloud Resource Manager API. Select and click ENABLE
    9. Go to back to main screen of Google API Console
    10. Click OAuth Concent Screen Tab. Enter necessary details and Save.
    11. Click Credentials Tab
    12. Click CREATE CREDENTIALS (some where in topbar) and select OAuth Client ID option.
    13. When prompted Select Application Type as Desktop App and click Create to receive your ClientID and Secret. You can use this information now to configure Connection with UseCustomApp=true.

    API Reference (External Site)

    And finally, fill in all the required parameters and set optional parameters if needed.:

    GoogleBigQueryDSN
    Google BigQuery
    User Account [OAuth]
    https://www.googleapis.com/bigquery/v2
    Required Parameters
    UseCustomApp Fill in the parameter...
    ProjectId Fill in the parameter...
    DatasetId Fill in the parameter...
    Optional Parameters
    ClientId Fill in the parameter...
    ClientSecret Fill in the parameter...
    Scope Fill in the parameter...
    ODBC DSN Oauth Connection Configuration

  7. Once you configured a data source, you can preview data. Hit Preview tab, and use similar settings to preview data:
    ODBC ZappySys Data Source Preview

  8. Click OK to finish creating the data source

Read data in SQL Server from the ZappySys Data Gateway data source

  1. To read the data in SQL Server the first thing you have to do is create a Linked Server. Go to SQL Server Management Studio and configure it in a similar way: SSMS SQL Server Configure Linked Server

  2. Then click on Security option and configure username we created in ZappySys Data Gateway in one of the previous steps: SSMS SQL Server Configure Linked Server User Name

  3. Finally, open a new query and execute a query we saved in one of the previous steps:

    SELECT * FROM OPENQUERY([MY_LINKED_SERVER_NAME], 'SELECT * FROM Products')

    SSMS SQL Server Query Data Results

Firewall settings

So far we have assumed that Gateway is running on the same machine as SQL Server. However there will be a case when ZappySys ODBC PowerPack is installed on a different machine than SQL Server. In such case you may have to perform additional Firewall configurations. On most computers firewall settings wont allow outside traffic to ZappySys Data Gateway. In such case perform following steps to allow other machines to connect to Gateway.

Method-1 (Preferred)

If you are using newer version of ZappySys Data Gateway then adding firewall rule is just a single click.

  1. Search for gateway in start menu and open ZappySys Data Gateway.
  2. Go to Firewall Tab and click Add Firewall Rule button like below. This will create Firewall rule to all Inbound Traffic on Port 5000 (Unless you changed it). Allow Inbound Traffic - Add Firewall Rule for ZappySys Data Gateway

Method-2 Here is another way to add / edit Inbound Traffic rule in windows firewall. Use below method if you choose to customize your rule (for advanced users).
  1. Search for Windows Firewall Advanced Security in start menu.
  2. Under Inbound Rules > Right click and click [New Rule] >> Click Next
  3. Select Port on Rule Type >> Click Next
  4. Click on TCP and enter port number under specified local port as 5000 (use different one if you changed Default port) >> Click Next
  5. Select Profile (i.e. Private, Public) >> Click Next
  6. Enter Rule name [i.e. ZappySys Data Gateway – Allow Inbound ] >> Click Next
  7. Click OK to save the rule
SQL Server Firewall Allow Inbound Data Gateway

Create Custom Store Procedure in ZappySys Driver

You can create procedures to encapsulate custom logic and then only pass handful parameters rather than long SQL to execute your API call.

Steps to create Custom Store Procedure in ZappySys Driver. You can insert Placeholders anywhere inside Procedure Body. Read more about placeholders here

  1. Go to Custom Objects Tab and Click on Add button and Select Add Procedure:
    ZappySys Driver - Add Store Procedure

  2. Enter the desired Procedure name and click on OK:
    ZappySys Driver - Add Store Procedure Name

  3. Select the created Store Procedure and write the your desired store procedure and Save it and it will create the custom store procedure in the ZappySys Driver:
    Here is an example stored procedure for ZappySys Driver. You can insert Placeholders anywhere inside Procedure Body. Read more about placeholders here

    CREATE PROCEDURE [usp_get_orders]
    @fromdate = '<<yyyy-MM-dd,FUN_TODAY>>'
    AS
    SELECT * FROM Orders where OrderDate >= '<@fromdate>'

    ZappySys Driver - Create Custom Store Procedure

  4. That's it now go to Preview Tab and Execute your Store Procedure using Exec Command. In this example it will extract the orders from the date 1996-01-01:

    Exec usp_get_orders '1996-01-01'

    ZappySys Driver - Execute Custom Store Procedure

Create Custom Virtual Table in ZappySys Driver

ZappySys API Drivers support flexible Query language so you can override Default Properties you configured on Data Source such as URL, Body. This way you don't have to create multiple Data Sources if you like to read data from multiple EndPoints. However not every application support supplying custom SQL to driver so you can only select Table from list returned from driver.

Many applications like MS Access, Informatica Designer wont give you option to specify custom SQL when you import Objects. In such case Virtual Table is very useful. You can create many Virtual Tables on the same Data Source (e.g. If you have 50 URLs with slight variations you can create virtual tables with just URL as Parameter setting.

  1. Go to Custom Objects Tab and Click on Add button and Select Add Table:
    ZappySys Driver - Add Table

  2. Enter the desired Table name and click on OK:
    ZappySys Driver - Add Table Name

  3. And it will open the New Query Window Click on Cancel to close that window and go to Custom Objects Tab.

  4. Select the created table, Select Text Type AS SQL and write the your desired SQL Query and Save it and it will create the custom table in the ZappySys Driver:
    Here is an example SQL query for ZappySys Driver. You can insert Placeholders also. Read more about placeholders here

    SELECT
    "ShipCountry",
    "OrderID",
    "CustomerID",
    "EmployeeID",
    "OrderDate",
    "RequiredDate",
    "ShippedDate",
    "ShipVia",
    "Freight",
    "ShipName",
    "ShipAddress",
    "ShipCity",
    "ShipRegion",
    "ShipPostalCode"
    FROM "Orders"
    Where "ShipCountry"='USA'

    ZappySys Driver - Create Custom Table

  5. That's it now go to Preview Tab and Execute your custom virtual table query. In this example it will extract the orders for the USA Shipping Country only:

    SELECT * FROM "vt__usa_orders_only"

    ZappySys Driver - Execute Custom Virtual Table Query

Conclusion

In this article we discussed how to connect to Google BigQuery in SSAS and integrate data without any coding. Click here to Download Google BigQuery Connector for SSAS and try yourself see how easy it is. If you still have any question(s) then ask here or simply click on live chat icon below and ask our expert (see bottom-right corner of this page).

Documentation 

Actions supported by Google BigQuery Connector

Google BigQuery Connectors support following actions for REST API integration. If some actions are not listed below then you can easily edit Connector file and enhance out of the box functionality.
 Read Data using SQL Query -OR- Execute Script (i.e. CREATE, SELECT, INSERT, UPDATE, DELETE)
Runs a BigQuery SQL query synchronously and returns query results if the query completes within a specified timeout
Parameter Description
SQL Statement (i.e. SELECT / DROP / CREATE)
Option Value
Example1 SELECT title,id,language,wp_namespace,reversion_id ,comment,num_characters FROM bigquery-public-data.samples.wikipedia LIMIT 1000
Example2 CREATE TABLE TestDataset.Table1 (ID INT64,Name STRING,BirthDate DATETIME, Active BOOL)
Example3 INSERT TestDataset.Table1 (ID, Name,BirthDate,Active) VALUES(1,&#39;AA&#39;,&#39;2020-01-01&#39;,true),(2,&#39;BB&#39;,&#39;2020-01-02&#39;,true),(3,&#39;CC&#39;,&#39;2020-01-03&#39;,false)
Use Legacy SQL Syntax?
Option Value
false false
true true
timeout (Milliseconds) Wait until timeout is reached.
Option Value
false false
true true
 Read Table Rows
Gets the specified table resource by table ID. This method does not return the data in the table, it only returns the table resource, which describes the structure of this table.
Parameter Description
ProjectId Leave this value blank to use ProjectId from connection settings
DatasetId Leave this value blank to use DatasetId from connection settings
TableId
 [$parent.tableReference.datasetId$].[$parent.tableReference.tableId$]
Read data from [$parent.tableReference.datasetId$].[$parent.tableReference.tableId$] for project .
Parameter Description
 List Projects
Lists Projects that the caller has permission on and satisfy the specified filter.
Parameter Description
SearchFilter An expression for filtering the results of the request. Filter rules are case insensitive. If multiple fields are included in a filter query, the query will return results that match any of the fields. Some eligible fields for filtering are: name, id, labels.{key} (where key is the name of a label), parent.type, parent.id, lifecycleState. Example: name:how*
 List Datasets
Lists all BigQuery datasets in the specified project to which the user has been granted the READER dataset role.
Parameter Description
ProjectId
SearchFilter An expression for filtering the results of the request. Filter rules are case insensitive. If multiple fields are included in a filter query, the query will return results that match any of the fields. Some eligible fields for filtering are: name, id, labels.{key} (where key is the name of a label), parent.type, parent.id, lifecycleState. Example: name:how*
all Whether to list all datasets, including hidden ones
Option Value
True True
False False
 Create Dataset
Creates a new empty dataset.
Parameter Description
ProjectId
Dataset Name Enter dataset name
Description
 Delete Dataset
Deletes the dataset specified by the datasetId value. Before you can delete a dataset, you must delete all its tables, either manually or by specifying deleteContents. Immediately after deletion, you can create another dataset with the same name.
Parameter Description
ProjectId
DatasetId
Delete All Tables If True, delete all the tables in the dataset. If False and the dataset contains tables, the request will fail. Default is False
Option Value
True True
False False
 Delete Table
Deletes the dataset specified by the datasetId value. Before you can delete a dataset, you must delete all its tables, either manually or by specifying deleteContents. Immediately after deletion, you can create another dataset with the same name.
Parameter Description
ProjectId
DatasetId
TableId
 List Tables
Lists BigQuery Tables for the specified project / dataset to which the user has been granted the READER dataset role.
Parameter Description
ProjectId
DatasetId
 Get Query Schema (From SQL)
Runs a BigQuery SQL query synchronously and returns query schema
Parameter Description
SQL Query
Use Legacy SQL Syntax?
Option Value
false false
true true
timeout (Milliseconds) Wait until timeout is reached.
Option Value
false false
true true
 Get Table Schema
Gets the specified table resource by table ID. This method does not return the data in the table, it only returns the table resource, which describes the structure of this table.
Parameter Description
DatasetId
TableId
 insert_table_data
Parameter Description
ProjectId
DatasetId
TableId
 post_[$parent.tableReference.datasetId$]_[$parent.tableReference.tableId$]
 Generic Request
This is generic endpoint. Use this endpoint when some actions are not implemented by connector. Just enter partial URL (Required), Body, Method, Header etc. Most parameters are optional except URL.
Parameter Description
Url API URL goes here. You can enter full URL or Partial URL relative to Base URL. If it is full URL then domain name must be part of ServiceURL or part of TrustedDomains
Body Request Body content goes here
IsMultiPart Check this option if you want to upload file(s) (i.e. POST RAW file data) or send data using Multi-Part encoding method (i.e. Content-Type: multipart/form-data). Multi-Part request allows you to mix key/value and upload files in same request. On the other hand raw upload allows only single file upload (without any key/value) ==== Raw Upload (Content-Type: application/octet-stream) ===== To upload single file in raw mode check this option and specify full file path starting with @ sign in the Body (e.g. @c:\data\myfile.zip ) ==== Form-Data / Multipart Upload (Content-Type: multipart/form-data) ===== To treat your Request data as multi part fields you must specify key/value pairs separated by new lines into RequestData field (i.e. Body). Each key value pair is entered on new-line and key/value are separated using equal sign (=). Preceding and trailing spaces are ignored also blank lines are ignored. If field value has some any special character(s) then use escape sequence (e.g. For NewLine: \r\n, For Tab: \t, For at (@): \@). When value of any field starts with at sign (@) its automatically treated as File you want to upload. By default file content type is determined based on extension however you can supply content type manually for any field using this way [ YourFileFieldName.Content-Type=some-content-type ]. By default File Upload Field always includes Content-Type in the request (non file fields do not have content-type by default unless you supply manually). For some reason if you dont want to use Content-Type header in your request then supply blank Content-Type to exclude this header altogather [e.g. SomeFieldName.Content-Type= ]. In below example we have supplied Content-Type for file2 and SomeField1, all other fields are using default content-type. See below Example of uploading multiple files along with additional fields. file1=@c:\data\Myfile1.txt file2=@c:\data\Myfile2.json file2.Content-Type=application/json SomeField1=aaaaaaa SomeField1.Content-Type=text/plain SomeField2=12345 SomeFieldWithNewLineAndTab=This is line1\r\nThis is line2\r\nThis is \ttab \ttab \ttab SomeFieldStartingWithAtSign=\@MyTwitterHandle
Filter Enter filter to extract array from response. Example: $.rows[*] --OR-- $.customers[*].orders[*]. Check your response document and find out hierarchy you like to extract
Headers Headers for Request. To enter multiple headers use double pipe or new line after each {header-name}:{value} pair

Other App Integration scenarios for Google BigQuery

Other API Connectors for SSAS


Documentation 

  • How to connect Google BigQuery in SSAS?

  • How to get Google BigQuery data in SSAS?

  • How to read Google BigQuery data in SSAS?

  • How to load Google BigQuery data in SSAS?

  • How to import Google BigQuery data in SSAS?

  • How to pull Google BigQuery data in SSAS?

  • How to push data to Google BigQuery in SSAS?

  • How to write data to Google BigQuery in SSAS?

  • How to POST data to Google BigQuery in SSAS?

  • Call Google BigQuery API in SSAS

  • Consume Google BigQuery API in SSAS

  • Google BigQuery SSAS Automate

  • Google BigQuery SSAS Integration

  • Integration Google BigQuery in SSAS

  • Consume real-time Google BigQuery data in SSAS

  • Consume realtime Google BigQuery API data in SSAS

  • Google BigQuery ODBC Driver | ODBC Driver for Google BigQuery | ODBC Google BigQuery Driver | SSIS Google BigQuery Source | SSIS Google BigQuery Destination

  • Connect Google BigQuery in SSAS

  • Load Google BigQuery in SSAS

  • Load Google BigQuery data in SSAS

  • Read Google BigQuery data in SSAS

  • Google BigQuery API Call in SSAS