Google BigQuery Connector
Google BigQuery Connector Help
Version 8
ZappySys Logo File Version: v8
Supported Engine: 9

Google BigQuery Connector Examples


The ZappySys API Driver is a user-friendly interface designed to facilitate the seamless integration of various applications with the Google BigQuery API. With its intuitive design and robust functionality, the ZappySys API Driver simplifies the process of configuring specific API endpoints to efficiently read or write data from Google BigQuery.

On this page you will find some SQL examples which can be used for API ODBC Driver or Data Gateway API Connector.

Native Query (ServerSide): Query using Simple SQL    [Read more...]

Server side BigQuery SQL query example. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL). Query free dataset table (bigquery-public-data.samples.wikipedia)

#DirectSQL SELECT * FROM bigquery-public-data.samples.wikipedia LIMIT 1000 /* try your own dataset or Some FREE dataset like nyc-tlc.yellow.trips -- 3 parts ([Project.]Dataset.Table) */

Native Query (ServerSide): Query using Complex SQL    [Read more...]

Server side SQL query example of BigQuery. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL). Query free dataset table (bigquery-public-data.usa_names.usa_1910_2013)

#DirectSQL 
SELECT name, gender, SUM(number) AS total
FROM bigquery-public-data.usa_names.usa_1910_2013
GROUP BY name, gender
ORDER BY total DESC
LIMIT 10

Native Query (ServerSide): Create Table / Run Other DDL    [Read more...]

Example of how to run Valid BigQuery DDL statement. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL)

#DirectSQL CREATE TABLE TestDataset.Table1 (ID INT64,Name STRING,BirthDate DATETIME, Active BOOL)

Native Query (ServerSide): UPDATE Table data for complex types (e.g. Nested RECORD, Geography, JSON)    [Read more...]

Example of how to run Valid BigQuery DML statement ()e.g. UPDATE / INSERT / DELETE). This usecase shows how to update record with complex data types such as RECORD (i.e Array), Geography, JSON and more. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL)

#DirectSQL 
#DirectSQL 
Update TestDataset.DataTypeTest 
Set ColTime='23:59:59.123456',
 ColGeography=ST_GEOGPOINT(34.150480, -84.233870),
 ColRecord=(1,"AA","Column3 data"),
 ColBigNumeric=1222222222222222222.123456789123456789123456789123456789,
 ColJson= JSON_ARRAY('{"doc":1, "values":[{"id":1},{"id":2}]}') 
Where ColInteger=1

Native Query (ServerSide): DROP Table (if exists) / Other DDL    [Read more...]

Example of how to run Valid BigQuery DDL statement. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL)

#DirectSQL DROP TABLE IF EXISTS Myproject.Mydataset.Mytable

Native Query (ServerSide): Call Stored Procedure    [Read more...]

Example of how to run BigQuery Stored Procedure and pass parameters. Assuming you created a valid stored proc called usp_GetData in TestDataset, call like below.

#DirectSQL CALL TestDataset.usp_GetData(1)

INSERT Single Row    [Read more...]

This is sample how you can insert into BigQuery using ZappySys query language. You can also use ProjectId='myproject-id' in WITH clause.

INSERT INTO MyBQTable1(SomeBQCol1, SomeBQCol2) Values(1,'AAA')
--WITH(DatasetId='TestDataset',Output='*')
--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')

INSERT Multiple Rows from SQL Server    [Read more...]

This example shows how to bulk insert into Google BigQuery Table from microsoft SQL Server as external source. Notice that INSERT is missing column list. Its provided by source query so must produce valid column names found in target BQ Table (you can use SQL Alias in Column name to produce matching names)

INSERT INTO MyBQTable1 
SOURCE(
    'MSSQL'
  , 'Data Source=localhost;Initial Catalog=tempdb;Initial Catalog=tempdb;Integrated Security=true'
  , 'SELECT Col1 as SomeBQCol1,Col2 as SomeBQCol2 FROM SomeTable Where SomeCol=123'
)
--WITH(DatasetId='TestDataset',Output='*')
--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')

INSERT Multiple Rows from any ODBC Source (DSN)    [Read more...]

This example shows how to bulk insert into Google BigQuery Table from any external ODBC Source (Assuming you have installed ODBC Driver and configured DSN). Notice that INSERT is missing column list. Its provided by source query so it must produce valid column names found in target BQ Table (you can use SQL Alias in Column name to produce matching names)

INSERT INTO MyBQTable1 
SOURCE(
    'ODBC'
  , 'DSN=MyDsn'
  , 'SELECT Col1 as SomeBQCol1,Col2 as SomeBQCol2 FROM SomeTable Where SomeCol=123'
) 
WITH(DatasetId='TestDataset')

INSERT Multiple Rows from any JSON Files / API (Using ZappySys ODBC JSON Driver)    [Read more...]

This example shows how to bulk insert into Google BigQuery Table from any external ODBC JSON API / File Source (Assuming you have installed ZappySys ODBC Driver for JSON). Notice that INSERT is missing column list. Its provided by source query so it must produce valid column names found in target BQ Table (you can use SQL Alias in Column name to produce matching names). You can also use similar approach to read from CSV files or XML Files. Just use CSV / XML driver rather than JSON driver in connection string. Refer this for more examples of JSON Query https://zappysys.com/onlinehelp/odbc-powerpack/scr/json-odbc-driver-sql-query-examples.htm

INSERT INTO MyBQTable1 
SOURCE(
    'ODBC'
  , 'Driver={ZappySys JSON Driver};Src='https://some-url/get-data''
  , 'SELECT Col1 as SomeBQCol1,Col2 as SomeBQCol2 FROM _root_'
)
--WITH(DatasetId='TestDataset',Output='*')
--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')

List Projects    [Read more...]

Lists Projects for which user has access

SELECT * FROM list_projects

List Datasets    [Read more...]

Lists Datasets for specified project. If you do not specify ProjectId then it will use connection level details.

SELECT * FROM list_datasets
--WITH(ProjectId='MyProjectId')

List Tables    [Read more...]

Lists tables for specified project / dataset. If you do not specify ProjectId or datasetId then it will use connection level details.

SELECT * FROM list_tables
--WITH(ProjectId='MyProjectId')
--WITH(ProjectId='MyProjectId',DatasetId='MyDatasetId')

Delete dataset    [Read more...]

Delete dataset for specified ID. If you like to delete all tables under that dataset then set deleteContents='true'

SELECT * FROM delete_dataset WITH(DatasetId='MyDatasetId', deleteContents='False')

Getting Started with Examples

ZappySys API Driver is a powerful software solution designed to facilitate the extraction and integration of data from a wide range of sources through APIs. Its intuitive design and extensive feature set make it an essential asset for any organization dealing with complex data integration tasks.

To get started with examples using ZappySys API Driver, please click on the following applications:

SQL Server Connect Google BigQuery in SQL Server
Power BI Connect Google BigQuery in Power BI
SSRS Connect Google BigQuery in SSRS
Informatica Connect Google BigQuery in Informatica
MS Access Connect Google BigQuery in MS Access
MS Excel Connect Google BigQuery in MS Excel
SSAS Connect Google BigQuery in SSAS
C# Connect Google BigQuery in C#
Python Connect Google BigQuery in Python
JAVA Connect Google BigQuery in JAVA
Tableau Connect Google BigQuery in Tableau
SAP Crystal Reports Connect Google BigQuery in SAP Crystal Reports
Azure Data Factory (ADF) Connect Google BigQuery in Azure Data Factory (ADF)
ODBC Connect Google BigQuery in ODBC

Key features of the ZappySys API Driver include:

The API ODBC driver facilitates the reading and writing of data from numerous popular online services (refer to the complete list here) using familiar SQL language without learning complexity of REST API calls. The driver allows querying nested structure and output as a flat table. You can also create your own ODBC / Data Gateway API connector file and use it with this driver.

  1. Intuitive Configuration: The interface is designed to be user-friendly, enabling users to easily set up the specific API endpoints within Google BigQuery without requiring extensive technical expertise or programming knowledge.

  2. Customizable Endpoint Setup: Users can conveniently configure the API endpoint settings, including the HTTP request method, endpoint URL, and any necessary parameters, to precisely target the desired data within Google BigQuery.

  3. Data Manipulation Capabilities: The ZappySys API Driver allows for seamless data retrieval and writing, enabling users to fetch data from Google BigQuery and perform various data manipulation operations as needed, all through an intuitive and straightforward interface.

  4. Secure Authentication Integration: The driver provides secure authentication integration, allowing users to securely connect to the Google BigQuery API by inputting the necessary authentication credentials, such as API tokens or other authentication keys.

  5. Error Handling Support: The interface is equipped with comprehensive error handling support, ensuring that any errors or exceptions encountered during the data retrieval or writing process are efficiently managed and appropriately communicated to users for prompt resolution.

  6. Data Visualization and Reporting: The ZappySys API Driver facilitates the seamless processing and presentation of the retrieved data from Google BigQuery, enabling users to generate comprehensive reports and visualizations for further analysis and decision-making purposes.

Overall, the ZappySys API Driver serves as a powerful tool for streamlining the integration of applications with Google BigQuery, providing users with a convenient and efficient way to access and manage data, all through a user-friendly and intuitive interface.