Google BigQuery Connector for Talend Studio
In this article you will learn how to integrate Using Google BigQuery Connector you will be able to connect, read, and write data from within Talend Studio. Follow the steps below to see how we would accomplish that. The driver mentioned above is part of ODBC PowerPack which is a collection of high-performance Drivers for various API data source (i.e. REST API, JSON, XML, CSV, Amazon S3 and many more). Using familiar SQL query language you can make live connections and read/write data from API sources or JSON / XML / CSV Files inside SQL Server (T-SQL) or your favorite Reporting (i.e. Power BI, Tableau, Qlik, SSRS, MicroStrategy, Excel, MS Access), ETL Tools (i.e. Informatica, Talend, Pentaho, SSIS). You can also call our drivers from programming languages such as JAVA, C#, Python, PowerShell etc. If you are new to ODBC and ZappySys ODBC PowerPack then check the following links to get started. |
Connect to Google BigQuery in other apps
|
Create Data Source in ZappySys Data Gateway based on API Driver
-
Download and install ZappySys ODBC PowerPack.
-
Search for gateway in start menu and Open ZappySys Data Gateway:
-
Go to Users Tab to add our first Gateway user. Click Add; we will give it a name tdsuser and enter password you like to give. Check Admin option and click OK to save. We will use these details later when we create linked server:
-
Now we are ready to add a data source. Click Add, give data source a name (Copy this name somewhere, we will need it later) and then select Native - ZappySys API Driver. Finally, click OK. And it will create the Data Set for it and open the ZS driver UI.
GoogleBigqueryDSN
-
When the Configuration window appears give your data source a name if you haven't done that already, then select "Google BigQuery" from the list of Popular Connectors. If "Google BigQuery" is not present in the list, then click "Search Online" and download it. Then set the path to the location where you downloaded it. Finally, click Continue >> to proceed with configuring the DSN:
GoogleBigqueryDSNGoogle BigQuery -
Now it's time to configure the Connection Manager. Select Authentication Type, e.g. Token Authentication. Then select API Base URL (in most cases, the default one is the right one). More info is available in the Authentication section.
User accounts represent a developer, administrator, or any other person who interacts with Google APIs and services. User accounts are managed as Google Accounts, either with Google Workspace or Cloud Identity. They can also be user accounts that are managed by a third-party identity provider and federated with Workforce Identity Federation. [API reference]
Steps how to get and use Google BigQuery credentials
Follow these steps on how to create Client Credentials (User Account principle) to authenticate and access BigQuery API in SSIS package or ODBC data source:
WARNING: If you are planning to automate processes, we recommend that you use a Service Account authentication method. In case, you still need to use User Account, then make sure you use a system/generic account (e.g.automation@my-company.com
). When you use a personal account which is tied to a specific employee profile and that employee leaves the company, the token may become invalid and any automated processes using that token will start to fail.Step-1: Create project
This step is optional, if you already have a project in Google Cloud and can use it. However, if you don't, proceed with these simple steps to create one:
-
First of all, go to Google API Console.
-
Then click Select a project button and then click NEW PROJECT button:
-
Name your project and click CREATE button:
-
Wait until the project is created:
- Done! Let's proceed to the next step.
Step-2: Enable Google Cloud APIs
In this step we will enable BigQuery API and Cloud Resource Manager API:
-
Select your project on the top bar:
-
Then click the "hamburger" icon on the top left and access APIs & Services:
-
Now let's enable several APIs by clicking ENABLE APIS AND SERVICES button:
-
In the search bar search for
bigquery api
and then locate and select BigQuery API: -
If BigQuery API is not enabled, enable it:
-
Then repeat the step and enable Cloud Resource Manager API as well:
- Done! Let's proceed to the next step.
Step-3: Create OAuth application
-
First of all, click the "hamburger" icon on the top left and then hit VIEW ALL PRODUCTS:
-
Then access Google Auth Platform to start creating an OAuth application:
-
Start by pressing GET STARTED button:
-
Next, continue by filling in App name and User support email fields:
-
Choose Internal option, if it's enabled, otherwise select External:
-
Optional step if you used
Internal
option in the previous step. Nevertheless, if you had to useExternal
option, then click ADD USERS to add a user: -
Then add your contact Email address:
-
Finally, check the checkbox and click CREATE button:
- Done! Let's create Client Credentials in the next step.
Step-4: Create Client Credentials
-
In Google Auth Platform, select Clients menu item and click CREATE CLIENT button:
-
Choose
Desktop app
as Application type and name your credentials: -
Continue by opening the created credentials:
-
Finally, copy Client ID and Client secret for the later step:
-
Done! We have all the data needed for authentication, let's proceed to the last step!
Step-5: Configure connection
-
Now go to SSIS package or ODBC data source and use previously copied values in User Account authentication configuration:
- In the ClientId field paste the Client ID value.
- In the ClientSecret field paste the Client secret value.
-
Press Generate Token button to generate Access and Refresh Tokens.
-
Then choose ProjectId from the drop down menu.
-
Continue by choosing DatasetId from the drop down menu.
-
Finally, click Test Connection to confirm the connection is working.
-
Done! Now you are ready to use Google BigQuery Connector!
Fill in all required parameters and set optional parameters if needed:
GoogleBigqueryDSNGoogle BigQueryUser Account [OAuth]https://www.googleapis.com/bigquery/v2Required Parameters UseCustomApp Fill-in the parameter... ProjectId (Choose after [Generate Token] clicked) Fill-in the parameter... DatasetId (Choose after [Generate Token] clicked and ProjectId selected) Fill-in the parameter... Optional Parameters ClientId ClientSecret Scope https://www.googleapis.com/auth/bigquery https://www.googleapis.com/auth/bigquery.insertdata https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/cloud-platform.read-only https://www.googleapis.com/auth/devstorage.full_control https://www.googleapis.com/auth/devstorage.read_only https://www.googleapis.com/auth/devstorage.read_write RetryMode RetryWhenStatusCodeMatch RetryStatusCodeList 429|503 RetryCountMax 5 RetryMultiplyWaitTime True Job Location Redirect URL (Only for Web App) Service accounts are accounts that do not represent a human user. They provide a way to manage authentication and authorization when a human is not directly involved, such as when an application needs to access Google Cloud resources. Service accounts are managed by IAM. [API reference]
Steps how to get and use Google BigQuery credentials
Follow these steps on how to create Service Account to authenticate and access BigQuery API in SSIS package or ODBC data source:
Step-1: Create project
This step is optional, if you already have a project in Google Cloud and can use it. However, if you don't, proceed with these simple steps to create one:
-
First of all, go to Google API Console.
-
Then click Select a project button and then click NEW PROJECT button:
-
Name your project and click CREATE button:
-
Wait until the project is created:
- Done! Let's proceed to the next step.
Step-2: Enable Google Cloud APIs
In this step we will enable BigQuery API and Cloud Resource Manager API:
-
Select your project on the top bar:
-
Then click the "hamburger" icon on the top left and access APIs & Services:
-
Now let's enable several APIs by clicking ENABLE APIS AND SERVICES button:
-
In the search bar search for
bigquery api
and then locate and select BigQuery API: -
If BigQuery API is not enabled, enable it:
-
Then repeat the step and enable Cloud Resource Manager API as well:
- Done! Let's proceed to the next step and create a service account.
Step-3: Create Service Account
Use the steps below to create a Service Account in Google Cloud:
-
First of all, go to IAM & Admin in Google Cloud console:
-
Once you do that, click Service Accounts on the left side and click CREATE SERVICE ACCOUNT button:
-
Then name your service account and click CREATE AND CONTINUE button:
-
Continue by clicking Select a role dropdown and start granting service account BigQuery Admin and Project Viewer roles:
-
Find BigQuery group on the left and then click on BigQuery Admin role on the right:
-
Then click ADD ANOTHER ROLE button, find Project group and select Viewer role:
-
Finish adding roles by clicking CONTINUE button:
You can always add or modify permissions later in IAM & Admin. -
Finally, in the last step, just click button DONE:
-
Done! We are ready to add a Key to this service account in the next step.
Step-4: Add Key to Service Account
We are ready to add a Key (P12 certificate) to the created Service Account:
-
In Service Accounts open newly created service account:
-
Next, copy email address of your service account for the later step:
-
Continue by selecting KEYS tab, then press ADD KEY dropdown, and click Create new key menu item:
-
Finally, select P12 option and hit CREATE button:
- P12 certificate downloads into your machine. We have all the data needed for authentication, let's proceed to the last step!
Step-5: Configure connection
-
Now go to SSIS package or ODBC data source and configure these fields in Service Account authentication configuration:
- In the Service Account Email field paste the service account Email address value you copied in the previous step.
- In the Service Account Private Key Path (i.e. *.p12) field use downloaded certificate's file path.
- Done! Now you are ready to use Google BigQuery Connector!
Fill in all required parameters and set optional parameters if needed:
GoogleBigqueryDSNGoogle BigQueryService Account [OAuth]https://www.googleapis.com/bigquery/v2Required Parameters Service Account Email Fill-in the parameter... Service Account Private Key Path (i.e. *.p12) Fill-in the parameter... ProjectId Fill-in the parameter... DatasetId (Choose after ProjectId) Fill-in the parameter... Optional Parameters Scope https://www.googleapis.com/auth/bigquery https://www.googleapis.com/auth/bigquery.insertdata https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/cloud-platform.read-only https://www.googleapis.com/auth/devstorage.full_control https://www.googleapis.com/auth/devstorage.read_only https://www.googleapis.com/auth/devstorage.read_write RetryMode RetryWhenStatusCodeMatch RetryStatusCodeList 429 RetryCountMax 5 RetryMultiplyWaitTime True Job Location Impersonate As (Enter Email Id) -
-
Once the data source has been configured, you can preview data. Select the Preview tab and use settings similar to the following to preview data:
-
Click OK to finish creating the data source.
Read Google BigQuery data in Talend Studio
To read Google BigQuery data in Talend Studio, we'll need to complete several steps. Let's get through them all right away!
Create connection for input
- First of all, open Talend Studio
-
Create a new connection:
-
Select Microsoft SQL Server connection:
-
Name your connection:
-
Fill-in connection parameters and then click Test connection:
GoogleBigqueryDSN
-
If the List of modules not installed for this operation window shows up, then download and install all of them:
Review and accept all additional module license agreements during the process
-
Finally, you should see a successful connection test result at the end:
Add input
-
Once we have a connection to ZappySys Data Gateway created, we can proceed by creating a job:
-
Simply drag and drop ZappySys Data Gateway connection onto the job:
-
Then create an input based on ZappySys Data Gateway connection:
-
Continue by configuring a SQL query and click Guess schema button:
-
Finish by configuring the schema, for example:
Add output
We are ready to add an output. From Palette drag and drop a tFileOutputDelimited output and connect it to the input:
Run the job
Finally, run the job and integrate your Google BigQuery data:
Advanced topics
Create Custom Stored Procedure in ZappySys Driver
You can create procedures to encapsulate custom logic and then only pass handful parameters rather than long SQL to execute your API call.
Steps to create Custom Stored Procedure in ZappySys Driver. You can insert Placeholders anywhere inside Procedure Body. Read more about placeholders here
-
Go to Custom Objects Tab and Click on Add button and Select Add Procedure:
-
Enter the desired Procedure name and click on OK:
-
Select the created Stored Procedure and write the your desired stored procedure and Save it and it will create the custom stored procedure in the ZappySys Driver:
Here is an example stored procedure for ZappySys Driver. You can insert Placeholders anywhere inside Procedure Body. Read more about placeholders here
CREATE PROCEDURE [usp_get_orders] @fromdate = '<<yyyy-MM-dd,FUN_TODAY>>' AS SELECT * FROM Orders where OrderDate >= '<@fromdate>';
-
That's it now go to Preview Tab and Execute your Stored Procedure using Exec Command. In this example it will extract the orders from the date 1996-01-01:
Exec usp_get_orders '1996-01-01';
-
Let's generate the SQL Server Query Code to make the API call using stored procedure. Go to Code Generator Tab, select language as SQL Server and click on Generate button the generate the code.
As we already created the linked server for this Data Source, in that you just need to copy the Select Query and need to use the linked server name which we have apply on the place of [MY_API_SERVICE] placeholder.
SELECT * FROM OPENQUERY([MY_API_SERVICE], 'EXEC usp_get_orders @fromdate=''1996-07-30''')
-
Now go to SQL served and execute that query and it will make the API call using stored procedure and provide you the response.
Create Custom Virtual Table in ZappySys Driver
ZappySys API Drivers support flexible Query language so you can override Default Properties you configured on Data Source such as URL, Body. This way you don't have to create multiple Data Sources if you like to read data from multiple EndPoints. However not every application support supplying custom SQL to driver so you can only select Table from list returned from driver.
If you're dealing with Microsoft Access and need to import data from an SQL query, it's important to note that Access doesn't allow direct import of SQL queries. Instead, you can create custom objects (Virtual Tables) to handle the import process.
Many applications like MS Access, Informatica Designer wont give you option to specify custom SQL when you import Objects. In such case Virtual Table is very useful. You can create many Virtual Tables on the same Data Source (e.g. If you have 50 URLs with slight variations you can create virtual tables with just URL as Parameter setting.
-
Go to Custom Objects Tab and Click on Add button and Select Add Table:
-
Enter the desired Table name and click on OK:
-
And it will open the New Query Window Click on Cancel to close that window and go to Custom Objects Tab.
-
Select the created table, Select Text Type AS SQL and write the your desired SQL Query and Save it and it will create the custom table in the ZappySys Driver:
Here is an example SQL query for ZappySys Driver. You can insert Placeholders also. Read more about placeholders here
SELECT "ShipCountry", "OrderID", "CustomerID", "EmployeeID", "OrderDate", "RequiredDate", "ShippedDate", "ShipVia", "Freight", "ShipName", "ShipAddress", "ShipCity", "ShipRegion", "ShipPostalCode" FROM "Orders" Where "ShipCountry"='USA'
-
That's it now go to Preview Tab and Execute your custom virtual table query. In this example it will extract the orders for the USA Shipping Country only:
SELECT * FROM "vt__usa_orders_only"
-
Let's generate the SQL Server Query Code to make the API call using stored procedure. Go to Code Generator Tab, select language as SQL Server and click on Generate button the generate the code.
As we already created the linked server for this Data Source, in that you just need to copy the Select Query and need to use the linked server name which we have apply on the place of [MY_API_SERVICE] placeholder.
SELECT * FROM OPENQUERY([MY_API_SERVICE], 'EXEC [usp_get_orders] ''1996-01-01''')
-
Now go to SQL served and execute that query and it will make the API call using stored procedure and provide you the response.
Actions supported by Google BigQuery Connector
Google BigQuery Connector support following actions for REST API integration. If some actions are not listed below then you can easily edit Connector file and enhance out of the box functionality.Parameter | Description | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
SQL Statement (i.e. SELECT / DROP / CREATE) |
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Use Legacy SQL Syntax? |
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
timeout (Milliseconds) |
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Job Location |
|
Parameter | Description |
---|---|
ProjectId |
|
DatasetId |
|
TableId |
|
Parameter | Description |
---|
Parameter | Description |
---|---|
SearchFilter |
|
Parameter | Description | ||||||
---|---|---|---|---|---|---|---|
ProjectId |
|
||||||
SearchFilter |
|
||||||
all |
|
Parameter | Description |
---|---|
ProjectId |
|
Dataset Name |
|
Description |
|
Parameter | Description | ||||||
---|---|---|---|---|---|---|---|
ProjectId |
|
||||||
DatasetId |
|
||||||
Delete All Tables |
|
Parameter | Description |
---|---|
ProjectId |
|
DatasetId |
|
TableId |
|
Parameter | Description |
---|---|
ProjectId |
|
DatasetId |
|
Parameter | Description | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
SQL Query |
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Filter |
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Use Legacy SQL Syntax? |
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
timeout (Milliseconds) |
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Job Location |
|
Parameter | Description |
---|---|
DatasetId |
|
TableId |
|
Filter |
|
Parameter | Description |
---|---|
ProjectId |
|
DatasetId |
|
TableId |
|
Parameter | Description | ||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Url |
|
||||||||||||||||||||||||||
Body |
|
||||||||||||||||||||||||||
IsMultiPart |
|
||||||||||||||||||||||||||
Filter |
|
||||||||||||||||||||||||||
Headers |
|
Parameter | Description |
---|---|
Url |
|
IsMultiPart |
|
Filter |
|
Headers |
|
Google BigQuery Connector Examples for Talend Studio Connection
This page offers a collection of SQL examples designed for seamless integration with the ZappySys API ODBC Driver under ODBC Data Source (36/64) or ZappySys Data Gateway, enhancing your ability to connect and interact with Prebuilt Connectors effectively.
Native Query (ServerSide): Query using Simple SQL [Read more...]
Server side BigQuery SQL query example. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL). Query free dataset table (bigquery-public-data.samples.wikipedia)
#DirectSQL SELECT * FROM bigquery-public-data.samples.wikipedia LIMIT 1000 /* try your own dataset or Some FREE dataset like nyc-tlc.yellow.trips -- 3 parts ([Project.]Dataset.Table) */
Native Query (ServerSide): Query using Complex SQL [Read more...]
Server side SQL query example of BigQuery. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL). Query free dataset table (bigquery-public-data.usa_names.usa_1910_2013)
#DirectSQL
SELECT name, gender, SUM(number) AS total
FROM bigquery-public-data.usa_names.usa_1910_2013
GROUP BY name, gender
ORDER BY total DESC
LIMIT 10
Native Query (ServerSide): Delete Multiple Records (Call DML) [Read more...]
This Server side SQL query example of BigQuery shows how to invoke DELETE statement. To do that prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL). Query free dataset table (bigquery-public-data.usa_names.usa_1910_2013)
#DirectSQL DELETE FROM TestDataset.MyTable Where Id > 5
Native Query (ServerSide): Query with CAST unix TIMESTAMP datatype column as datetime [Read more...]
This example shows how to query timestamp column as DateTime. E.g. 73833719.524272 should be displayed as 1972-05-04 or with milliseconds 1972-05-04 1:21:59.524 PM then use CAST function (you must use #DirectSQL prefix)
#DirectSQL
SELECT id, col_timestamp, CAST(col_timestamp as DATE) AS timestamp_as_date, CAST(col_timestamp as DATETIME) AS timestamp_as_datetime
FROM MyProject.MyDataset.MyTable
LIMIT 10
Native Query (ServerSide): Create Table / Run Other DDL [Read more...]
Example of how to run Valid BigQuery DDL statement. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL)
#DirectSQL CREATE TABLE TestDataset.Table1 (ID INT64,Name STRING,BirthDate DATETIME, Active BOOL)
Native Query (ServerSide): UPDATE Table data for complex types (e.g. Nested RECORD, Geography, JSON) [Read more...]
Example of how to run Valid BigQuery DML statement ()e.g. UPDATE / INSERT / DELETE). This usecase shows how to update record with complex data types such as RECORD (i.e Array), Geography, JSON and more. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL)
#DirectSQL
Update TestDataset.DataTypeTest
Set ColTime='23:59:59.123456',
ColGeography=ST_GEOGPOINT(34.150480, -84.233870),
ColRecord=(1,"AA","Column3 data"),
ColBigNumeric=1222222222222222222.123456789123456789123456789123456789,
ColJson= JSON_ARRAY('{"doc":1, "values":[{"id":1},{"id":2}]}')
Where ColInteger=1
Native Query (ServerSide): DROP Table (if exists) / Other DDL [Read more...]
Example of how to run Valid BigQuery DDL statement. Prefix SQL with word #DirectSQL to invoke server side engine (Pass-through SQL)
#DirectSQL DROP TABLE IF EXISTS Myproject.Mydataset.Mytable
Native Query (ServerSide): Call Stored Procedure [Read more...]
Example of how to run BigQuery Stored Procedure and pass parameters. Assuming you created a valid stored proc called usp_GetData in TestDataset, call like below.
#DirectSQL CALL TestDataset.usp_GetData(1)
INSERT Single Row [Read more...]
This is sample how you can insert into BigQuery using ZappySys query language. You can also use ProjectId='myproject-id' in WITH clause.
INSERT INTO MyBQTable1(SomeBQCol1, SomeBQCol2) Values(1,'AAA')
--WITH(DatasetId='TestDataset',Output='*')
--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')
INSERT Multiple Rows from SQL Server [Read more...]
This example shows how to bulk insert into Google BigQuery Table from microsoft SQL Server as external source. Notice that INSERT is missing column list. Its provided by source query so must produce valid column names found in target BQ Table (you can use SQL Alias in Column name to produce matching names)
INSERT INTO MyBQTable1
SOURCE(
'MSSQL'
, 'Data Source=localhost;Initial Catalog=tempdb;Initial Catalog=tempdb;Integrated Security=true'
, 'SELECT Col1 as SomeBQCol1,Col2 as SomeBQCol2 FROM SomeTable Where SomeCol=123'
)
--WITH(DatasetId='TestDataset',Output='*')
--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')
INSERT Multiple Rows from any ODBC Source (DSN) [Read more...]
This example shows how to bulk insert into Google BigQuery Table from any external ODBC Source (Assuming you have installed ODBC Driver and configured DSN). Notice that INSERT is missing column list. Its provided by source query so it must produce valid column names found in target BQ Table (you can use SQL Alias in Column name to produce matching names)
INSERT INTO MyBQTable1
SOURCE(
'ODBC'
, 'DSN=MyDsn'
, 'SELECT Col1 as SomeBQCol1,Col2 as SomeBQCol2 FROM SomeTable Where SomeCol=123'
)
WITH(DatasetId='TestDataset')
INSERT Multiple Rows from any JSON Files / API (Using ZappySys ODBC JSON Driver) [Read more...]
This example shows how to bulk insert into Google BigQuery Table from any external ODBC JSON API / File Source (Assuming you have installed ZappySys ODBC Driver for JSON). Notice that INSERT is missing column list. Its provided by source query so it must produce valid column names found in target BQ Table (you can use SQL Alias in Column name to produce matching names). You can also use similar approach to read from CSV files or XML Files. Just use CSV / XML driver rather than JSON driver in connection string. Refer this for more examples of JSON Query https://zappysys.com/onlinehelp/odbc-powerpack/scr/json-odbc-driver-sql-query-examples.htm
INSERT INTO MyBQTable1
SOURCE(
'ODBC'
, 'Driver={ZappySys JSON Driver};Src='https://some-url/get-data''
, 'SELECT Col1 as SomeBQCol1,Col2 as SomeBQCol2 FROM _root_'
)
--WITH(DatasetId='TestDataset',Output='*')
--WITH(DatasetId='TestDataset',ProjectId='MyProjectId',Output='*')
List Projects [Read more...]
Lists Projects for which user has access
SELECT * FROM list_projects
List Datasets [Read more...]
Lists Datasets for specified project. If you do not specify ProjectId then it will use connection level details.
SELECT * FROM list_datasets
--WITH(ProjectId='MyProjectId')
List Tables [Read more...]
Lists tables for specified project / dataset. If you do not specify ProjectId or datasetId then it will use connection level details.
SELECT * FROM list_tables
--WITH(ProjectId='MyProjectId')
--WITH(ProjectId='MyProjectId',DatasetId='MyDatasetId')
Delete dataset [Read more...]
Delete dataset for specified ID. If you like to delete all tables under that dataset then set deleteContents='true'
SELECT * FROM delete_dataset WITH(DatasetId='MyDatasetId', deleteContents='False')
Conclusion
In this article we discussed how to connect to Google BigQuery in Talend Studio and integrate data without any coding. Click here to Download Google BigQuery Connector for Talend Studio and try yourself see how easy it is. If you still have any question(s) then ask here or simply click on live chat icon below and ask our expert (see bottom-right corner of this page).
Download Google BigQuery Connector for Talend Studio
Documentation
More integrations
Other application integration scenarios for Google BigQuery
Other connectors for Talend Studio
Download Google BigQuery Connector for Talend Studio
Documentation
How to connect Google BigQuery in Talend Studio?
How to get Google BigQuery data in Talend Studio?
How to read Google BigQuery data in Talend Studio?
How to load Google BigQuery data in Talend Studio?
How to import Google BigQuery data in Talend Studio?
How to pull Google BigQuery data in Talend Studio?
How to push data to Google BigQuery in Talend Studio?
How to write data to Google BigQuery in Talend Studio?
How to POST data to Google BigQuery in Talend Studio?
Call Google BigQuery API in Talend Studio
Consume Google BigQuery API in Talend Studio
Google BigQuery Talend Studio Automate
Google BigQuery Talend Studio Integration
Integration Google BigQuery in Talend Studio
Consume real-time Google BigQuery data in Talend Studio
Consume real-time Google BigQuery API data in Talend Studio
Google BigQuery ODBC Driver | ODBC Driver for Google BigQuery | ODBC Google BigQuery Driver | SSIS Google BigQuery Source | SSIS Google BigQuery Destination
Connect Google BigQuery in Talend Studio
Load Google BigQuery in Talend Studio
Load Google BigQuery data in Talend Studio
Read Google BigQuery data in Talend Studio
Google BigQuery API Call in Talend Studio