Apache Spark Connector for Talend Studio

Apache Spark Connector lets you connect to Apache Spark, a unified engine for large-scale data analytics.

In this article you will learn how to quickly and efficiently integrate Apache Spark data in Talend Studio without coding. We will use high-performance Apache Spark Connector to easily connect to Apache Spark and then access the data inside Talend Studio.

Let's follow the steps below to see how we can accomplish that!

Download Documentation

Prerequisites

Before we begin, make sure you meet the following prerequisite:

If you already have a JRE installed, you can try using it too. However, if you experience any issues, we recommend using one of the distributions mentioned above (you can install an additional JRE next to the existing one; just don't forget to configure the default Java in the Windows Environment Variables).

Download Apache Spark JDBC driver

To connect to Apache Spark in Talend Studio, you will have to download JDBC driver for it, which we will use in later steps. It is recommended to use JDBC driver compiled for Java 8, if possible. Let's perform these little steps right away:

  1. Visit MVN Repository.
  2. Download the JDBC driver, and save it locally, e.g. to D:\Drivers\JDBC\hive-jdbc-standalone.jar.
  3. Make sure to download the standalone version of the Apache Hive JDBC driver to avoid Java library dependency errors, e.g., hive-jdbc-4.0.1-standalone.jar (commonly used driver to connect to Spark).
  4. Done! That was easy, wasn't it? Let's proceed to the next step.

Create Data Source in ZappySys Data Gateway based on JDBC Bridge Driver

  1. Download and install ODBC PowerPack.

  2. Search for gateway in start menu and Open ZappySys Data Gateway:

    Open ZappySys Data Gateway
  3. Go to Users Tab to add our first Gateway user. Click Add; we will give it a name tdsuser and enter password you like to give. Check Admin option and click OK to save. We will use these details later when we create linked server:

    ZappySys Data Gateway - Add User
  4. Now we are ready to add a data source. Click Add, give data source a name (Copy this name somewhere, we will need it later) and then select Native - ZappySys JDBC Bridge Driver. Finally, click OK. And it will create the Data Set for it and open the ZS driver UI.

    ApacheSparkDSN
    Native - ZappySys JDBC Bridge Driver
    ZappySys Data Gateway - ZappySys JDBC Bridge Driver
  5. Now, we need to configure the JDBC connection in the new ODBC data source. Simply enter the Connection string, credentials, configure other settings, and then click Test Connection button to test the connection:

    ApacheSparkDSN
    jdbc:hive2://spark-thrift-server-host:10000
    D:\Drivers\JDBC\hive-jdbc-standalone.jar
    []
    JDBC-ODBC Bridge driver data source settings

    Use these values when setting parameters:

    • Connection string: jdbc:hive2://spark-thrift-server-host:10000
    • JDBC driver file(s): D:\Drivers\JDBC\hive-jdbc-standalone.jar
    • Connection parameters: []

  6. You should see a message saying that connection test is successful:

    ODBC connection test is successful

    Otherwise, if you are getting an error, check out our Community for troubleshooting tips.

  7. We are at the point where we can preview a SQL query. For more SQL query examples visit JDBC Bridge documentation:

    ApacheSparkDSN
    -- Basic SELECT with a WHERE clause
    SELECT
        id,
        name,
        salary
    FROM employees
    WHERE department = 'Sales';
    JDBC ODBC Bridge data source preview
    -- Basic SELECT with a WHERE clause
    SELECT
        id,
        name,
        salary
    FROM employees
    WHERE department = 'Sales';
    You can also click on the <Select Table> dropdown and select a table from the list.

    The ZappySys JDBC Bridge Driver acts as a transparent intermediary, passing SQL queries directly to the Trino JDBC driver, which then handles the query execution. This means the Bridge Driver simply relays the SQL query without altering it.

    Some JDBC drivers don't support INSERT/UPDATE/DELETE statements, so you may get an error saying "action is not supported" or a similar one. Please, be aware, this is not the limitation of ZappySys JDBC Bridge Driver, but is a limitation of the specific JDBC driver you are using.

  8. Click OK to finish creating the data source.

Read Apache Spark data in Talend Studio

To read Apache Spark data in Talend Studio, we'll need to complete several steps. Let's get through them all right away!

This article is compatible with Talend Open Studio (a free version, currently retired by Qlik). If you don't have it, you can still purchase a shareware version of Talend Studio from Qlik.

Create connection for input

  1. First of all, open Talend Studio
  2. Create a new connection: Creating a new connection in Talend Studio
  3. Select Microsoft SQL Server connection: Creating SQL Server connection in Talend Studio
  4. Name your connection: Naming a connection in Talend Studio
  5. Fill-in connection parameters and then click Test connection:
    ApacheSparkDSN
    Configuring the ZappySys Data Gateway connection in Talend Studio
  6. If the List of modules not installed for this operation window shows up, then download and install all of them: Configure the connection
    Review and accept all additional module license agreements during the process
  7. Finally, you should see a successful connection test result at the end: Connection test successful

Add input

  1. Once we have a connection to ZappySys Data Gateway created, we can proceed by creating a job: Create a job in Talend Studio
  2. Simply drag and drop ZappySys Data Gateway connection onto the job: Creating an input based on ZappySys Data Gateway connection
  3. Then create an input based on ZappySys Data Gateway connection: Creating an input based on ZappySys Data Gateway connection
  4. Continue by configuring a SQL query and click Guess schema button: Configuring a SQL query in Talend Studio
  5. Finish by configuring the schema, for example: Configuring a schema in Talend Studio

Add output

We are ready to add an output. From Palette drag and drop a tFileOutputDelimited output and connect it to the input: Connecting tFileOutputDelimited output in Talend Studio

Run the job

Finally, run the job and integrate your Apache Spark data: Integrating Apache Spark data in Talend Studio

Conclusion

In this article we showed you how to connect to Apache Spark in Talend Studio and integrate data without any coding, saving you time and effort. It's worth noting that ZappySys JDBC Bridge Driver allows you to connect not only to Apache Spark, but to any Java application that supports JDBC (just use a different JDBC driver and configure it appropriately).

We encourage you to download Apache Spark Connector for Talend Studio and see how easy it is to use it for yourself or your team.

If you have any questions, feel free to contact ZappySys support team. You can also open a live chat immediately by clicking on the chat icon below.

Download Apache Spark Connector for Talend Studio Documentation

More integrations

Other connectors for Talend Studio

All
Big Data & NoSQL
Database
CRM & ERP
Marketing
Collaboration
Cloud Storage
Reporting
Commerce
API & Files

Other application integration scenarios for Apache Spark

All
Data Integration
Database
BI & Reporting
Productivity
Programming Languages
Automation & Scripting
ODBC applications

  • How to connect Apache Spark in Talend Studio?

  • How to get Apache Spark data in Talend Studio?

  • How to read Apache Spark data in Talend Studio?

  • How to load Apache Spark data in Talend Studio?

  • How to import Apache Spark data in Talend Studio?

  • How to pull Apache Spark data in Talend Studio?

  • How to push data to Apache Spark in Talend Studio?

  • How to write data to Apache Spark in Talend Studio?

  • How to POST data to Apache Spark in Talend Studio?

  • Call Apache Spark API in Talend Studio

  • Consume Apache Spark API in Talend Studio

  • Apache Spark Talend Studio Automate

  • Apache Spark Talend Studio Integration

  • Integration Apache Spark in Talend Studio

  • Consume real-time Apache Spark data in Talend Studio

  • Consume real-time Apache Spark API data in Talend Studio

  • Apache Spark ODBC Driver | ODBC Driver for Apache Spark | ODBC Apache Spark Driver | SSIS Apache Spark Source | SSIS Apache Spark Destination

  • Connect Apache Spark in Talend Studio

  • Load Apache Spark in Talend Studio

  • Load Apache Spark data in Talend Studio

  • Read Apache Spark data in Talend Studio

  • Apache Spark API Call in Talend Studio