ElasticSearch Connector for SAP Crystal Reports : Upsert documents via SQL

Integrate SAP Crystal Reports and ElasticSearch
Integrate SAP Crystal Reports and ElasticSearch

Learn how to upsert documents using the ElasticSearch Connector for SAP Crystal Reports. This connector enables you to read and write Elasticsearch data effortlessly. Integrate, manage, and automate indexes and documents — almost no coding required. We'll walk you through the exact setup.

Let's dive in!

Create data source using ElasticSearch ODBC Driver

  1. Download and install ODBC PowerPack (if you haven't already).

  2. Search for odbc and open the ODBC Data Sources (64-bit):

    Open ODBC Data Source
  3. Create a User data source (User DSN) based on the ZappySys API Driver driver:

    ZappySys API Driver
    Create new User DSN for ZappySys API Driver
    • Create and use a User DSN if the client application runs under a User Account. This is the ideal option at design time (e.g., when developing in Visual Studio). Use it for both types of applications (64-bit and 32-bit).
    • Create and use a System DSN if the client application runs under a System Account (e.g., as a Windows Service). This is usually the required option in a production environment. If your Windows Service is a 32-bit application, you must use the 32-bit ODBC Data Source Administrator to configure this
  4. When the Configuration window appears give your data source a name if you haven't done that already, then select "ElasticSearch" from the list of Popular Connectors. If "ElasticSearch" is not present in the list, then click "Search Online" and download it. Then set the path to the location where you downloaded it. Finally, click Continue >> to proceed with configuring the DSN:

    ElasticsearchDSN
    ElasticSearch
    ODBC DSN Template Selection
  5. Now it's time to configure the Connection Manager. Select Authentication Type, e.g. Token Authentication. Then select API Base URL (in most cases, the default one is the right one). More info is available in the Authentication section.

    ElasticSearch authentication

    For Local / Hosted Instance by you

    1. Get your userid / password and enter on the connection UI

    For Managed Instance (By Bonsai search)

    If your instance is hosted by bonsai then perform these steps to get your credentials for API call
    1. Go to https://app.bonsai.io/clusters/{your-instance-id}/tokens
    2. Copy Access Key and Access Secret and enter on the connection UI. Click Test connection.
    3. If your Cluster has no data you can generate sample data by visiting this URL and click Add Sample Data https://{your-cluster-id}.apps.bonsaisearch.net/app/home#/tutorial_directory
    API Connection Manager configuration

    Just perform these simple steps to finish authentication configuration:

    1. Set Authentication Type to Basic Authentication (UserId/Password) [Http]
    2. Optional step. Modify API Base URL if needed (in most cases default will work).
    3. Fill in all the required parameters and set optional parameters if needed.
    4. Finally, hit OK button:
    ElasticsearchDSN
    ElasticSearch
    Basic Authentication (UserId/Password) [Http]
    http://localhost:9200
    Optional Parameters
    User Name (or Access Key)
    Password (or Access Secret)
    Ignore certificate related errors
    ODBC DSN HTTP Connection Configuration
    ElasticSearch authentication

    No instructions available.

    API Connection Manager configuration

    Just perform these simple steps to finish authentication configuration:

    1. Set Authentication Type to Windows Authentication (No Password) [Http]
    2. Optional step. Modify API Base URL if needed (in most cases default will work).
    3. Fill in all the required parameters and set optional parameters if needed.
    4. Finally, hit OK button:
    ElasticsearchDSN
    ElasticSearch
    Windows Authentication (No Password) [Http]
    http://localhost:9200
    Optional Parameters
    Ignore certificate related errors
    ODBC DSN HTTP Connection Configuration

  6. Then go to Preview tab to start building a SQL query.

  7. Once you do that, proceed by opening Query Builder:

    ZappySys API Driver - ElasticSearch
    Read and write Elasticsearch data effortlessly. Integrate, manage, and automate indexes and documents — almost no coding required.
    ElasticsearchDSN
    Open Query Builder in API ODBC Driver to read and write data to REST API
  8. Then simply select the [Dynamic Table] table and Upsert operation.

  9. Continue by configuring the Required parameters. You can also set optional parameters too.

  10. Move on by hitting Preview Data button to preview the results.

  11. If you see the results you need, simply copy the generated query:

    [Dynamic Table]
    Upsert
    Required Parameters
    Index Select the value from the dropdown
    Optional Parameters
    Alias (Deprecated - Use Index instead)
    Advanced Properties
    RowHeaderFooterContinueOnError True
    RowHeader
    DoNotIndentArray True
    Ignore certificate related errors
    UPSERT INTO datatype_test (
        _id, 
    	binary_field,
    	boolean_field,
    	byte_field,
    	date_field,
    	double_field,
    	float_field,
    	geo_point_field,  --raw
    	--OR--
    	--"geo_point_field.lat",
    	--"geo_point_field.lon",
    	
    	geo_shape_field,  --raw
    	--OR--
    	--"geo_shape_field.type",
    	--"geo_shape_field.coordinates",
    	
    	integer_field,
    	ip_field,
    	keyword_field,
    	long_field,
    	nested_field, --raw
    
    	object_field, --raw
    	--OR--
    	--"object_field.field1",
    	--"object_field.field2",
    	
    	short_field,
    	text_field
    )
    VALUES(
        2, -- _id (Optional - if not supplied then it inserts with auto-generated _id)
    	'SGVsbG8gd29ybGQ=', --binary_field  --base64 value of "Hello world"
    	false, --bool
    	117, --byte_field
    	'2012-12-31T23:59:59.123', --date_field
    	1.123456789, --double_field
    	1.123456789, --float_field
    	--raw JSON must be in one line
    	'{ "lat": 40.7128, "lon": -74.0060 }', --geo_point_field
    	--OR--
    	-- 40.7128, -74.0060,
    	
    	'{ "type": "polygon", "coordinates": [[[-74.0060, 40.7128], [-73.9960, 40.7128], [-73.9960, 40.7028], [-74.0060, 40.7028], [-74.0060, 40.7128]]] }', --geo_shape_field
    	--OR--
    	--'polygon',
    	--'[[[-74.0060, 40.7128], [-73.9960, 40.7128], [-73.9960, 40.7028], [-74.0060, 40.7028], [-74.0060, 40.7128]]]',
    	
    	123, --integer_field
    	'127.0.0.1', --ip_field
    	'thhi is text', --keyword_field
    	1234567890, --long_field
    	--raw JSON must be in one line
    	'[{"nested_property_1":"nested text 1", "nested_property_2":100}, {"nested_property_1":"nested text 2", "nested_property_2":101}]', --nested_field
    	'{"field1":"A","field2":"B"}', --object_field (Raw Value)
    	--OR--
    	--'object field keyword 1', --object_field.field1
    	--123,                       --object_field.field2	
    	1, --short_field
    	'text field ' --text_field
    
    )
    Query Builder
  12. Click OK to use built SQL query and close the Query Builder.

  13. Now hit Preview Data button to preview the data using the generated SQL query. If you are satisfied with the result, use this query in SAP Crystal Reports:

    ZappySys API Driver - ElasticSearch
    Read and write Elasticsearch data effortlessly. Integrate, manage, and automate indexes and documents — almost no coding required.
    ElasticsearchDSN
    UPSERT INTO datatype_test (
        _id, 
    	binary_field,
    	boolean_field,
    	byte_field,
    	date_field,
    	double_field,
    	float_field,
    	geo_point_field,  --raw
    	--OR--
    	--"geo_point_field.lat",
    	--"geo_point_field.lon",
    	
    	geo_shape_field,  --raw
    	--OR--
    	--"geo_shape_field.type",
    	--"geo_shape_field.coordinates",
    	
    	integer_field,
    	ip_field,
    	keyword_field,
    	long_field,
    	nested_field, --raw
    
    	object_field, --raw
    	--OR--
    	--"object_field.field1",
    	--"object_field.field2",
    	
    	short_field,
    	text_field
    )
    VALUES(
        2, -- _id (Optional - if not supplied then it inserts with auto-generated _id)
    	'SGVsbG8gd29ybGQ=', --binary_field  --base64 value of "Hello world"
    	false, --bool
    	117, --byte_field
    	'2012-12-31T23:59:59.123', --date_field
    	1.123456789, --double_field
    	1.123456789, --float_field
    	--raw JSON must be in one line
    	'{ "lat": 40.7128, "lon": -74.0060 }', --geo_point_field
    	--OR--
    	-- 40.7128, -74.0060,
    	
    	'{ "type": "polygon", "coordinates": [[[-74.0060, 40.7128], [-73.9960, 40.7128], [-73.9960, 40.7028], [-74.0060, 40.7028], [-74.0060, 40.7128]]] }', --geo_shape_field
    	--OR--
    	--'polygon',
    	--'[[[-74.0060, 40.7128], [-73.9960, 40.7128], [-73.9960, 40.7028], [-74.0060, 40.7028], [-74.0060, 40.7128]]]',
    	
    	123, --integer_field
    	'127.0.0.1', --ip_field
    	'thhi is text', --keyword_field
    	1234567890, --long_field
    	--raw JSON must be in one line
    	'[{"nested_property_1":"nested text 1", "nested_property_2":100}, {"nested_property_1":"nested text 2", "nested_property_2":101}]', --nested_field
    	'{"field1":"A","field2":"B"}', --object_field (Raw Value)
    	--OR--
    	--'object field keyword 1', --object_field.field1
    	--123,                       --object_field.field2	
    	1, --short_field
    	'text field ' --text_field
    
    )
    API ODBC Driver-based data source data preview
    You can also access data quickly from the tables dropdown by selecting <Select table>.
    A WHERE clause, LIMIT keyword will be performed on the client side, meaning that the whole result set will be retrieved from the ElasticSearch API first, and only then the filtering will be applied to the data. If possible, it is recommended to use parameters in Query Builder to filter the data on the server side (in ElasticSearch servers).

Let's not stop here and explore SQL query examples, including how to use them in Stored Procedures and Views (virtual tables) in the next steps.

ElasticSearch SQL query examples

Use these SQL queries in your SAP Crystal Reports data source:

Upsert (Update or Insert) documents using various data types (supply _id)

This example shows how to update or insert document for different datatype fields. _id is optional. If _id column is supplied then it does UPSERT action (Update or Insert) if _id not supplied then does only insert (auto generate new _id) Some fields can accept value as Raw JSON (e.g. nested, object, geo_point, geo_shape). Object field type can also accept value by nested field (e.g. [object_field.field1] ). Look at the Result column in the output to see if document was created or updated.

UPSERT INTO datatype_test (
    _id, 
	binary_field,
	boolean_field,
	byte_field,
	date_field,
	double_field,
	float_field,
	geo_point_field,  --raw
	--OR--
	--"geo_point_field.lat",
	--"geo_point_field.lon",
	
	geo_shape_field,  --raw
	--OR--
	--"geo_shape_field.type",
	--"geo_shape_field.coordinates",
	
	integer_field,
	ip_field,
	keyword_field,
	long_field,
	nested_field, --raw

	object_field, --raw
	--OR--
	--"object_field.field1",
	--"object_field.field2",
	
	short_field,
	text_field
)
VALUES(
    2, -- _id (Optional - if not supplied then it inserts with auto-generated _id)
	'SGVsbG8gd29ybGQ=', --binary_field  --base64 value of "Hello world"
	false, --bool
	117, --byte_field
	'2012-12-31T23:59:59.123', --date_field
	1.123456789, --double_field
	1.123456789, --float_field
	--raw JSON must be in one line
	'{ "lat": 40.7128, "lon": -74.0060 }', --geo_point_field
	--OR--
	-- 40.7128, -74.0060,
	
	'{ "type": "polygon", "coordinates": [[[-74.0060, 40.7128], [-73.9960, 40.7128], [-73.9960, 40.7028], [-74.0060, 40.7028], [-74.0060, 40.7128]]] }', --geo_shape_field
	--OR--
	--'polygon',
	--'[[[-74.0060, 40.7128], [-73.9960, 40.7128], [-73.9960, 40.7028], [-74.0060, 40.7028], [-74.0060, 40.7128]]]',
	
	123, --integer_field
	'127.0.0.1', --ip_field
	'thhi is text', --keyword_field
	1234567890, --long_field
	--raw JSON must be in one line
	'[{"nested_property_1":"nested text 1", "nested_property_2":100}, {"nested_property_1":"nested text 2", "nested_property_2":101}]', --nested_field
	'{"field1":"A","field2":"B"}', --object_field (Raw Value)
	--OR--
	--'object field keyword 1', --object_field.field1
	--123,                       --object_field.field2	
	1, --short_field
	'text field ' --text_field

)

Upsert (Update or Insert) documents using various data types (supply _id)

This example shows how to update or insert document for different datatype fields. _id is optional. If _id column is supplied then it does UPSERT action (Update or Insert) if _id not supplied then does only insert (auto generate new _id) Some fields can accept value as Raw JSON (e.g. nested, object, geo_point, geo_shape). Object field type can also accept value by nested field (e.g. [object_field.field1] ). Look at the Result column in the output to see if document was created or updated.

UPSERT INTO datatype_test (
    _id, 
	binary_field,
	boolean_field,
	byte_field,
	date_field,
	double_field,
	float_field,
	geo_point_field,  --raw
	--OR--
	--"geo_point_field.lat",
	--"geo_point_field.lon",
	
	geo_shape_field,  --raw
	--OR--
	--"geo_shape_field.type",
	--"geo_shape_field.coordinates",
	
	integer_field,
	ip_field,
	keyword_field,
	long_field,
	nested_field, --raw

	object_field, --raw
	--OR--
	--"object_field.field1",
	--"object_field.field2",
	
	short_field,
	text_field
)
VALUES(
    2, -- _id (Optional - if not supplied then it inserts with auto-generated _id)
	'SGVsbG8gd29ybGQ=', --binary_field  --base64 value of "Hello world"
	false, --bool
	117, --byte_field
	'2012-12-31T23:59:59.123', --date_field
	1.123456789, --double_field
	1.123456789, --float_field
	--raw JSON must be in one line
	'{ "lat": 40.7128, "lon": -74.0060 }', --geo_point_field
	--OR--
	-- 40.7128, -74.0060,
	
	'{ "type": "polygon", "coordinates": [[[-74.0060, 40.7128], [-73.9960, 40.7128], [-73.9960, 40.7028], [-74.0060, 40.7028], [-74.0060, 40.7128]]] }', --geo_shape_field
	--OR--
	--'polygon',
	--'[[[-74.0060, 40.7128], [-73.9960, 40.7128], [-73.9960, 40.7028], [-74.0060, 40.7028], [-74.0060, 40.7128]]]',
	
	123, --integer_field
	'127.0.0.1', --ip_field
	'thhi is text', --keyword_field
	1234567890, --long_field
	--raw JSON must be in one line
	'[{"nested_property_1":"nested text 1", "nested_property_2":100}, {"nested_property_1":"nested text 2", "nested_property_2":101}]', --nested_field
	'{"field1":"A","field2":"B"}', --object_field (Raw Value)
	--OR--
	--'object field keyword 1', --object_field.field1
	--123,                       --object_field.field2	
	1, --short_field
	'text field ' --text_field

)

upsert_documents endpoint belongs to [Dynamic Table] , [Dynamic Table] table(s), and can therefore be used via those table(s).

Create SQL view in ODBC data source

ZappySys API Drivers support flexible Query language so you can override Default Properties you configured on Data Source such as URL, Body. This way you don't have to create multiple Data Sources if you like to read data from multiple EndPoints. However not every application support supplying custom SQL to driver so you can only select Table from list returned from driver.

If you're dealing with Microsoft Access and need to import data from an SQL query, it's important to note that Access doesn't allow direct import of SQL queries. Instead, you can create custom objects (Virtual Tables) to handle the import process.

Many applications like MS Access, Informatica Designer wont give you option to specify custom SQL when you import Objects. In such case Virtual Table is very useful. You can create many Virtual Tables on the same Data Source (e.g. If you have 50 URLs with slight variations you can create virtual tables with just URL as Parameter setting.

  1. Go to Custom Objects Tab and Click on Add button and Select Add Table:
    ZappySys Driver - Add Table

  2. Enter the desired Table name and click on OK:
    ZappySys Driver - Add Table Name

  3. And it will open the New Query Window Click on Cancel to close that window and go to Custom Objects Tab.

  4. Select the created table, Select Text Type AS SQL and write the your desired SQL Query and Save it and it will create the custom table in the ZappySys Driver:
    Here is an example SQL query for ZappySys Driver. You can insert Placeholders also. Read more about placeholders here

    SELECT
      "ShipCountry",
      "OrderID",
      "CustomerID",
      "EmployeeID",
      "OrderDate",
      "RequiredDate",
      "ShippedDate",
      "ShipVia",
      "Freight",
      "ShipName",
      "ShipAddress",
      "ShipCity",
      "ShipRegion",
      "ShipPostalCode"
    FROM "Orders"
    Where "ShipCountry"='USA'

    ZappySys Driver - Create Custom Table
  5. That's it now go to Preview Tab and Execute your custom virtual table query. In this example it will extract the orders for the USA Shipping Country only:

    SELECT * FROM "vt__usa_orders_only"
    ZappySys Driver - Execute Custom Virtual Table Query

Upsert documents in SAP Crystal Reports via SQL view

  1. First of all, open SAP Crystal Reports and create the new Crystal Report.

    Create New Crystal Report
  2. And it will open the new data source selection window. Under ODBC(RDO) double click on the Make New Connection and Select the desired ODBC DSN, in our case we need to select ElasticsearchDSN which we created in upper section. And Click on Next.

    ElasticsearchDSN
    SAP Crystal Report - Select ODBC ElasticsearchDSN DSN

  3. Expand the Connection and under the data select the desired table(s) or view(s) and click Add > button and click on Next.

    ElasticsearchDSN
    ElasticsearchDSN
    SAP Crystal Report - Add ODBC ElasticsearchDSN DSN Tables
  4. Add the desired Fields to Display in the Reports. Here we are adding all fields and click on Finish.

    SAP Crystal Report - Add Display Fields
  5. That's it and you will be able to load the data in the Report.

    SAP Crystal Report - ElasticsearchDSN Data Output

Advanced topics

Creating SQL stored procedures

You can create procedures to encapsulate custom logic and then only pass handful parameters rather than long SQL to execute your API call.

Steps to create Custom Stored Procedure in ZappySys Driver. You can insert Placeholders anywhere inside Procedure Body. Read more about placeholders here

  1. Go to Custom Objects Tab and Click on Add button and Select Add Procedure:
    ZappySys Driver - Add Stored Procedure

  2. Enter the desired Procedure name and click on OK:
    ZappySys Driver - Add Stored Procedure Name

  3. Select the created Stored Procedure and write the your desired stored procedure and Save it and it will create the custom stored procedure in the ZappySys Driver. Here is an example stored procedure for ZappySys Driver. You can insert Placeholders anywhere inside Procedure Body. Read more about placeholders here

    CREATE PROCEDURE [usp_get_orders]
        @fromdate = '<<yyyy-MM-dd,FUN_TODAY>>'
     AS
        SELECT * FROM Orders where OrderDate >= '<@fromdate>';
    
    ZappySys Driver - Create Custom Stored Procedure
  4. That's it now go to Preview Tab and Execute your Stored Procedure using Exec Command. In this example it will extract the orders from the date 1996-01-01:

    Exec usp_get_orders '1996-01-01';
    ZappySys Driver - Execute Custom Stored Procedure

Conclusion

And there you have it — a complete guide on how to upsert documents in SAP Crystal Reports without writing complex code. All of this was powered by ElasticSearch ODBC Driver, which handled the REST API pagination and authentication for us automatically.

Download the trial now or ping us via chat if you have any questions or are looking for a specific feature (you can also reach out to us by submitting a ticket):

More actions supported by ElasticSearch Connector

Got another use case in mind? We've documented the exact setups for a variety of essential ElasticSearch operations directly in SAP Crystal Reports, so you can skip the trial and error. Find your next step-by-step guide below:

More ElasticSearch integrations

All
Data Integration
Database
BI & Reporting
Productivity
Programming Languages
Automation & Scripting
ODBC applications