Quick Start Guide 101¶
Introduction to Streaming Integration¶
The Streaming Integrator is one of the integrators in WSO2 Enterprise Integrator. It reads streaming data from files, cloud-based applications, streaming applications, and databases, and processes them. It also allows downstream applications to access streaming data by publishing information in a streaming manner. It can analyze streaming data, identify trends and patterns, and trigger integration flows.
The purpose of this guide if for you to understand the basic functions of the Streaming Integrator in 30 minutes.
To learn how to use the key functions of the Streaming Integrator, consider a laboratory that is observing the temperature of a range of rooms in a building via a sensor and needs to use the temperature readings as the input to derive other information.
Before you begin:
Creating your first Siddhi application¶
Create a basic Siddhi application for a simple use case.
Extract the Streaming Integrator Tooling pack to a preferred location. Hereafter, the extracted location is referred to as
Navigate to the
<SI_TOOLING_HOME>/bindirectory and issue the following command to start the Streaming Integration tooling.
Access the Streaming Integration Tooling via the
The default URL is
The Streaming Integration Tooling opens as shown below.
Open a new Siddhi file by clicking New.
The new file opens as follows.
Specify a name for the new Siddhi application via the
@App:nameannotation, and a description via the
@App:name("TemperatureApp") @App:description("This application captures the room temperature and analyzes it, and presents the results as logs in the output console.")
The details to be captures include the room ID, device ID, and the temperature. To specify this, define an input stream with attributes to capture each of these details.
define stream TempStream(roomNo string, deviceNo long, temp double)
The technicians need to know the average temperature with each new temperature reading. To publish this information, define an output stream including these details as attributes in the schema.
define stream AverageTempStream(roomNo string, deviceNo long, avgTemp double)
The average temperature needs to be logged. Therefore, connect a sink of the
logtype to the output stream as shown below.
@sink(type = 'log', @map(type = 'passThrough')) define stream AverageTempStream (roomNo string, deviceID long, avgTemp double);
passThroughis specified as the mapping type because in this scenario, the attribute names are received as they are defined in the stream and do not need to be mapped.
To get the input events, calculate the average temperature and direct the results to the output stream, add a query below the stream definitions as follows:
To name the query, add the
@infoannotation and enter
CalculateAverageTemperatureas the query name.
@info(name = 'CalculateAvgTemp')
To indicate that the input is taken from the
TempStreamstream, add the
fromclause as follows:
Specify how the values for the output stream attributes are derived by adding a
selectclause as follows.
select roomNo, deviceNo, avg(temp)
To insert the results into the output stream, add the
insert intoclause as follows.
insert into AverageTempStream;
The completed Siddhi application is as follows:
@App:name('TemperatureApp') @App:description('This application captures the room temperature and analyzes it, and presents the results as logs in the output console.') define stream TempStream (roomNo string, deviceID long, temp double); @sink(type = 'log', @map(type = 'passThrough')) define stream AverageTempStream (roomNo string, deviceID long, avgTemp double); @info(name = 'CalculateAvgTemp') from TempStream select roomNo, deviceID, avg(temp) as avgTemp insert into AverageTempStream;
Testing your Siddhi application¶
The application you created needs to be tested before he uses it to process the actual data received. You can test it in the following methods:
To simulate events for the Siddhi application, you can use the event simulator available with in the Streaming Integration Tooling as explained in the procedure below.
In the Streaming Integrator Tooling, click the following icon for event simulation on the side panel.
The Simulation panel opens as shown below.
In the Single Simulation tab of the simulation panel, select TemperatureApp from the list for the Siddhi App Name field.
You need to send events to the input stream. Therefore, select TempStream" in the **Stream Name field. As a result, the attribute list appears in the simulation panel.
Then enter values for the attributes as follows:
Attribute Value roomNo NW106 deviceID 262626367171371717 temp 26
Click Start and Send.
The output is logged in the console as follows:
To debug your Siddhi application, you need to mark debug points, and then simulate events as you did in the previous section. The complete procedure is as follows:
- Open the
To run the Siddhi application in the debug mode, click Run => Debug, or click the following icon for debugging.
As a result, the Debug console opens in a new tab below the Siddhi application as follows.
Apply debug points in the lines with the
insert intoclauses. To mark a debug point, you need to click on the left of the required line number so that it is marked with a dot as shown in the image below.
You can only mark lines with from or insert into clauses as debug points.
- Now simulate a single event the same way you simulated it in the previous section, with the following values.
Attribute Value roomNo NW106 deviceID 262626367171371717 temp 26
Click Send to send the event.
When a debug point is hit, the line marked as a debug point is highlighted as shown below. The status of the event and the query is displayed in the debug console below the Siddhi application.
Deploying Siddhi applications¶
After creating and testing the
TemperatureApp Siddhi application, you need to deploy it in the Streaming Integrator server, export it as a Docker image, or deploy in Kubernetes.
Deploying in Streaming Integrator server¶
To deploy your Siddhi application in the Streaming Integrator server, follow the procedure below:
To deploy the Siddhi application, you need to run both the Streaming Integrator server and Streaming Integrator Tooling. The home directories of the Streaming Integrator server is referred to as
<SI_HOME> and the home directory of Streaming Integrator Tooling is referred to as
Start the Streaming Integrator server by navigating to the
<SI_HOME>/bindirectory from the CLI, and issuing one of the following commands:
- On Windows:
- On Linux/Mac OS:
- On Windows:
In the Streaming Integrator Tooling, click Deploy and then click Deploy to Server.
The Deploy Siddhi Apps to Server dialog box opens as follows.
In the Add New Server section, enter information as follows:
Field Value Host Your host Port
Then click Add.
Select the check boxes for the TemperatureApp.siddhi Siddhi application and the server you added as shown below.
As a result, the
TemperatureAppSiddhi application is saved in the
<SI_HOME>/deployment/siddhi-filesdirectory, and the following is message displayed in the dialog box.
Deploying in Docker¶
To export the
TemperatureApp Siddhi application as a Docker artifact, follow the procedure below:
Open the Streaming Integrator Tooling.
Click Export in the top menu, and then click For Docker.
As a result, Step 1 of the Export Siddhi Apps for Docker image wizard opens as follows.
Select the TemperatureApp.siddhi check box and click Next.
In Step 2, you can template values of the Siddhi Application.
Click Next without templating any value of the Siddhi application.
For detailed information about templating the values of a Siddhi Application, see Exporting Siddhi Apps for Docker Image.
In Step 3, you can update configurations of the Streaming Integrator.
Leave the default configurations, and click Next.
In Step 4, you can provide arguments for the values that were templated in Step 2.
There are no values to be configured because you did not template any values in Step 2. Therefore click Next.
In Step 5, you can choose additional dependencies to be bundled. This is applicable when Sources, Sinks and etc. with additional dependencies are used in the Siddhi Application (e.g., a Kafka Source/Sink, or a MongoDB Store). In this scenario, there are no such dependencies. Therefore nothing is shown as additional JARs.
Click Export. The Siddhi application is exported as a Docker artifact in a zip file to the default location in your machine, based on your operating system and browser settings.
Extending the Streaming Integrator¶
The Streaming Integrator is by default shipped with most of the available Siddhi extensions by default. If a Siddhi extension you require is not shipped by default, you can download and install it.
In this scenario, let's assume that the laboratories require the siddhi-execution-extrema extension to carry out more advanced calculations for different types of time windows. To download and install it, follow the procedure below:
Open the Siddhi Extensions page. The available Siddhi extensions are displayed as follows.
Search for the siddhi-execution-extrema extension.
Click on the V4.1.1 for this scenario. As a result, the following page opens.
To download the extension, click Download Extension. Then enter your email address in the dialog box that appears, and click Submit. As a result, a JAR fil is downloaded into a location in your machine (the location depends on your browser settings).
To install the siddhi-execution-extrema extension in your Streaming Integrator, place the JAR file you downloaded in the
- For a quicker demonstration of the Streaming Integrator, see Getting Started with the Streaming Integrator in Five Minutes.
- For a quick guide on how the Streaming Integrator works with the Micro Integrator to trigger integration flows, see [Getting SI Running with MI in 5 Minutes].
- To get the Streaming Integrator running with Docker in five minutes, see Getting SI Running with Docker in 5 Minutes.
- To get the Streaming Integrator running in a Kubernetes cluster in five minutes, see Getting SI Running with Kubernetes in 5 Minutes.