Skip to content

Quick Start Guide 101

Introduction to Streaming Integration

The Streaming Integrator is one of the integrators in WSO2 Enterprise Integrator. It reads streaming data from files, cloud-based applications, streaming applications, and databases, and processes them. It also allows downstream applications to access streaming data by publishing information in a streaming manner. It can analyze streaming data, identify trends and patterns, and trigger integration flows.

The purpose of this guide if for you to understand the basic functions of the Streaming Integrator in 30 minutes.

To learn how to use the key functions of the Streaming Integrator, consider a laboratory that is observing the temperature of a range of rooms in a building via a sensor and needs to use the temperature readings as the input to derive other information.

Before you begin:

Creating your first Siddhi application

Create a basic Siddhi application for a simple use case.

  1. Extract the Streaming Integrator Tooling pack to a preferred location. Hereafter, the extracted location is referred to as <SI_TOOLING_HOME>.

  2. Navigate to the <SI_TOOLING_HOME>/bin directory and issue the appropriate command depending on your operating system to start the Streaming Integration tooling.

    • For Windows: tooling.bat

    • For Linux/MacOS: ./

  3. Access the Streaming Integration Tooling via the http://<HOST_NAME>:<TOOLING_PORT>/editor URL.


    The default URL is http://<localhost:9390/editor.

The Streaming Integration Tooling opens as shown below.

Welcome Page

  1. Open a new Siddhi file by clicking New.

    The new file opens as follows.

    New Siddhi File

  2. Specify a name for the new Siddhi application via the @App:name annotation, and a description via the @App:description annotation.

    @App:description("This application captures the room temperature and analyzes it, and presents the results as logs in the output console.")
  3. The details to be captures include the room ID, device ID, and the temperature. To specify this, define an input stream with attributes to capture each of these details.

    define stream TempStream(roomNo string, deviceNo long, temp double)

  4. The technicians need to know the average temperature with each new temperature reading. To publish this information, define an output stream including these details as attributes in the schema.

    define stream AverageTempStream(roomNo string, deviceNo long, avgTemp double)

  5. The average temperature needs to be logged. Therefore, connect a sink of the log type to the output stream as shown below.

    @sink(type = 'log', 
        @map(type = 'passThrough'))
    define stream AverageTempStream (roomNo string, deviceID long, avgTemp double);

    passThrough is specified as the mapping type because in this scenario, the attribute names are received as they are defined in the stream and do not need to be mapped.

  6. To get the input events, calculate the average temperature and direct the results to the output stream, add a query below the stream definitions as follows:

    1. To name the query, add the @info annotation and enter CalculateAverageTemperature as the query name.

      @info(name = 'CalculateAvgTemp')

    2. To indicate that the input is taken from the TempStream stream, add the from clause as follows:

      from TempStream

    3. Specify how the values for the output stream attributes are derived by adding a select clause as follows.

      select roomNo, deviceNo, avg(temp)

    4. To insert the results into the output stream, add the insert into clause as follows.

      insert into AverageTempStream;

The completed Siddhi application is as follows:

@App:description('This application captures the room temperature and analyzes it, and presents the results as logs in the output console.')
define stream TempStream (roomNo string, deviceID long, temp double);
@sink(type = 'log', 
    @map(type = 'passThrough'))
define stream AverageTempStream (roomNo string, deviceID long, avgTemp double);

@info(name = 'CalculateAvgTemp')
from TempStream 
select roomNo, deviceID, avg(temp) as avgTemp 
insert into AverageTempStream;

Testing your Siddhi application

The application you created needs to be tested before he uses it to process the actual data received. You can test it by simulating events as follows:

  1. In the Streaming Integrator Tooling, click the following icon for event simulation on the side panel.

    Event Simulator icon

The Simulation panel opens as shown below.

Simulation Panel

  1. In the Single Simulation tab of the simulation panel, select TemperatureApp from the list for the Siddhi App Name field.

  2. You need to send events to the input stream. Therefore, select TempStream" in the **Stream Name field. As a result, the attribute list appears in the simulation panel.

  3. Then enter values for the attributes as follows:

    Attribute Value
    roomNo NW106
    deviceID 262626367171371717
    temp 26

    Single Simulation

  4. Click Start and Send.

The output is logged in the console as follows: Console Log

Deploying Siddhi applications

After creating and testing the TemperatureApp Siddhi application, you need to deploy it in the Streaming Integrator server, export it as a Docker image, or deploy in Kubernetes.

Deploying in Streaming Integrator server

To deploy your Siddhi application in the Streaming Integrator server, follow the procedure below:


To deploy the Siddhi application, you need to run both the Streaming Integrator server and Streaming Integrator Tooling. The home directories of the Streaming Integrator server is referred to as <SI_HOME> and the home directory of Streaming Integrator Tooling is referred to as <SI_TOOLING_HOME>.

  1. Start the Streaming Integrator server by navigating to the <SI_HOME>/bin directory from the CLI, and issuing the appropriate command based on your operating system:
  2. For Windows: server.bat --run
  3. For Linux/Mac OS:  ./

  4. In the Streaming Integrator Tooling, click Deploy and then click Deploy to Server.

    Deploy to Server Menu Option

    The Deploy Siddhi Apps to Server dialog box opens as follows.

    Deploy Siddhi Apps to Server

  5. In the Add New Server section, enter information as follows:

    Field Value
    Host Your host
    Port 9443
    User Name admin
    Password admin

    Add Server

    Then click Add.

  6. Select the check boxes for the TemperatureApp.siddhi Siddhi application and the server you added as shown below.

    Deploy Siddhi Apps to Server

  7. Click Deploy.

    As a result, the TemperatureApp Siddhi application is saved in the <SI_HOME>/deployment/siddhi-files directory, and the following is message displayed in the dialog box.

    Siddhi App successfully deployed

Deploying in Docker

To export the TemperatureApp Siddhi application as a Docker artifact, follow the procedure below:

  1. Open the Streaming Integrator Tooling.

  2. Click Export in the top menu, and then click For Docker.

    Export as Docker/Kubernetes Menu

    As a result, Step 1 of the Export Siddhi Apps for Docker image wizard opens as follows.

    Export as Docker dialog box

  3. Select the TemperatureApp.siddhi check box and click Next.

  4. In Step 2, you can template values of the Siddhi Application.

    Template Siddhi Apps dialog box

    Click Next without templating any value of the Siddhi application.


    For detailed information about templating the values of a Siddhi Application, see Exporting Siddhi Apps for Docker Image.

  5. In Step 3, you can update configurations of the Streaming Integrator.

    Update Streaming Integrator Configurations dialog box

    Leave the default configurations, and click Next.

  6. In Step 4, you can provide arguments for the values that were templated in Step 2.

    Populate Arguments Template dialog box

    There are no values to be configured because you did not template any values in Step 2. Therefore click Next.

  7. In Step 5, you can choose additional dependencies to be bundled. This is applicable when Sources, Sinks and etc. with additional dependencies are used in the Siddhi Application (e.g., a Kafka Source/Sink, or a MongoDB Store). In this scenario, there are no such dependencies. Therefore nothing is shown as additional JARs.

    Bundle Additional Dependencies dialog box

    Click Export. The Siddhi application is exported as a Docker artifact in a zip file to the default location in your machine, based on your operating system and browser settings.

Extending the Streaming Integrator

The Streaming Integrator is by default shipped with most of the available Siddhi extensions by default. If a Siddhi extension you require is not shipped by default, you can download and install it.

In this scenario, let's assume that the laboratories require the siddhi-execution-extrema extension to carry out more advanced calculations for different types of time windows. To download and install it, follow the procedure below:

  1. Open the Siddhi Extensions page. The available Siddhi extensions are displayed as follows. Extensions Home Page

  2. Search for the siddhi-execution-extrema extension. Searched Extension

  3. Click on the V4.1.1 for this scenario. As a result, the following page opens. Extrema Extension Page

  4. To download the extension, click Download Extension. Then enter your email address in the dialog box that appears, and click Submit. As a result, a JAR fil is downloaded to a location in your machine (the location depends on your browser settings).

  5. To install the siddhi-execution-extrema extension in your Streaming Integrator, place the JAR file you downloaded in the <SI_HOME>/lib directory.

Further references