Step 1: Download Streaming Integrator and Dependencies¶
First, you are required to download the Streaming Integrator and the other software needed for the scenario you are trying out. To do this, follow the topics below.
Before you begin:
Downloading the Streaming Integrator runtime and tooling¶
To download the Streaming Integrator runtime, visit the Streaming Integrator Product Page. Enter you email address and agree to the license. Then click Zip Archive download the Streaming Integrator as a zip file.
To download Streaming Integrator Tooling, click Tooling in the Streaming Integrator Product Page. Enter you email address and agree to the license. Then click MacOS Installer pkg download the Streaming Integrator as a zip file.
Downloading the other dependencies for your scenario¶
This section shows how to prepare your production environment for the scenario described in the Streaming Integration Overview section.
Setting up a MySQL database table¶
In this scenario, the Streaming Integrator reads input data from a MySQL database table. Therefore, let's download and install MySQL and define the database and the database table as follows:
Download MySQL 5.1.49 from MySQL Community Downloads.
Enable binary logging in the MySQL server. For detailed instructions, see Enabling the Binlog tutorial by debezium.
Once you install MySQL and start the MySQL server, create the database and the database table you require as follows:
Let's create a new database in the MySQL server which you are to use throughout this tutorial. To do this, execute the following query.
CREATE SCHEMA production;
Create a new user by executing the following SQL query.
GRANT SELECT, RELOAD, SHOW DATABASES, REPLICATION SLAVE, REPLICATION CLIENT ON *.* TO 'wso2si' IDENTIFIED BY 'wso2';
- Switch to the
productiondatabase and create a new table, by executing the following queries:
CREATE TABLE SweetProductionTable (name VARCHAR(20),amount double(10,2));
Download Kafka and create topics¶
This scenario involves publishing some filtered production data to a Kafka topic named
Download the Kafka broker from the Apache site and extract it. This directory is referred to as
<KAFKA_HOME>from here on.
Start Kafka as follows:
First, start a zoo keeper node. To do this, navigate to the
<KAFKA_HOME>directory and issue the following command.
sh bin/zookeeper-server-start.sh config/zookeeper.properties
Next, start a Kafka server node. To do this, issue the following command from the same directory.
sh bin/kafka-server-start.sh config/server.properties
To create a Kafka topic named
eclair-production, issue the following command from the same directory.
bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic eclair-production
Starting the WSO2 Streaming Integrator Server¶
To start WSO2 Streaming Integrator, navigate to the
<SI_HOME>/bin directory from the CLI, and issue the appropriate command based on your operating system:
- For Linux:
- For Windows:
Now you have completed a WSO2 Streaming Integrator setup that is capable of the following:
Design, test and deploy Siddhi applications via Streaming Integrator Tooling.
Consume data from as well as publish data to MySQL databases.
Consume data from as well as publish data to Kafka topics.
To design a Siddhi application, proceed to Step 2: Create the Siddhi Application.