LESSON

Kafka - Source Example

Description

In this video, we'll connect a Kafka topic to a SQL database table via the Kafka Connector Module.

Video recorded using: Ignition 8.3

Transcript

(open in window)

[00:00] In this lesson, I'll demonstrate how to synchronize Kafka event data with a SQL database via the Kafka connector module. A brief explanation of my scenario: I'm running containerized versions of both Ignition and my Kafka broker. I have a Kafka topic for user data messages that I wanna forward to a cloud database that I've already created a connection to for later processing and reporting. To start, I'll need to create a connection to my broker. I'll open my Gateway webpage and navigate to Connections, Service Connectors, and Connections. Then I'll click Create Connection. I'll choose Kafka Connector and then click Next. I'll call this connection "Kafka connector". And then I need to point this at my bootstrap server. I've configured listeners for non-con containerized applications and for other containers. Since my Ignition installation is running in a container, I'm gonna choose the latter, which in my case requires me to enter kafka:9093. I'm not gonna use any passwords, so I'll make sure mine are set to none, and then I'll click Create Connection.

[01:04] But if you want to configure additional security properties, they can be done in the lower portion of the form. My status should say connected with a green check mark, which means I'm ready to use this connection and consume data. Now I'll switch to my designer. I'm gonna create a new event stream, call this "Kafka to DB Users" and pick the Kafka source. This is going to listen to data from a specific broker and topic that I point it to. The connector I created is already selected from the dropdown, so I can open the topic dropdown and choose the topic I need in my example, which is my "users" topic. I'll set a group ID of one and then leave the rest of these alone and move on. The default encoder is set the JSON object, which is perfect for my data, but if I needed to change that, I could do so here. Next, I'll move on to my handler stage. I'll click the plus sign to add a new one, and then I'll select Database. Then I'll select my AuroraDB from the database dropdown. I'll leave the mode on insert, and then I'll select my users table that's already been created.

[02:07] However, I could provide a table name and let the handler automatically create one if I needed it to. Next, I'll start mapping my database columns. I'm gonna grab one of my example user messages to help me build this out, and then I'll start adding the columns for my fields. I just need to provide the column name for my database table, the column type, and then an expression that will return the value to store to the corresponding column. My messages are JSON objects, so I can use the jsonGet expression function on event.data to extract the specific values I'm looking for. I'll continue doing this with the rest of my integer and string fields. If I use jsonGet on the registered field, it'll just return a string, but I'm storing that in a DATETIME column type. I can use the toDate expression function to try to coerce the string that gets returned from the jsonGet function into a date. It'll try to coerce it based on certain supported date formats, but my date doesn't match one.

[03:05] I can get around this with a transform script. I'll enable a transform and then add the following line to the script. The script just replaces the capital T in the registered date with a space which makes it match one of the supported date formats. Alright, now I'm done configuring the event stream. I'll save my project and I'm ready to start sending Kafka messages to my SQL database. I'll open up my database query browser real quick, and then I'll produce a couple messages on my other screen. And now we should see messages start to show up in my database from Kafka. With that, I've linked my Kafka broker to Ignition, so the incoming Kafka messages are sent to my SQL database via an event stream with a Kafka source. The Kafka connector module also provides a new scripting library that I'll add a user manual link to if you'd like to explore that in your solutions.

You are editing this transcript.

Make any corrections to improve this transcript. We'll review any changes before posting them.