You can help by commenting or suggesting your edit directly into the transcript. We'll review any changes before posting them. All comments are completely anonymous. For any comments that need a reply, consider emailing docs@inductiveautomation.com.
LESSON LIST
-
8:15Event Streams
-
3:59Kafka - Source Example
-
3:46Kafka - Handler Example
-
7:58Form Component - Structure and Widgets
-
3:39Form Component - Contingent Actions
-
4:55Form Component - Submission Management
-
4:21Form Component - Submitting Form Data in Offline Mode
-
13:37Form Component - Basic Form
-
2:13Offline Submissions
-
4:19File Association and Deep Links
-
11:38Drawing Component - Overview
-
16:14Drawing Component - Elements
-
5:05Drawing Component - Modification Tools
-
10:43Drawing Component - Layering and Alignment
-
2:14SQL Historian
-
4:55Deployment Modes
-
5:29Siemens Symbolic Driver - Browsing
-
4:46Siemens Symbolic Driver - Migrating
-
4:18Internal Secret Providers - Referenced Secrets
-
1:35Internal Secret Providers - Embedded Secrets
-
7:05Alarm Metrics Filtering and Aggregation
LESSON
Kafka - Handler Example
Description
Create an Event Stream with an Event Listener source and a Kafka handler to forward alarm data to a Kafka topic.
Video recorded using: Ignition 8.3
Transcript
(open in window)[00:00] In this lesson, I'll demonstrate how to forward alarm data to a Kafka topic via the Kafka connector module. In my setup, I'm running containerized versions of both Ignition and my Kafka broker. I already created a Kaka topic for alarm messages and have already set up alarm notification in Ignition. I wanna bridge my existing alarming setup with Kafka and stream specific alarm events to my broker. I'll start by creating a connection to my broker. I'll open my Gateway webpage and navigate to Connections, Service Connectors, and Connections. Then I'll click Create Connection. I'll choose Kafka Connector and then click Next. I'll just call this connection Kafka Connector. And then I need to point this at my bootstrap server. I've configured listeners for non-containerized applications and for other containers. Since my Ignition installation is running in a container, I'm gonna choose the latter, which in my case requires me to enter Kafka:9093. I'm not gonna use any passwords, so I'll make sure mine's set to none, and then I'll click Create Connection, but if you want to configure additional security properties, they can be done in the lower portion of the form.
[01:09] My status should say connected with a check mark, which means I'm ready to use this connection and consume data. Now, I'll switch to my designer. I'm gonna create a new event stream, call this "Alarm to Kafka", and then I'll need to pick the Event Listener source. There's nothing to configure with this source, it'll simply listen for events from Ignition to call it. The default encoder is set to jsonObject, which is fine, but if I needed to change that, I could do so here. Next, I'll move on to the filter stage. I wanna set this up so that only high priority alarms move forward to my handler. To do this, I'll click this checkbox to enable a filter, and then I just need to add some lines of Python code to check whether or not to move an event through the rest of the event stream. The event data is going to have a key for the alarm priority, just named "priority", so I'm gonna check its value. I know that high priority events have an integer value of three, so I'll add a line to check if the value is equal to three.
[02:02] If it is, I want it to return true, which means it can move forward. Otherwise, I want this to return false and drop the event. Now that my filter is set up, I'll move on to the handler stage. I'll click the plus sign to add a new one and select Kafka. My connector I just created is already selected from this dropdown, but I'll need to make sure that this is pointed at the correct topic. These fields with the function icons allow me to dynamically assign values via the expression language. I'm just gonna give the topic field a hard-coded string of "alarms" since that's the name of the topic I wanna stream this data to. Next, I wanna specify a key and a value. For the key, I'm gonna pass another string of "high priority alarm", and for the value, I'm gonna pass the entire event data object. I'm done configuring the event stream, so I'll save my project, and then I'm gonna open up my existing alarm notification pipeline. This was set up to email operators and supervisors if any of my alarms go active. I'll add onto this with the Event Stream Source block, so that it'll also send the alarm event to that stream, and then I'll connect it to the rest of my pipeline. I'll save the project again, and now I should have all of the setup needed to forward alarm events to my Kafka topic.
[03:08] To verify this works, I'll open up a Perspective view with an alarm status table and activate my alarms. This tag has three separate alarms on it with low, medium, and high priorities. If I set the tag value to 75, it should activate all three. I'll open up my Kafka UI to view my messages, and I should be able to confirm that only the high priority alarm was actually sent to my topic, and it looks like that's the case. With that, I've linked my alarm data to Kafka via event streams and the Kafka connector module. The Kafka connector module also provides a new scripting library that I'll add a user manual link to if you'd like to explore that in your solutions.