You can help by commenting or suggesting your edit directly into the transcript. We'll review any changes before posting them. All comments are completely anonymous. For any comments that need a reply, consider emailing docs@inductiveautomation.com.
LESSON LIST
-
8:15Event Streams
-
3:59Kafka - Source Example
-
3:46Kafka - Handler Example
-
7:58Form Component - Structure and Widgets
-
3:39Form Component - Contingent Actions
-
4:55Form Component - Submission Management
-
4:21Form Component - Submitting Form Data in Offline Mode
-
13:37Form Component - Basic Form
-
2:13Offline Submissions
-
4:19File Association and Deep Links
-
11:38Drawing Component - Overview
-
16:14Drawing Component - Elements
-
5:05Drawing Component - Modification Tools
-
10:43Drawing Component - Layering and Alignment
-
2:14SQL Historian
-
4:55Deployment Modes
-
5:29Siemens Symbolic Driver - Browsing
-
4:46Siemens Symbolic Driver - Migrating
-
4:18Internal Secret Providers - Referenced Secrets
-
1:35Internal Secret Providers - Embedded Secrets
-
7:05Alarm Metrics Filtering and Aggregation
LESSON
Event Streams
Description
Event Streams are project resources intended to easily handle "event-driven" data. This video will teach you how to configure them as well as test them from your Designer.
Video recorded using: Ignition 8.3
Transcript
(open in window)[00:00] Event streams are a project resource that allow your gateway to handle unsolicited event-driven data. An event stream combines a source with one or more handlers. The source provides data objects and the handlers do something with those objects. In this lesson, we'll take a look at how to create and test an event stream. Since event streams are project resources, I'll need to have a project open in the designer to interact with them via the event streams workspace. To create a new one, I'll simply right click event streams and click New Event stream. I'll need to provide a name for the event stream, and then I need to choose a source. I'll wanna pick a source depending on the expected origin of the data. Right now I have four sources available to me: Tag Event, Kafka, Event Listener, and HTTP. Be aware that the options here will depend on your installed modules. A Tag Event Source listens for any change to a tag's timestamp, value, or quality. The Kafka Source provided by the Kafka Connector Module subscribes to a Kafka topic and waits for event payloads.
[01:05] The Event Listener Source listens for events from other parts of Ignition. And the HTTP Source, provided by the Web Dev Module, opens up a new HTTP endpoint in Ignition for POST and PUT events. I'm gonna select the HTTP Source. I'll give my event stream a name, and then I'll click create. Once an event stream is opened, we'll see the name at the top of the window here, and then there are some controls over on the top right corner for enabling or disabling it and for showing the test controls. I'm gonna hide the test controls for now so we can focus on the other parts first. There are seven different stages to an event stream, and each of those stages are shown here. The first stage is for source configuration. For the HTTP source, you can configure whether or not to require HTTPS or authentication, the user source and roles for authentication, and the max number of authentication retries. I'll leave these alone and click Encoder to move to the next stage. The Encoder Stage allows you to encode the raw data into a specific format. I can encode the data as a string, a PyDictionary, a SparkPpug message, a JSON object, or a ByteArray.
[02:08] I'll leave it set to string and move on to the filter. The Filter Stage provides an opportunity to add a script that can filter out specific data objects. If the given event evaluates to true, then it can proceed with the rest of the stages. However, if it evaluates to false, it would stop here and be dropped. I'll add a filter to mine that returns false if the event data is equal to "unwanted" and will return true for everything else. Next, the Transform Stage provides another script that allows you to transform the data to prepare it for the handlers so that they can then do something with that data. This would be the last chance to manipulate the data into a specific format, or coalesce it with any other data before sending it away. After the Transform Stage is the Buffer Stage. This is the staging area for events before they're passed to the handlers. If your handlers are already working through a set of events, the buffer will not pass along queued events until it's finished with the current events, and the properties here will define when they're forwarded. The Debounce property is the amount of time in milliseconds that the event stream will wait for another event before passing the current event or batch of events to the handlers.
[03:09] Events will be batched together unless the Debounce Time or the next property, the Max Wait, are reached. The Max Wait is just as the names suggests, the maximum amount of time before an event or batch of events is sent to the handlers. If the queue starts to build up, you can decide whether or not to start dropping events with these two properties. The Max Queue Size will control the maximum number of events that can be queued before it starts dropping them. The default value is zero, which means there is no limit. If you do set a Max Queue Size, you can then decide to start dropping the oldest or newest events with this dropdown. The next stage is the Handler Stage. This is where you can configure what happens with the data objects. You can add multiple handlers, and they can include things like firing scripts, calling HTTP methods, storing records in a database, or sending messages to other systems in Ignition. The list of handlers that you'll see will depend on the modules that you have installed on your gateway, such as the SQL Bridge Module for database handlers, and future modules may add additional types as well.
[04:07] Since this is a simple demonstration, I'm just gonna add a logger handler to mine to confirm that the event stream is working properly. The fields in this handler allow for expressions, so I use the expression syntax to add a string for the logger name and for the message. Then I'll change this to an info level message. You'll notice that I included "event.data" in brackets in my message. This is how you can reference the actual event object. You can then use it with different expression functions to extract different parts of it. Finally, if this handler were to run into any issues, I can decide how it should fail by either aborting, ignoring, or retrying. This leads me to the Error Handler stage, which will be invoked any time an individual handler throws an exception and the failure mode is set to abort. This would then act as a final catchall for any exceptions that aren't caught in the handler logic, and this will allow them to be handled gracefully. I'll save my project and quickly show the Status Window.
[05:07] This will give you information about the event stream and the number of events received, filtered, and the execution times. Now that my event stream is configured, I'd like to test that it works, so I'll click the button up here to show the test controls. Again, this reveals an area for passing data to this event stream and testing different sections of it. I'll type "test" here, and then I can choose whether or not to run all of the stages with this data or just run up to the filter or the transform. There's also this dry run checkbox, which will allow you to test the event stream, but prevent the handlers from actually handing off the data. For example, if I had a database handler and this was checked, my event stream would send the data object through each stage, but stop right before the data is actually handed off to the database, so nothing would end up being stored. I wanna see my logger message, so I'll leave that unchecked and click run all. Then I'll open up the diagnostic logs in my gateway webpage, and I can see my logger message appear in the logs, which tells me that the event stream ran and the handler worked.
[06:01] I'd like to confirm that my filter works, so I'll go back to the filter stage. My script is looking at the event data for a specific string, and if that string matches, it should return false, so this event will never make it to the handler stage. I'll uncomment out this logger message that lets me know that this fires. Now, I'll provide "unwanted" as my test data and click run all. Back in the logs, I see my filter logging message here, but no handler message, so my filter is working properly. If this were a real event stream, I'd probably wanna remove any logging messages once I've finished my testing and confirming that everything works, so I'm not flooding my logs and preventing myself from seeing any other important messages. I was able to run this event stream through the test controls window, but since it's configured with an HTTP source, the intention would be to listen for HTTP messages from outside of Ignition. They would be sent to a new endpoint that gets created that I'll show you at the bottom of the screen. Your endpoint would contain your gateway's hostname or IP address, and the port. Then "system", then "eventstream", and then the name of your project, and the path to your event stream.
[07:03] Like I mentioned earlier, the event listener source is designed to listen for messages from other systems in Ignition. One example would be the alarm notification system, which has a new notification block specifically for event streams. I'll open up an existing pipeline I have, and you can see that I have an event stream source block available. I can drag that into my pipeline, point it to an event listener event stream that I've already created, connect it to my script, and then I'll trigger one of my alarms that uses this pipeline. This should end up triggering the event stream, which I've set up to log another message to my gateway logs. If I open my logs, I'll see a message that shows me that my pipeline script ran, and then the event stream handler ran. As you can see, event streams provide a streamlined way to handle a variety of messages that might be sent to your gateway. The gateway doesn't need to poll these sources, but will instead passively listen once an event stream with the proper source is set up. Then, you can configure any number of handlers on your end to send records to your database, fire scripts, write to tags, or you can send messages to other parts of Ignition.
[08:08] The possibilities are up to you and what you need from your project.