Sponsored by IBM
Center for Automated Integration
Learn more

How Events and APIs Enable Modern Real-Time Applications

PinIt

Instead of just opening up closed-off systems via APIs, many businesses are looking to use event-based triggers as well to react to changes in real time.

Competitive pressures are driving the need for new thinking when it comes to developing applications that help a business react in real time. Being able to make real-time decisions based on events is at the heart of those efforts.

Any event—whether internally generated, such as a transaction, state change, database change, etc., or externally generated by customer activity—invokes the need for an action. But working with events introduces new technical requirements.

One way to look at the different requirements is to compare the handling of events to the way applications built using APIs might work. For example, applications might use APIs to form a one-to-one relationship between different app components. For instance, a mobile banking app would use APIs to allow a customer to query a backend system to get a bank balance. One query is sent, and one result is delivered. And the app components at each end of the session must both be online at the same time.

Event-based apps break this one-to-one relationship. And they change the way the interactions work. Traditional apps require a push, an intervention to trigger the next action. Things happened sequentially. A customer asks for his or her bank balance, and the balance amount is returned. With events-based applications, systems reply to events as they occur naturally. A system doesn’t have to wait for a response to take another action.

Use cases abound

Event-driven applications can be used in a wide range of industries, including manufacturing, financial services, transportation, logistics, retail, and more.

Frequently, a single event is used by multiple applications for different purposes at different times. For example, if an airline passenger changes flights, that change impacts seat assignments on both the old and the new flights. If the trip was booked through a travel agency, the change might impact other aspects of the trip. A hotel reservation might need to be shifted from one night to another, and adjustments might need to be made to car rental and ground transportation services.

In retail, it is easy to understand how a single event like a purchase on a website must be shared with other systems, including inventory, tax calculation and collection, billing/payment processing, shipping, and more. The point to keep in mind is that multiple disparate systems, including some that may not be controlled by the enterprise, must all work together to give the customer a seamless experience.

See also: Supercharging the Data Flowing into Advanced Analytics

MQ vs. Kafka: Not necessarily one or the other

When working with events, the conduit, the software that sits between the different event creators and event consumers, must have special properties. Quite often, the choice comes down to two general categories of solutions. The alternatives are one based on message queuing, such as IBM MQ, or one based on event streaming, such as Apache Kafka, the open-source distributed event streaming platform.

The two are often presented as competitive solutions. But in reality, they do different things and are designed for different uses.

Kafka is used to build real-time streaming data pipelines and real-time streaming applications. It enables a data pipeline to reliably process and move data from one system to another and allows a streaming application to consume streams of data.

IBM MQ supports the exchange of information between applications, systems, services, and files by sending and receiving message data via messaging queues. This simplifies the creation and maintenance of business applications. IBM MQ works with a broad range of computing platforms and can be deployed across a range of different environments, including on-premises, cloud, and hybrid cloud deployments. IBM MQ supports a number of different APIs, including Message Queue Interface (MQI), Java Message Service (JMS), REST, .NET, IBM MQ Light, and MQTT.

As such, one differentiator between Kafka and IBM MQ is that Kafka is very much about a stream of events or sequence of events, whereas MQ is more about individual messages.

Separation is the key

The way modern applications are developed results in independent elements working together as one. The decoupling of the various elements of a larger application is becoming the norm. APIs, IBM MQ, and Kafka serve as the glue between the elements. Each has its own purpose in different applications.

Businesses are making the different components available as services, often through APIs. However, instead of just opening up once closed-off systems via APIs, many businesses are looking to use event-based triggers as well to react to changes in real time.

Salvatore Salamone

About Salvatore Salamone

Salvatore Salamone is a physicist by training who has been writing about science and information technology for more than 30 years. During that time, he has been a senior or executive editor at many industry-leading publications including High Technology, Network World, Byte Magazine, Data Communications, LAN Times, InternetWeek, Bio-IT World, and Lightwave, The Journal of Fiber Optics. He also is the author of three business technology books.

Leave a Reply

Your email address will not be published. Required fields are marked *