Stream Mainframe Application and z/OS System Data into Apache Kafka

Luminex MDI zKonnect

Data Driven Enterprises Need Mainframe Data… Now

As data centers face a growing number of independent systems generating and processing data, building a central data hub and streaming data pipelines has become a priority for enterprises looking to make timely business decisions for competitive advantage.

In response to this demand, data centers have implemented Apache Kafka™ and Confluent Platform to provide an enterprise-wide event streaming and processing platform to act as a "central nervous system" to respond immediately and with agility, using input from the entire data center.

Learn More:

Connecting the Mainframe to Kafka

As powerful a tool as Kafka can be, it is only as good as the data it has access to, and the last hold out likely contains an enterprise's most valuable data… the mainframe. The familiar challenges arise when sharing mainframe data with the rest of the enterprise: security, the cost and speed of data movement, and the need for an agile approach.

Luminex MDI zKonnect addresses these challenges by eliminating mainframe TCP/IP from the process and providing a simple JCL-based method for publishing mainframe data to Kafka topics. Data can also be pushed back to the mainframe by distributed applications, either from new sources or as part of a cross-platform workflow. The mainframe and its data stay more secure by sending data over FICON rather than unsecure FTP. Data transfer is faster, more reliable (especially for large data sets) and uses significantly fewer MSUs than encrypted TCP/IP. Most importantly, the process is easy and agile enough for mainframe teams to meet the ever-changing demands of data architects, scientists and analysts.

zKonnect enables mainframe data to be securely and efficiently integrated with Kafka for real-time enterprise-wide decision-making.

zKonnect eliminates the overhead, bottlenecks and security risks associated with data movement over mainframe-based TCP/IP, enabling more efficient and agile streaming data pipelines by using FICON.

The Integrated, Streaming Data Center

As a central hub for streaming data, Kafka enables various purpose-built systems to be continously interconnected for enterprise-wide processing. For example, mainframe transactions can be published to Kafka, consumed and transformed and/or analyzed by Pentaho, then republished to Kafka for consumption by other, numerous downstream processes… including the mainframe. Each step has the potential to feed multiple business applications and microservices, ensuring that data is made accessible when and where it is needed.

Related

More Efficient and More Secure… MDI solutions from Luminex.

For information about Luminex products and services, please contact us.