An in-memory digital integration hub enables flexible, real-time information flow between mainframes and external systems, unlocking mainframe data for digital transformation. Credit: Nayanba Jadeja / Getty Images Companies undergoing digital transformation typically require agile, efficient, and real-time integration between their core business systems and hybrid cloud deployments. How can such real-time integration be achieved when the core business system is a mainframe? The short answer: an in-memory digital integration hub. Most large enterprises, especially those in financial services and insurance, rely on mainframes for mission-critical business operations, and these business systems create a high volume of high-value data. For these organizations, successful transformation requires effective and flexible information flow between these mainframes and cloud applications. An in-memory-powered digital integration hub optimized for the mainframe enables them to achieve exactly that goal. There are numerous potential use cases for a digital integration hub, including the delivery of 360-degree customer or 360-degree business views. These enable real-time flow of banking information to cloud-based functions such as rates and pricing or compliance, and providing a current, comprehensive view of clients and their policies for insurance organizations. The importance of mainframes in digital transformation Mainframes provide key value as the center of gravity for operational data for large enterprises. Two-thirds of the Fortune 500, 44 of the top 50 worldwide banks, and eight of the top 10 insurance companies use mainframes to process core batch as well as real-time transactional workloads. For these businesses to reach their digital transformation goals and aggregate the information they need from across disparate, siloed data sources, they must be able to combine data generated by multiple mainframe applications with data from other data environments in real time. They must then be able to effectively and efficiently share this information to consuming applications in hybrid cloud environments without impacting their systems of record in order to drive real-time business processes. For example, to generate top-line growth, enterprises need to effectively power real-time upsell and cross-sell opportunities by leveraging a holistic view of the client. In the financial services sector, multi-national and regional banks benefit from linking client information across multiple lines of business, which may include demand deposits, retail banking operations including credit cards and mortgages, investments, and private banking. The information required to create this valuable linkage needs to be derived and composed from numerous transactional and batch environments and their related data sources, many of which are hosted on mainframes. In some cases, combining this information with other sources typically not on the mainframe, such as data warehouses, data lakes, or SaaS applications, can provide significant value. The most efficient and cost-effective way to integrate information from all of these sources is via a high-performance digital integration hub running on the mainframe, which will support high throughput, low latency use cases. A digital integration hub running on the mainframe provides efficient integration with transactional and batch systems of record and can also integrate with other data hubs in the enterprise architecture. How a digital integration hub creates a more effective information flow An efficient digital integration hub on the mainframe can aggregate and compose information derived from multiple on-premises transactional and batch systems with minimal or no impact on systems of record. In addition, it can combine mainframe data with data not residing on the mainframe. A digital integration hub can create and compose information based on that high-value data and populate the resulting information into a powerful and scalable in-memory cache. The aggregated information, residing in the high performance in-memory cache, can be accessed in real time by numerous business applications using a variety of APIs, such as REST and JDBC/ODBC, or Apache Kafka for event-based architectures. Consuming applications can range from back-office systems to customer-facing applications, such as those related to omnichannel banking. A synchronization layer, or change data capture layer, ensures that current data in the systems of record is processed, and initiates the re-computation or re-aggregation of information whenever source data changes. The most popular and cost-effective solution for building a real-time digital integration hub is with an in-memory data grid, a component of an in-memory computing platform. An in-memory data grid pools the available CPUs and RAM and distributes data and compute across the cluster of servers or cores. By aggregating data in memory and applying massively parallel processing (MPP) across the distributed cluster, an in-memory data grid maximizes processing speed. While an in-memory data grid can be deployed on a cluster of commodity servers, a digital integration hub optimized for mainframe data benefits from being deployed on specialty mainframe processor cores. Generally, an in-memory data grid can be deployed on-premises, in a public or private cloud, or in a hybrid environment. The combination of in-memory data caching and MPP provides real-time performance that is up to 1,000x faster than a solution built on disk-based data storage. In-memory data grids may support a range of APIs, including support for key-value data and, in some solutions, SQL. ACID transaction support may also be available to ensure no loss of data when transactions are processed on the in-memory data. The distributed computing architecture of the in-memory computing platform also makes it easy to increase the compute power and RAM of the cluster by adding new nodes or additional hardware resources. The in-memory computing platform automatically detects additional resources and redistributes data to ensure optimal use of the available CPUs and RAM. Relevant data is cached in the digital integration hub, with the compute workload distributed across the in-memory computing server or core cluster. Business applications, including everything from back-office systems to consumer-facing websites to mobile applications, can access information such as 360-degree customer or business views that would be impossible to achieve in real time without the digital integration hub. The advantages of running a digital integration hub on a mainframe A digital integration hub on the mainframe is best suited for use cases where the predominance of data originates on the mainframe. It has two major technology components, the in-memory compute runtime and caching layer and the data virtualization and synchronization layer. Both components provide unique advantages and optimizations. The in-memory computing layer in the digital integration hub on the mainframe pools the CPU cores and RAM allocated for its use; MPP is accomplished by running the cache queries in parallel across the allocated cores. The data virtualization and synchronization layer provides automatic parallelization of the access to a large variety of data from core systems and can provide change data capture functionality for those data sources. The communication between the data abstraction layer and the in-memory compute layer is further optimized within the mainframe to leverage shared memory. Additional benefits of running the digital integration hub on a mainframe include: Currency of information: Locating the digital integration hub where the underlying data originates means the aggregated information is derived from current business operational data in real time, rather than from stale data after it has been moved off the mainframe. Security and governance: With a digital integration hub on the mainframe, computation and aggregation of data occurs on a highly secure platform, leveraging its existing auditing and governance capabilities rather than exposing raw data off-platform. Cost: A digital integration hub on the mainframe can take advantage of attractive pricing mechanisms of the platform such as specialty engines. It also reduces the I/O costs of moving bulk raw data off-platform during operations. Scalability: Co-locating the digital integration hub with the transactional and batch systems provides greater scalability. It also provides high performance via memory-to-memory techniques for data integration and via event aggregation at the source of the data. Unparalleled disaster recovery: Persisting in-memory digital integration hub instances on the mainframe to an RDBMS such as IBM Db2 z/OS enables users to leverage their existing investments in highly reliable and well-tested mainframe disaster recovery infrastructures. Ease of information access: A digital integration hub on the mainframe facilitates information sharing with existing business applications that are already integrated with the mainframe. How the digital integration hub can reduce time to value A digital integration hub abstracts and decouples the underlying data contexts and formats from the consuming applications, allowing for greater flexibility when adding new consuming applications. The digital integration hub also reduces the complexity of joining, aggregating, and computing information from multiple systems of record on the mainframe without requiring changes to those systems. A digital integration hub cache can also be used for multiple consuming applications and scenarios, providing “implement once and reuse” benefits. An ecosystem of integrated business solutions that can leverage the digital integration hub on the mainframe can further reduce time to value. For example, it may allow users to avoid custom application coding and maintenance costs by using packaged business solutions that can leverage the hub’s APIs. The digital integration hub can enable financial services businesses to build ACH, ISO20022 (SWIFT), ISO8583-1 (credit card format), and custom templates into the data abstraction and virtualization layer, which can enable interaction with the systems of record without impacting their performance. A digital integration hub underpinned by in-memory computing is rapidly becoming a foundational technology for powering digital transformations. For use cases in which the center of gravity of a company’s operational data resides on a mainframe, a digital integration hub deployed on the mainframe provides a vital, cost-effective, performant, and secure solution for enabling high-value information flow between systems of record and business processes across the hybrid cloud environment. Nikita Ivanov is founder and CTO of GridGain Systems, where he has led the development of advanced and distributed in-memory data processing technologies. He has more than 20 years of experience in software application development, building HPC and middleware platforms, and contributing to the efforts of companies including Adaptec, Visa, and BEA Systems. — New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to newtechforum@infoworld.com. Related content feature What is Rust? Safe, fast, and easy software development Unlike most programming languages, Rust doesn't make you choose between speed, safety, and ease of use. Find out how Rust delivers better code with fewer compromises, and a few downsides to consider before learning Rust. By Serdar Yegulalp Nov 20, 2024 11 mins Rust Programming Languages Software Development how-to Kotlin for Java developers: Classes and coroutines Kotlin was designed to bring more flexibility and flow to programming in the JVM. Here's an in-depth look at how Kotlin makes working with classes and objects easier and introduces coroutines to modernize concurrency. By Matthew Tyson Nov 20, 2024 9 mins Java Kotlin Programming Languages analysis Azure AI Foundry tools for changes in AI applications Microsoft’s launch of Azure AI Foundry at Ignite 2024 signals a welcome shift from chatbots to agents and to using AI for business process automation. By Simon Bisson Nov 20, 2024 7 mins Microsoft Azure Generative AI Development Tools news Microsoft unveils imaging APIs for Windows Copilot Runtime Generative AI-backed APIs will allow developers to build image super resolution, image segmentation, object erase, and OCR capabilities into Windows applications. By Paul Krill Nov 19, 2024 2 mins Generative AI APIs Development Libraries and Frameworks Resources Videos