The need to have on-premises systems talk to public cloud systems is becoming critical. But so few enterprises are prepared Credit: Thinkstock You are trying to get an end-of-quarter report out and you’re having some trouble. It seems that while sales are recorded on a public cloud system, inventory is recorded on an on-premises system. You need to combine both data stores for the report, and you have no way of doing so. How was this allowed to happen? The fact of the matter is that all legacy systems and data can’t migrate to the public cloud, so those on-premises systems need to integrate with the data on the public cloud systems to function. While this was a known problem in 2011 when we started on the cloud journey, in 2018 many organizations still have still not gotten around to solving it. Enterprises typically don’t think about data, process, and service integration until there is a tactical need. Even then, they typically get around the issues by pulling together a quick and dirty solution, which typically involves FTP, a file drop, or even Federal Express. The result of all this is that a lot of integration between the cloud and on-premises systems remains undone, be it data integration, process integration, or service integration. This will become a crisis in 2019 for many enterprises, because they can spend the entire year, or more, just pulling together integration solution for their public cloud systems—which they now depend on for some mission-critical processes. To avoid that crisis, here’s what you need to do. First, catalog all data, services, and processes, using some sort of repository to track them all.. You need to do this for all on-premises systems and all public cloud systems, and you need to do so with the intent of understanding most of the properties so you can make sure the right things are talking to the right things. Second, figure out logically how things need to be integrated. This means understanding at a high level what data needs to flow where, and why. You will take this and break it down to a more primitive level where you’ll identify the data elements and server properties as well. Third, pick the tools and technology you’ll need to carry out the integration. By the way, enterprises too often go directly to this step, but that will only ensure that you pick the wrong tools because you don’t yet know much yet. Of course, there are more complexities to deal with, such as security, governance, and network, which will have to be figured out as well. But start with the basics I covered here, because you can’t deal with the complexities until you’ve dealt with the basics. Related content analysis Azure AI Foundry tools for changes in AI applications Microsoft’s launch of Azure AI Foundry at Ignite 2024 signals a welcome shift from chatbots to agents and to using AI for business process automation. By Simon Bisson Nov 20, 2024 7 mins Microsoft Azure Generative AI Development Tools analysis Succeeding with observability in the cloud Cloud observability practices are complex—just like the cloud deployments they seek to understand. The insights observability offers make it a challenge worth tackling. By David Linthicum Nov 19, 2024 5 mins Cloud Management Cloud Computing news Akka distributed computing platform adds Java SDK Akka enables development of applications that are primarily event-driven, deployable on Akka’s serverless platform or on AWS, Azure, or GCP cloud instances. By Paul Krill Nov 18, 2024 2 mins Java Scala Serverless Computing analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing Resources Videos