The hyperscalers now offer multicloud ops tools. Cloud-native tools sound good in theory, but here are a few other things to keep in mind. Credit: Mattjeacock / Getty Images The rise of cloudops tools (such as AIops) is in full swing. There are three basic choices: on demand as a non-native tool, hosted on-premises, or a hosted cloud-native tool offered by a public cloud provider. Which door should you choose? The on-demand, non-native category includes the majority of AIops tools that run on a hosting service, sometimes on a public cloud. The wide variety of the tools’ options drives this choice more than the preferred deployment model. If more on-premises systems need to be monitored and controlled, that’s better accomplished using on-premises hosting because the data does not need to flow all the way back to a centralized hosting service over the open internet. At times, it may make sense for the ops tool to run in both places, and some tools provide the ability to do that in coordinated ways. If it’s a solid tool, then it should not matter how you deploy it. Cloud-native tools are owned by a specific cloud provider. They were created to monitor and control their own native cloud services, but they can also monitor and control services on other clouds. This support for multicloud deployments is logical when you consider the growing number of multicloud configurations. However, you need to consider the capabilities of the tool now, as well as its ability to address future needs as your deployments become more complex and heterogeneous over time. At this moment, I could make the argument that using a native tool is a good idea. Most enterprises have an 80/20 rule when deploying to multiple cloud brands. This means that 80% of the applications and data reside in a specific cloud brand while the other 20% reside within other brands, for example: 80% Microsoft, 15% AWS, 5% Google. Thus, it may make sense to leverage a cloud-native ops tool that does a better job of supporting its cloud-native services and can also be deployed as a multicloud ops tool that supports other public clouds. The mix makes sense given your ops approach—at least for now. The trouble with multicloud is that it’s always changing. Although the mix used in our example above is the state today, tomorrow’s market may include two more public clouds, say IBM and Oracle, as well as a normalized percentage of applications and data that run across different cloud brands. We could even see a common deployment pattern where a single public cloud holds less than 30% of the workloads and data on average, with the other applications and data distributed across four or more public clouds as part of the multicloud. Here’s the question that comes up: If you use a single cloud-native tool running on a single public cloud provider and it can monitor and control other cloud brands as well, should you select that ops tool? The answer is probably no, and it has nothing to do with the tool being native to a specific public cloud provider. It’s the architectural reality that ops tools need to be centralized and decoupled from the platforms they control. They need to support the monitoring and management of all public clouds as part of your multicloud, as well as most traditional on-premises systems. A hosted cloud-native tool (option 3) could solve your problems in the short run. However, in the long run, your cloudops tool needs to run on a neutral platform to ensure the most effective solutions now and into the future. Therefore, the best cloudops tool choices lie in options 1 (hosted, on demand) or 2 (on premises), or both. Related content feature What is Rust? Safe, fast, and easy software development Unlike most programming languages, Rust doesn't make you choose between speed, safety, and ease of use. Find out how Rust delivers better code with fewer compromises, and a few downsides to consider before learning Rust. By Serdar Yegulalp Nov 20, 2024 11 mins Rust Programming Languages Software Development how-to Kotlin for Java developers: Classes and coroutines Kotlin was designed to bring more flexibility and flow to programming in the JVM. Here's an in-depth look at how Kotlin makes working with classes and objects easier and introduces coroutines to modernize concurrency. By Matthew Tyson Nov 20, 2024 9 mins Java Kotlin Programming Languages analysis Azure AI Foundry tools for changes in AI applications Microsoft’s launch of Azure AI Foundry at Ignite 2024 signals a welcome shift from chatbots to agents and to using AI for business process automation. By Simon Bisson Nov 20, 2024 7 mins Microsoft Azure Generative AI Development Tools news Microsoft unveils imaging APIs for Windows Copilot Runtime Generative AI-backed APIs will allow developers to build image super resolution, image segmentation, object erase, and OCR capabilities into Windows applications. By Paul Krill Nov 19, 2024 2 mins Generative AI APIs Development Libraries and Frameworks Resources Videos