Public clouds are changing the way we look at IT, no matter if you’re using cloud computing or not. Credit: Thinkstock Remember when there was a distinct difference between public clouds and systems you could see and touch in your data center? This is no longer the case. The lines are blurring between traditional systems, meaning hardware and software purchased or licensed for millions of dollars in sunk costs to sit in your own physical data centers, and the public clouds with their flexibility, scalability, and instant provisioning. Legacy or traditional systems are looking more like clouds these days, and what once was a clear decision is no longer clear. I call this “the cloud effect.” Traditional software and hardware players have adopted much of what makes public cloud computing compelling. This includes pay-as-you-go pricing and agreements for hardware and software, even public cloud–connected systems that sit within a data center and often are called edge clouds or microclouds, such as Microsoft’s Azure Stack and AWS’s Outpost. No longer is this a clear path. Is this blurring a good thing? Anything that makes the use of technology more flexible and less expensive is a positive evolution, and this is the same. You may recall when we moved to PCs that we changed the way we leveraged mainframe and minicomputer hardware and software. The cloud effect is no different, however it’s about 100 times greater a game changer as any technological shift that I’ve seen. So, there are benefits, even for those who have yet to move to a single cloud. For sure, data centers have become “stickier,” with many enterprises opting to delay migration to the cloud or cut back on the number of systems that will migrate. They are doing this for strictly business reasons, including the fact that systems in their data center are becoming more cloudlike and thus more cost effective already. The downside is that some enterprises may delay migrations for the wrong reasons. If they are seeking to support more speed and innovation, then cloud computing is typically a better fit than traditional computing approaches. The risk is that vendors that support things that run in data centers become good at retaining customers and, at times, customers may make the wrong decisions for what seems to be the right reasons. I often play devil’s advocate and take the side of staying in the data center when there is too much religion around cloud. Or I become a cloud advocate when nobody wants to take on the risk and costs of making the journey to the cloud, not considering the value left on the table. There must be a compelling reason in each case. Neither path will be a slam dunk—it’s mostly going to be a mix of on-premises and cloud. The mix is the problem to solve. Related content analysis Azure AI Foundry tools for changes in AI applications Microsoft’s launch of Azure AI Foundry at Ignite 2024 signals a welcome shift from chatbots to agents and to using AI for business process automation. By Simon Bisson Nov 20, 2024 7 mins Microsoft Azure Generative AI Development Tools analysis Succeeding with observability in the cloud Cloud observability practices are complex—just like the cloud deployments they seek to understand. The insights observability offers make it a challenge worth tackling. By David Linthicum Nov 19, 2024 5 mins Cloud Management Cloud Computing news Akka distributed computing platform adds Java SDK Akka enables development of applications that are primarily event-driven, deployable on Akka’s serverless platform or on AWS, Azure, or GCP cloud instances. By Paul Krill Nov 18, 2024 2 mins Java Scala Serverless Computing analysis Strategies to navigate the pitfalls of cloud costs Cloud providers waste a lot of their customers’ cloud dollars, but enterprises can take action. By David Linthicum Nov 15, 2024 6 mins Cloud Architecture Cloud Management Cloud Computing Resources Videos