AI and other forces are lessening the gravitational pull of public cloud platforms. This trend might be good for enterprises. Credit: Sergey Ginak The phenomenon of data gravity moving away from the cloud continues as businesses reassess their cloud strategies. Many now realize that the initial allure of unlimited storage and processing power often comes with unforeseen costs and complications. Historically, the cloud was championed for its promise of scalability and flexibility. Most enterprises have since encountered a reality starkly different from the marketing hype. Although many companies apply different metrics, the actual costs of operating applications and storing data in the cloud are at least 2.5 times what enterprises had expected. Stripped of its previously invincible allure, the cloud is experiencing a migration of its own as it is no longer deemed the default destination for data. Respondents in various surveys have pointed to the higher-than-anticipated expenses tied to migrating workloads as a key reason many organizations have reverted significant portions of their operations to on-premises environments. It’s AI’s fault As businesses increasingly adopt generative AI technologies, AI plays a pivotal role in reshaping how data gravity moves away from the cloud. AI applications often generate and process vast amounts of data that must be handled close to the source to minimize latency. This necessity for real-time processing means that organizations are gravitating toward edge computing solutions, where data is analyzed near the point of generation. By keeping data local, enterprises enhance performance and reduce the costs of transferring large data sets to the cloud. AI’s immediate data access and analysis requirements are prompting organizations to rethink cloud strategies. Traditional cloud environments can sometimes introduce delays and performance bottlenecks, mainly when dealing with high-volume data streams from Internet of Things devices or real-time analytics. Companies can leverage AI more effectively by migrating specific workloads back on premises or to hybrid infrastructures, ensuring swift response times and better user experiences. AI’s capability to optimize resource allocation can identify the best location for workloads. AI-driven insights enable organizations to assess usage patterns, operational costs, and data flow, uncovering opportunities to streamline processes and identify when on-premises solutions might be more advantageous than cloud storage. This analytical prowess assists in creating more informed data management strategies, ultimately reflecting a shift in data gravity. Rising data security and privacy concerns As businesses deal with regulatory compliance and the necessity of safeguarding sensitive information, many are opting to house their data within local infrastructures. This movement isn’t merely reactionary but strategic as it allows organizations to maintain greater control and mitigate risks associated with cloud vulnerabilities. The need for data governance and security is escalating as AI becomes more prevalent. Organizations are increasingly aware of the risks associated with cloud environments, especially regarding regulatory compliance. Maintaining sensitive data on premises allows for tighter controls and adherence to industry standards, which are often critical in AI applications dealing with personal or confidential information. The convergence of these factors signals a broader reevaluation of cloud-first strategies, leading to hybrid models that balance the benefits of cloud computing with the reliability of traditional infrastructures. This hybrid approach facilitates a tailored fit for various workloads, optimizing performance while ensuring compliance and security. The best place for your data Data can exist on any platform, and accessibility should not be problematic regardless of whether data resides on public clouds or on premises. Indeed, the data location should be transparent. Storing data on-prem or with public cloud providers affects how much an enterprise spends and the data’s accessibility for major strategic applications, including AI. Currently, on-prem is the most cost-effective AI platform—for most data sets and most solutions. Obviously I am speaking in generalities with the understanding that no one-size-fits-all solution exists for AI platforms. What about scalability? Most people who use the cloud for data understand their data growth patterns and have plenty of time to adjust capacity, such as storage. However, if your data load is bursting up and down, the cloud has a role. Most enterprises don’t experience unmanageable bursts, so the cloud’s scalability comes at a cost. On-prem data storage systems are typically much cheaper than their analogs on public cloud providers, and their reliability approaches that of the cloud. Of course, nothing will be done for you as it is in the cloud. However, most enterprises already have employees who handle on-prem systems. The benefits of data gravity moving to on-prem are clear. On-prem data storage systems do not diminish the value of public cloud computing; they just provide a better alternative that allows enterprises to extract more value from data systems. At the end of the day, it’s all about value. Related content feature Dataframes explained: The modern in-memory data science format Dataframes are a staple element of data science libraries and frameworks. Here's why many developers prefer them for working with in-memory data. By Serdar Yegulalp Nov 06, 2024 6 mins Data Science Data Management analysis Cloud providers make bank with genAI while projects fail Generative AI is causing excitement but not success for most enterprises. This needs to change quickly, but it will take some work that enterprises may not be willing to do. By David Linthicum Nov 05, 2024 5 mins Generative AI Cloud Computing Data Management feature Overcoming data inconsistency with a universal semantic layer Disparate BI, analytics, and data science tools result in discrepancies in data interpretation, business logic, and definitions among user groups. A universal semantic layer resolves those discrepancies. By Artyom Keydunov Nov 01, 2024 7 mins Business Intelligence Data Management feature Bridging the performance gap in data infrastructure for AI A significant chasm exists between most organizations’ current data infrastructure capabilities and those necessary to effectively support AI workloads. By Colleen Tartow Oct 28, 2024 12 mins Generative AI Data Architecture Artificial Intelligence Resources Videos