Perhaps the biggest thing since open source or Google, LLMs may have companies fighting for supremacy, but it’s the developers who come out ahead. Credit: TotoKita / Getty Two months ago, Amazon didn’t make a single mention of AI on its earnings call (Google and Microsoft mentioned AI dozens of times each). This past week, by contrast, the company’s cloud division, Amazon Web Services (AWS), could talk about little else. As announced by Swami Sivasubramanian, vice president of database, analytics, and machine learning at AWS, the company is all over AI with the launch of new large language models (LLMs) and APIs to access them, as well as CodeWhisperer, a GitHub Copilot competitor, and more. It’s not that AWS wasn’t working on AI before; Amazon has been working with AI for decades. Rather, it’s now impossible to ignore AI. For developers, I’ve recently argued, the time is now to start learning how to put LLMs to work for you in your code development. AWS, never one to chase competitors, has decided it can’t remain silent when everyone else is talking about the power of LLMs and AI to transform software development. Just in time, too, as RedMonk’s James Governor argues that OpenAI is the new AWS. To the folks at Jina, OpenAI is the new Google. Either way, it’s big, with the potential to dramatically change how the clouds compete. OpenAI as the new AWS When Governor calls OpenAI “the new AWS,” he’s not suggesting that OpenAI, the company behind the LLM ChatGPT, will be rolling out its own version of Amazon EC2 or Amazon S3 anytime soon. Rather, he’s talking about the impact LLMs can have on software development. Inspired by a discussion Governor and I had recently over lunch, I wrote about this, suggesting that, “The race is on for developers to learn how to query LLMs to build and test code but also to learn how to train LLMs with context (like code samples) to get the best possible outputs.” For Governor, past revolutions in developer productivity were launched by “AWS, open source, and GitHub” because “all of that stuff came together to help people learn and build.” With LLMs, he continues, “We’re at that point again.” LLMs lower barriers to developer productivity much like open source (no need to get purchasing’s approval for a software license) and cloud (swipe a credit card rather than ask to requisition a server). In this case, Governor says it’s not about reducing time to gain access to software/hardware or about collaboration (GitHub), but rather about dramatically reducing the time to learn. As he stresses, “AI makes it easier than ever to learn new skill sets.” Back to AWS. One reason for the bevy of announcements this past week is because Microsoft, not AWS, has been at the forefront of enabling developer productivity with AI. Years ago, Microsoft bought GitHub, but prior to that it developed Visual Studio, the number 1 IDE and code editor used by developers. Together, that’s a powerful one-two punch. Add OpenAI’s ChatGPT, which Microsoft has built into Bing, Copilot, and other Microsoft services, and Microsoft is now in pole position to earn developer loyalty. As Governor puts it, given that any developer using ChatGPT is running on Azure, “What about a toolchain that eliminated Azure as a gating factor for developers building apps in GitHub and [Visual Studio] Code?” In other words, could Microsoft refocus developers away from the underlying cloud infrastructure onto the applications being built with ChatGPT? It could. Microsoft has done well with Azure, but it’s still catching up to AWS. By elevating the application experience and removing the “undifferentiated heavy lifting” of even thinking about the underlying cloud infrastructure, “Microsoft has the opportunity to create a once and future developer experience which finally and properly brings the pain to AWS,” to borrow Governor’s phrase. My InfoWorld colleague David Linthicum correctly contends that “ ‘cost savings’ are a terrible way to define the value of cloud-based platforms.” Instead, he posits, cloud is “about delivering the more critical business values of agility and speed to innovation.” Nowhere is that more true than in this greenfield area of AI. The way the clouds surface that developer agility—rather than forcing developers to continue to muddle through mountains of different infrastructure services—will determine who wins the next $100 billion in cloud spend. As the cloud companies contend, the biggest winners of all will be developers. Game on. Related content analysis Azure AI Foundry tools for changes in AI applications Microsoft’s launch of Azure AI Foundry at Ignite 2024 signals a welcome shift from chatbots to agents and to using AI for business process automation. By Simon Bisson Nov 20, 2024 7 mins Microsoft Azure Generative AI Development Tools news Microsoft unveils imaging APIs for Windows Copilot Runtime Generative AI-backed APIs will allow developers to build image super resolution, image segmentation, object erase, and OCR capabilities into Windows applications. By Paul Krill Nov 19, 2024 2 mins Generative AI APIs Development Libraries and Frameworks feature A GRC framework for securing generative AI How can enterprises secure and manage the expanding ecosystem of AI applications that touch sensitive business data? Start with a governance framework. By Trevor Welsh Nov 19, 2024 11 mins Generative AI Data Governance Application Security news Go language evolving for future hardware, AI workloads The Go team is working to adapt Go to large multicore systems, the latest hardware instructions, and the needs of developers of large-scale AI systems. By Paul Krill Nov 15, 2024 3 mins Google Go Generative AI Programming Languages Resources Videos