In the Build 2023 session list, you will find a reference to how developers can build a “company copilot” using Azure ML and GPT-4. This is, I suspect, the first big step, post-Bing Chatbot, to adapting Microsoft’s latest AI innovations for use by third parties.
“Large AI models and AI-embedded applications like ChatGPT are transforming the way we live and work, made possible by the confluence of advancements in big data, algorithms, and powerful AI supercomputers,” the listing for the session notes. “Harnessing these technologies for real-world applications requires purpose-built tooling to enable effective prompt engineering, experimentation, and safety mechanisms that deliver great customer experiences.”
The speaker is Gregory Buehrer, a Microsoft Distinguished Engineer and the CTO of Azure Machine Learning.
Somewhat related to this, Microsoft last night also posted about how developers can get started with OpenAI in .NET. This is the start of a months-long, multi-article series about “AI-related building blocks to help you add OpenAI-powered AI capabilities to your .NET applications.” It explains that GPT and other related models are accessed via REST APIs and libraries and that the Azure OpenAI Service provides access to OpenAI advanced language AI models like GPT-4.
What’s curious here is that Microsoft is both partnering and competing with OpenAI.
“The [Azure OpenAI Service] APIs are co-developed with OpenAI to ensure compatibility and a smooth transition between the two,” Microsoft’s Luis Quintanilla explains. “Customers also benefit from private networking, regional availability, and responsible AI content filtering. We recommend using the Azure OpenAI .NET SDK which supports both OpenAI and Azure OpenAI Service.”
There will be more soon, which makes sense: Build is less than six weeks away, and with Microsoft CTO Kevin Scott providing an AI keynote, I suspect there will be a lot of related news at the show.