After months of silence, Amazon today announced that it will take on ChatGPT, Bing chatbot, and Google Bard with Bedrock, a way for customers to build and scale generative AI-based applications using Foundational Models (FMs).
“Recently, generative AI applications like ChatGPT have captured widespread attention and imagination,” Amazon AWS vice president Swami Sivasubramanian writes in the announcement post. “At AWS, we have played a key role in democratizing ML and making it accessible to anyone who wants to use it, including more than 100,000 customers of all sizes and industries. AWS has the broadest and deepest portfolio of AI and ML services at all three layers of the stack. That’s why today I’m excited to announce several new innovations that will make it easy and practical for our customers to use generative AI in their businesses.”
Bedrock provides access to a range of powerful FMs for text and images. These include Amazon’s Titan FMs, which consist of two new Large Language Models (LLMs) that Amazon also announced today, and made available through a scalable, reliable, and secure AWS-managed service. Bedrock is now available in a limited preview.
Additionally, Amazon announced that Amazon EC2 Trn1n instances powered by AWS Trainium and Amazon EC2 Inf2 instances powered by AWS Inferentia2, which it describes as the most cost-effective cloud infrastructure for generative AI, are generally available. And the firm has also shipped CodeWhisperer, its free AI coding companion, for individual developers. It works with languages like Python, Java, JavaScript, TypeScript, C#, Go, Kotlin, Rust, PHP, and SQL in IDEs like Visual Studio Code, IntelliJ IDEA, AWS Cloud9, and others.
You can learn more about Amazon Bedrock, the AWS Trainium-based Trn1n instance, the AWS Inferentia-based Inf2 instance, and Amazon CodeWhisperer on the AWS website.