Amazon has launched Amazon Bedrock, a platform that enables the creation of generative AI-powered apps using pretrained models hosted on AWS by third-party startups such as AI21 Labs, Anthropic, and Stability AI. Bedrock is in a “limited preview” and offers access to Titan FMs, which are models trained in-house by AWS.
While AWS has made partnerships with generative AI startups in the past few months, including Stability AI and Hugging Face, to bring their text-generating models onto the AWS platform, Bedrock marks a significant investment in this technology. Vasi Philomin, VP of generative AI at AWS, stated that AWS believes every application can be transformed by generative AI. The company recently launched a generative AI accelerator for startups and announced a partnership with Nvidia to create new infrastructure for training AI models.
Bedrock and custom models
Amazon has launched Amazon Bedrock, a platform that enables the creation of generative AI-powered apps using pretrained models hosted on AWS by third-party startups. The generative AI market could be worth close to $110 billion by 2030. Bedrock is aimed at large customers building “enterprise-scale” AI apps, differentiating it from some of the AI model hosting services out there. Third-party models hosted on Bedrock include AI21 Labs’ Jurassic-2 family, which are multilingual and can generate text in several languages, and Stability AI’s suite of text-to-image models. However, Amazon hasn’t announced formal pricing or revealed terms of the model licensing or hosting agreements.
Amazon has introduced a new service called Amazon Bedrock, which enables customers to create generative AI-powered apps using pre-trained models from third-party startups, including AI21 Labs, Anthropic and Stability AI. AWS customers can access AI models from different providers, including Amazon’s in-house models, via an API. The Titan FM family comprises two models at present – a text-generating model and an embedding model, and they are aimed at customers building “enterprise-scale” AI apps. The text-generating model is similar to OpenAI’s GPT-4 and can perform tasks like writing blog posts and emails, summarizing documents and extracting information from databases. AWS customers can customize any Bedrock model by providing the service with a few labeled examples.
Philomin, a spokesperson for AWS, discussed the Titan FM family of models, which includes a text-generating model and an embedding model. The text-generating model can perform various tasks such as summarizing documents, writing emails and blog posts, and extracting information from databases. Meanwhile, the embedding model translates text inputs into numerical representations that contain the semantic meaning of the text. However, Philomin did not reveal which data the Titan models were trained on. Instead, he emphasized that the models were designed to detect and remove harmful content in customer-provided data and filter out outputs containing hate speech, profanity, and violence.
Although generative AI models tend to amplify biases in training data, Philomin dismissed concerns regarding potential prompt injection attacks that could lead to the creation of sexist, racist, or misinformational content. He stated that AWS was committed to responsible use of the technology and was monitoring the regulatory landscape. However, it is uncertain who would be held liable in the event of a lawsuit, whether it would be AWS customers, AWS itself, or the offending model’s creator.
CodeWhisperer, Trainium and Inferentia2 launch in GA
Amazon has made its AI-powered code-generating service, CodeWhisperer, free of charge to developers without any usage restrictions. This move is seen as an attempt to boost the service’s usage, as CodeWhisperer has not seen the same level of uptake as its rival, GitHub’s Copilot, which has over a million users, thousands of which are enterprise customers. To gain more corporate customers, Amazon has launched CodeWhisperer Professional Tier, which adds single sign-on with AWS Identity and Access Management integration, as well as higher limits on scanning for security vulnerabilities.
Amazon has made its CodeWhisperer AI-powered code-generating service free for developers without any usage restrictions. The move could be seen as an attempt to catch up with GitHub’s Copilot, which has over a million users, including thousands of enterprise customers.
Meanwhile, Amazon has launched Elastic Cloud Compute Inf2 instances in general availability, powered by its AWS Inferentia2 chips, as well as Amazon EC2 Trn1n instances powered by AWS Trainium, Amazon’s custom-designed chip for AI training. Both are designed to offer faster AI runtimes, with improved throughput and lower latency.