Amazon continues to keep the latest Anthropic generative AI models in its cloud.

spot cso primariy hybridcloud 2400x1600 3
Credit: Shutterstock

The most up-to-date version of Anthropic’s Claude AI, Claude 3 Opus, is now available on Amazon’s Bedrock managed AI service, bringing higher performance on more open-ended tasks to enterprise developers who rely on Amazon for generative AI model access.

According to an announcement today from Amazon, Claude 3 Opus roughly doubles the accuracy of the AI’s responses to difficult, novel and open-ended questions. By offering Claude 3 Opus via Bedrock, Amazon said it is trying to enable enterprise developers to build more robust, feature-rich applications based on generative AI, like complex financial forecasting and R&D.

Task automation, research, and even high-level tasks like formulating strategy are all within the reach of Claude 3 Opus, Amazon said. “As enterprise customers rely on Claude across industries like healthcare, finance and legal research, improved accuracy is essential for safety and performance,” the statement said.

Amazon is also a direct investor in Anthropic, having announced a $2.75 billion funding contribution to the company late last month, which brings its total investment in Anthropic to $4 billion.

Claude 3 Opus was released in March, and its inclusion in Bedrock – a cloud platform designed to let developers work with a range of different generative AI models through a common API – continues the trend of Amazon bringing the latest models from Anthropic to its platform. The latest and most robust version of Claude, Opus provides fewer hallucinations, better visual processing and fewer incorrect refusals to perform harmless tasks, Anthropic said at the time.

“It exhibits near-human levels of comprehension and fluency on complex tasks, leading the frontier of general intelligence,” the company said.

Amazon’s stated strategy with generative AI is shifting, CEO Andy Jassy said last week in an annual shareholder letter, moving away from in-house, consumer-facing AI applications and towards systems like Bedrock, allowing it to sell services via the web to business users.

It’s an area where huge hyperscalers like Amazon have a key advantage, according to experts; actually operating the LLMs that underpin generative AI and its associated applications requires the type of vast computing infrastructure that only major platform providers and the largest corporations can afford.

Bedrock competes with similar offerings from other hyperscalers, including Azure AI Studio from Microsoft and Vertex AI Generative AI Studio, from Google. It lacks the same access to OpenAI models that Azure AI Studio possesses, and has fewer AI models available overall, but the costs of prompt engineering and certain types of app development tend to be lower on Amazon’s platform.

Exit mobile version