AWS said the expanded partnership introduces three limited preview offerings on Amazon Bedrock: the latest OpenAI models, Codex on Amazon Bedrock and Amazon Bedrock Managed Agents, powered by OpenAI. The company framed the rollout around a common enterprise demand: access to advanced AI systems without giving up the security, governance and operating controls already used in production on AWS. The additions are intended to place OpenAI capabilities inside existing Bedrock workflows rather than force customers onto separate infrastructure or management layers.
π Key Highlights
- OpenAI models become available on Amazon Bedrock in limited preview
- Codex launches on Amazon Bedrock for enterprise software development
- Bedrock Managed Agents, powered by OpenAI, enters limited preview
- AWS says OpenAI usage can count toward cloud commitments
- Bedrock Managed Agents uses OpenAI agent harness on AWS
The OpenAI models will be available through the same Amazon Bedrock APIs and controls customers already use for model access, fine-tuning and orchestration. AWS said that lets customers compare and deploy OpenAI models alongside offerings from Anthropic, Meta, Mistral, Cohere and Amazon through one service. Those models inherit existing Bedrock controls, including IAM-based access management, AWS PrivateLink connectivity, guardrails, encryption at rest and in transit, logging through AWS CloudTrail and links to current compliance frameworks. AWS also said customers can apply OpenAI model usage toward existing cloud commitments and combine that spending with other AWS workloads.
Codex on Amazon Bedrock extends that approach to software development. AWS said the OpenAI coding agent is now available in limited preview inside the AWS environments where enterprise teams already build and run systems at scale. Customers can sign in with AWS credentials, run inference through Amazon Bedrock infrastructure and count Codex usage toward AWS cloud commitments. The service is available through the Bedrock API, beginning with the Codex CLI, the Codex desktop app and a Visual Studio Code extension. AWS said more than 4 million people use Codex each week for tasks that include writing code, refactoring, explaining systems, generating tests and speeding software delivery.
The new managed agent service focuses on the added pieces required to run AI agents in production. AWS said capable agents need more than model intelligence, including memory that persists across sessions, encoded skills, permissions controls and task-suited compute. Bedrock Managed Agents, powered by OpenAI, is designed to combine OpenAI frontier models and agent capabilities with AWS infrastructure and services already used by millions of organizations. AWS said the service is built with the OpenAI agent harness and optimized for OpenAI models on AWS, with features aimed at faster execution, stronger reasoning and dependable control over long-running tasks.
AWS said each managed agent gets its own identity, records every action for auditability and operates inside the customerβs environment, with all model inference running on Amazon Bedrock. The company added that organizations scaling to hundreds of thousands of agents can use AWS global infrastructure while staying close to their data, applications and services. Bedrock AgentCore serves as the default compute environment for Bedrock Managed Agents, and AWS said the two services together will add capabilities including authorization policy enforcement, agent and tool discovery, plus observability and evaluation. The company described the launches as an early step in a deeper collaboration with OpenAI that will keep bringing newer model and agent advances to Amazon Bedrock.
π What This Means (Our Analysis)
This expansion matters because it narrows the gap between access to advanced OpenAI systems and the operational demands of enterprise computing. Instead of asking organizations to bolt on separate controls, AWS is placing those capabilities inside the cloud environment many already use for security, logging, procurement and compliance.
The practical shift is in how AI moves from testing to day-to-day deployment. By combining model access, coding assistance and managed agents under Amazon Bedrock, AWS is packaging OpenAI tools in a way that reduces setup friction and keeps governance close to the workload, which is where enterprise adoption often gets decided.
π Our Take: The partnership is pushing enterprise AI toward a more unified operating model. If customers can evaluate models, run coding workflows and deploy agents through one service while applying usage to existing cloud commitments, the path from experiment to scaled use becomes much clearer.