LAS VEGAS – Amazon Web Services (AWS) leveraged its annual re:Invent conference, running through December 5, 2025, to unveil a series of enterprise-focused artificial intelligence (AI) products and services. Key announcements centered on advanced AI agents, enhanced tools for custom large language models (LLMs), and new AI training hardware.
AWS CEO Matt Garman initiated the conference on December 2 with a keynote emphasizing the growing role of AI agents in unlocking business value. "AI assistants are starting to give way to AI agents that can perform tasks and automate on your behalf," Garman stated. Swami Sivasubramanian, Vice President of Agentic AI at AWS, further elaborated on December 3, describing agents as tools that can generate plans, write code, and execute solutions from natural language prompts. AWS introduced "Frontier agents," including the Kiro autonomous agent, designed to learn team workflows and operate independently for extended periods, alongside agents for security processes and DevOps tasks.
To facilitate greater customization, AWS announced new capabilities for its Amazon Bedrock and Amazon SageMaker AI platforms. SageMaker now includes serverless model customization, allowing developers to build models without managing underlying compute resources. Bedrock gained Reinforcement Fine Tuning, providing pre-set workflows for automated LLM customization. Additionally, AWS launched four new Nova AI models—three for text generation and one for text and image creation—and the Nova Forge service, which enables customers to train pre-existing models with their proprietary data. The company also expanded its AgentCore platform with features like Policy in AgentCore for boundary setting and prebuilt evaluation systems.
Hardware advancements included the introduction of Trainium3, AWS's next-generation AI training chip, which the company claims offers up to four times performance gains for AI training and inference while reducing energy consumption by 40%. AWS also revealed plans for Trainium4, designed for compatibility with Nvidia chips. Amazon CEO Andy Jassy commented via social media that the current Trainium2 chip constitutes a "multi-billion-dollar business." Furthermore, AWS announced "AI Factories," systems developed in partnership with Nvidia, allowing large corporations and governments to run AWS AI systems within their own data centers, addressing data sovereignty requirements.
Other announcements included Database Savings Plans, offering up to a 35% reduction in database costs for customers committing to consistent usage. Lyft, a ridesharing company, reported utilizing an AI agent via Amazon Bedrock to manage driver and rider inquiries, resulting in an 87% reduction in average resolution time and a 70% increase in driver usage of the AI agent.