Rig has released version 0.35, introducing significant improvements including enhanced tool calling compatibility across providers, better streaming reliability, and dynamic model discovery functionality. The updated version supports programmatic model listing from various providers such as Ollama, Anthropic, OpenAI, and Gemini, facilitating easier tooling development. It also includes the addition of GPT Image 1.5 and fixes that ensure conversation history is preserved in multi-turn interactions, preventing issues for stricter providers. Other enhancements in this release focus on observability and caching support for rig-bedrock, strengthening the overall performance and usability of the AI library written in Rust.

Rig: Rig is an open-source Rust library for building modular, ergonomic, and production-ready LLM-powered applications and fullstack AI agents. It provides unified client interfaces for providers including OpenAI, Anthropic, Mistral, Gemini, Ollama, and local backends. The v0.35 release introduces tool calling across providers like llama.cpp, GPT Image 1.5 support, dynamic model listing, reliable multi-turn streaming, and Bedrock prompt caching.
OpenAI: OpenAI develops advanced multimodal large language models in the GPT series, enabling applications in text, vision, and tool use via APIs. Their models power widespread AI integrations. Rig v0.35 adds support for GPT Image 1.5, fixes OpenAI-compatible streaming and tool execution, and includes model listing capabilities.
Anthropic: Anthropic creates safe, interpretable AI systems with the Claude family of models, offered through scalable APIs. They prioritize reliability in conversational AI. Rig v0.35 enables model listing for Anthropic and updates constants to current Claude families for seamless integration.
llama.cpp: llama.cpp is an open-source C/C++ library for lightweight LLM inference on consumer hardware using the GGUF format. It supports optimized execution across CPUs, GPUs, and recent backends like OpenVINO. Rig v0.35 incorporates llama.cpp tool-calling handling with fixes for OpenAI-compatible local backends.
gold_electrum: gold_electrum is an X user actively involved in the Rig project, posting detailed release announcements. They highlighted the v0.35.0 launch, emphasizing llama.cpp support, dynamic model listing, streaming improvements, and regression tests. Their contributions promote Rig within the Rust AI developer community.

Dynamic Discovery: Rig now supports programmatic model listing from providers like Ollama, Anthropic, OpenAI, and Gemini to simplify tooling development.
Observability Boost: rig-bedrock gains OpenTelemetry metadata and prompt caching for explicit system content and messages.
Streaming Reliability: Fixes in v0.35 preserve full conversation history across tool calls, avoiding prompt duplication on stricter providers.