A daily log of product changes, improvements, and new model releases on OpenRouter

Product changes

  • Fusion server tool now available via API and ChatroomFusion is now available as an API plugin, a server tool, and in the chatroom composer — it routes your prompt to multiple models in parallel and synthesizes their responses into a single, higher-quality answer.
  • Automatic prompt caching in the Responses API — The Responses API now accepts top-level cache_control for Anthropic Claude models, closing a gap where cached input tokens stayed at zero on /api/v1/responses. Docs
  • Logs page pagination toggle — The Generations tab on the logs page now lets you switch between “Load more” and traditional paged navigation, with your preference saved across sessions.
  • Replit community guide — Added a community guide for configuring your OpenRouter API key in Replit projects.
  • Fixed: Codex multi-turn reasoning context lost through OpenRouter — Codex CLI and Desktop conversations proxied through OpenRouter no longer lose reasoning context across turns, fixing issues where the model would repeat completed steps.
  • Fixed: chatroom artifacts ignoring prior prompts — Follow-up artifact generations now include the full prompt history, preventing the model from generating unrelated content when iterating.
  • Fixed: routing preferences ignored with fallbacks disabledpreferred_min_throughput and preferred_max_latency now take effect when allow_fallbacks is set to false. Docs
  • Fixed: mobile sidebar on iPad-size viewports — The sidebar drawer on tablet-sized screens no longer renders behind the backdrop overlay.
  • Fixed: Recraft vector model SVG output — Recraft vector model variants now return properly encoded image/svg+xml data URLs instead of incorrectly labeled raster output. Docs