LangChain/Models
Intermediate7 min

Configurable Models

Create a single model instance that can be swapped at runtime via config. Use configurable_fields to expose temperature, model name, and provider as runtime parameters — no code changes needed.

Quick Reference

  • init_chat_model() with no model name = fully configurable at runtime
  • Pass config={'configurable': {'model': 'gpt-4.1'}} to any invoke/stream/batch call
  • configurable_fields=('model', 'temperature', 'max_tokens') exposes specific fields
  • config_prefix='first' scopes config keys when chaining multiple configurable models
  • bind_tools() and with_structured_output() work on configurable models

Why Configurable Models

Hardcoding a model name means redeploying to change it. Configurable models let you define the chain once and swap the model at invocation time — useful for A/B testing, user-controlled model selection, multi-tenant deployments, and debug workflows where you want to try different models without touching code.