summarize

summarizer

LLM

summarize can use multiple providers. Pick one model, one provider, then make it boring and reliable.

Keys

  • OPENAI_API_KEY
  • XAI_API_KEY
  • GEMINI_API_KEY (also accepts GOOGLE_GENERATIVE_AI_API_KEY / GOOGLE_API_KEY)

Practical advice

  • Pin --model for stable output.
  • When using --markdown llm, provider fallback is disabled by design.
  • For audits / tooling, prefer --json + fixed model.