summarizer
LLM
summarize can use multiple providers. Pick one model, one provider, then make it boring and reliable.
Keys
OPENAI_API_KEYXAI_API_KEYGEMINI_API_KEY(also acceptsGOOGLE_GENERATIVE_AI_API_KEY/GOOGLE_API_KEY)
Practical advice
- Pin
--modelfor stable output. - When using
--markdown llm, provider fallback is disabled by design. - For audits / tooling, prefer
--json+ fixed model.