CometLLM class as a first-class integration with CometAPI. Use it to power RAG pipelines, agents, and LLM chains with any model in CometAPI’s catalog.
Prerequisites
- Python 3.8+
- A CometAPI account with an active API key — get yours here
Tips and troubleshooting
Tips and troubleshooting
- Models: See the CometAPI Models page for all available options.
- Using other models: Initialize with a different current model ID, e.g.
CometLLM(api_key=api_key, model="your-model-id", max_tokens=1024). - Fine-tuning: Pass
temperatureandmax_tokensdirectly toCometLLM(...). - Error handling: Wrap calls in
try/exceptto catch key errors or network issues. - Security: Never commit API keys to version control. Use environment variables.
- More docs: LlamaIndex documentation — CometAPI quick start — Colab example