• by sreenathmenon on 8/8/2025, 4:37:11 PM

    Just released v1.5.0 with IBM watsonx integration.

      ```python
      # Works with any provider now
      client = LLMClient(provider="watsonx")
      client = LLMClient(provider="anthropic")
      client = LLMClient(provider="openai")
    
      response = client.query("Analyze this data")
    
      New in v1.5.0:
      - IBM watsonx provider with Granite models
      - Same API across all 5 providers
      - Auto-fallback still works
      - Enterprise authentication
    
      Also supports 100+ local Ollama models including OpenAI's new GPT-OSS.