Support other LLM
under review
D
Dongyoung Lim
I use Claude 3 opus for personal use of LLM chat. Please support claude, gemini as AI Assistant backend.
Log In
B
Benjamin Wallsten
I’m a fan of Claude and have a Max account that I typically use. I’m fine with using API keys and a more developer-focused approach, but I worry focusing on the fully customizable (e.g. custom tool usage or web search or etc.) puts off the easier but still highly valuable intermediate step feature of allowing users to swap the existing OpenAI GPT-4.1 nano model for something more immediately useful to subscribers of these other services.
For instance, I use Imbue’s Sculptor application which allows me to use either a Claude API key or directly connecting my Max account to just get up and running with a higher tier and capability set than out-of-the-box Claude. For me, that seems a much more reasonable and intuitive approach for existing Capacities subscribers who may already have paid Pro plans with some of these services than asking them (and the Capacities Dev team) to support all of the combinations of API capabilities for each platform. Would starting here make more sense to existing Capacities subscribers?
(Caveat, I’m not experienced in the technical implementation of this — I assume a permissioned app approach — vs the API key approach, but my experience with building API key utilities for these platforms does inform some of my thinking.)
M
Michael Murphy
I just plugged in my OpenRouter API key expecting a bit more models to choose from. I really hope you allow at least all OpenAI API compatible keys soon.
Steffen Bleher
marked this post as
under review
Steffen Bleher
Merged in a post:
Please add Gemini API to AI feature
M
Mehmet Kaba
OpenAI limit is very low. With using on 10-15 title, it reaches daily limit. I am using Gemini Advanced and want to use it within Capacities also.
Steffen Bleher
Merged in a post:
Introduce alternative AI engines in the assistant, eg: Perplexity
N
Nicolas Bosshardt
This ticket is to explore the idea of integrating LLMs in addition to chatGPT into Capacities' AI assistant, with the goal of bringing additional features / benefits, such as:
- leveraging features such as live web search and footnote referencing directly within Cap when using the AI assistant
- providing alternatives to chatGPT if/when the user prefers to use a different language model for reasons related to features / result accuracy / ethics / other reasons
This idea was initially discussed & developed here: https://discord.com/channels/940596022344843336/1120340681102204948/1283433662670438431
Related, but different ask: https://capacities.io/feedback/p/byo-ai
Steffen Bleher
Merged in a post:
Custom AI Endpoints
T
TJ Ferrell
It would be awesome if we could provide our own Open AI compatible endpoint. For example LLM studio provides an "Open AI" compatible api (https://lmstudio.ai/docs/api/openai-api), google provides a Open AI compatible api (https://ai.google.dev/gemini-api/docs/openai), and claude has an open project translation layer ( https://github.com/jtsang4/claude-to-chatgpt) to provide open ai api compatibility. Giving this ability to users not only allows security concerned users the freedom to run their own local models, but also gives people the ability to choose other models should they prefer like Gemini or Claude.
Steffen Bleher
Merged in a post:
BYO AI
N
Nicola Fern
I have a subscription to Perplexity Pro, which is much more useful to me than OpenAI, and includes $5 of API use a month - it would be nice to be able to use this more suitable AI in Capacities.
R
Rodrigo Ferraz
What about running your own Local AI models?
A
Adam Lewis
Completely agreed! I migrated to Claude from OpenAI some time ago, and I cannot fathom going back. It’s nice to not have my AI tool be a sycophant, but mostly it’s that Claude works so much better for my specific tasks.
Beth
Merged in a post:
GPT 4 Personal Account API
C
Corey Treaster
Would kill to be able to use my personal GPT4 subscription API with Capacities. Feels hard to pay for two things with such overlapping functionality.
Load More
→