Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for: streaming, model lists, LM Studio, OpenRouter, more Google Gemini's #106

Open
cleesmith opened this issue Nov 29, 2024 · 2 comments

Comments

@cleesmith
Copy link

cleesmith commented Nov 29, 2024

It would be great to have support for these:

  1. streaming AI responses - totally expected in chats

  2. get a list of models for each/selected provider - as models change frequently nowadays, it's better to ask for a list from the provider than keep my own list up-to-date

  3. add LM Studio - nice local LLM server with lots of models

  4. add OpenRouter - 200+ models, and one api key to get to them all

  5. add Google's free Gemini in additional to Advanced and Vertex

Thanks for your efforts to "normalize" the LLM requests/responses to "fit" OpenAI's API ...
why all of the AI providers can not agree on a standard API is puzzling (or is it?).

@cleesmith cleesmith changed the title Add support for: LM Studio, OpenRouter, Google's free Gemini, streaming AI responses, and list of models for each provider Add support for: streaming, model lists, LM Studio, OpenRouter, more Google Gemini's Nov 29, 2024
@chris0517-sb
Copy link

yeah

@rseward
Copy link

rseward commented Dec 2, 2024

+1 for gemini-pro support! Don't make me create a pull request. ;-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants