Ecosyste.ms: Timeline

Browse the timeline of events for every public repo on GitHub. Data updated hourly from GH Archive.

BerriAI/litellm

krrishdholakia created a branch on BerriAI/litellm

litellm_redis_otel_tracing - Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]

atakanokan starred BerriAI/litellm
biranchi2018 starred BerriAI/litellm
tenfar starred BerriAI/litellm
collinsomniac starred BerriAI/litellm
us58 created a comment on an issue on BerriAI/litellm
Sorry I have no idea how to monitor this. If you can explain how to do this, I'm happy to help

View on GitHub

krrishdholakia pushed 1 commit to litellm_dev_10_26_2024 BerriAI/litellm
  • fix(redis_cache.py): instrument otel logging for sync redis calls ensures complete coverage for all redis cache calls 7301c83

View on GitHub

krrishdholakia pushed 1 commit to litellm_dev_10_26_2024 BerriAI/litellm
  • test: make testing more robust for custom pricing cbc0778

View on GitHub

zzlgreat starred BerriAI/litellm
krrishdholakia pushed 1 commit to litellm_dev_10_26_2024 BerriAI/litellm
  • fix(main.py): register custom model pricing with specific key Ensure custom model pricing is registered to the speci... a9c8c06

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
it shouldn't be. is a file descriptor only created for http calls (wondering if a new thread could cause this)

View on GitHub

Daylyt247 starred BerriAI/litellm
us58 created a comment on an issue on BerriAI/litellm
Found the issue. The Ubuntu default value of 1024 for the number of file descriptors caused this. No more errors after increasing it. I guess litellm internally uses an additional request of som...

View on GitHub

vercel[bot] created a comment on a pull request on BerriAI/litellm
[vc]: #SaCG2k1vAGa8y0oQYf0kgmhzGPGY++HI/OvidnzoIko=:eyJpc01vbm9yZXBvIjp0cnVlLCJ0eXBlIjoiZ2l0aHViIiwicHJvamVjdHMiOlt7Im5hbWUiOiJsaXRlbGxtIiwicm9vdERpcmVjdG9yeSI6ImRvY3MvbXktd2Vic2l0ZSIsImluc3BlY3Rvc...

View on GitHub

jamesev15 opened a pull request on BerriAI/litellm
Add txt file type in GCS URIs as accepted file type for gemini 1.5
## Title Add txt file type to GCS URIs for Gemini 1.5 models ## Relevant issues Since the version of **google-cloud-aiplatform** is 1.47.0, the txt file is supported in GCS URIs, this code...
jamesev15 forked BerriAI/litellm

jamesev15/litellm

Neo42 starred BerriAI/litellm
cleberhensel starred BerriAI/litellm
beat opened an issue on BerriAI/litellm
[Bug]: Adding a LanFuse Logging: UI is missing LANGFUSE_HOST, so a self-hosted langfuse setup not possible in one step
### What happened? New self-hosted LiteLLM installation (docker) with self-hosted langfuse. Bug 1: In the Logging&Alerts menu / Add Callback, select Langfuse, only LANGFUSE_PUBLIC_KEY and LANGF...
junchi9 starred BerriAI/litellm
thangnhc starred BerriAI/litellm
krrishdholakia pushed 1 commit to litellm_dev_10_26_2024 BerriAI/litellm
  • docs(exception_mapping.md): add missing exception types Fixes https://github.com/Aider-AI/aider/issues/2120#issuecom... 937ed32

View on GitHub

beat created a comment on an issue on BerriAI/litellm
Try updating to latest litellm version of today, it's fixed for me.

View on GitHub

krrishdholakia created a branch on BerriAI/litellm

litellm_dev_10_26_2024 - Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]

krrishdholakia created a comment on an issue on BerriAI/litellm
this makes sense @asapMaki > model: llama-3.2-3b-instruct # add openai/ prefix to route as OpenAI provider you didn't add the route - `openai/`, litellm has no idea how to route this model

View on GitHub

lukasz-kastelik starred BerriAI/litellm
minhquanhoang starred BerriAI/litellm
khromalabs starred BerriAI/litellm
ishaan-jaff pushed 1 commit to litellm_dd_llm_obs BerriAI/litellm

View on GitHub

ishaan-jaff pushed 1 commit to litellm_dd_llm_obs BerriAI/litellm
  • add datadog_llm_observability ad6af95

View on GitHub

Load more