Ecosyste.ms: Timeline

Browse the timeline of events for every public repo on GitHub. Data updated hourly from GH Archive.

BerriAI/litellm

KornaAI forked BerriAI/litellm

KornaAI/litellm

ishaan-jaff created a comment on an issue on BerriAI/litellm
it looks like you're missing views that are needed can you run this migration script and retry @ncecere https://github.com/BerriAI/litellm/blob/main/db_scripts/create_views.py

View on GitHub

ishaan-jaff created a comment on an issue on BerriAI/litellm
this works for me <img width="1280" alt="Screenshot 2024-10-25 at 11 13 23 AM" src="https://github.com/user-attachments/assets/01f03d61-e65f-4be8-896e-7d27ed2e5bad">

View on GitHub

ishaan-jaff created a comment on an issue on BerriAI/litellm
fixed on #6430 @ncecere

View on GitHub

ella-hong22 starred BerriAI/litellm
ishaan-jaff closed a pull request on BerriAI/litellm
(feat) track created_at, updated_at for virtual keys
## Title <!-- e.g. "Implement user authentication feature" --> ## Relevant issues <!-- e.g. "Fixes #000" --> ## Type <!-- Select the type of Pull Request --> <!-- Keep only the nece...
ishaan-jaff pushed 4 commits to main BerriAI/litellm
  • track created, updated at virtual keys cce118e
  • add created_at, updated_at for verification token 8cd8d47
  • ui show created at date 2d2a2c3
  • Merge pull request #6429 from BerriAI/litellm_ui_show_created_at_for_key (admin ui) - show created_at for virtual keys 5485a2a

View on GitHub

ishaan-jaff closed an issue on BerriAI/litellm
[Feature]: WebUI show timestamp on virtual keys
### The Feature In the Web UI, it would be nice if the creation timestamp for the visual key would be displayed as a column and/or included in the 'info' pop up. ### Motivation, pitch helps fi...
ishaan-jaff closed a pull request on BerriAI/litellm
(admin ui) - show created_at for virtual keys
## Title <img width="1494" alt="Xnapper-2024-10-25-09 23 46" src="https://github.com/user-attachments/assets/12c20779-aebf-4111-945d-68b62fa6fef8"> Following the OpenAI UI ordering for displa...
vercel[bot] created a comment on a pull request on BerriAI/litellm
[vc]: #nKsTrGkz9BhfJE7p+nRyHnbvvLRzQPCXtD8f7fNUA00=:eyJpc01vbm9yZXBvIjp0cnVlLCJ0eXBlIjoiZ2l0aHViIiwicHJvamVjdHMiOlt7Im5hbWUiOiJsaXRlbGxtIiwibGl2ZUZlZWRiYWNrIjp7InJlc29sdmVkIjowLCJ1bnJlc29sdmVkIjowL...

View on GitHub

ishaan-jaff opened a pull request on BerriAI/litellm
(admin ui / auth fix) Allow internal user to call /key/{token}/regenerate
## Title <!-- e.g. "Implement user authentication feature" --> ## Relevant issues <!-- e.g. "Fixes #000" --> ## Type <!-- Select the type of Pull Request --> <!-- Keep only the nece...
ishaan-jaff pushed 1 commit to litellm_allow_internal_user_to_regen_tokens BerriAI/litellm

View on GitHub

ishaan-jaff pushed 2 commits to litellm_allow_internal_user_to_regen_tokens BerriAI/litellm

View on GitHub

blueonrails starred BerriAI/litellm
krrishdholakia pushed 1 commit to litellm_dev_10_24_2024 BerriAI/litellm

View on GitHub

krrishdholakia pushed 8 commits to litellm_dev_10_24_2024 BerriAI/litellm
  • bump: version 1.50.3 → 1.50.4 0f0470f
  • perf: remove 'always_read_redis' - adding +830ms on each llm call (#6414) * perf: remove 'always_read_redis' - addin... d59f8f9
  • feat(litellm_logging.py): refactor standard_logging_payload function … (#6388) * feat(litellm_logging.py): refactor ... c04c4a8
  • LiteLLM Minor Fixes & Improvements (10/23/2024) (#6407) * docs(bedrock.md): clarify bedrock auth in litellm docs ... 1cd1d23
  • allow configuring httpx hooks for AsyncHTTPHandler (#6290) (#6415) * allow configuring httpx hooks for AsyncHTTPHand... cc8dd80
  • feat(proxy_server.py): check if views exist on proxy server startup +… (#6360) * feat(proxy_server.py): check if vie... 4e31005
  • feat(litellm_pre_call_utils.py): support 'add_user_information_to_llm… (#6390) * feat(litellm_pre_call_utils.py): su... 9fccf82
  • Merge branch 'main' into litellm_dev_10_24_2024 292473e

View on GitHub

eliorc created a comment on an issue on BerriAI/litellm
@ishaan-jaff I have `completion` and `batch_completion` calls, which look like this ```python3 session_uid = os.environ['SESSION_ID'] session_id = (f"{inspect.currentframe().f_code.co_name}." ...

View on GitHub

codecov[bot] created a comment on a pull request on BerriAI/litellm
## [Codecov](https://app.codecov.io/gh/BerriAI/litellm/pull/6421?dropdown=coverage&src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=BerriAI) ...

View on GitHub

vercel[bot] created a comment on a pull request on BerriAI/litellm
[vc]: #LDeg8ehyX36CEgPp1A17juwd5IPdMBFS+rLKbObEZWc=:eyJpc01vbm9yZXBvIjp0cnVlLCJ0eXBlIjoiZ2l0aHViIiwicHJvamVjdHMiOlt7Im5hbWUiOiJsaXRlbGxtIiwiaW5zcGVjdG9yVXJsIjoiaHR0cHM6Ly92ZXJjZWwuY29tL2NsZXJraWVha...

View on GitHub

ishaan-jaff opened a pull request on BerriAI/litellm
(admin ui) - show created_at for virtual keys
## Title <!-- e.g. "Implement user authentication feature" --> ## Relevant issues <!-- e.g. "Fixes #000" --> ## Type <!-- Select the type of Pull Request --> <!-- Keep only the nece...
ishaan-jaff created a branch on BerriAI/litellm

litellm_ui_show_created_at_for_key - Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]

krrishdholakia pushed 1 commit to litellm_dev_10_24_2024 BerriAI/litellm

View on GitHub

gaosong886 starred BerriAI/litellm
krrishdholakia pushed 1 commit to main BerriAI/litellm
  • feat(litellm_pre_call_utils.py): support 'add_user_information_to_llm… (#6390) * feat(litellm_pre_call_utils.py): su... 9fccf82

View on GitHub

krrishdholakia closed a pull request on BerriAI/litellm
feat(litellm_pre_call_utils.py): support 'add_user_information_to_llm…
…_headers' param enables passing user info to backend llm (user request for custom vllm server) ## Title <!-- e.g. "Implement user authentication feature" --> ## Relevant issues <!--...
krrishdholakia pushed 22 commits to litellm_forward_user_info_to_vllm BerriAI/litellm
  • (feat) Arize - Allow using Arize HTTP endpoint (#6364) * arize use helper for get_arize_opentelemetry_config * u... b75019c
  • Litellm dev 10 22 2024 (#6384) * fix(utils.py): add 'disallowed_special' for token counting on .encode() Fixes er... cb2563e
  • bump: version 1.50.2 → 1.50.3 0a92923
  • build(deps): bump http-proxy-middleware in /docs/my-website (#6395) Bumps [http-proxy-middleware](https://github.com... 64c3d32
  • (docs + testing) Correctly document the timeout value used by litellm proxy is 6000 seconds + add to best practices f... 807e9dc
  • (refactor) move convert dict to model response to llm_response_utils/ (#6393) * refactor move convert dict to model ... 3991d75
  • (refactor) litellm.Router client initialization utils (#6394) * refactor InitalizeOpenAISDKClient * use helper f... b70147f
  • (fix) Langfuse key based logging (#6372) * langfuse use helper for get_langfuse_logging_config * fix get_langfus... 72a91ea
  • Revert "(refactor) litellm.Router client initialization utils (#6394)" (#6403) This reverts commit b70147f63b5ad95d... d063086
  • def test_text_completion_with_echo(stream): (#6401) test 182adec
  • fix linting - remove # noqa PLR0915 from fixed function bcce21a
  • test: cleanup codestral tests - backend api unavailable ca09f4a
  • (refactor) prometheus async_log_success_event to be under 100 LOC (#6416) * unit testig for prometheus * unit te... cdda7c2
  • (refactor) router - use static methods for client init utils (#6420) * use InitalizeOpenAISDKClient * use Inital... 17e81d8
  • (code cleanup) remove unused and undocumented logging integrations - litedebugger, berrispend (#6406) * code cleanu... c731ba4
  • bump: version 1.50.3 → 1.50.4 0f0470f
  • perf: remove 'always_read_redis' - adding +830ms on each llm call (#6414) * perf: remove 'always_read_redis' - addin... d59f8f9
  • feat(litellm_logging.py): refactor standard_logging_payload function … (#6388) * feat(litellm_logging.py): refactor ... c04c4a8
  • LiteLLM Minor Fixes & Improvements (10/23/2024) (#6407) * docs(bedrock.md): clarify bedrock auth in litellm docs ... 1cd1d23
  • allow configuring httpx hooks for AsyncHTTPHandler (#6290) (#6415) * allow configuring httpx hooks for AsyncHTTPHand... cc8dd80
  • and 2 more ...

View on GitHub

krrishdholakia pushed 1 commit to main BerriAI/litellm
  • feat(proxy_server.py): check if views exist on proxy server startup +… (#6360) * feat(proxy_server.py): check if vie... 4e31005

View on GitHub

krrishdholakia closed a pull request on BerriAI/litellm
feat(proxy_server.py): check if views exist on proxy server startup +…
… refactor startup event logic to <50 LOC ## Title <!-- e.g. "Implement user authentication feature" --> ## Relevant issues <!-- e.g. "Fixes #000" --> ## Type <!-- Select the type...
krrishdholakia pushed 37 commits to litellm_proxy_cleandb_fixes BerriAI/litellm
  • refactor(redis_cache.py): use a default cache value when writing to r… (#6358) * refactor(redis_cache.py): use a def... 7338b24
  • feat(proxy_cli.py): add new 'log_config' cli param (#6352) * feat(proxy_cli.py): add new 'log_config' cli param A... 2b9db05
  • docs(sidebars.js): add jina ai embedding to docs e30a274
  • docs(sidebars.js): add jina ai to left nav e6e518a
  • bump: version 1.50.1 → 1.50.2 a5fe87d
  • langfuse use helper for get_langfuse_logging_config eca09ad
  • Refactor: apply early return (#6369) a0c5fee
  • (refactor) remove berrispendLogger - unused logging integration (#6363) * fix remove berrispendLogger * remove u... 7a5f997
  • Merge remote-tracking branch 'origin/main' 666741d
  • fix docs configs.md 7853cb7
  • (fix) standard logging metadata + add unit testing (#6366) * fix setting StandardLoggingMetadata * add unit test... 8359cb6
  • Revert "(fix) standard logging metadata + add unit testing (#6366)" (#6381) This reverts commit 8359cb6fa9bf7b0bf4f... 400cbff
  • add new 35 mode lcard (#6378) 21ace6d
  • Add claude 3 5 sonnet 20241022 models for all provides (#6380) * Add Claude 3.5 v2 on Amazon Bedrock and Vertex AI. ... 7939e93
  • test(skip-flaky-google-context-caching-test): google is not reliable. their sample code is also not working 24a0d26
  • test(test_alangfuse.py): handle flaky langfuse test better f943410
  • (feat) Arize - Allow using Arize HTTP endpoint (#6364) * arize use helper for get_arize_opentelemetry_config * u... b75019c
  • Litellm dev 10 22 2024 (#6384) * fix(utils.py): add 'disallowed_special' for token counting on .encode() Fixes er... cb2563e
  • bump: version 1.50.2 → 1.50.3 0a92923
  • build(deps): bump http-proxy-middleware in /docs/my-website (#6395) Bumps [http-proxy-middleware](https://github.com... 64c3d32
  • and 17 more ...

View on GitHub

krrishdholakia pushed 1 commit to main BerriAI/litellm
  • allow configuring httpx hooks for AsyncHTTPHandler (#6290) (#6415) * allow configuring httpx hooks for AsyncHTTPHand... cc8dd80

View on GitHub

Load more