Ecosyste.ms: Timeline

Browse the timeline of events for every public repo on GitHub. Data updated hourly from GH Archive.

BerriAI/litellm

krrishdholakia created a comment on an issue on BerriAI/litellm
Fixed here - https://github.com/BerriAI/litellm/commit/8f37b2266717e5cfaafbac6a9c2a2bdc293bf01b @xingyaoww if my changes look okay to you, can we go with that instead - preferred as it includes ...

View on GitHub

krrishdholakia pushed 1 commit to litellm_krrish_dev_10_25_2024 BerriAI/litellm
  • fix(factory.py): support anthropic prompt caching for tool results 58fe661

View on GitHub

beat created a comment on an issue on BerriAI/litellm
confirming missing!

View on GitHub

beat created a comment on an issue on BerriAI/litellm
Confirming that these views are missing, and that running that script fixes the broken "Usage" view. Was searching for hours until I found this bug ticket. Many Many Thanks! Hope that this is added...

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
ahh - I set that inside `message.content`. So this one: https://github.com/BerriAI/litellm/issues/6422#issuecomment-2438757294 <img width="608" alt="image" src="https://github.com/user-attachmen...

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
their docs imply this should be working (from within the content list) <img width="752" alt="Screenshot 2024-10-25 at 2 09 21 PM" src="https://github.com/user-attachments/assets/0802f9a2-ca2f-485...

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
interesting - it fails for me <img width="1525" alt="Screenshot 2024-10-25 at 2 07 34 PM" src="https://github.com/user-attachments/assets/ed4149d6-c6be-4fda-ba6b-7f5ac95434c7">

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
I was successful with this: ``` {'role': 'user', 'content': [{'type': 'tool_result', 'tool_use_id': 'toolu_01V1paXrun4CVetdAGiQaZG5', 'content': [{'type': 'text', 'text': 'OBSERVATION:\nOn bra...

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
<img width="1153" alt="Screenshot 2024-10-25 at 2 02 32 PM" src="https://github.com/user-attachments/assets/a1eebdbb-189e-47b4-9cc5-2193591fffca">

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
Anthropic rejects this message as well 😢 ``` {'role': 'user', 'content': [{'type': 'tool_result', 'tool_use_id': 'toolu_01V1paXrun4CVetdAGiQaZG5', 'content': [{'type': 'text', 'text': 'OBSERVATI...

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
You mean to have a user set the cache at the message level? I'm ok with that - as long as we can do the caching properly!

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
hmm isn't that a bit weird @xingyaoww - so i might intend to just cache a single item in list, but am now caching the entire list -> this would be unexpected behaviour imo my preference here (o...

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
@krrishdholakia yep! what i did in #6425 to check for any cache_control inside content, if so, remove them inside and move it to message level 😓

View on GitHub

codecov[bot] created a comment on a pull request on BerriAI/litellm
## [Codecov](https://app.codecov.io/gh/BerriAI/litellm/pull/6433?dropdown=coverage&src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=BerriAI) ...

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
so then the correct input message would be ``` { "content": [ { "type": "text", "text": 'OBSERVATION:\nOn branch master\...

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
Yeah - i contacted anthropic yesterday, and it seems you need to do it on `message.cache_control` for it to work 😓

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
``` E litellm.exceptions.BadRequestError: litellm.BadRequestError: AnthropicException - {"type":"error","error":{"type":"invalid_request_error","message":"messages.2....

View on GitHub

KirillSuhodolov starred BerriAI/litellm
ishaan-jaff created a comment on an issue on BerriAI/litellm
thanks so much for the help on this @tyler-liner, the graph is really helpful > So, I removed all threading-related code and compared the memory usage of the application. (all threading.Thread(...

View on GitHub

tkg61 created a comment on an issue on BerriAI/litellm
It is **after** i click the green submit button do i get the above error. What is the route it should take after going to FQDN.com/fallback/login, is it FQDN.com/ui or something else?

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
> would that be cached? anthropic's docs say the cached content has to be large enough yes - in my local branch, i got the cache write token to be the same as input token - this is the correct case

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
another thing is that, with the fix here: https://github.com/BerriAI/litellm/commit/8f37b2266717e5cfaafbac6a9c2a2bdc293bf01b#diff-39721500d7df0dd4c6aa7d60d7ef78a38afaf136f182114d15f4e3c1e863a413 ...

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
would that be cached? anthropic's docs say the cached content has to be large enough oh - i think i'll update my test to catch that it's being sent correctly

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
i found that it can get "silently omitted" without carefully a fix -- you can check the cached write/read token from the response, if both are zero, the it doesn't work :(

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
@krrishdholakia this line: https://github.com/BerriAI/litellm/commit/8f37b2266717e5cfaafbac6a9c2a2bdc293bf01b#diff-6b37b401056b3fb9e91a7fbe1631be6f14440c26756a87b39157fcb00562de39R523

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
oh it works for me

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
the monthly global spend should be fixed on v`1.50.4` the spend issue - can you share your server logs? (run proxy with `--detailed_debug`) This should show any errors that might be happening

View on GitHub

xingyaoww created a comment on an issue on BerriAI/litellm
@krrishdholakia the repro file i send earlier already include that prompt caching flag on the first request

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
The route it should be going to is `{PROXY_BASE_URL}/sso/key/generate`

View on GitHub

krrishdholakia created a comment on an issue on BerriAI/litellm
Unable to repro. This works just fine - <img width="1230" alt="Screenshot 2024-10-25 at 12 51 35 PM" src="https://github.com/user-attachments/assets/e2c3ce84-bea7-4d41-8507-96182deddb9d"> Yo...

View on GitHub

Load more