Ecosyste.ms: Timeline
Browse the timeline of events for every public repo on GitHub. Data updated hourly from GH Archive.
cpsievert created a review comment on a pull request on posit-dev/py-shiny
No, `DictNormalizer` will work for either case since they define `__getitem__()` on the pydantic model. I suppose that is a weird/subtle thing that requires extra context, and it'd be nice to take ...
gadenbuie created a review comment on a pull request on posit-dev/py-shiny
Is some of the context here that you have a message normalizer for Pydantic models?
gadenbuie created a review comment on a pull request on posit-dev/py-shiny
Seems like `ImportError()` would be a good fit for our error too
cpsievert created a review comment on a pull request on posit-dev/py-shiny
This change is orthogonal to the main fix of this PR, but it's a good idea, and I ran into because [bert-base-cased](https://huggingface.co/google-bert/bert-large-cased) temporarily went offline fo...
jcheng5 created a review comment on a pull request on posit-dev/py-shiny
Would `ui.output_markdown_stream` be more consistent with our past naming, perhaps? (I think of Chat as a special case because it's equal parts output and input)
jcheng5 created a review comment on a pull request on posit-dev/py-shiny
```suggestion ```
cpsievert pushed 1 commit to markdown-stream-component posit-dev/py-shiny
- Make examples more readable: 9fbfb2e
cpsievert pushed 1 commit to ollama-0.4-fix posit-dev/py-shiny
- Better error reporting in get_default_tokenizer() 14539a8
cpsievert pushed 2 commits to ollama-0.4-fix posit-dev/py-shiny
cpsievert opened a pull request on posit-dev/py-shiny
`ui.Chat()` now correctly handles new `ollama.chat()` return value introduced in ollama 0.4
Ollama 0.4 [changed the return type of `ollama.chat()`](https://github.com/ollama/ollama-python/pull/276) (`ChatResponse`) from a `TypedDict` to a `pydantic.BaseModel`. As a result, passing that re...cpsievert pushed 1 commit to ollama-0.4-fix posit-dev/py-shiny
- ui.Chat() now correctly handles new ollama.chat() return value introduced in ollama 0.4 ee260c5