Ecosyste.ms: Timeline

Browse the timeline of events for every public repo on GitHub. Data updated hourly from GH Archive.

sixsixcoder

sixsixcoder created a comment on an issue on THUDM/GLM-4
可以举个例子吗?另外`temperature`和`TOP P`参数是一致的吗,第三方AI平台比如扣子可能会加入一些`RAG`的方案提升原始模型的生成效果。

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4
有测试示例吗?您如何判断呈现的性能效果

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4
感谢您的贡献

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4
pip install gradio==4.44.1

View on GitHub

sixsixcoder closed an issue on THUDM/GLM-4
拥有glm4v的vLLM-api_server
### Feature Request / 功能建议 1. 能否像 `glm_server.py` 一样,为 GLM4V 提供api_sever?目前在 `vllm_client_vision_demo.py` 中,看到是通过以下形式将数据传递给 `engine.generate()`: ```python messages = [ {"role"...
sixsixcoder created a comment on an issue on THUDM/GLM-4
[vllm_client_vision_demo.py](https://github.com/THUDM/GLM-4/blob/main/basic_demo/vllm_cli_vision_demo.py)这个文件必须带图像输入,这个demo可以进行多轮对话,现已支持vllm 0.6.3。

View on GitHub

sixsixcoder opened a pull request on THUDM/GLM-4
support vllm 0.6.3
support vllm 0.6.3
sixsixcoder pushed 1 commit to main sixsixcoder/GLM-4

View on GitHub

sixsixcoder pushed 8 commits to main sixsixcoder/GLM-4
  • Merge pull request #634 from sixsixcoder/main Support for GLM-4-9B-Chat-hf and GLM-4v-9B models on vLLM >= 0.6.3 and... bca86f8
  • comment with trust_remote_code=True d71b8c2
  • model download page 0b37cf2
  • adapt transformers==4.46 af1d4f2
  • adapt transformers>=4.46 a0c5687
  • Merge pull request #639 from zhipuch/main adapt transformers==4.46 1e0fa42
  • process eval issue a7dbfa5
  • Merge pull request #649 from zhipuch/main process eval issue 4446f60

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4-Voice
web demo supports multi-round dialogue

View on GitHub

sixsixcoder closed an issue on THUDM/GLM-4
AttributeError: 'ChatGLM4Tokenizer' object has no attribute '_id'
### System Info / 系統信息 My environment: cuda: 12.4 python: 3.12.7 torch: 2.5.0+cu124 transformer: 4.46.0 glm-4v-9b: the latest version (yesterday) When I run basic_demo/trans_web_vision_dem...
sixsixcoder closed an issue on THUDM/GLM-4
ValueError: too many values to unpack (expected 2)
### System Info / 系統信息 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 555.42.02 Driver Version: 555.42.02 CUDA Version:...
sixsixcoder closed an issue on THUDM/GLM-4
ValueError: Unrecognized configuration class <class 'transformers_modules.THUDM.glm-4-9b-chat.eb55a443d66541f30869f6caac5ad0d2e95bcbaa.configuration_chatglm.ChatGLMConfig'>OSError: /tiamat-NAS/boyang/GLM4/gjm/1024/checkpoint-2000 does not appear to have a file named THUDM/glm-4-9b-chat--configuration_chatglm.py. Checkout 'https://huggingface.co//tiamat-NAS/boyang/GLM4/gjm/1024/checkpoint-2000/None' for available files.
### System Info / 系統信息 python inference.py /tiamat-NAS/boyang/GLM4/gjm/1024/checkpoint-6000 Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████...
sixsixcoder closed an issue on THUDM/GLM-4
用xinference无法加载4-bit的glm-4-9b-chat
### System Info / 系統信息 CUDA=12.4 transformers=4.44.2 torch=2.4.1+cu124 OS=windows 11 23H2 python=3.11.10 ### Who can help? / 谁可以帮助到您? _No response_ ### Information / 问题信息 - [ ] The officia...
sixsixcoder closed an issue on THUDM/GLM-4
hugging face transformers 4.45.0与vllm 0.6.3结果不一致
### System Info / 系統信息 RTX3090 ### Who can help? / 谁可以帮助到您? _No response_ ### Information / 问题信息 - [ ] The official example scripts / 官方的示例脚本 - [ ] My own modified scripts / 我自己修改的脚本和任务 ### R...
sixsixcoder closed an issue on THUDM/GLM-4
用 openai_api_server.py 运行 glm-4-9b-chat 可能会重复输出
### System Info / 系統信息 以前也有类似问题,https://github.com/THUDM/GLM-4/issues/476 Python 3.11 RTX 4090 x 2 今天发现问题之后, 我现在下载了最新 modelscope 上的 tokenization_chatglm.py (129d6b0e) 和 最新的 basic_demo(4...
sixsixcoder created a comment on an issue on THUDM/GLM-4-Voice
请问您的问题是什么

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4-Voice
尝试使用gradio=4.44.1的版本

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4-Voice
我也用您的数据集测试过了,确实存在幻觉现象,不知道预训练有没有包含少数民族语言的数据集,请关注近期要发布的详细技术文献

View on GitHub

sixsixcoder opened an issue on hiyouga/LLaMA-Factory
After using lora to fine-tune the GLM-4 model, the chat template format is wrong
### Reminder - [X] I have read the README and searched the existing issues. ### System Info llamafactory 0.9.0 ### Reproduction # glm-4-9b-chat Original template ``` "chat_template": "[gMASK...
sixsixcoder created a comment on an issue on THUDM/GLM-4-Voice
> @codersun123 @sixsixcoder 请问你解决这个问题了吗?我也遇到相似的问题,也是AutoDL的机器环境。 可以试一下gradio==4.44.1

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4
感谢您的关注,原理问题可以邮件发送到论文的作者邮箱进行咨询。

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4-Voice
你可以先输入“请帮我把下列这段维吾尔语翻译成中文”的语音,提交问答后,再输入你想要翻译的维吾尔语 <img width="908" alt="image" src="https://github.com/user-attachments/assets/d71bb357-c6fb-430d-a1e2-aea9f65fcfab">

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4-Voice
你是如何在上传音频的同时添加文本指令prompt的

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4
我在pytorch2.4.0和transformers4.45.0按照您的步骤可以正常运行微调后的推理

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4
我在pytorch2.4.0和transformers4.45.0按照您的步骤可以正常运行微调后的推理

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4
这是一个使用int4进行量化推理的demo ```Python import torch from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig import os os.environ['CUDA_VISIBLE_DEVICES'] = '0' # 设置 GPU 编号,如...

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4-Voice
方便上传几个数据集让我测试一下吗

View on GitHub

sixsixcoder created a comment on an issue on THUDM/GLM-4-Voice
该问题已在[PR](https://github.com/THUDM/GLM-4-Voice/pull/80)中修复。

View on GitHub

sixsixcoder opened a pull request on THUDM/GLM-4-Voice
fixed cuda bug
fixed issue: https://github.com/THUDM/GLM-4-Voice/issues/79
Load more