Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: LLM with ___ return cannot be deleted #5585

Closed
1 task done
lizheng419 opened this issue Mar 4, 2025 · 0 comments
Closed
1 task done

[Bug]: LLM with ___ return cannot be deleted #5585

lizheng419 opened this issue Mar 4, 2025 · 0 comments
Assignees
Labels
bug Something isn't working

Comments

@lizheng419
Copy link
Contributor

Is there an existing issue for the same bug?

  • I have checked the existing issues.

RAGFlow workspace code commit ID

latest

RAGFlow image version

latest

Other environment information

latest

Actual behavior

Image

Expected behavior

The llm_name returned by the my_llm interface is claude-3-7-sonnet-202502192-all___OpenAI-API. The front end filters it and displays it as claude-3-7-sonnet-202502192-all. As a result, when delete_llm is called, claude-3-7-sonnet-202502192-all is passed in, resulting in the inability to delete.

Steps to reproduce

1

Additional information

No response

@lizheng419 lizheng419 added the bug Something isn't working label Mar 4, 2025
@cike8899 cike8899 self-assigned this Mar 4, 2025
cike8899 added a commit to cike8899/ragflow that referenced this issue Mar 4, 2025
KevinHuSh pushed a commit that referenced this issue Mar 4, 2025
### What problem does this PR solve?

Fix: LLM with ___ return cannot be deleted #5585

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants