Skip to content

Issues: ollama/ollama

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

I hope ollama completes my command input. feature request New feature or request
#4507 opened May 17, 2024 by taozhiyuai
sh: line 252: VERSION_ID: unbound variable bug Something isn't working
#4505 opened May 17, 2024 by aBaldoqui
Creating an own model is not reliable bug Something isn't working
#4503 opened May 17, 2024 by dehlong
paligemma model request Model requests
#4499 opened May 17, 2024 by wwjCMP
Add option to disable Autoupdate feature request New feature or request
#4498 opened May 17, 2024 by Moulick
Ollama 0.1.38 has high video memory usage and runs very slowly. bug Something isn't working
#4497 opened May 17, 2024 by chenwei0930
gemma 2.0 model request Model requests
#4495 opened May 17, 2024 by userforsource
How to load a model from local disk path? feature request New feature or request
#4494 opened May 17, 2024 by quzhixue-Kimi
How can we make model calls faster bug Something isn't working
#4493 opened May 17, 2024 by userandpass
Ollama crashes after idle and can't process new requests bug Something isn't working
#4492 opened May 17, 2024 by artem-zinnatullin
Pulling using API - Session timeout (5 minutes) bug Something isn't working
#4491 opened May 17, 2024 by pelletier197
Is there anybody who successfully imported llama-3-8b-web? bug Something isn't working
#4489 opened May 17, 2024 by Bill-XU
Not compiled with GPU offload support bug Something isn't working
#4486 opened May 17, 2024 by oldmanjk
Import a model:latest aborted (core dumped) bug Something isn't working
#4485 opened May 17, 2024 by Anorid
Gemma:latest aborted (core dumped) bug Something isn't working memory
#4484 opened May 17, 2024 by ManuLinares
Ollama tries to re-create existing models path bug Something isn't working
#4480 opened May 16, 2024 by LumiWasTaken
Add GPU number to ps command. feature request New feature or request
#4479 opened May 16, 2024 by saul-jb
Ram not releasing bug Something isn't working
#4478 opened May 16, 2024 by Stampsm
support image with url when chat with vison model feature request New feature or request
#4474 opened May 16, 2024 by dickens88
ProTip! Adding no:label will show everything without a label.