Issues: ollama/ollama
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
I hope ollama completes my command input.
feature request
New feature or request
#4507
opened May 17, 2024 by
taozhiyuai
sh: line 252: VERSION_ID: unbound variable
bug
Something isn't working
#4505
opened May 17, 2024 by
aBaldoqui
on https://www.ollama.com/library add sort filter by model strengths
feature request
New feature or request
#4504
opened May 17, 2024 by
arjunkrishna
Creating an own model is not reliable
bug
Something isn't working
#4503
opened May 17, 2024 by
dehlong
Does Ollama currently plan to support multiple acceleration frameworks
feature request
New feature or request
#4501
opened May 17, 2024 by
glide-the
Add option to disable Autoupdate
feature request
New feature or request
#4498
opened May 17, 2024 by
Moulick
Ollama 0.1.38 has high video memory usage and runs very slowly.
bug
Something isn't working
#4497
opened May 17, 2024 by
chenwei0930
Settings getting fluctuated while multiple models are being downloaded
bug
Something isn't working
#4496
opened May 17, 2024 by
suryan-s
How to load a model from local disk path?
feature request
New feature or request
#4494
opened May 17, 2024 by
quzhixue-Kimi
How can we make model calls faster
bug
Something isn't working
#4493
opened May 17, 2024 by
userandpass
Ollama crashes after idle and can't process new requests
bug
Something isn't working
#4492
opened May 17, 2024 by
artem-zinnatullin
Pulling using API - Session timeout (5 minutes)
bug
Something isn't working
#4491
opened May 17, 2024 by
pelletier197
Is there anybody who successfully imported llama-3-8b-web?
bug
Something isn't working
#4489
opened May 17, 2024 by
Bill-XU
Not compiled with GPU offload support
bug
Something isn't working
#4486
opened May 17, 2024 by
oldmanjk
Import a model:latest aborted (core dumped)
bug
Something isn't working
#4485
opened May 17, 2024 by
Anorid
Gemma:latest aborted (core dumped)
bug
Something isn't working
memory
#4484
opened May 17, 2024 by
ManuLinares
Ollama tries to re-create existing models path
bug
Something isn't working
#4480
opened May 16, 2024 by
LumiWasTaken
Add GPU number to ps command.
feature request
New feature or request
#4479
opened May 16, 2024 by
saul-jb
Expose Max threads as an environment variable or set ollama to use all the cores/threads a CPU provides
feature request
New feature or request
#4477
opened May 16, 2024 by
haydonryan
langchain-python-rag-privategpt "Cannot submit more than 5,461 embeddings at once"
bug
Something isn't working
#4476
opened May 16, 2024 by
dcasota
support image with url when chat with vison model
feature request
New feature or request
#4474
opened May 16, 2024 by
dickens88
Previous Next
ProTip!
Adding no:label will show everything without a label.