https://ollama.com/
serve Start the Ollama service so models can run
Sample: ollama serve
create Build a new model from a Modelfile for custom LLMs
Sample: ollama create mymodel -f Modelfile
show Display full details of a model such as parameters and template
Sample: ollama show llama3
run Execute a model and generate output from a prompt
Sample: ollama run llama3 "Hello"
stop Stop a model that is currently running in memory
Sample: ollama stop llama3
pull Download a model from the Ollama registry to local machine
Sample: ollama pull llama3
push Upload a custom model to the Ollama registry for sharing
Sample: ollama push mymodel
signin Log in to ollama.com to enable registry operations
Sample: ollama signin
signout Log out from ollama.com and end the authenticated session
Sample: ollama signout
list Show all models installed locally on your system
Sample: ollama list
ps Show all models currently running in the Ollama service
Sample: ollama ps
cp Duplicate an existing model under a new name
Sample: ollama cp llama3 llama3backup
rm Remove a model from local storage to free space
Sample: ollama rm llama3backup
help Display help and usage details for any command
Sample: ollama help run
Ollama Ai LLM Commands
Ollama Ai LLM Commands
Malik Sikandar Hayat
Oracle ACE Pro
info@erpstuff.com
Oracle ACE Pro
info@erpstuff.com
Who is online
Users browsing this forum: Google Adsense [Bot] and 1 guest