
I have been using Ollama for a good while now to run LLMs locally on my laptop for better testing and development of my AI Agents. This post shows a number of things that you can do with Ollama in a glimpse and so I call this a cheatsheet.
--
--
If you haven't tried your hands on Ollama, you should give it a try and this cheatsheet should come handy to you. And you may want to star it and fork it from my github-gist, for your quick reference and fiddling.