This page is designed to let you chat with Ollama. It uses an API hosted at a base URL (by default http://localhost:11434) to load available models and send your requests.
Installation and Setup Instructions
If you haven't installed Ollama, please download it from the
Ollama download page.
On a Mac you may need to give permission for your site to access Ollama. Open Terminal and run:
And if you want to access Ollama from another device on your network, you can run:
launchctl setenv OLLAMA_HOST "0.0.0.0"
... but remember that you should only do this on a secure network, as it will allow anyone on your network to access Ollama. If you have a firewall enabled then you will need to add a rule to allow access to the port Ollama is using - typically 11434.
Then close and re-open Ollama (or restart your laptop) so that the new permission takes effect.
For further details on how to set an environment variable on a Mac, Windows, or Linux, see this section in the Ollama documentation. Nb. You are changing the OLLAMA_ORIGINS environment variable rather than OLLAMA_HOST.