
Set up Ollama, NGROK, and LangChain
This post shows how to quickly set up Ollama with ngrok and use it in LangChain. OLLAMA You can easily download for Linux, macOS, and Windows: https://ollama.com/download . Test if it's working: $ curl http://localhost:11434 ; echo # has to give "Ollama is running" Note: If returns nothing, just execute $ ollama serve . Pull your preferred models: $ ollama pull phi4-mini $ ollama pull nomic-embed-text:v1.5 NGROK You can find it on the official website for Linux, macOS, and Windows: https://ngrok.com/download Note: Remember to add your authtoken! $ ngrok config add-authtoken <your-ngrok-auth-token> Now, you can expose Ollama with ngrok and add basic authentication (change the username and password as you desire) $ ngrok http 11434 --host-header = "localhost:11434" --basic-auth = "username:password" This will generate your public URL as in the picture: https://09c6b3946ddd.ngrok-free.app You can test it easily by opening a web browser and pasting the public URL: When you authenticate suc
Continue reading on Dev.to
Opens in a new tab



