Select an app or view more
We are building kiosk apps for...
We are kiosk experts
This tutorial shows how to set up HelloKiosk with the Ollama server
This tutorial will walk you through the steps to use Ollama server to download the Llama model and set up an inference server. Llama is used here only as an example; you can download any open source AI model. Ollama can be installed on the kiosk computer or a server on the local network where the kiosks are installed.
That's it! You are done. You can have conversations using the Ollama engine.