Select an app or view more
We are building kiosk apps for...
We are kiosk experts
This tutorial showcase how to setup HelloKiosk with Ollama server
This tutorial will walk you through the steps to use Ollama server to download the Llama model and set up an inference server. Instead of Llama models that is just used as an example, you can download any open source AI model. Ollama can be installed on the kiosk computer or a server on the local network where the kiosks are installed.
That's it! You are done. You can have conversations using the Ollama engine.