Open WebUI: A nuanced approach to locally hosted Ollama

Open WebUI offers a robust, feature-packed, and intuitive self-hosted interface that operates seamlessly offline. It supports  various large language models like Ollama and OpenAI-compatible APIs,  Open WebUI is open source with an MIT license. It is easy to download and install and it has excellent documentation. I chose to install it on both my Linux computer and on the M2 MacBook Air. The software is written in Svelte, Python, and TypeScript and has a community … Read more

Exploring Hollama: A Minimalist Web Interface for Ollama

I’ve been continuing the large language model learning experience with my introduction to Hollama. Until now my experience with locally hosted Ollama had been querying models with snippets of Python code, using it in REPL mode and customizing it with text model files. Last week that changed when I listened to a talk about using … Read more

In search of the right GPU

I rely on my powerful Intel NUC with an i7 processor and 64 GB of RAM for my daily computing needs. However, it lacks a GPU, which makes it unsuitable for the experimentation I’ve been conducting with locally hosted large language models. To address this, I use an M2 MacBook Air, which has the necessary … Read more

Evaluating writing using open source artificial intelligence

In today’s digital age, writers seek tools that enhance their craft and provide real-time feedback and assistance. Enter Ollama – an open-source machine learning system designed to democratize accessibility for natural language processing tasks across a wide range of languages and scripts with ease. Coupled with the Phi3 model, this powerful duo promises unparalleled benefits … Read more

Using Ollama to evaluate what you write

Last year, a colleague shared with me how he used ChatGPT to evaluate content generated by his students at a local university. I was intrigued by the idea but also concerned about the ethical issues surrounding using publicly available LLMs to read student writing without their consent. This led me to look for locally hosted … Read more

Using Python to talk with an Ollama model

Continuing my exploration of using a locally hosted Ollama on my Linux desktop computer, I have been doing a lot of reading and research. Today, while having lunch with a university professor, he asked me some questions I didn’t have an immediate answer to. So, I went back to my research to find the answers. … Read more