Simple RAG with Ollama, OpenWebUI, and VectorDB on Ubuntu 22.04

Prerequisites I have already installed Nvidia proprietary drivers and the Nvidia Cuda Toolkit. I documented the install of the Cuda toolkit in an older post which can be found here. Since I have Nvidia GPUs in my host system, and I intend to run some services in containers, I want to make sure that I … Continue reading Simple RAG with Ollama, OpenWebUI, and VectorDB on Ubuntu 22.04

Essential Commands to Monitor Nvidia GPUs in Linux

Identify Your GPU Via the Linux CLI Identify that your card is recognized by the OS via the CLI command below, hwinfo # hwinfo --gfxcard --short graphics card: nVidia TU104GL [Tesla T4] nVidia TU104GL [Tesla T4] Matrox G200eR2 Primary display adapter: #58 Or you can see similar output with lshw # lshw -C display *-display … Continue reading Essential Commands to Monitor Nvidia GPUs in Linux

Ollama CLI Quick Start Guide and Tutorial for Beginners – Part 1

This two-part guide introduces beginners to using Ollama, focusing initially on installation and model pulling. It explains essential commands and service configurations, discusses memory management, and provides troubleshooting tips for ensuring successful model installation and interaction with Ollama through the command line interface.