I rely on my powerful Intel NUC with an i7 processor and 64 GB of RAM for my daily computing needs. However, it lacks a GPU, which makes it unsuitable for the experimentation I’ve been conducting with locally hosted large language models. To address this, I use an M2 MacBook Air, which has the necessary power for some of these tasks.
I had helped some local folks purchase a refurbished Dell computer from a refurbisher. They began to experience difficulty with it in a couple of months and when they did it was beyond the ninety day warranty. Rather than see them lose their money I wrote them a check for the original purchase price.
I believe that when you do good things that you will be rewarded in some fashion. I helped these folks purchase a new Dell Inspiron desktop which has a full factory warranty and when I was about to leave their home they asked me if I wanted to take the defective computer. I thought I might be able to fix it or use it for parts. I removed the cover and discovered that this Optiplex 5060 with an i5 CPU didn’t have a traditional hard drive like I had thought but instead was equipped with a Western Digital SN 270 NVME drive. I also discovered that the only thing wrong with the unit was a bad external power switch. Once I removed the front bezel I was easily able to power the device on.
Karma was working once again in my favor as I have found it does when you do for others as youu would have them do for you. I erased the Windows 11 install and installed Linux Mint 22 in it’s place. This unit also had two open low profile expansion slots and I wondered if I could find a graphics card with a GPU that would allow me to experiment with Ollama and other LLMs. I did some research and decided to purchase a XFX Speedster SWFT105 Radeon RX 6400 Gaming Graphics Card with 4GB from Amazon. The card came a couple days later and I installed it in one of the expansion slots.
After installing the card I placed the cover back on the machine, connected a spare Sceptre 27 inch display and an ethernet cable to it and downloaded Ollama and the Phi3 model. I downloaded and installed the ROCm modules which are helped Ollama to recognize the GPU. Ollama states that it recognizes the GPU when it finished installing the software. I think Ollama and the Phi3 module run faster with this unit. But maybe that’s wishful thinking. I also wanted to try Stable Diffusion on this computer and used Easy Diffusion which I have installed on the NUC before. I was frustrated to discover that my RX6400 card and GPU don’t work with EasyDiffusion. Am I missing something? Is there a fix?
I hope that if you’re reading this and you know of a fix for this issue that you would share it. I’d love to find and answer. Nonetheless, doing good for others always results in good coming back to you.