Buying a desktop in 2023

I bought my last desktop in 2014.  It was a very high end machine at the time, and while I’ve had several new laptops since then, the desktop long remained the workhorse of my gaming setup.  But with the recent AI craze, I found that my desktop didn’t have enough power to run stable-diffusion (the AI art program) or even GPT4All (an open-source version of ChatGPT). 

So I decided to finally get a new desktop, and it was harder than expected.  I bought my 2014 desktop at Fry’s Electronics, which went under during the pandemic.  With them gone, the only computer stores nearby are a fleet of Best Buys.  Best Buy isn’t bad, but I’ll warn you that it won’t come across well in this story.

When I went to Best Buy for a new computer, I only knew I wanted a machine powerful enough to run stable-diffusion.  And I figured that in this day and age, maybe I don’t need a desktop to do the most powerful computing.  Desktops seem like dinosaurs these days, most of my coworkers only have laptops or tablets.  I even know some people whose only computer is their phone.  So maybe I just need a top-end laptop to do what I want? 

But looking for laptops in Best Buy felt like trawling a souk for antiquities.  There was a huge language barrier, and no one seemed like they knew what I wanted.

I did some homework online, and it turns out that AIs don’t just need a powerful graphics card, they need a very special type of card.  They need an NVIDIA card with a lot of VRAM.  NVIDIA is needed because only its cards contain “CUDA” which is can make AIs go.  CUDA is a suite of on-card libraries for complex math and parallel computing.  I know the AMD stans will tell me that there are libraries to run stable-diffusion on AMD, but installing stable-diffusion is already a pain, and trying to install CUDA work-arounds using barely-commented GitHub files is too much work for a simple hobby.

And in addition to an NVIDIA card, you also need the card to have VRAM.  VRAM stands for video RAM, and it’s needed to let graphics cards work their best.  How it was explained to me is that your PC and your graphics card are like 2 major cities connected by a single dirt path.  Each city has their own big highway system, so moving data within them is quick and easy, but moving data between them is slooooooooooooooooow.  So modern cards use VRAM, which is like a data warehouse for GPU-land.  

This is important because GPU-land is the part of the computer specialized for complex math.  In the old days, the demand for math processing was primarily driven by video games, which needed to calculate position and momentum of thousands of characters and particles across 3D space.  This is why GPUs are most associated with video games, but recently crypto-mining and AI have also emerged as major drivers of GPU demand since they have their own high-end math requirements.

Before VRAM, every time the GPU did a calculation it had to store its answer in the main system memory, then ask for that answer back if it needed it for the next calculation. It was sort of like this:

the computer says: “what’s the square root of 2+7 over 77+23?”  

The GPU says “OK 2+7 is 9.  Now what was in the denominator?”

Computer: “77+23”

GPU: “OK 77+23 is 100.  Now what was in the numerator?”

Computer: “well, you just told me 2+7 was 9”

GPU: “OK 9/100 is 0.09.  Is that all you wanted?”

Computer: “you forgot to square-root it”

GPU: “OK, the square root of 0.09 is 0.3”

Computer: “Did you say 0.3000000000000000004?  Sounds right to me”

GPU: “Don’t forget to check for floating point errors.  See you next time!”

That’s a lot of cars going back and forth along the dirt road, and it made for slow computing.  But with VRAM, the GPU can store all its answers locally and only talks to the computer when it’s finished calculating.  This clears a hell of a lot of traffic off the road, and without VRAM most modern AIs just don’t work.

So I knew I wanted a lot of VRAM, and the internet told me 16GB was a good number.  I also knew I needed an NVIDIA graphics card.  But finding all that at Best Buy was an exercise in frustration.  

I would walk up to a computer to check its specs.  The tag says it has an NVIDIA card with 16GB of RAM.  16GB RAM?  That’s way too low for modern storage.  So that 16GB must be the VRAM, right?  It also says it has a 512GB solid state drive, which I assume is the computer’s main RAM storage.  So half a terabyte of memory and 16GB VRAM, that’s exactly what I want, right?  But on closer inspection of the actual computer and not the tag, it says it has an intel graphics card.  It seems this model of laptop can either have an Intel or an NVIDIA, and while the tag says NVIDIA the computer itself says Intel.  So this is not what I want.

The next computer over does say NVIDIA, and it’s got a whole terabyte of memory.  It still says 16GB RAM, so I guess it’s a buy, right?  Well dxdiag is a simple windows command to tell you the computer’s specs, and I run it on this computer just to check.  It turns out that the 16GB RAM is made up of 6GB display memory and 8GB shared memory.  I guess Best Buy uses base 8 math where 6+8=16.  That would explain their prices, but 6+8 isn’t what I’m looking for.

Even worse, I do some searching and find that only display memory is “true” VRAM.  The 8GB of shared memory is actually just normal RAM that is “reserved” for the graphics card.  Using the analogy from above, it’s like the GPU city owns a warehouse in the Computer city, so when it has too much data it can offload it there for pickup later.  The problem is that to move that data it still has to go back and forth down the dirt path between the two cities, which means it’s still very slow.  So for my purposes, 6+8=0.

But here’s the thing, I’m not an expert so I don’t know if “display memory” really is the same thing as “VRAM.”  I’m only assuming it is.  But maybe I’m wrong and the VRAM is listed elsewhere?  I flag down a Best Buy employee and ask him what display memory actually is.  He tells me “oh it makes the graphics card go faster, but it doesn’t make it more powerful.”  That’s incredibly generic, I ask him if “display memory” is the same as VRAM.  He says “I think kinda, yeah,” and at that point I realize he doesn’t know any more than I do so I thank him for his time and leave.

I need true VRAM, so now I just start running dxdiag on every computer on the floor.  I find that all of them are set up like the 6+8 laptop and none of them have a lot of “true” VRAM.  Looking online, it also seems like NVIDIA has sneakily given their laptop cards the same names as their desktop cards despite the laptop cards having much lower specs.  I knew a 4070 or 3060 were “good” NVIDIA cards, but the laptop versions are paltry imitations of the real thing and not good enough for AI.  So it turns out I do need a desktop.

OK, well I’m still at Best Buy so I wander over to their desktop area.  I no longer trust tags so I just run dxdiag on anything I see.  And there I seem to strike the motherload: 24GB of display memory, holy crap that’s a lot of VRAM!!  Oh, it’s an AMD card.  Well AMD may be cheaper and have way more VRAM, but it doesn’t have the CUDA so it’s a no-go. 

I finally go over to Geek Squad, Best Buy’s in house specialists, and ask if they do build-a-desktop services.  It turns out no, that’s a service they discontinued a long time ago.  I can buy parts to build it myself, but Best Buy can’t build it for me.  I asked who could build me a computer and every member of Geek Squad plus a randomly patrolling employee all told me to try Micro Center instead.  So I had to head there.

Micro Center was the exact opposite of Best Buy.  As soon as I started looking at graphics cards an employee came up to ask if I had any questions.  I asked him my questions about VRAM and display memory and he was able to point me to a specific card that had plenty of VRAM and which he told me was very good for AI.  He also gave me ideas of other cards I could buy if I wanted to move up or down in power and price, and when I finally settled on which card to buy, he then offered to pick out every part I needed for a computer and put them together for me. 

This was exactly what I needed, a build-a-desktop service with an expert who could actually help me buy something.  We went over all the parts and I made whatever changes I wanted from what he suggested.  Then 2 days later I had a desktop built for just 2000$.  That may seem like a lot, but laptops with way less power were selling for 1800$, and the only laptop that seemed even capable of doing what I wanted had a 2500$ price tag.  I only just got the desktop back to my house, so I still have a few weeks before I find all the things I hate about it, but I’m already liking Micro Center a lot more than Best Buy.

Overall, buying a computer in 2023 is still as overly complicated a mess as it’s always been.  If you just need to write emails to your grandkids, Best Buy has 180$ laptops that will probably do you good.  But if you want the kind of power needed to play modern games and do modern activities, trying to parse all the various GPUs with their CUDAs and VRAMs and so on is way more of a hassle than it should be.  

I wish more computer sellers were knowledgeable in what they were selling, I don’t need all of them to be experts in AI hardware but if they could at least tell me what all the parts mean I’d have been a lot happier.  Shouldn’t a car salesmen be able to explain to you miles-per-gallon and what a hybrid is?  As it stands, I was dumbstruck by how helpless most salesfolks were, and how little the GPU business has changed in decades.  In 2008 the late Shamus Young wrote an article complaining about how confusing it was trying to buy a graphics card, and nothing has gotten better since then.

Maybe someday I can ask an AI what kind of graphics card I need to run it.  Then ask the AI to build it and maybe ask the AI to install itself on there for me.  Some people are scared of AI, but I think if Skynet ever does become self-aware and try to self-replicate, just reading its own hardware requirements will give it enough of an aneurysm to drop it back down to pre-sentience.  Until then, I can’t say I’m looking forward to doing all this again in a few years time.

Leave a comment