How do I select a computer for ArcGIS Pro Spatial Analysis

1259
10
Jump to solution
12-05-2023 08:20 AM
RonaldHaug
Occasional Contributor II

Hi I am looking for a used medium level computer less than $1000 to do spatial analysis with. Does anyone have recommendations for a computer with an intel  i7 9th or 10th generation chip, a GPU with CUDA, an adequate power supply, and a terrabyte of storage?  What are the relative merits / drawbacks of using a hard drive, solid state storage, and nvme solid state storage? I'm not an electrical engineer but what I have seen is the faster chips use more power and generate more heat when running programs, and they need an adequate power supply and cooling. I would rather have an adequately fast machine which doesn't use too much power and generates maximum heat around  55 degrees celsius. Is it better to buy a tower rather than a small form factor or laptop?

Thank you! I appreciate your carefully considered opinions and experiences.

0 Kudos
1 Solution

Accepted Solutions
Brian_Wilson
Occasional Contributor III

2. Maybe they mean they have to replace the cards to keep up, if the person does a lot of gaming, the card is a crucial part and they want the latest to remain competitive? I have no idea. They are hardware and they don't wear out. Just get obsolete. Like my Quadro 2000 whatever workstation card that I paid too much for 🙂

3. CUDA is a particular architecture just like ARM vs Intel for CPUs. You can't run CUDA code on an AMD board because the GPU architecture is different.

4. Esri employees I've asked over the years are cagey about what video card to buy. They say "any fast gamer card is fine." They used to have a list of suggested cards but they took that down maybe 10 years ago. Autocad on the other hand has a database of officially supported cards.

Autocad is all about rendering lines. GIS is now a grey area with 3D and animations and lots of mixed rending of rasters and vectors.

Nvidia provides "Game Ready" and "Workstation" drivers for the board I use. I've tried both and never noticed any difference. I suspect the Game Ready driver might actually be better for raster and animation work but I've never experimented, and it's a pain to switch drivers.

5. Upgrade the storage you have right now to make it more fun to use in the meantime. If you have an open PCI-express slot (but no NVME slot) you can put an NVME card on a $10 adapter.

I am planning on switching my Win10 desktop to Linux since I will be retired by then anyway.

View solution in original post

10 Replies
VinceAngelo
Esri Esteemed Contributor

I built a minimal power draw high-end gaming-class system (65W AMD Ryzen 7 (8 core), 32Gb RAM, 2x500Gb M.2 disk), with a GEFORCE RTX 4060 graphics card for around $1K not too long ago.  It was originally planned with motherboard graphics, but that was too laggy for dual HD displays, and I was able to get the RTX on sale (it doubled the power draw during graphics benchmarking, with more than ten times the frames per second), but still well under the capability of the 80+ Platinum 650W power supply) . It runs fairly cool with just two (silent) case fans and the default fan-on-a-heat-sink CPU fan, and our teenager is jealous of mom's WFH desktop GPU.

You can build a new system that meets your requirements or work with a build-to-suit vendor to do so, as long as you don't need to purchase new monitors, too.

- V

Brian_Wilson
Occasional Contributor III

We've used Dells here at work for years, and they have a bunch in your budget. I have a tower case at home but today I would get a smaller one because there is no longer any reason to have multiple 3.5" drive slots. I prefer desktops because they are flexible, easy to work on and upgrade.

Don't even think about using a spinning hard drive for anything anymore other than backups.

Use NVME for primary storage (operating system and apps) and use an SSD if you need more space.

I have  512GB here but the computer's a year old; it is plenty for my current work. I'd be inclined to go with maybe 2TB if I was building a new one.

After 5 minutes looking at the Dell site I'd be inclined to pick a medium price model for around 800-900$ and then go get a Nvidia card for around $200. Dell seems to have no video card or $1000 video card and nothing in between.

At one point I had a pricey "workstation" Nvidia card at one point, I could not tell any difference for ArcGIS. Maybe it would do more for Autodesk?? At home I now have a $250 "gamer" card with the "studio" driver installed. Works fine.

If you have the option to shop for NVME, speeds vary. Faster = more $ In other words, the fact that it's "NVME" does not guarantee it will be faster than an SSD. Also the interface on the computer itself has to be PCiE gen 4 to use the fast gen 4 NVME drives.

 

RonaldHaug
Occasional Contributor II

Thank you both for your thoughtful replies. Since nobody else chimed in, I'm going to wrap this up with my thoughts. In the last couple weeks, I found mid-level Dell machines which had intel i7 -9700 processors, 32 GB ram, and 1 TB nvme storage, but they did not have GPUs with coprocessors and sufficient power supplies.

I think as spatial analysis and AI becomes more mainstream, my hope is computer manufacturers  will rise to the occasion to provide us with off the rack machines.  What I see us needing are reasonably fast machines, graphics cards with CUDA coprocessors, enough RAM and storage capacity, and power supplies which feed the needs of our machines. I'm counting on companies like ESRI to give manufacturers the nudge and give them the specs required for doing the work.

I don't think these machines need to eat up alot of power or get so hot they catch on fire. I would rather have reliability and a good processing cruising speed. Another concern I have is how much energy it takes to run computers and run the internet. It's alot, like 20% of global energy usage.

On another note, may I suggest gyms with gen-bikes  (bikes with generators attached to them which plug into the wall and feed the grid) to help us power the future.

Thanks again, Vince and Brian.

0 Kudos
Brian_Wilson
Occasional Contributor III

You need to dig around in the Dell site for the option to customize, I only checked the "XPS" model but they let you choose everything basically -- you can add more ram, a power supply, and perhaps a $300 Nvidia card.

I am not a "Dell fan boy" 🙂 Personally I spec out every component on my own desktop computers and build them up myself. It's a hobby for me; the Dell option of getting someone else to build up the computer for you is more practical for most people.

You should get yourself a "Kill-a-watt" and measure what your equipment ACTUALLY uses. A modern desktop is burning only about 25-50W when it's just sitting there. The big power supplies like 1000W are generally like the 5.7 HEMI in a Dodge supercar. You don't need it. You just get bragging rights with your gamer friends.

Go to an estimator like this one

https://outervision.com/power-supply-calculator

I popped in some numbers and it says you need about a 350W supply so the stock 450 would be enough... but you could bump up to 750W if you want -- it won't use more power, it just will run cooler and allow for more expansion later. They let you estimate how much power is used if you do rendering (or gaming) and it's really not much. Still under 400W

I also used to be very active with human power stuff. Sadly humans are pathetic as electrical generators. Getting 50-100W from one person for sustained periods is asking quite a bit. So you need 5 or more people to offset your one computer. Solar panels are great!

 

RonaldHaug
Occasional Contributor II

Hi Brian, I really do feel like a rat in a maze when searching for the right computer. I'm quite surprised more people haven't contributed to this post. Either they are lost in the corn maze like me or just can't be bothered to help a pilgrim find his way.  After reading your post, last night I looked around on Dell's site and looked at the Precision line, which had a number of choices for machines with GPUs with thousands of CUDA  threads and 500 watt power supplies.

I have seven questions and maybe you or someone else can answer them for me.

1) What are the difference among Dell's product lines: Inspiron, Optiplex, Precision, and XPS?

2) What are the differences among Intel's processor lines: i3, i5, i7 and i9 and do they overlap temporally (old i5 is like a new i3, old i7 is like a new i5, old i9 is like a new i7)?

3) What are the differences among NVIDEA's many GPU offerings?

4) Do ArcGIS Pro's geospatial analysis tools only work with NVIDEA CUDAs or can I use a GPU Like Intel Spark or AMD Radeon?

5) Which computer builds are in the sweet spot for multivariable raster and vector spatial analysis?

6) Do all these product offerings provide significantly different experiences and results?

7) Do I really need to be this confused by all these product offers?

I want something which works well, is reliable and will work for 10 years.

 

0 Kudos
Brian_Wilson
Occasional Contributor III

1) I have torn apart a lot of computers including a few Dells. They really are better made when you go up the line. It's things like how easy it is to service and whether the case has sharp edges or not. In Dell's case I think they market them differently-- they guide you along by asking what kind of customer you are. I would pay more attention to specs. Unless you open the computer regularly it does not matter if it's easy or has sharp edges. Low end ones will have cheaper power supplies and connectors etc. but it's hard to tell without looking.

2) Around 7 years ago I built a i9 super duper computer at home and it's still way faster than I need but it won't run Windows 11. Older processors will generally be slower and use more power. I have found from a practical point for my own uses that I can't tell the difference between my i9/Nvidia rig at home, and my i5 laptops at home and here at work. Yes the i9 is snappy but it does not affect my work enough to matter.

I have to say the 24" monitor I got at home is AWESOME, it's so crisp and sharp it almost feels like I need new glasses when I am at work. I don't remember what it is (it's Dell), it must be 4K.

3) My video card at home came with a game that showed off rendering reflections and lighting. Awesome for games. Really probably useless for GIS. I am pretty convinced for GIS work that any card meeting min. specs is fine. I'd love to try doing processing on the video card but never have had the time or the need. Someone in another posting here pointed me at cudapy https://developer.nvidia.com/cuda-python Maybe next year after I retire...

5) This is for someone else to answer. These days I keep the servers running and the webmaps humming. There is no sweet spot. It used to be no matter what I did, Esri would use one or two cores on my killer 16 core i9, not sure where they are at with that today.

6) No.

7) No, get something, try it, don't look back. There's almost no chance it will work for 10 years, MS and Esri will prevent that.

RonaldHaug
Occasional Contributor II

Hi Brian, Thank you for replying again to my most recent post. I appreciate conversing with you over this platform. So I've been searching the internet and found a few new things to relay to you and anyone else who may be following along.

1) The Dell Precision line seems to have more what I'm looking for and the price is about $2,000. It's got the chip, the memory, the NVIDEA graphics card, nvme storage and power supply ready to go off the rack.

2) On unicornplatform.com I learned graphics cards wear out and last 3-5 years. Huh?

3) On ESRI.com  I learned that ArcGIS Pro geospatial tools only operate on NVIDEA CUDAs (Compute Unified Device Architecture) so far. Does anyone know why this is so? This puts AMD and now Intel at a disadvantage.

4) Some of the GPUs for CAD and Graphic Design are very expensive which leads me to wonder if there are some fundamental differences between gaming gpu's and CAD and graphic  design gpu's. Does anyone know what's going on here?

5) I've been fretting about what to buy as time is running out for my old i5 processor running windows 10, BUT that day of reckoning is not until October 14, 2025 when support stops. So you know what? I'm going to put off that purchase until then. By 2025 my computer will be 11 years old.

I'll let Moore's Law work in my favor, find out the answers to my questions, and I will buy new. 

0 Kudos
Brian_Wilson
Occasional Contributor III

2. Maybe they mean they have to replace the cards to keep up, if the person does a lot of gaming, the card is a crucial part and they want the latest to remain competitive? I have no idea. They are hardware and they don't wear out. Just get obsolete. Like my Quadro 2000 whatever workstation card that I paid too much for 🙂

3. CUDA is a particular architecture just like ARM vs Intel for CPUs. You can't run CUDA code on an AMD board because the GPU architecture is different.

4. Esri employees I've asked over the years are cagey about what video card to buy. They say "any fast gamer card is fine." They used to have a list of suggested cards but they took that down maybe 10 years ago. Autocad on the other hand has a database of officially supported cards.

Autocad is all about rendering lines. GIS is now a grey area with 3D and animations and lots of mixed rending of rasters and vectors.

Nvidia provides "Game Ready" and "Workstation" drivers for the board I use. I've tried both and never noticed any difference. I suspect the Game Ready driver might actually be better for raster and animation work but I've never experimented, and it's a pain to switch drivers.

5. Upgrade the storage you have right now to make it more fun to use in the meantime. If you have an open PCI-express slot (but no NVME slot) you can put an NVME card on a $10 adapter.

I am planning on switching my Win10 desktop to Linux since I will be retired by then anyway.

RonaldHaug
Occasional Contributor II

Hi Brian,

An update for you and everyone who's following along.

I'm watching TV last night and I see an ad for AMD Ryzen AI (artificial intelligence) processors, and I'm thinking, "Wow! This is coming along fast".

So I google "computers for ai" and there are a whole bunch. The Dell website intrigues me so I go there, and they have a bunch of machines using the Precision line, and these aren't the ones I was looking at the last time I wrote on this thread.

https://www.dell.com/en-us/dt/ai-technologies/index.htm#tab0=0

For esri spatial analysis users the question for me is, "Will esri or its army of super intelligent the future is now users reprogram the existing software to take advantage of the way the new hardware works?" Will the software need to analyze the computer it runs on to determine the most efficient path to run the program through? 

Ok that question answers itself. Of course! I mean that's how GPU as coprocessing occurred.

Anyway, I'm excited by this quantum outburst of new hardware and look forward to how it affects esri and spatial analysis. You, too?

0 Kudos