Cost of Distance Runs Very slow

2139
6
08-09-2016 08:47 AM
FredIrani
New Contributor

It is taking my COST of Distance (COD) 15 hours to run where it once took 32 minutes for the same data. The differences are that before the input was in native, GRID, format and now I'm using Tiff for input and output. I find it hard to believe that it would take 15 hours just to convert input data to native scratch files and back again for output. Does it speed up COD if I input 0/1 raster rather than null,1 rasters? Does it affect the speed if I am using 1, 2, 4, 8 or 32 bit files as .tif input?

Can anyone provide some insight as to how to optimize my input rasters for fastest COD processing?

0 Kudos
6 Replies
DanPatterson_Retired
MVP Emeritus

how many classes do you have in your cost surface?

what are their values?

can the values be represented at a lower bit level? (ie no point in using 32 bit if you have 0, 1 and 2 as classes)

native grid will run faster in any event

More questions need answering

0 Kudos
FredIrani
New Contributor

Thanks very much for your quick reply. My input consists of two one-meter resolution, 8-bit Unsigned single class .tif files: null and 1 for both Input raster and Cost Raster.My output is directed to a .tif file. Everything else is default. ESRI seems unable to handle 1-bit correctly under v10.4.1. - Try performing a Raster Copy from a 1-bit tif to a GRID or a FGDB and see what you get. I need to process 200 raster files using Model builder. I've tried using native format for input, but I've had problems with Grid file output when the model is interrupted, perhaps due to overnight System problems, and I end up with Grid files that cannot be deleted or overwritten or used in any way, so I want to avoid using Grid files. The output form my 15 hour run was good. I just can't give it that much time to run.

0 Kudos
DanPatterson_Retired
MVP Emeritus

You had better check the help

Understanding cost distance analysis—Help | ArcGIS for Desktop

Tip:

If your cost raster does contain values of 0, and these values represent areas of lowest cost, change these values to a small positive value (such as 0.01) before running Cost Distance. You can do this with the Con tool. If areas with a value of 0 represent areas that should be excluded from the analysis, these values should be turned to NoData before running Cost Distance, by first running Set Null.

FredIrani
New Contributor

Thanks for your input. I'm familiar with the help although it doesn't discuss input raster pixel depth or format in terms of their impact on processing speed. I am using null for a background. Was hoping someone with experience could clue me in.

You answer was helpful, though, in that you confirmed that my input is good.

0 Kudos
DanPatterson_Retired
MVP Emeritus

I have experience... get rid of the zeros since they will bog down the system, since accumulating a bunch of zeros isn't going to contribute to the analysis.  I think you will find that replacing 0 with null will speed things up a lot.  The more values that need to be accumulated, the slower it is going to take... and ensure it is an integer grid, some people miss that assuming 1 will automatically be integer

0 Kudos
BrianGelder1
New Contributor III

Did you ever find an answer to this? I am experiencing the same behavior with the Combine tool, and I see the same result in 10.3.1 and 10.6.1. All inputs are TIFFs. If you set the output to GRID my combine runs in 4 seconds. If you set it to TIFF it takes 120 seconds (with Windows Defender on it takes 1095 seconds!). I was wondering why my scripts were running so much slower in the newer versions and they seemed to switch to TIFFs as the default raster format in 10.5.1. You can still use .save() as a GRID but it's always creating an intermediate TIFF first.

0 Kudos