Size limit to NumPy Arrays?

20845
7
04-13-2012 01:18 PM
NilsBabel
Occasional Contributor II
Is there a size limit to NumPy Arrays?  I'm working on a script using arcpy.  I've been testing it with a subset of my raster.  I'm using arcpy.RasterToNumPyArray() to create a numpy array.  I've got my script working the way I want.  Now when I try and run it with my full extent raster it crashes on the arcpy.RasterToNumPyArray() function.  No message, nothing.  So I'm guessing it's a memory thing.  The raster I'm trying to convert isn't even that big 185Mb.  Anyone got a clue?

Thanks in advance,

Nils
7 Replies
MarcinGasior
Occasional Contributor III
The issue is 32-bit Python and the size of your RAM.
On the 8GB RAM system and 32-bit Python I managed to create NumPy Array of Integers of size about 9000x9000. On 3GB RAM system it was about 5000x5000.
For floating points raster it may be even smaller.

Maybe you can try to split your raster into several rasters?
0 Kudos
V_StuartFoote
MVP Frequent Contributor
Nils, Marcin

Problem is even worse. 32-bit Python is not compiled LargeAddressAware, meaning it will only be able to address 2GB of user addressable memory space per process thread running on either 32-bit or 64-bit Windows.

If working outside ArcGIS Desktop on 64-bit OS, use a 64-bit Python as an alternative environment for OSGeo processing--the GDAL libraries and NumPy in this case. You'd need to keep 32-python as primary for correct ArcGIS ArcPy functions.

If just  a little more head room is needed to run 32-bit Python code to completion, it is possible to set 32-bit Python executables LargeAddressAware, so modified it can use a 3.2 GB in 32-bit OS, or a full 4GB of user addressable memory space in each thread in 64-bit OS.  Several forum threads with details.

Stuart
0 Kudos
NilsBabel
Occasional Contributor II
Thanks for the advice.  I only have 4GB of memory on my system and my raster already is an integer (16bit).  So I don't think I'll be able to save any memory by converting my raster.  I'm performing a neighborhood type analysis so tiling my raster into smaller pieces is not really ideal, but that may be the only solution.  I'll try and look into LargeAddressAware if I have some time.

Thanks again,
Nils
0 Kudos
MarcinGasior
Occasional Contributor III
Do you really need to convert raster to NumPy array?
Mayby it would be enough to use Map Algebra which is now available direct from arcpy?
Check the Raster class help.
0 Kudos
NilsBabel
Occasional Contributor II
I can probably get by with existing SA tools and map algebra for now.  But there is one thing I would like to do that I can't do with existing tools, which is why I was looking at Numpy

Thanks again.
0 Kudos
curtvprice
MVP Esteemed Contributor

The help for RasterToNumpyArray (example 2) has an example of how to process your data in tiles so you can use numpy array processing with rasters that are too big to handle in one go.

LotteDoyle
New Contributor II
Hi Nils,
Have you looked into writing Python generator expressions? I haven't tried it but maybe the NumPy array output could be stored as an iterable data type?

Only until today did I come across the RasterToNumPyArray arcpy function and came across your post. I recently finished writing a custom tool to convert the ESRI Arc/Info Grid to a text file so I had to deal with memory issues with large rasters. I ended up using generators after trying with the list data type in for-loops and NumPy functions. The negative aspect about my tool is that it first uses the Raster to ASCII to generate the ASCII Grid. If the arcpy function could bypass that while "wrapped in" generator expressions, the converter 's performance could double.

JK
0 Kudos