Error defining query layer on large dataset

655
1
02-07-2012 07:29 AM
IanDeMerchant
New Contributor
I�??m trying to create a dynamic map service to serve polygons from a PostgreSQL database.  The client mapping application defines the set of features to draw so I have defined my query layer with a wide open query string (select * from nfhdb2.nfps.mv_nfps_sur_pol_en).  My table is pretty big, about 1 GB in total. When I do the �??validate�?? on the layer, ArcMap appears to be trying to load all of the results in memory at once and I get an error: �??Underlying DBMS error [out of memory for query result]�??.  I watched the memory usage on the database server as the request was being made and it never exceeded 3% so it looks to me as though it has nothing to do with the database.  I also watched the ArcMap.exe process on my desktop during validation and noticed it climbing and climbing until it hit 1.3 GB before displaying the error message.

Why does ArcMap request all of the features during validation?  Is there a way to avoid this?  I tried turning off the display but that didn�??t help.  Looks to me like I might have to break up the table into multiple map services�?�  Any ideas?

Thanks,
Ian
0 Kudos
1 Reply
adamestrada
New Contributor
I am having the same error. Does anyone know how to fix this?

A
0 Kudos