POST
|
Hi James 85 million points is a lot! The key to these kind of issues is breaking down the problem and making sure you have the shortest path to a solution because slow response is often a death by a thousand cuts. Whilst some approaches in ArcGIS are definitely quicker than others I've found for large datasets the less data you send to particular query the better. Is there a way you can pre-filter your result set by using environments or processing masks? This approach prevents the entire dataset being used to exclude data points that will never be in your results. At least that is my conclusion after finding it massively improves performance in raster and vector processing exercises. Others might have a more in depth knowledge of how ArcGIS does its thing under the hood. Good luck!
... View more
10-06-2017
05:07 AM
|
0
|
1
|
1759
|
POST
|
The short answer is there is no way to achieve this without creating a new table. As others have mentioned OBJECTID is a system column and while its extremely useful as an ID column when you need a sequential ID it will only work until you delete a feature. OBJECTID always starts from the last value in the table (including the values of rows that have been deleted) since the latest value of OBJECTID is stored in a related system table. This has some serious implications for SDE data but for file gdb its more just annoying. The has the unfortunate side affect of this design means that any simple attempt to create a sequential ID (like a field calculation) based of OBJECTID will also be non-sequential. Your options are: write to a new feature class/table where upon OBJECTID will automatically be rebuilt for the new table in the sequence they are entered write a script that uses an independent record count to populate the sequence as you loop through the table with a cursor. use a GLOB or other ID that doesn't require a neat sequence. Hope that helps!
... View more
08-24-2017
04:32 PM
|
0
|
0
|
1224
|
POST
|
If the script is on a server - it most likely can't access your user context when you are not physically logged on. This could be windows or ArcGIS related. It sounds like you can schedule other jobs so its probably ArcGIS related. Check the action of your script carefully for any in memory or temporary layers - unless you explicitly declare the location of these they will often default to areas a service account may not have access to - so the problem is not immediately apparent. Unfortunately the only way I discovered this was to use the approach outlined above. Pull your script back to "import arcpy - pass" and then add in blocks until the problem is raised. It can be pretty hard to trap the problem if/when there is a windows conflict clobbering all of your python error handling. Good luck!
... View more
06-29-2017
03:57 PM
|
0
|
0
|
1323
|
POST
|
I came across this thread while searching for a solution to my own version of this issue - I think they could be the same problem. I suspect that its actually a locking issue and nothing to do with the actual VAT itself. In my own script (which creates a smoothed categorized polygon from input classified raster). I'm getting this error intermittently on the same raster. Its most prevelent when running the script - almost entirely absent when stepping through in the debugger. Its very difficult to trap what is actually happening but I think its arcgis not letting go of the table before arcpy tries to access it again. In my process its the reclass function which throws this error. I reclass a float raster to get the categories for the polygon, but in running multiple sequential reclasses on the same raster seems to be generating this error. I've found that running the same script on a computer still running ArcGIS 10.0 SP5 without the latest windows update - none of the errors are thrown. So from looking at your script I suspect that what is actually causing the error is the build of the VAT in the block preceding the one that throws the error. I haven't had time to test this theory properly in my own application but hopefully that gives you another line to work on. Good Luck Mike
... View more
09-25-2013
05:28 PM
|
1
|
0
|
1533
|
POST
|
Hi guys - I posted this in the python forum (http://forums.arcgis.com/threads/42260-Spatial-Join-Error?p=197053&viewfull=1#post197053) but think it almost exactly the same error as investigated here. I am updating a routine analysis job that used to (and still will) execute fine in 9.3.1 but am running into unhandled errors in ArcGIS10.0 SP4 Unfortunately I have a similar issue with the maximum merge rule(but no answer as yet). I have a buffered road feature class with 4 numeric (float) fields that identify the category of buffer. I wish to spatially join these to a parcels layer so that each intersecting parcel is thereby identified by the buffer it intersects. By setting the merge rule for the numeric fields to maximum I get the maximum number of intersects for any parcel that intersects the buffer feature class - this is all very nice and works swimmingly. HOWEVER - for all of the parcels that do not intersect in the spatial join (ie too far away from the joining class) I get -3.402823e+038 calculated into the numeric fields instead of zero. It doesn't matter if I set the field precision or not or disable null values I still get -3.402823e+038 returned for non intersected parcels!? My lead theory is that this is a floating point representation of zero returned by the underlying ArcObject (reasoned below). The interesting thing is if I change all numeric inputs in both the target and join feature classes to short integer (as my feature classes only hold 0 or 1 for the fields of interest) instead of their original float type - the spatial join bombs out with a generic windows 9999 error and an unhandled exception written to the log file. Presumably the only reason that this can happen is because -3.402823e+038 is a float or double type and is being returned from the underlying ArcObject - it therefore generates an error if the field to which it must be written is an invalid datatype (ie short integer). Is this a bug in the Arc10 object model? I am running ArcGIS Desktop 10.0 service pack 4. At this point I should probably mention that the same result does not occur in 9.3.1 SP3. Certainly appreciate any light that can be shed - I'm stumped as I need the Maximum merge rule for the analysis! All my experiments are being done on cut down feature classes of only a couple of hundred features. Apreciate any assitance Cheers Mike
... View more
05-09-2012
09:52 PM
|
0
|
0
|
1752
|
POST
|
Unfortunately I have a similar issue with the maximum merge rule(but no answer as yet). I have a buffered road feature class with 4 numeric (float) fields that identify the category of buffer. I wish to spatially join these to a parcels layer so that each intersecting parcel is thereby identified by the buffer it intersects. By setting the merge rule for the numeric fields to maximum I get the maximum number of intersects for any parcel that intersects the buffer feature class - this is all very nice and works swimmingly. HOWEVER - for all of the parcels that do not intersect in the spatial join (ie too far away from the joining class) I get -3.402823e+038 calculated into the numeric fields instead of zero. It doesn't matter if I set the field precision or not or disable null values I still get -3.402823e+038 returned for non intersected parcels!? The interesting thing is if I change all numeric inputs in both the target and join feature classes to short integer (as my feature classes only hold 0 or 1 for the fields of interest) instead of their original float type - the spatial join bombs out with a generic windows 9999 error and an unhandled exception written to the log file. Presumably the only reason that this can happen is because -3.402823e+038 is a float or double type and is being returned from the underlying ArcObject - it therefore generates an error if the field to which it must be written is an invalid datatype (ie short integer). Is this a bug in the Arc10 object model? I am running ArcGIS Desktop 10.0 service pack 4. At this point I should probably mention that the same result does not occur in 9.3.1 SP3. Certainly appreciate any light that can be shed - I'm stumped as I need the Maximum merge rule for the analysis!
... View more
05-09-2012
08:28 PM
|
0
|
0
|
858
|
Title | Kudos | Posted |
---|---|---|
1 | 09-25-2013 05:28 PM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|