POST
|
... So, I could not solve this issue. I decided to remove that gdb from my workspace (temporarily) and try to run the script on what remained in the workspace. Well that failed as well with the resulting error : Runtime error <class 'arcgisscripting.ExecuteError'>: ERROR 000278: Failed on input OID 1, could not write value â?? â?? to output field SOURCE_DATE Failed to execute (Append). Did you ever figure out this error? I keep getting it too on a similar process running the Merge tool.
... View more
04-16-2013
01:23 PM
|
0
|
0
|
963
|
POST
|
Sorry Dan, I was disappointed myself. We have had it fixed in our 10.1 code base, and we will be pushing it through to service pack 2 in the next short while. The projected date for SP2 I believe is still early 2011. The NIM report has also been updated, and should get pushed to the web on the next flush. -Dave Dave, It seems like 10 SP 5 still doesn't have a fix. Can you verify that?
... View more
11-08-2012
07:40 AM
|
0
|
0
|
325
|
POST
|
Has there actually been a fix for this in 10.0? I am on 10.0 SP5 here, and I have a failure that probably relates, but I think it is the way the geometries.py file is built (using the arcobjectconversion and _base - perhaps it actually goes even deeper into the core C++ code). Note this failure is a simplified version as shown by Victor velarde at gis.stackexchange. Basically, the geometry exported by arcpy cannot be imported as the same on by the same session of arcpy. [PHP] geom = arcpy.SearchCursor(polygonFC, "FID = 0").next().getValue("Shape") geomGeoJSON = geom.__geo_interface__ geom2 = arcpy.AsShape(geomGeomJSON) geom.equals(geom2) # This fails [/PHP] Sample representations of the geometry (note at least their representations seem identical): [PHP] >>> geom {'type': 'Polygon', 'coordinates': [[(-122.8423481559999, 47.060497293000083), (-122.84239755599992, 47.059262423000064), (-122.84416913599989, 47.059309693000046), (-122.84416913599989, 47.060497293000083), (-122.8423481559999, 47.060497293000083)]]} >>> geom2 {'type': 'Polygon', 'coordinates': [[(-122.8423481559999, 47.060497293000083), (-122.84239755599992, 47.059262423000064), (-122.84416913599989, 47.059309693000046), (-122.84416913599989, 47.060497293000083), (-122.8423481559999, 47.060497293000083)]]} [/PHP] Incidentally, the __geo_interface__ fails miserably if the polygon has a hole, and you need to change the geometries.py to actually handle the holes yourself (the Polygon class). Just in case anyone notices at ESRI and wants to release this as a patch for 10.0: Change the Polygon class' return values for the following methods: [PHP] def __geo_interface__(self): return {'type': 'Polygon', 'coordinates': [[((pt.X, pt.Y) if pt else None) for pt in part] for part in self]} def _fromGeoJson(cls, data): ... return cls(Array([map(lambda p: Point(*p) if p is not None else Point(), part) for part in coordinates])) [/PHP] Also, a side-note: Will we ever get a [PYTHON] tag for highlighting? The vBulletin forums have plenty of examples for syntax highlighting in multiple languages.
... View more
11-07-2012
08:44 AM
|
0
|
0
|
325
|
POST
|
Thank you V. I will move into chunking my data then outside a geodatabase. A file geodatabase can hold closer to an infinite number of records, but any one table can only hold *four* billion records. I wouldn't recommend exceeding ten million, especially in any one session. If there are no concrete plans to use the data, and large data volumes cause ArcGIS to crash, then my recommended solution is to not use file geodatabase for this task. Instead use a sequence of ASCII files (larger than binary, but more platform-independent) that combines the date and a rolling sequence number in the filename, capped at one tenth the expected daily throughput or 1Gb, whichever is smaller. This would give you a lightweight, scalable solution that won't cloud the performance (creating locks and such) of the file geodatabase. - V
... View more
09-24-2012
08:50 AM
|
0
|
0
|
1574
|
POST
|
The reason they are not sent to /dev/null is that we need to maintain a record so we can recreate results if ever needed. If the file geodatabase can (in theory) hold 400 billion records, my problem then becomes arcpy silently dying without any response when saving to the geodatabase. Interesting...
... View more
09-24-2012
07:43 AM
|
0
|
0
|
1574
|
POST
|
I have been trying to find out the maximum number of features allowed in a file geodatabase feature class, to no avail. The best I could find was hundreds of millions of vectors (I assume one vector is the equivalent of one simple geometry shape). Is there a hard limit, or any way to calculate that hard limit? Is my only solution to break down the features into multiple feature classes, and if so, at which point? First 100,000,000 million features? 10,000,000? 1,000,000? I am running across some programs that simply die with no error message when trying to write a large number of features to a file geodatabase (~1,000,000,000 points along with 2 text fields each of length 8). Yes I understand that this is a large number of features to write, but this is simply a record of calculations that I need to maintain and in no way production data. In all likelihood these data will never be accessed again - they are going straight to an archive. Thank you, Michalis PS: The arcpy.Getmessages() command at the except: code block returns 2 \n as a result (very informative, I know).
... View more
09-23-2012
02:04 PM
|
0
|
4
|
4447
|
POST
|
Have you found a resolution to this problem? I am trying to do a spatial join as well, 300,000 points to about 60,000 polygons, and I get the Join function running for about 30 minutes before saying Out of memory Item not found in this collection (even though the memory usage goes to about 1.5GB out of my 8GB).
... View more
09-13-2012
11:06 AM
|
0
|
0
|
372
|
POST
|
I don't know off the top of my head how to do it in Python, but in VBScript you should be able to use the example on this help page (and divide by 7). Thanks. I finally got it to work, after realizing that when you use !Date! in the field calculator, it actually returns a string rather than a datetime.date() object. In my pre-processor I had the following: [PHP] firstTime = datetime.date(2009, 7, 16) def timeDif(a): month, date, year = a.split('/') featDate = datetime.date(int(year), int(month), int(date)) timeDiff = featDate - firstTime return timeDiff.days / 7 [/PHP] and in my new field I simply had: [PHP] timeDif(!Date!) [/PHP] Frustrating to figure it out, but hey, it works now.
... View more
02-03-2012
01:12 PM
|
1
|
0
|
222
|
POST
|
I have a field in my shapefile that is of type "Date". I want to figure out the difference between that "Date" and today, in weeks. My assumption was that (datetime.date() - !Date!) would produce a timedelta object, which i would simply call to find out the difference. Hence my code below: (datetime.date.today() - !Date!).days / 7 Of course, that says it is a syntax error, which doesn't explain much. Any pointers to manipulating dates in the field calculator?
... View more
02-03-2012
12:54 PM
|
0
|
2
|
2148
|
POST
|
This post made me go back to my code and think about this issue again and I think I've come up with a simple solution that works. I've gotten IsNull to work in the past so I think the problem is actually caused because I'm casting the output path of a PolygonToRaster function to a raster by simply doing polyRaster = Raster(outputPath) - which should work and generally does for casting path strings to Raster objects. If I instead make a raster layer using MakeRasterLayer_management and use that layer name as the input to my subsequent operations (Con in this case), it works without issue. So my code went from this: arcpy.PolygonToRaster_conversion(inPoly,valueField,outputPath,"CELL_CENTER","#",arcpy.env.cellSize) polyRaster = Raster(outputPath) output = Con(IsNull(polyRaster),0,polyRaster) ##Returns 999998: Unexepected Error to this: arcpy.PolygonToRaster_conversion(inPoly,valueField,outputPath,"CELL_CENTER","#",arcpy.env.cellSize) arcpy.MakeRasterLayer_management(outputPath,"polyRaster") output = Con(IsNull("polyRaster"),0,"polyRaster") So the solution works for me but there must be a bug in the type casting or something... I tried your solution too, to no avail. I get weird errors: RuntimeError: ERROR 000732: Input Raster: Dataset d:\Users\MICHAL~1.THI\AppData\Local\Temp\x923a0602_2133_4e68_bd44_b285094f1537y0.afr does not exist or is not supported The referenced input raster was generated using MakeRasterLayer by the way. I am not sure what the problem is. What I realized is that Geoprocessing of raster data in Python is abysmal. My work should be straight forward but I encounter a lot of cryptic messages (Errors 999998 and 999999). My workflow is simple really, but if you have any suggestions... I receive daily locations of monitors that where put in place somewhere. I know the monitor has a radius of 5km. For each monitor then, I use the wonderful ExtractByCircle on a constant raster of value 1 to get a circular buffer around the monitor (this returns an instance of Raster class) (NOTE: This does not work with vector as DISSOLVE refuses to work with more than 2 overlapping polygons, and ESRI tech support just said find a workaround). This is then meant to be compared with the historical coverage (i.e. did we add pi*5^2 area or was there an overlap?). A simple conditional statement achieves this (if the result of extractbycircle is null, get data from the old coverage, else grab data from our extractbycircle), along with a comparison of the new and old coverage. Now, this process works just fine when performed by hand in ArcMap. Run the extract, run the Con, do simple math to figure out change, record it. In Python, this seems like a lost cause. If you use the Con() command, the returned value is of type Raster, but does not allow you to save it or find the maximum or minimum values (Error 999998), which should be what the Raster class provides (trying to save it to verify it is valid does not work). Any suggestions?
... View more
02-03-2012
07:04 AM
|
0
|
0
|
464
|
POST
|
Unfortunately, my frustration with these changes stems from the the .save function. In my script when I hit that particular line it blows up PyWin and my error traps fail to document why. It makes it hard to debug the script when there is nothing to work from. I have an "if" decision statement prior to my save I'm wondering if that may be my issue. Maybe, the process dumps the temp raster when the if runs and it's just not available when my .save statement is run. Jon, I have the same problem. I run my Con statement, and when I try to save the result I get an unexpected error. Not sure how to deal with it at all. Tried a few things, including verifying the result is a raster, and checking it has a min and max values, but no luck.
... View more
02-02-2012
09:44 AM
|
0
|
0
|
293
|
POST
|
Denise, Thank you for your reply. You are correct on the tool (Conversion tools) and that I have background processing disabled. I am on 10 SP3 as well. So... I am not sure what the problem is, or why it happens. For now, I resolved in generating raster from polygons for every about 100 polygons, and then bringing them together (which is probably not the ideal situation, but it works). Michalis Michalis, You do not mention the specific tool you are using but I assume you are using ArcToolbox > Conversion Tools > To Raster > Polygon to Raster for this process, is this correct? Do you receive the 'error 001143' with this tool? It is possible you are running into an issue with background processing. Please try running the tool with background processing disabled: 1. In ArcMap > Geoprocessing menu 2. Geoprocessing Options 3. Background Processing group > uncheck Enable I did not know whether you are using ArcGIS Desktop 10 SP3 but I am aware of a fix in Service Pack 3 that might resolve this issue. Please install Service Pack 3 to see if this fixes the issue. http://resources.arcgis.com/content/patches-and-service-packs?fa=viewPatch&PID=17&MetaID=1807 For faster response by other ArcGIS Desktop users, please post your questions on a more applicable product forum such as ArcGIS Desktop Forums - http://forums.arcgis.com/forums/5-ArcGIS-Desktop-General. Cheers, Denise
... View more
01-23-2012
10:15 AM
|
0
|
0
|
430
|
POST
|
Denise, I had a similar error, but not using mobile processes. I am simply trying to convert polygons to raster. I have polygons generated through buffers at 3 distances. For the 5km distance, the operation worked fine. But when it comes to 10 and 15km, the operation fails with an ERROR 001134. I tried fixing the problem based on your suggestions, but to no avail. I tried to see if it was a resources error as well, but I am well below the resources I have available (this is a quad core machine with 16GB of RAM). Are there any other suggestions? I will try the Repair Installation procedure next. Hi Joe, One suggestion is to disable background geoprocessing by clicking Geoprocessing > Geoprocessing Options on the Standard toolbar in ArcMap and then re-run the tool. Some geoprocessing tools need as much of your computer resources available and when a tool fails because there is not enough memory available you can try the following: Identify and exit nonessential memory-intensive applications. Disable background processing from the Geoprocessing Options dialog box (Geoprocessing > Geoprocessing Options). This will shut down the background process and free up resources. Rerun the tool. Using this procedure, background processing is bypassed; the additional background process does not start, and the resources it would consume is now available for your tool to make use of. While the tool is executing, avoid starting any memory-intensive applications. If the above doesn't help to resolve the "ERROR 001143: Background server threw an exception" error, please contact Esri Support and log this issue so we can help in troubleshooting this error. Thanks, Denise
... View more
01-16-2012
03:36 PM
|
0
|
0
|
430
|
POST
|
I am attempting to run ZonalStatistics on a number of generated buffers (around 100,000) and record my results in a table. I have managed to restrict myself to memory mostly, meaning the buffer files are generated in memory, passed on to ZonalStatistics as a memory feature class and it runs great. My problem is that zonal statistics, while it returns a variable to use in Python, also generates files in the scratch workspace. This is a big problem, as I run multiple zonal statistics per buffer (multiple source rasters) and with 100,000 buffers, this becomes a very big bottleneck for me. Is there any way to set the output of Zonal Statistics to be completely in memory? Or to set the whole scratch workspace in memory? I know that the arcpy.gp.ZonalStatistics_sa that is called when you call zonal statistics has an out_raster parameter, but it is hard coded to "#" so i can't change it. Thank you in advance. Michalis
... View more
12-30-2011
07:06 AM
|
0
|
1
|
333
|
POST
|
Vangelo, I am certain that the staff at ESRI are doing their best. You have been continuously improving the functionalities you make available to us, and we are grateful. Thank you. Michalis
... View more
05-07-2010
09:19 AM
|
0
|
0
|
165
|
Title | Kudos | Posted |
---|---|---|
1 | 02-03-2012 01:12 PM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|