I have ArcGIS Enterprise 10.9 deployed to EMCS. I have a geoprocessing service published to the Hosted server which as part of the operation clips an esri provided image service. It works fine, but when the clipping extent is slightly large I get the following error.
Error executing tool. RoughnessMapTool Job ID: j0ec761fca4af4acdbf7170354c52d479 : Traceback (most recent call last): File "<string>", line 213, in execute File "C:\Program Files\ArcGIS\Server\framework\runtime\ArcGIS\Resources\ArcPy\arcpy\management.py", line 19530, in Clip raise e File "C:\Program Files\ArcGIS\Server\framework\runtime\ArcGIS\Resources\ArcPy\arcpy\management.py", line 19527, in Clip retval = convertArcObjectToPythonObject(gp.Clip_management(*gp_fixargs((in_raster, rectangle, out_raster, in_template_dataset, nodata_value, clipping_geometry, maintain_clipping_extent), True))) File "C:\Program Files\ArcGIS\Server\framework\runtime\ArcGIS\Resources\ArcPy\arcpy\geoprocessing\_base.py", line 512, in <lambda> return lambda *args: val(*gp_fixargs(args, True)) arcgisscripting.ExecuteError: ERROR 160235: There is not enough storage space to complete the operation. An invalid field name was used in a query string An invalid field name was used in a query string An invalid field name was used in a query string No resource could be found at that address. (status code 404).
When I run locally, the size of the files are no bigger than 30 mb, and I have some validation to prevent a large size being input into the tool that would use all the space on the server.
Any ideas? Is there a setting on the ESRI server I need to change to allow for large processing size? I have set the "Maximum Number of Records Returned by Server:" to a reasonable size too. I use the scratch folder works; could I potentially use in_memory for the processing before saving the result to the scratch workspace to return the file
arcpy.env.workspace = arcpy.env.scratchFolder