POST
|
I had a similar issue where using the arcpy.da module was creating a lock on my file geodatabase. I see you worked around your problem using arcpy.Describe(), but I also discovered you can use arcpy.da.Describe() so long as you use a path as the argument instead of the Layer object and also call another function that seems to force the reset, which in my case was arcpy.management.GetCount(). There seem to be two bugs here. One, when you use the Layer object as an argument to a function like Describe(), whether it's the da module or not, a lock is created that cannot be released. Two, when you use the arcpy.da module, a lock is created that can only be released if using paths to the data since using Layer objects also creates locks. See my post with additional code samples demonstrating the issues.
... View more
10-17-2023
12:40 PM
|
0
|
0
|
241
|
POST
|
This appears to be a wider problem with the arcpy.da module. As documented in this post, arcpy.da.Describe() also creates a schema lock in a file geodatabase when a layer is used as the argument. I am hoping this topic gets some attention and is ultimately addressed without having to resort to workarounds, but in my case and in the case of using arcpy.da.Describe(), I found a workaround using arcpy.management.GetCount(). In my case, I was able to call GetCount with the path to a table in the file geodatabase that has the lock from using the arcpy.da module and the lock was released. Note that I was using paths to the data and not arcpy.mp.Layer objects. In the post referenced above, that wasn't quite enough. As mentioned in that post, using arcpy.Describe() with a path as the argument was the workaround. Using arcpy.Describe() with a Layer object as the argument still created a schema lock, and using arcpy.da.Describe() with a path as the argument also resulted in a schema lock. Calling arcpy.management.GetCount()released the lock only when arcpy.da.Describe() was used with a path as the argument; it did not work when the Layer object was passed as the argument in either the arcpy.Describe() or arcpy.da.Describe() cases. I've included a code sample below that demonstrates the schema locks and partial success in releasing them for arcpy.da.Describe(). Note that you'll have to reset the Python environment between cases 3 and 4 since case 3 doesn't release the lock. Edit: I've included a 5th case demonstrating lock files being created by the Layer object without the use of the arcpy.da module. Note again that the Python environment should be reset between cases 4 and 5. import arcpy
aprx_path = r"path to a Pro project with a map containing feature layers in a file gdb"
aprx = arcpy.mp.ArcGISProject(aprx_path)
m = aprx.listMaps()[0]
lyrs = m.listLayers()
lyr = lyrs[0]
# case 1 - lock created by arcpy.da.Describe() with a path as the argument
# lock released using arcpy.management.GetCount() with a path as the argument
desc1 = arcpy.da.Describe(lyr.dataSource)
result1 = arcpy.management.GetCount(lyr.dataSource)
# case 2 - lock NOT created using arcpy.Describe() with a path as the argument
desc2 = arcpy.Describe(lyr.dataSource)
# case 3 - lock created by arcpy.da.Describe() with a Layer object as the argument
# lock NOT released using arcpy.management.GetCount() with a path as the argument
desc3 = arcpy.da.Describe(lyr)
result3 = arcpy.management.GetCount(lyr.dataSource)
# case 4 - lock created by arcpy.Describe() with a Layer object as the argument
# lock NOT released using arcpy.management.GetCount() with a path as the argument
desc4 = arcpy.Describe(lyr)
result4 = arcpy.management.GetCount(lyr.dataSource)
# case 5 - lock created by using arcpy.management.GetCount() with a Layer object as the argument
# lock NOT released using arcpy.management.GetCount() with a path as the argument
result5 = arcpy.management.GetCount(lyr)
del result5 # this does remove one of two lock files from the gdb
result6 = arcpy.management.GetCount(lyr.dataSource)
... View more
10-17-2023
12:28 PM
|
0
|
0
|
214
|
POST
|
I'm using ArcGIS Pro 3.1.2. Calling TableToNumPyArray or FeatureClassToNumPyArray creates a schema lock on the File Geodatabase containing the feature class or table used as input to the function when the code is executed in a Jupyter Notebook in a standalone environment. It also creates a schema lock when executed via a standalone Python window. This does not happen when the code is executed from a Jupyter Notebook run inside of ArcGIS Pro, or from the Python Console in ArcGIS Pro. The only method I found for releasing the schema lock is restarting the Kernel or exiting Python in the standalone window. This seems like a bug since it doesn't occur when the code is executed inside Pro. Is there a way to release the schema lock that is unadvertised? Provide a path to your data and a valid field name and the code below will produce locks in your geodatabase when executed in Jupyter Notebooks outside of Pro. import arcpy
import numpy
tbl = r"path to your table here"
fld = "a field in your table"
nulls_bool = False # Tested with both True and False, doesn't matter
arr = arcpy.da.TableToNumPyArray(tbl, fld, skip_nulls=nulls_bool)
... View more
10-12-2023
12:52 PM
|
0
|
1
|
302
|
POST
|
@DrewFlater Thanks for following up and posting the (future) solution.
... View more
10-06-2022
12:10 PM
|
0
|
0
|
780
|
POST
|
By converted, I meant right-click to copy the tool from the existing toolbox and then right-click on the new atbx toolbox and paste. However, for troubleshooting purposes, I created a new atbx toolbox and a new tool in that toolbox. The new tool had a single parameter and a Python script that is as simple as you can get. "Parameter corruption" may not be the best title for this post, but at the very least a string parameter populated by a value list that is set in initializeParameters is not working properly. Also, my ArcGIS Pro is up-to-date, this was all using version 3.0.1.
... View more
08-30-2022
08:51 AM
|
0
|
0
|
837
|
POST
|
I converted a working toolbox with python script tools from the tbx format to the new atbx format and experienced behavior where a tool parameter was not being populated correctly. I was able to troubleshoot and determine when the parameter is populated correctly and when it is not. The difference occurs in the in the JSON type files tool.content and tool.content.rc that can be accessed through 7-Zip (or other archive utility). I'm guessing it relates to the "loading/opening of a toolbox with many tools, as well as the access of those tools' properties and parameters, when a particular tool is opened" discussed here. The parameter has a data type of String and a Filter of Value List, and is used to list the layers in the map that pass validation. For troubleshooting, I stripped everything down to remove the validation and simply populate the parameter with all layers inside initializeParameters of the ToolValidator. def initializeParameters(self):
# Customize parameter properties.
# This gets called when the tool is opened.
import os
upd_list = []
project = arcpy.mp.ArcGISProject("CURRENT")
m = project.activeMap
for lyr in m.listLayers():
upd_list.append(lyr.name)
self.params[0].filter.list = upd_list
return Pretty basic stuff. However, when you open tool, the parameter is sometimes populated with only blank entries. See below. After opening the tool with an active map view window open it fails to populate, if you open the tool properties and inspect the Value List it should have the list of layers from the map view. To get the tool working, I enabled the Multiple values checkbox of the parameter, reopened the tool and the layers were all listed, then disabled the Multiple values checkbox again, and the layers were still listed properly. However the parameter breaks again if you add more layers to the map. It seems like the performance improvements they tried to implement actually end up breaking functionality (validation/filtering of layers in map before populating a parameter with them) that I'm not aware of being offered in another manner. The parameter items seem to be cached in the tool.content and tool.content.rc files (see below) and refreshing the toolbox doesn't seem to rebuild the cache. This makes atbx toolboxes unsuitable for sharing. The tool.content file is below. In the file from a tool with the parameter populated, the parameter has a domain property with items reflecting the layers populated in initializeParameters. {
"type": "ScriptTool",
"displayname": "$rc:title",
"app_ver": "13.0",
"product": "100",
"params": {
"list_of_layers": {
"displayname": "$rc:list_of_layers.title",
"datatype": {
"type": "GPString"
},
"domain": {
"type": "GPCodedValueDomain",
"items": [
{
"value": "aspect.tif",
"code": "$rc:list_of_layers.domain.aspect.tif"
},
{
"value": "slope.tif",
"code": "$rc:list_of_layers.domain.slope.tif"
},
{
"value": "fl_down.tif",
"code": "$rc:list_of_layers.domain.fl_down.tif"
}
]
}
}
}
} The tool.content.rc file is below and has some example layers listed that were included in the parameter and also refer back to the contents of the tool.content file. {
"map": {
"list_of_layers.domain.aspect.tif": "aspect.tif",
"list_of_layers.domain.fl_down.tif": "fl_down.tif",
"list_of_layers.domain.slope.tif": "slope.tif",
"list_of_layers.title": "List of Layers",
"title": "Example"
}
} I've attached the toolbox that worked for me after I toggled the Multiple values checkbox in the Parameters tab of the tool properties to force the rebuild of the tool.content and tool.content.rc files. Unless you create layers with the names from above, the tool won't populate the layer names form the map. Is this a known bug/limitation? Are there plans to fix this? Are there any other ways to filter and validate what layers from a map will be included in a parameter?
... View more
08-29-2022
05:21 PM
|
0
|
4
|
888
|
POST
|
This continues to be an issue and is occurring for me in ArcGIS Pro 2.8.2. Selecting "Don't create index" in the Options took a few minutes for new entries from being added to the sde_setup.log in the Temp directory, and from new .geodatabase files from being created, but it does seem to have stopped the issue. @TheodoreF , you mentioned there are two bugs but you listed the same number twice (BUG-000138481). What's the other one, and does it show any change in status? Thanks Shea
... View more
09-07-2021
05:30 PM
|
1
|
0
|
1673
|
POST
|
This may be old, but nothing has changed in ArcMap and it's still a problem there. This does not appear to be an issue in ArcGIS Pro however. I was having an identical problem doing a similar task using ExtendTable and a numpy array to add fields to an existing feature class and thank goodness for this thread because it solved my issue. Simply casting the field names in the array to string solved it. The simplest demonstration of this is putting the code from Esri's ExtendTable documentation in the execute block of a Python Toolbox like below. # In ArcMap, this results in "data type not understood"
# In ArcGIS Pro, this succeeds
array = numpy.array([(1, 'a', 1111.0), (2, 'b', 2222.22)],
numpy.dtype([('idfield',numpy.int32),
('textfield', '|S256'),
('doublefield','<f8')]))
# In ArcMap, this succeeds
# In ArcGIS Pro, this succeeds
array = numpy.array([(1, 'a', 1111.0), (2, 'b', 2222.22)],
numpy.dtype([(str('idfield'),numpy.int32),
(str('textfield'), '|S256'),
(str('doublefield'),'<f8')])) I've also attached a Python Toolbox that uses the same code to demonstrate it.
... View more
07-20-2021
07:57 PM
|
0
|
0
|
251
|
POST
|
I tried this again by following the steps as directed in https://www.usgs.gov/media/videos/lesson-10f-national-map-3dep-elevation-web-service-arcmap and managed to add the service to a map successfully. As @TrevorHobbs noted, it was not added as a WCS Server, which I think was my problem before. For me, the connection to add via Catalog was "New ArcGIS Server Connection" in ArcGIS Pro and "Add ArcGIS Server" in ArcMap. I'm not sure if it matters, but the URL I added was https://elevation.nationalmap.gov/arcgis/services/3DEPElevation/ImageServer/WCSServer without the query string. Shea
... View more
07-12-2021
11:32 AM
|
0
|
0
|
1916
|
POST
|
I think that's what I had done initially, otherwise I'm not certain the SDK would have installed, but I'm not positive at this point. If it does allow you to install the SDK without having Visual Studio installed first, then I'm guessing that's what I did initially and that was my problem. So yes, I would recommend having VS 2019 installed first, then install the SDK. Fortunately if it allows you to and you did it backwards, the uninstall/reinstall of the SDK was fairly quick (5-10 minutes, and no reboots that I recall).
... View more
12-21-2020
02:30 PM
|
1
|
0
|
3477
|
POST
|
An uninstall/reinstall of the ArcObjects SDK fixed this issue for me and now the add-in is created and the templates show up when creating a new project. Performing a repair of the ArcObjects SDK on Friday did not work. I'm not sure why the initial install or repair didn't work- Visual Studio Community 2019 did appear (see screenshot) during setup and there were no errors during the install. Also, when troubleshooting the issue, I noticed it is possible to have Visual Studio 2017 and 2019 installed concurrently, recognized by the ArcObjects SDK setup program, and the SDK installed and working for both. Shea
... View more
12-07-2020
08:31 AM
|
2
|
0
|
3541
|
POST
|
According to the documentation , Visual Studio 2019 is supported for use with the ArcObjects SDK in 10.8. A 10.8 project I built in Visual Studio 2017 builds, but the add-in for the project is not created. In the past, this has been a culprit of the ESRI.ArcGIS.AddIns.targets in the XML code of the project file as documented here. I'm familiar with this and have made these changes in the past, but couldn't figure it out quickly on my new computer with fresh installs of Visual Studio and ArcMap. So, I decided to just create a new add-in project from scratch in Visual Studio 2019 to look for any differences that could be causing the problem. However, on the Create New Project screen (see screenshot), I couldn't find any ArcGIS entries as described in the documentation here. Next, I went looking for the ESRI folder within the MSBuild folder referred to in the second link, so now I'm wondering if perhaps the SDK isn't being completely installed with Visual Studio 2019 installations. Has anyone else used Visual Studio 2019 with ArcMap and ArcObjects 10.8? Thanks Shea
... View more
12-04-2020
03:06 PM
|
0
|
3
|
3571
|
POST
|
Is it possible to set the environment settings at the tool level versus for the entire geoprocessor in ArcObjects? This can obviously be done for specific tools run via ArcToolbox and Model builder, but I can't find any examples of it being done at the tool level in ArcObjects when using ESRI.ArcGIS.Geoprocessor.Geoprocessor or ESRI.ArcGIS.Geoprocessing.GeoProcessor. Shea
... View more
06-11-2019
01:15 PM
|
0
|
1
|
418
|
POST
|
I'm seeing this behavior in ArcGIS 10.5 with 64-bit geoprocessing installed. Any updates? Was this ever identified as a bug by ESRI?
... View more
02-13-2017
01:35 PM
|
0
|
1
|
724
|
POST
|
I have an 8-bit raster (TIFF, IMG, and FGDB rasters all exhibit the same behavior) with the pixel type of unsigned integer. In ArcMap 10.2, the value field is being incorrectly identified as a double, so it can't be used in tools like Tabulate Area that require an integer field for the zone field. In ArcMap 10.3 and 10.4.1, the value field is correctly identified as a long. The problem seems to be an introduced one because the value field in the original raster is correctly identified as a long. The original raster was the 2011 NLCD Land Cover (Multi-Resolution Land Characteristics Consortium (MRLC)), which was then clipped using the Clip tool (Clip—Help | ArcGIS for Desktop), and the result of this is where the value field is misidentified as a double. In summary: In ArcMap 10.2, 10.3, and 10.4.1, executing Clip on an integer raster with the output saved as a TIFF, IMG, or FGDB raster results in the value field being incorrectly identified as a double in ArcMap 10.2, but correctly identified as a long in ArcMap 10.3 and 10.4.1 In ArcMap 10.2, 10.3, and 10.4, executing Clip on an integer raster with the output saved as a GRID results in the value field being correctly identified as a long in ArcMap 10.2, 10.3, and 10.4.1 Exporting the clipped TIFF, IMG, or FGDB raster to a GRID results in the field being correctly identified as a long in ArcMap 10.2, 10.3, and 10.4.1 Exporting the clipped TIFF, IMG, or FGDB raster to a TIFF, IMG, or FGDB raster results in the field being incorrectly identified as a double in ArcMap 10.2, but correctly identified as a long in ArcMap 10.3 and 10.4.1 Has anyone experienced this bug, or can you reproduce it?
... View more
11-04-2016
04:51 PM
|
0
|
2
|
1083
|
Title | Kudos | Posted |
---|---|---|
1 | 09-07-2021 05:30 PM | |
1 | 12-21-2020 02:30 PM | |
2 | 12-07-2020 08:31 AM | |
3 | 05-19-2016 01:30 PM | |
4 | 01-11-2016 10:48 AM |
Online Status |
Offline
|
Date Last Visited |
02-09-2024
05:11 PM
|