Overwrite ArcGIS Online Feature Service using Truncate and Append

43484
120
04-24-2020 05:57 AM

Overwrite ArcGIS Online Feature Service using Truncate and Append

You may have a need to overwrite an ArcGIS Online hosted feature service due to feature and/or attribute updates.  However, this could cause some data loss such as pop-ups, symbology changes, etc in the hosted feature service.  For example, you will receive a warning about this when you try to overwrite a feature service in ArcGIS Pro:

 

One way around this is to use the ArcGIS API for Python.  If you are the data owner, or Administrator, you can truncate the feature service, and then append data.  This is essentially an overwrite of the feature service.  The below script will do this by specifying a local feature class and the item id of the feature service you wish to update.  The script will then execute the following steps:

  • export the feature class to a temporary File Geodatabase
  • zip the File Geodatabase
  • upload the zipped File Geodatabase to AGOL
  • truncate the feature service
  • append the zipped File Geodatabase to the feature service
  • delete the uploaded zipped File Geodatabase in AGOL
  • delete the local zipped File Geodatabase
  • delete the temporary File Geodatabase

 

Here is an explanation of the script variables:

  • username = ArcGIS Online username
  • password = ArcGIS Online username password
  • fc = path to feature class used to update feature service
  • fsItemId = the item id of the ArcGIS Online feature service
  • featureService = True if updating a Feature Service, False if updating a Hosted Table
  • hostedTable = True is updating a Hosted Table, False if updating a Feature Service
  • layerIndex = feature service layer index
  • disableSynce = True to disable sync, and then re-enable sync after append, False to not disable sync. Set to True if sync is not enabled

 

Note:  For this script to work, the field names in the feature class must match the field names in the hosted feature service.  The hosted feature service can have additional fields, though.

A video of the workflow can be found at the link below:

https://youtu.be/aBYay-41gVA

 

 

 

import arcpy, os, time, uuid
from zipfile import ZipFile
from arcgis.gis import GIS
import arcgis.features

# Overwrite Output
arcpy.env.overwriteOutput = True

# Variables
username = "jskinner_rats"                                    # AGOL Username
password = "********"                                         # AGOL Password
fc = r"c:\projects\GeoNET\GeoNET.gdb\FACILITIES"              # Path to Feature Class/Table
fsItemId = "9d2c07aaf82249108d8669ef976e4542"                 # Feature Service Item ID to update
featureService = True                                         # True if updating a Feature Service, False if updating a Hosted Table
hostedTable = False                                           # True is updating a Hosted Table, False if updating a Feature Service
layerIndex = 0                                                # Layer Index
disableSync = True                                            # True to disable sync, and then re-enable sync after append, False to not disable sync.  Set to True if sync is not enabled
updateSchema = True                                           # True will remove/add fields from feature service keeping schema in-sync, False will not remove/add fields

# Start Timer
startTime = time.time()

# Create GIS object
print("Connecting to AGOL")
gis = GIS("https://www.arcgis.com", username, password)

# Create UUID variable for GDB
gdbId = str(uuid.uuid1())

# Function to Zip FGD
def zipDir(dirPath, zipPath):
    '''Zip File Geodatabase'''
    zipf = ZipFile(zipPath , mode='w')
    gdb = os.path.basename(dirPath)
    for root, _ , files in os.walk(dirPath):
        for file in files:
            if 'lock' not in file:
               filePath = os.path.join(root, file)
               zipf.write(filePath , os.path.join(gdb, file))
    zipf.close()

print("Creating temporary File Geodatabase")
gdb = arcpy.CreateFileGDB_management(arcpy.env.scratchFolder, gdbId)[0]

# Export featureService classes to temporary File Geodatabase
fcName = os.path.basename(fc)
fcName = fcName.split('.')[-1]
print(f"Exporting {fcName} to temp FGD")
if featureService == True:
    arcpy.conversion.FeatureClassToFeatureClass(fc, gdb, fcName)
elif hostedTable == True:
    arcpy.conversion.TableToTable(fc, gdb, fcName)

# Zip temp FGD
print("Zipping temp FGD")
zipDir(gdb, gdb + ".zip")

# Upload zipped File Geodatabase
print("Uploading File Geodatabase")
fgd_properties={'title':gdbId, 'tags':'temp file geodatabase', 'type':'File Geodatabase'}
fgd_item = gis.content.add(item_properties=fgd_properties, data=gdb + ".zip")

# Get featureService/hostedTable layer
serviceLayer = gis.content.get(fsItemId)
if featureService == True:
    fLyr = serviceLayer.layers[layerIndex]
elif hostedTable == True:
    fLyr = serviceLayer.tables[layerIndex]

# Truncate Feature Service
# If views exist, or disableSync = False use delete_features.  OBJECTIDs will not reset
flc = arcgis.features.FeatureLayerCollection(serviceLayer.url, gis)
hasViews = False
try:
    if flc.properties.hasViews == True:
        print("Feature Service has view(s)")
        hasViews = True
except:
    hasViews = False

if hasViews == True or disableSync == False:
    objectIdField = fLyr.manager.properties.objectIdField
    # Get Min OBJECTID
    minOID = fLyr.query(out_statistics=[{"statisticType": "MIN", "onStatisticField": objectIdField, "outStatisticFieldName": "MINOID"}])
    minOBJECTID = minOID.features[0].attributes['MINOID']

    # Get Max OBJECTID
    maxOID = fLyr.query(out_statistics=[{"statisticType": "MAX", "onStatisticField": objectIdField, "outStatisticFieldName": "MAXOID"}])
    maxOBJECTID = maxOID.features[0].attributes['MAXOID']

    # If more than 2,000 features, delete in 2000 increments
    if (maxOBJECTID - minOBJECTID) > 2000:
        print("Deleting features")
        x = minOBJECTID
        y = x + 1999
        while x < maxOBJECTID:
            query = f"{objectIdField} >= {x} AND {objectIdField} <= {y}"
            fLyr.delete_features(where=query)
            x += 2000
            y += 2000
    # Else if less than 2,000 features, delete all
    else:
        print("Deleting features")
        fLyr.delete_features(where="1=1")

# If no views and disableSync is True: disable Sync, truncate, and then re-enable Sync.  OBJECTIDs will reset
elif hasViews == False and disableSync == True:
    if flc.properties.syncEnabled == True:
        print("Disabling Sync")
        properties = flc.properties.capabilities
        updateDict = {"capabilities": "Query", "syncEnabled": False}
        flc.manager.update_definition(updateDict)
        print("Truncating Feature Service")
        fLyr.manager.truncate()
        print("Enabling Sync")
        updateDict = {"capabilities": properties, "syncEnabled": True}
        flc.manager.update_definition(updateDict)
    else:
        print("Truncating Feature Service")
        fLyr.manager.truncate()

# Schema Sync
if updateSchema == True:
    # Get feature service fields
    print("Get feature service fields")
    featureServiceFields = {}
    for field in fLyr.manager.properties.fields:
        if field.type != 'esriFieldTypeOID' and 'Shape_' not in field.name and 'GlobalID' not in field.name:
            featureServiceFields[field.name] = field.type

    # Get feature class/table fields
    print("Get feature class/table fields")
    featureClassFields = {}
    arcpy.env.workspace = gdb
    if hostedTable == True:
        for field in arcpy.ListFields(fc):
            if field.type != 'OID' and field.type != 'Geometry' and 'GlobalID' not in field.name:
                featureClassFields[field.name] = field.type
    else:
        for field in arcpy.ListFields(fc):
            if field.type != 'OID' and field.type != 'Geometry' and 'Shape_' not in field.name and 'GlobalID' not in field.name:
                featureClassFields[field.name] = field.type

    minusSchemaDiff = set(featureServiceFields) - set(featureClassFields)
    addSchemaDiff = set(featureClassFields) - set(featureServiceFields)

    # Delete removed fields
    if len(minusSchemaDiff) > 0:
        print("Deleting removed fields")
        for key in minusSchemaDiff:
            print(f"\tDeleting field {key}")
            remove_field = {
                "name": key,
                "type": featureServiceFields[key]
            }
            update_dict = {"fields": [remove_field]}
            fLyr.manager.delete_from_definition(update_dict)

    # Create additional fields
    fieldTypeDict = {}
    fieldTypeDict['Date'] = 'esriFieldTypeDate'
    fieldTypeDict['Double'] = 'esriFieldTypeDouble'
    fieldTypeDict['Integer'] = 'esriFieldTypeInteger'
    fieldTypeDict['String'] = 'esriFieldTypeString'
    if len(addSchemaDiff) > 0:
        print("Adding additional fields")
        for key in addSchemaDiff:
            print(f"\tAdding field {key}")
            if fieldTypeDict[featureClassFields[key]] == 'esriFieldTypeString':
                new_field = {
                    "name": key,
                    "type": fieldTypeDict[featureClassFields[key]],
                    "length": [field.length for field in arcpy.ListFields(fc, key)][0]
                }
            else:
                new_field = {
                    "name": key,
                    "type": fieldTypeDict[featureClassFields[key]]
                }

            update_dict = {"fields": [new_field]}
            fLyr.manager.add_to_definition(update_dict)

# Append features from featureService class/hostedTable
print("Appending features")
fLyr.append(item_id=fgd_item.id, upload_format="filegdb", upsert=False, field_mappings=[])

# Delete Uploaded File Geodatabase
print("Deleting uploaded File Geodatabase")
fgd_item.delete()

# Delete temporary File Geodatabase and zip file
print("Deleting temporary FGD and zip file")
arcpy.Delete_management(gdb)
os.remove(gdb + ".zip")

endTime = time.time()
elapsedTime = round((endTime - startTime) / 60, 2)
print("Script finished in {0} minutes".format(elapsedTime))

 

 

 

Update 3/3/2023:  Added the ability to add/remove fields from feature service keeping schemas in-sync.  For example, if a field(s) is added/removed from the feature class, it will also add/remove the field(s) from the feature service

 

Attachments
Comments

This method worked for me for a while but this week has started throwing a 500 error when it gets to using append(). Any ideas why that might be?

@A_Schwab after a quick test, I too am receiving this error.  I would recommend logging a case with Tech Support as I suspect this may be an issue with ArcGIS Online.  Try the following script as an alternative.

Thanks Jake. Good to know it's not just me. I've logged with tech support.

The issue that I experience is that at the end of the process, the zip file wont delete. Says that it is still in use. Any thoughts?

"PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\rossch\\Downloads\\996470ae-bd03-11ee-aaba-70d823122144.gdb.zip'"

@ChrisJRoss13 it could be a permissions issue for the scratch directory.  You can change this to another directory by updating line 43.  For example, change the below:

gdb = arcpy.CreateFileGDB_management(arcpy.env.scratchFolder, gdbId)[0]

 to something such as:

gdb = arcpy.CreateFileGDB_management(r"C:\temp", gdbId)[0]

Hi @JakeSkinner , I had actually changed the original scratch folder input to another directory, like you had indicated. This error is still occurring.

@ChrisJRoss13 did the error occur for both the scratch and other directory specified?

@JakeSkinner Yes, it occurred for both.

@JakeSkinner I'm running into an issue with the append step where no features are added, leaving the feature service empty. Everything else seems to work, though I did have to note out the same line that @ChrisJRoss13 had trouble with. If I note out the delete steps and check the file GDBs, everything is intact. I can "Publish" from the zipped GDB from the Portal item page and the layer it creates matches and has all of the features.

I did double check that the feature service supports appends which was set to true but noticed that it doesn't list a GDB as an accepted type:

"supportedAppendFormats": "shapefile,geojson,csv,featureCollection,excel,jsonl"

Would this cause the Append step to not add any features?

@Tim-Woodfield 

I can't say for sure why this is occurring, but if you are working with Portal, it may be just as fast to execute ArcGIS Pro's Delete Features and Append tools.

@JakeSkinner @MelissaJohnson @ashleyf_lcpud Sorry it took me a bit to respond! 

My issue seemed to be that line 141 of the script excludes fields with "Shape_" in the name.  At least in my case, my SDE database does not have an underscore in the Shape field name.  This was causing issues with sync and append of the shape fields.  

Since the script plays nicer with layers from a file geodatabase, I used the existing temp gdb to compare schema with ArcGIS Online. I also had to add a dictionary entry to account for SmallInteger field types.  Here is my modified Schema Sync section, picking up at Line 122 of the original script: 

# Schema Sync
if updateSchema == True:
    # Get feature service fields
    print("Get feature service fields")
    featureServiceFields = {}
    for field in fLyr.manager.properties.fields:
        if field.type != 'esriFieldTypeOID' and 'Shape_' not in field.name and 'GlobalID' not in field.name:
            featureServiceFields[field.name] = field.type

    # Get feature class/table fields
    print("Get feature class/table fields")
    featureClassFields = {}
    arcpy.env.workspace = gdb
    fc_export = os.path.join(gdb, fcName) ## Added variable for temp fGDB feature class
    if hostedTable == True:
        for field in arcpy.ListFields(fc_export): ## using fgdb feature class rather than SDE fc
            if field.type != 'OID' and field.type != 'Geometry' and 'GlobalID' not in field.name:
                featureClassFields[field.name] = field.type
    else:
        for field in arcpy.ListFields(fc_export): ## using fgdb feature class rather than SDE fc
            if field.type != 'OID' and field.type != 'Geometry' and 'Shape_' not in field.name and 'GlobalID' not in field.name:
                featureClassFields[field.name] = field.type

    minusSchemaDiff = set(featureServiceFields) - set(featureClassFields)
    addSchemaDiff = set(featureClassFields) - set(featureServiceFields)

    # Delete removed fields
    if len(minusSchemaDiff) > 0:
        print("Deleting removed fields")
        for key in minusSchemaDiff:
            print(f"\tDeleting field {key}")
            remove_field = {
                "name": key,
                "type": featureServiceFields[key]
            }
            update_dict = {"fields": [remove_field]}
            fLyr.manager.delete_from_definition(update_dict)

    # Create additional fields
    fieldTypeDict = {}
    fieldTypeDict['Date'] = 'esriFieldTypeDate'
    fieldTypeDict['Double'] = 'esriFieldTypeDouble'
    fieldTypeDict['Integer'] = 'esriFieldTypeInteger'
    fieldTypeDict['SmallInteger'] = 'esriFieldTypeSmallInteger' ## Added dict entry for SmallInteger
    fieldTypeDict['String'] = 'esriFieldTypeString'
    if len(addSchemaDiff) > 0:
        print("Adding additional fields")
        for key in addSchemaDiff:
            print(f"\tAdding field {key}")
            if fieldTypeDict[featureClassFields[key]] == 'esriFieldTypeString':
                new_field = {
                    "name": key,
                    "type": fieldTypeDict[featureClassFields[key]],
                    "length": [field.length for field in arcpy.ListFields(fc_export, key)][0] ## using fgdb feature class rather than SDE fc
                }
            else:
                new_field = {
                    "name": key,
                    "type": fieldTypeDict[featureClassFields[key]]
                }

            update_dict = {"fields": [new_field]}
            fLyr.manager.add_to_definition(update_dict)


Jake, I'd be happy to share a dataset with you, but I believe my issues stemmed from using an SDE layer as the input, so I'm fairly confident that exporting my SDE data to a file gdb would mask the problem.

Hope this helps! 

@ChrisJRoss13 I am running into the same issue you are, did you ever figure it out? I currently using this to overwrite four feature services, and two of the four scripts run with no error, but the other two also get the PermissionError. 

I can't figure out why two of them fail, there is no reasonable reason I can see, though the script does still append and do everything else fine, it just leaves a scratch gdb at the end. It just make it difficult to automate since the script fails task scheduler will give me an error code and I have to manually check to be sure the append succeeded. 

@Chase_RSO try deleting the scratch folder at C:\Users\<useraccount>\appdata\local\temp.  Recreate this directory, and then add Full Control to  your account to the scratch folder by right-clicking on the folder > Properties > Security.

@JakeSkinner Thank you for the suggestion. I tried that no luck, I continue to get the same error message.

@JakeSkinner  and @Chase_RSO   I am getting that same error as well.  Changed the scratch directory to c:\temp. . . same error,  I have full privileges on my machine.  

@Chase_RSO @SallyBickel what version of Pro are you running?

@JakeSkinner Pro 3.2.0  here

@Jake 3.2.2

@SallyBickel @Chase_RSO what IDE are you using to execute the script?  Try running the IDE as an Administrator (i.e. right-click on IDE > Run As Administrator).  Do you still receive the same error?

@JakeSkinner using PyScripter here and yes same results Run as Administrator.

Version history
Last update:
‎12-07-2023 09:34 AM
Updated by:
Contributors