Truncate and Append Hosted Parcel Fabric

198
4
Jump to solution
2 weeks ago
DenverBilling
New Contributor III

Hello,

I have a parcel fabric in an enterprise geodatabase, and I would like to overwrite the existing singlepart polygons with multipart polygons of the same data.

What is the best workflow to 'truncate and append' the existing parcels with the new multipart parcels? Truncate and Append does not appear to work with a hosted parcel fabric. Will I need to export the parcel fabric and edit it in a fGDB instead? Looking for a work around!

Thank you,
Denver

0 Kudos
1 Solution

Accepted Solutions
AmirBar-Maor
Esri Regular Contributor

Hello @DenverBilling 

The term "hosted" usually refers to storing the data in the Enterprise's internal self-managed datastore.

If your data is on an enterprise geodatabase (using a DBMS), you cannot edit the data directly. We recommend performing all the data migration on a file geodatabase, then copying it to the Enterprise geodatabase,  and publishing it for multi-user editing (branch versioning).

Using the geoprocessing Truncate on a table that is controlled by a geodatabase topology will not work. Instead,  use the geoprocessing tool 'Delete Rows'.

So the workflow might look like this:

  1. Copy the data to a file geodatabase: copy the entire feature dataset is the preferred way.
  2. Use geoprocessing tool Delete Rows to empty it
  3. Append the data to to the empty table
  4. Copy the feature dataset containing the parcel fabric back to the enterprise geoedatabsase.

If you have a very large amount of data you can consider running the geoprocessign "Disable Parcel topology' before deleting the rows and appending the data, append the data and then run geoprocessing tool Enable Parcel Topology.
Let us know if this worked for you

 

View solution in original post

4 Replies
AmirBar-Maor
Esri Regular Contributor

Hello @DenverBilling 

The term "hosted" usually refers to storing the data in the Enterprise's internal self-managed datastore.

If your data is on an enterprise geodatabase (using a DBMS), you cannot edit the data directly. We recommend performing all the data migration on a file geodatabase, then copying it to the Enterprise geodatabase,  and publishing it for multi-user editing (branch versioning).

Using the geoprocessing Truncate on a table that is controlled by a geodatabase topology will not work. Instead,  use the geoprocessing tool 'Delete Rows'.

So the workflow might look like this:

  1. Copy the data to a file geodatabase: copy the entire feature dataset is the preferred way.
  2. Use geoprocessing tool Delete Rows to empty it
  3. Append the data to to the empty table
  4. Copy the feature dataset containing the parcel fabric back to the enterprise geoedatabsase.

If you have a very large amount of data you can consider running the geoprocessign "Disable Parcel topology' before deleting the rows and appending the data, append the data and then run geoprocessing tool Enable Parcel Topology.
Let us know if this worked for you

 

DenverBilling
New Contributor III

Thanks Amir,

You're right, I meant published to portal by reference in an eGDB, not hosted.

I was able to 'Copy' the feature dataset to a fGDB and make my edits by Deleting Rows and Append.

Now that I want to copy the feature dataset back to the eGDB, I would like to overwrite the existing feature dataset. The Copy tool creates a new separate version, even though they have the same name. What is the best practice in this instance? Do I need to remove the existing feature dataset before Copy?

-Denver

0 Kudos
AmirBar-Maor
Esri Regular Contributor

@DenverBilling 

Overwriting feature classes can be dangerous... I usually delete the feature dataset on my enterprise before copying the updated feature dataset from staging. Then you can follow the steps to publish it for editing. 

This process is well documented here: https://pro.arcgis.com/en/pro-app/latest/help/data/parcel-editing/workflow-publishpf.htm

 

0 Kudos
DenverBilling
New Contributor III

Thanks for the hand-holding @AmirBar-Maor , this worked for my needs!

-Denver