What is the proper workflow to recreate data everyday?

5205
3
Jump to solution
02-12-2015 09:40 AM
TomMagdaleno
Occasional Contributor III

We have a company wide database (it’s Sungard HTE aka AS400).  We have another program that pulls data out
of those databases and makes shapefiles.
That’s all it can output to, shapefiles or SDE.  So I have it creating new shapefiles every morning.   I have
ArcCatalog on our Server and a model in ArcCatalog which will

Delete the previous days file geodatabase HTE.gdb

Create a new Geodatabase HTE.gdb

Import shapefiles into HTE.gdb

This Geodatabase is used by users in ArcGIS Server applications and ArcReader.

Because I have been doing so many changes and experiments I
find it easier to have a batch file call modelbuilder and run it instead of
exporting to a new python file every time I make a change. 

Unfortunately I have had nothing but problems with this workflow. 

  The current problem that is making me pull my hair out is the previous nights HTE.gdb will not delete.  I look in the geodatabase in
windows explorer and sometimes I see lock files from people who left their
computer on and ArcReader open, or a lock file from ArcGIS Server.  I made a batch file to stop services on the
server that hit that file geodatabase but even that does not work.  Often with no lock file it still will not
delete.  Other times I can get people to exit GIS applications and the lock file will remain.  The only night it works cleanly is the night
we do windows patches and everyone’s computer is forcibly rebooted.

What are other people doing?
What is the proper workflow to overwrite data on a nightly basis? 

0 Kudos
1 Solution

Accepted Solutions
JakeSkinner
Esri Esteemed Contributor

Hi Tom,

If you are importing the same shapefiles into the HTE.gdb everyday, you should take a look at using the Truncate Table and Append tools.  These tools will work whether there are locks or not.  The tools will simply delete all rows in the feature classes, and then append new rows.

View solution in original post

3 Replies
JakeSkinner
Esri Esteemed Contributor

Hi Tom,

If you are importing the same shapefiles into the HTE.gdb everyday, you should take a look at using the Truncate Table and Append tools.  These tools will work whether there are locks or not.  The tools will simply delete all rows in the feature classes, and then append new rows.

BlakeTerhune
MVP Regular Contributor

I have a similar situation. I just tested using truncate and append on a file geodatabase with locks. If it was just a geodatabase lock (like some had it open in ArcCatalog) the truncate/append worked. However, if someone had the feature class being truncated open in ArcCatalog preview, the truncate worked but the append failed with

ERROR 000224: Cannot insert features

Failed to execute (Append).

I tested this on 10.2.2 using Pyhon.

EDIT:

If the feature class is opened in ArcCatalog with geometry preview, truncate/append works. If it's open with table preview, truncate works but append fails. Same thing happens with ArcMap; if the feature class is open in data view it will truncate/append but if the attribute table is also open it will truncate but fail to append.

0 Kudos
TomMagdaleno
Occasional Contributor III

Thank you both for the replies so far.  I wasn't clear earlier but the daily updates are not just the changes, it is everything recreated from scratch.  So to prevent double data the old will need to be deleted first.

I'm waiting until 5:00 so I can go nuclear and stop the ArcGIS server in order to delete the old hte.gdb and recreate it.  After that I will try to build a model to do what you suggested Jake.  I tried it out on another dataset and it went pretty fast.  I was afraid it would take a long time to delete rows of data and then append new data to the empty feature class.  Hopefully it will let me delete the rows of old data

0 Kudos