We have a company wide database (it’s Sungard HTE aka AS400). We have another program that pulls data out
of those databases and makes shapefiles.
That’s all it can output to, shapefiles or SDE. So I have it creating new shapefiles every morning. I have
ArcCatalog on our Server and a model in ArcCatalog which will
Delete the previous days file geodatabase HTE.gdb
Create a new Geodatabase HTE.gdb
Import shapefiles into HTE.gdb
This Geodatabase is used by users in ArcGIS Server applications and ArcReader.
Because I have been doing so many changes and experiments I
find it easier to have a batch file call modelbuilder and run it instead of
exporting to a new python file every time I make a change.
Unfortunately I have had nothing but problems with this workflow.
The current problem that is making me pull my hair out is the previous nights HTE.gdb will not delete. I look in the geodatabase in
windows explorer and sometimes I see lock files from people who left their
computer on and ArcReader open, or a lock file from ArcGIS Server. I made a batch file to stop services on the
server that hit that file geodatabase but even that does not work. Often with no lock file it still will not
delete. Other times I can get people to exit GIS applications and the lock file will remain. The only night it works cleanly is the night
we do windows patches and everyone’s computer is forcibly rebooted.
What are other people doing?
What is the proper workflow to overwrite data on a nightly basis?