Insert Fails at 1000 records

2513
5
06-11-2010 10:34 AM
MichaelFischer
New Contributor II
I'm trying to insert a number of points into an ArcSDE 9.3.1 gdb, in an edit session

I've found that at 1000 features, I get an exception  HRESULT: 0x8005018B

This only happens on SDE .. a personal GDB is fine.

I've tried flushing the load cursor at 500 record intervals, but no difference.

Any ideas? 

Thanks
0 Kudos
5 Replies
AlexanderGray
Occasional Contributor III
Right off the bat seems like a problem with SDE or the RDBMS.  Sounds like there is a buffer or temp table filling up or that has a max set on it.   I am not really an SDE or Oracle guru but I would look in that direction.
0 Kudos
KirkKuykendall
Occasional Contributor III
are you doing your inserts between IEditor.StartOperation and IEditor.StopOperation?
0 Kudos
JamesGonsoski
New Contributor III
It may or may not have anything to do with your issue, but check this help page on ArcSDE Initialization Parameters. Note the default size of the AUTOCOMMIT value.
0 Kudos
JohnHauck
Occasional Contributor II
What type of underlying database? Is SDE also at 9.3.1? Can you post some example code? I tried a quick test with SQL Express and didn't see a problem with inserting 100,000 features.
0 Kudos
RemigijusPankevicius
New Contributor
Must be that Oracle AUTOCOMMIT that is 1000 by default.
http://proceedings.esri.com/library/userconf/proc01/professional/papers/pap869/p869.htm

I've seen examples how to do that bulk inserts without DB tuning, at the cost of leaving transaction.
0 Kudos