Any advice/experience with large number of Feature Classes in a File GeoDatabase?

309
3
02-15-2024 07:20 AM
DuncanHornby
MVP Notable Contributor

I have a general question about best practise with file geodatabases. 

I'm currently building a file geodatabase which will contain approximately 2,500 feature classes where each feature class will contain 1 million rows (cells from a vector grid created by the fishnet tool). 

I have subsequently run compact tool on file geodatabase which reduced the database down by approximately 50GB then I ran the compress tool to further reduce data volume.  I'm current copying into this database about 400 compressed feature class. This is chuggin' away and taking forever.

So my question to the user community is there a sensible number of feature classes one should have in a file geodatabase, after which performance degrades?  Are people thinking 2,500 featureclasses in a single file geodatabase, that's a crazy number and not surprised copying into it is taking for ever.  Has anyone got any experiences to tell and how they resolved it? An obvious solution is to split out the featureclasses into a set of geodatabases.

Tags (1)
3 Replies
Ed_
by MVP Regular Contributor
MVP Regular Contributor

I am not a FGDB expert, but speed may vary on SDE vs local disk (SDE is slower than local). Moreover, locally, a fast NVMe SSD may also help speed up things. 

Question | Analyze | Visualize
VinceAngelo
Esri Esteemed Contributor

Put me in the "crazy number" camp. It's also suboptimal for enterprise geodatabase. The tools are optimized for rows, not for tables. When a Windows "list directory contents" request takes three minutes to complete, there's no chance that row access will be fast.

2500 feature classes is too many, even for 100 file geodatabases. Far better to have 25 tables with 100m rows each.

- V

DuncanHornby
MVP Notable Contributor

Your advice is very much appreciated!

0 Kudos