Geocoding a large list with no address - only city and state

1218
7
12-14-2012 12:09 PM
SteveCline
Occasional Contributor
I am trying to geocode 58,000+ addresses which only contain city and state, no address.  After a few hundred I am getting an error returned that says "There was an error trying to process this table."  It does not always happen at the same point.  Any suggestions for troubleshooting?
Tags (2)
0 Kudos
7 Replies
BradNiemand
Esri Regular Contributor
There are usually two explanations to this issue.

1. You are getting a timeout from the server if you are hitting a geocoding service.
2. You have a bad value(s) in your input table or the input table column names have a space.
3. Your output is a shapefile and the 2 GB limit was exceeded.

Brad
0 Kudos
LarsHansen
New Contributor
I am getting the same issue, most of the times it bombs out around the 40K mark.  I am not hitting the file size limit and I have scrubbed the data to ensure there are no bad values.  Any suggestions?  I need to add around 1.4M addresses for my current project.
0 Kudos
JoeBorgione
MVP Emeritus
I am trying to geocode 58,000+ addresses which only contain city and state, no address.  After a few hundred I am getting an error returned that says "There was an error trying to process this table."  It does not always happen at the same point.  Any suggestions for troubleshooting?


I am getting the same issue, most of the times it bombs out around the 40K mark.  I am not hitting the file size limit and I have scrubbed the data to ensure there are no bad values.  Any suggestions?  I need to add around 1.4M addresses for my current project.


I'd suggest to both of you to split out your files and geocode the subsets; if things bail out then and all the time, it's a memory issue or something not data related.  However, if you can geocode a couple of subsets and others bail, it points to the data. 

With all due respect, 1.4 million records is a lot of records for one bad one to hide in and so is 58k for that matter....

Hope this helps-
That should just about do it....
0 Kudos
KimOllivier
Occasional Contributor III
I am getting the same issue, most of the times it bombs out around the 40K mark.  I am not hitting the file size limit and I have scrubbed the data to ensure there are no bad values.  Any suggestions?  I need to add around 1.4M addresses for my current project.


I can happily geocode 1.6M addresses (full addresses) in 25 minutes with no problems in one process on a local PC.

Maybe avoid a network share?
I hope you are using a 10.x locator?
Since you have scrubbed the data, it must be something else.
At 10.1 you can tune the locator, memory allocated etc.
You might get better performance by sorting on the required fields first.
0 Kudos
jp
by
New Contributor II
Hi Kim,

I too have a table with just village, district, state in India. I can search this in ArcMap manually. I was able to find some addresses like this. But I try to geocode the table this is the error - There was an error trying to process this table. Token required.

Do we need any license to the online geocode service?

Thanks!
0 Kudos
KimOllivier
Occasional Contributor III
Esri have changed their online geocoding service from free to paid. If we are all moving to the Cloud we will have to pay.
The individual search to find a single location remains free which is why it worked on a test.
Google do the same except with a few more, free search on a few, but a cutoff at 50,000 addresses.

My performance was quoting using my own reference data to build a private locator on a local computer.
If you have access to a reference dataset you can build your own locator, but it is so difficult most people would rather buy the service.
0 Kudos
SteveCline
Occasional Contributor
I'd suggest to both of you to split out your files and geocode the subsets; if things bail out then and all the time, it's a memory issue or something not data related.  However, if you can geocode a couple of subsets and others bail, it points to the data. 

With all due respect, 1.4 million records is a lot of records for one bad one to hide in and so is 58k for that matter....

Hope this helps-


It has been a while since I saw this thread.  I did take this advice though.  I was able to break the data set down from a national level to a state level.  Each state geocoded just fine.  Thanks for the advice.