Updating SQL Server table with Geocodes

1884
3
06-22-2017 01:07 PM
EricAnderson14
New Contributor III

In our environment we have a ArcGIS 10.5 in a federated environment. For security reasons, we are using a Portal for our Web delivered maps instead of ArcGIS Online. Some of our maps are based off of SQL Server database tables. One of our tables has a column for Latitude and longitude. The table comes from another source. The other source is only able to geocode about 60 percent of the addresses. Through some hacking, we have been able to figure out that we are able to geocode the remainder of the addresses using the Geocoding tool in ArcMap. The process we are having to go through is as follows:

  1. Using SSIS, select all records with ‘null’ value in Lat_long field and place them in a CSV file.
  2. Using a Python script, execute the ‘Geocode Address (Geocoding) tool’ to geocode the addresses selected based on a Composite location file. This will result in a shape file.
  3. Extract information from the Shape file and place it in a CSV file using the 'Export Data tool' that is executed by a Python script.
  4. Using SSIS and the CSV file from step 3, update SQL table rows without geocodes, so that they now have values.

I'm wondering if anyone has come up with an easier way to do this.

0 Kudos
3 Replies
MicahBabinski
Occasional Contributor III

Hi Eric,

How many records have null Lat_Long values? If it's not too many, I recommend using pyodbc to select the address column(s) from records in your SQL table with Null Lat_Long values. Then for each row returned by your selection, you can write the address fields to an in_memory table with a single row. Geocode that table and output to an in_memory feature class with one point feature. Then, you can read the Lat/Long values from the feature class using an arcpy.da.SearchCursor and update your SQL Server table with that value/values.

The advantage is all the work would happen in-memory and so you wouldn't have to deal with an actual CSV and Shapefile being on your disk.

Hope this helps!

Micah

0 Kudos
EricAnderson14
New Contributor III

Micah,

Thank you so much for your response. I'll be dealing with anywhere from 30,000 to 80,000 rows. But, your concept definitely bears looking into.

I really appreciate your input.

- Eric -

0 Kudos
GiacomoFavaron
New Contributor III

Eric,

Do you mind give an update if the in memory solution worked for you? I have to do something similar with a similar number of rows.

Thank you

0 Kudos