Preserving original FID values when using Table to Table python tool

172
0
02-25-2019 11:13 AM
JacobHolcombe
New Contributor

I am trying to move a table from an ODBC database that I have the ability to extract the tables from but nothing else into a File Geodatabase so that I can join it to an existing layer. The problem is coming up that when I move the table from the existing ODBC to my File Geodatabse the field "Cust_No" , which is the unique identifier field in the source table, is being re-written when it gets transferred over to the geodatabase. Rather than preserving the existing values it just renumbers them so the final records value is equal to the number of records that were transferred across. So if the last value was say 18341 but there were only 16000 records it has just reset this field so its 1-16000 in order. Unfortunately this is the field that I need to do my table join on and this messes things up as you can imagine.

When I run the tool from the Table to Table tool in the toolbox I can correct for this issue in the Field Map by adding a new field, and mapping the original "Cust_No" field into it, thereby preserving the values I need. If I output this into a python snippet then I can still get the result that I need.

Here is my question though, I am being picky and the resulting string is long and ugly. The real question here is from a python script how do I move the table from its source to my Geodatabase for future manipulation and preserve the original values in the "Cust_No" field. I saw some stuff on Field Maps but I couldn't get it to work, I was getting an error stating "No such file or directory: '(working file path here)'

Tags (2)
0 Kudos
0 Replies