Add Data Store database hosted on amazon web services

902
8
11-07-2023 02:12 PM
DavidEvans5
New Contributor III

Hi,

I'm trying to do something that may not be possible in our Enterprise Portal (version 11.0). We have a feature service that I would like to publish with a table join to a table containing live customer billing data hosted in an Amazon Web Services (AWS) database. This AWS database is not managed by me or my organization, I just have a URL and credentials that I've used to connect to using the Microsoft OLE DB connection for SQL Server.

I'm trying Add the database as a data store to our Portal in ArcGIS Pro to a machines we have set up for hosting that is federated with our Portal, but when I do this I get an error informing me there was an Error validating with the server (see attached image). 

I've read through the documentation for Adding a database data store item on ESRI's website which says that I need to have a .SDE database connection file. As mentioned above I don't have direct access to the AWS site and before I go down a rabbit hole with the AWS/billing software's IT department I wanted to see if anyone here could advise me on whether adding an AWS database as a datastore is even possible in Enterprise, and if not if there's a different method we could employ to allow us to publish a feature service  linked to that billing data.

Thank you all for any assistance/expertise you can provide

0 Kudos
8 Replies
Scott_Tansley
MVP Regular Contributor

It sounds like your other databases and ArcGIS Enterprise are not in AWS?  If that's the case then technically it would be possible if you create an SDE file.  However, it's not good architectural practice.  Being very blunt, it's one of those things where it may look fun to jump into the river from a bridge, but it's most likely very dangerous.

What would happen is that you're ArcGIS Server would receive a request, get the data from your local feature class and then have to navigate the internet to AWS to get the other data.  While ArcGIS Server is waiting for that response, it's waiting, and potentially causing a bottleneck for other requests.  

From a performance and reliability perspective you'd be much better having all the data as close to the ArcGIS Server as possible to maximise throughput.  Can you build a process that refreshes the data from AWS into your local SQL Server on a regular basis?  You'd get better performance and an easier Esri config if you can.

 

Scott Tansley
https://www.linkedin.com/in/scotttansley/
DavidEvans5
New Contributor III

Thank you Scott for your candid and very illuminating response. I suspected that the solution might be to  develop some kind of script to take an image of the AWS table every night and write that to the SQL geodatabase within our enterprise. I have not had any experience with creating this type of process myself. You wouldn't happen to know of any resources (tutorials, sample code, etc.) that might help me get started on this track? I'm hoping I might find something within the esri community forums, though it can sometimes can be a task trying to narrow down search terms to find what you're looking for!

0 Kudos
Scott_Tansley
MVP Regular Contributor

Hi,  Sorry, I don't develop very much these days as I'm focussed on the infrastructure side of things.  My go to at this point would actually be something like Data Interop or FME rather than a script.  The visual design tools are a lot easier to comprehend for non-developers and are effectively self-documenting, so that's the route I recommend to most clients.  🙂  

Scott Tansley
https://www.linkedin.com/in/scotttansley/
DavidEvans5
New Contributor III

Thanks again Scott. I have had very limited exposure to both the Data Interoperability extension and FME but would love to learn more about their capabilities. I have been trying out the approach you recommended and have created a Model Builder function that can copy the table from the AWS server to our local geodatabase. My next challenge/question to the community is how do automate that Model Builder function to make it run every night? Is that even possible with Model Builder in ArcGIS Enterprise?

I'm also abandoning my original plan to use publish a point feature service with a table join (to the table that's now been copied into our SQL database), in favor of a Database View, something I also have very little experience with, leading to a second community posting here. I think this makes more sense since the table will always have the lat/long coordinates for each account location while the point feature would require constant updating. 

Again anyone who created a process like this that can share their thoughts/recommendations will have my sincere appreciation and kudos.

0 Kudos
Scott_Tansley
MVP Regular Contributor

Assuming you've made the move to Pro then you can schedule models:

https://pro.arcgis.com/en/pro-app/latest/help/analysis/geoprocessing/modelbuilder/scheduling-a-model... 

If by saying the table will always have lat/longs then you're appearing to talk about using an X/Y Event layer.  These things have no spatial indexing and perform very badly.  It would likely be worse performing than the AWS route you suggested originally.  If the Lat/Longs are in a geometry/geography column in the database then you'll be find as long as you have a database indexing strategy in place.

Scott Tansley
https://www.linkedin.com/in/scotttansley/
0 Kudos
DavidEvans5
New Contributor III

Yikes, and I was thinking that I had hit on a better solution, not a worse one. It doesn't look like the table being copied from AWS to our geodatabase has spatial indexing (at least not for lat/long coordinates fields), since I'm using the "Export Table" geoprocessing function and the lat/long coordinates are being stored in regular Double format and there's no "*" next to the field name like there is for the OBJECTID. 

Is there another work-around to this such as adding spatial indexing after the fact to the table (if that's possible) or even creating duplicate lat/long fields to the table coordinates could be copied into?

Thank you, Thank you, Thank you for all the advice so far.

 

 

0 Kudos
MarceloMarques
Esri Regular Contributor

If you are working with an Esri Geodatabase then you need to take a database backup and restore on premises that would be the easy solution or use ArcGIS tools to copy the featureclass or table from the Geodatabase A to Geodatabase B, you can even use a File Geodatabase as intermediate if the source data is too large and cannot be copied over the internet in the case your geodatabase is in AWS. There are plenty of tools in ArcGIS to do this, look in the the Geoprocessing Tools in ArcGIS Pro and to automate the process you can use ArcGIS Pro Model Builder, Python ArcPy Script, ArcGIS Pro Data Interoperability Extension or Safe Software FME Workbench.  

| Marcelo Marques | Principal Product Engineer | Esri |
| Cloud & Database Administrator | OCP - Oracle Certified Professional |
I work with Enterprise Geodatabases since 1997.
“ I do not fear computers. I fear the lack of them." Isaac Isimov
0 Kudos
Scott_Tansley
MVP Regular Contributor

What you have described to me is an aspatial table in a database, not a geodatabase, that has attributes and spatial properties in the form of two double fields, one for Lat and one for Long.  This needs to be consumed in the GIS.  In a pure desktop world, we would have introduced this to the Table of Contents as a X/Y Event layer, and then to get some form of performance, you would right click on the layer and 'Export As'.

In logical steps this is what you need to do here.  Two doubles cannot be indexed spatially, literally they're just numbers and not shapes or geometries.  There are many ways of doing all of this.  

If you have someone skilled in SQL then you could create a table that has the same schema as AWS, but has a geography/geometry column.  Inserts/Updates would then update the 'shape' with the attributes of your Lat/Long.  It's just SQL.  You can then use a Query Layer.

If you've got a developer, you could look to use Python to do the same thing.  Theoreticlly, you can do it with model builder as well, but I haven't touched it in years...  Sorry.

Otherwise, and the way I always recommend is to use Data Interop or FME.  

My skillset these days is architecture and IaaS/Cloud and deploying Esri into that, rather than writing scripts etc.  😞  As much as I miss that.  I'm probably not the best person to help you here, but X/Y Event layers do not perform well in ArcGIS Server and I cannot recommend them.  They create bottlenecks and performance issues.  

 

Scott Tansley
https://www.linkedin.com/in/scotttansley/
0 Kudos