Is there a way to customize or extend Collector to geoprocess newly added features?

2672
9
10-19-2016 05:08 PM
WillAllender3
New Contributor II

I don't want my field users to waste time or type errors by entering data that is available in other layers available in my enterprise GDB. I also want to process some if-then scenarios.

Agricultural use case work flow:

1. User enters a polygon and enters 12 attributes, all from drop down lists (domains), except for acres, which can be hand entered in special cases.

2. For each new polygon feature, I want to know the County that intersects the centroid of the polygon (for reporting purposes).  I can pull that from my Counties layer by using a spatial join or select by location and the arpy.da module.

3. The polygon may or may not be inside of some jurisdictional boundaries.  If it is IN, do geoprocess 1.  If it is OUT, do geoprocess 2.

4. The user may have entered a specific number of acres as an attribute to the polygon.  If they enter it, use that value.  If they don't enter it, calculate acres from the polygon shape area.

etc., etc. etc.

I currently have a Python script running every 10 minutes, directly against the enterprise feature class in SQL Server. But I worry about race conditions if Collector activity scales up. ( It currently takes 6 minutes to process 3,400 features)

9 Replies
ScottFierro2
Occasional Contributor III

Depending on your SQL abilities this should resolve it for you and not be overly difficult to accomplish and done as a trigger or stored procedure depending on your needs and performance speeds. I'd lean toward a trigger but we have seen mixed results with those in conjunction with Collector usage.

In the trigger scenario build it as an on insert so as soon as the user commits the record to the DB the actions are performed eliminating batch processing concerns. https://msdn.microsoft.com/en-us/library/ms189799.aspx

For the trigger action to resolve use the centroid piece STCentroid https://msdn.microsoft.com/en-us/library/bb933847.aspx and then seems like you'd want to use STIntersect() for the rest and that solves #2 and #3. https://msdn.microsoft.com/en-us/library/bb933899.aspx

#4 is just an IF statement and perform a check on the field. Assuming the user is unable to enter a blank space (field should use a numeric, integer, etc. data type and not varchar) then you can assume any entry you will accept so all you are actually needing is to do a check for IS NULL and if that = true then perform STArea https://msdn.microsoft.com/en-us/library/bb933923.aspx

JadeFreeman
Occasional Contributor III

We've implemented many triggers and this can work but you need to be really careful if your are using archiving or versioning.  We've had less success in those scenarios and are starting to look at SOIs.

0 Kudos
ScottFierro2
Occasional Contributor III

Right, our use of triggers has been great and we use archiving on all our editable tables within Collector. Only hiccups we hit was trying to use triggers to perform system calls for things like pulling the system time.

0 Kudos
JadeFreeman
Occasional Contributor III

I don't think that would be possible in Collector itself but I imagine that would be possible with a Server Object Interceptor.

  • Post-processing responses—Additional information from separate business systems not supported by ArcGIS Server could be added to outgoing responses in order to join spatial data with other types of business intelligence data.

I'll be interested to hear what others say!

0 Kudos
ScottFierro2
Occasional Contributor III

Been reviewing these but thus far we haven't pursued using them. In part we haven't done a backend version upgrade to make use of them and in part because our experience has shown the DB will always process far faster if it can do things natively. We have some instances where we set a Windows Task to call a python script that performs tasks, submits call to DB stored procedure, gets returns and performs final tasks back within python just because of the performance gains from letting the DB do what it's designed to do.

Again, lots of variables and all depends on need, architecture, expertise, etc. etc. but it does sound like an SOI could handle the task.

JadeFreeman
Occasional Contributor III

I do agree, I would prefer to use the database if possible, that way the same business logic can be enforced regardless of client among other reasons.

JadeFreeman
Occasional Contributor III

Biggest reason we are looking at them is that it seems different ArcGIS clients (Collector, ArcMap, AGOL/Portal web maps) handle sending new features and updates to the database differently.  For instance, ArcMap and the web maps seems to create a new blank record in the db as soon as you add the feature to the map where as Collector will only do this once the user hits submit and therefore the record will contain data for any attribute fields the user entered.  Depending on what you are trying to do, it can be tricky to trap specific scenarios. 

It would be nice if there was documentation out there on how different clients handle CRUD operations but unfortunately, I haven't been able to find it!

ScottFierro2
Occasional Contributor III

Couldn't agree more. Drives us nuts with some web apps we had tried to enforce a NO NULL scenario which is impossible since it writes a geometry only record and the DB kicks it out. Hadn't considered these as a workaround to that but will have to look at it now.

0 Kudos
WillAllender3
New Contributor II

Thanks guys!  Good discussion.  And yes, our internal discussions have revolved around 1) Python batch processing, 2) Database triggers, 3) Server Object Interceptors, and even a wild idea #4) make use of our under-utilized GeoEvent Server.  We're launching next week with Python batch processing, but will take a closer look at DB triggers as an upgrade for scalability. We are archiving our feature classes, so I will do a deep dive on testing this.  Thanks.