POST
|
I spent a couple of weeks with Esri Technical Support to determine what the problem was. Short answer: You MUST install the corresponding ArcGIS Desktop (ArcMap) on the same server, but you don't need to license it. The issue has been resolved in ArcGIS Server/Enterprise 10.5.1. Long Answer: We couldn't duplicate it until I created a test VM and ran through many iterations of installing, uninstalling, upgrading, and VM snapshotting. I eventually found that everything works if ArcMap is installed and fails if ArcMap is uninstalled. Apparently the ArcMap installation includes some DLLs and/or registry settings that are needed for some script/custom evaluators that the ArcGIS Server install does not include. I tried doing some filesystem and registry comparisons before and after installs to find the differences and create a small "patch", but that was taking too much time. For now, a full 3GB install of unnecessary 32-bit DLLs and other stuff is still needed. They said they would respond to this thread, but I guess it fell off their radar. Answer from Esri after I identified the problem (July 20, 2017): 1.) The defect that we identified, BUG-000101482 - A network analysis service fails to solve routes when the cost attribute uses a script evaluator with AttributeValueByName, if the server machine only has ArcGIS for Server installed, is the root cause of the issue according to my investigation. Though the synopsis seems differently titled and the attribute used is a cost attribute, the problem that occurs because of this defect happens when you use your script evaluator for ZDefaultNoTurns. The resolution to this defect, which is the problem, is in 2 parts: 1.) Install ArcMap on the same machine as the ArcGIS Server machine, OR, 2.) Upgrade ArcGIS Server on the production machine to version 10.5.1.
... View more
01-18-2018
07:43 AM
|
1
|
1
|
711
|
POST
|
I have a complex multi-modal network dataset with some scripted evaluators defined for edges and default turns. The route generation and evaluators work perfectly on our development ArcGIS Server 10.3.1. However, when deployed to a hardened production server, the default turn evaluators fail miserably with the following error in the MapServer log: System.Runtime.InteropServices.COMException (0x80042270): The evaluator failed to return a value. [Attribute: NoBargeToTruck, Default Turns, OID = -1, EID = -1]. Network element evaluator error. [Script Control Error -2147352319]. In the Server/framework/etc/service/logs/service.log, I am seeing this Java stack trace: :ExecGroup-1488:err:Exception in thread "SOThread" java.lang.RuntimeException: java.lang.NoSuchFieldError: a
:ExecGroup-1488:err: at com.esri.arcgis.discovery.servicelib.impl.SOThread.run(SOThread.java:476)
:ExecGroup-1488:err:Caused by: java.lang.NoSuchFieldError: a
:ExecGroup-1488:err: at com.esri.arcgis.interop.StdObjRef.b(Unknown Source)
:ExecGroup-1488:err: at com.esri.arcgis.interop.Cleaner.releaseAllInCurrentThread(Unknown Source)
:ExecGroup-1488:err: at com.esri.arcgis.discovery.servicelib.impl.SOThread.b(SOThread.java:1178)
:ExecGroup-1488:err: at com.esri.arcgis.discovery.servicelib.impl.SOThread.run(SOThread.java:414) I've turned off all restrictions except for one and through various trials, I have found that this works: Body:
Dim prohibit
prohibit = false
Value:
prohibit But this fails: Body:
Dim prohibit
prohibit = false
dim test
test = Turn.EID
Value:
prohibit
Accessing ANY of the network elements (Turn, fromEdge, toEdge) fails with the COMException but only for default turns. Scripts on Edges work just fine. The Server is Windows 2008 R2 64-bit. The network dataset is served from a file-geodatabase. Any thoughts?
... View more
06-22-2017
03:24 PM
|
0
|
3
|
1486
|
POST
|
If anyone is looking at this post with a similar problem, I have discussed it with Esri Technical Support at the Dev Summit and confirmed that there is a bug that appears to be fixed in 10.4. If any of the layers/tables you are exporting to Runtime Content has a GlobalID column, then the export may fail and/or result in an incomplete runtime geodatabase. The only workaround is to delete the GlobalID columns before exporting. I have not yet confirmed that the export succeeds in 10.4 or any versions of ArcGIS Pro.
... View more
03-14-2016
07:47 AM
|
0
|
0
|
289
|
POST
|
Al, I wasn't directly involved in the development, but I took it upon myself to run the app through the instrumentation. No leaks were reported by the Leak module. However, running through the Allocations module showed that out of the 250+MB application footprint, more than 150MB was consumed by persistent UILabels. It turns out that the UIPicker we were using was maintaining strong references to the labels as they were scrolled into view and for each instance of the view that was shown. Since we were using this to allow the user to pick from coded domain values, this racked up pretty quickly when creating or modifying attribute information. We changed some of the code around the picker and have removed that specific cause of memory pressure warnings. So far, no more issues have been reported. Thanks!
... View more
11-13-2014
09:56 AM
|
2
|
0
|
490
|
POST
|
We have been working on a native iOS app for quite a while and have become frustrated trying to track down the causes of memory pressure warnings. We are not able to replicate the issue in a consistent manner to effectively trace/snapshot the application. One of our clients was using the app for a couple of weeks, creating over 200 detailed inspections with attachments. Then suddenly they are getting memory purge warnings every 2-5 inspections. This is the only application running on the device .. all other applications have been terminated. The memory warnings persist even after the application is uninstalled and reinstalled .. which should have removed any data associated with the app. When the memory warnings occur, some devices have 750MB (out of 1GB) of system memory available, others only have 50-200MB. We can understand the warnings when there is only 50MB free, but 750MB? We can log in as this user on a device with identical specs and view the same data and perform identical workflows with no issues. We are fairly certain that it's not related to the amount of data or the specific workflow. We are not forcibly caching anything in memory or using large collections. What we can't identify is if there is anything the Runtime SDK is doing behind the scenes that may be caching data or storing some other information that is loaded into memory. The application specs are: Runtime SDK 10.2.4 against ArcGIS Server 10.2.2 4 AGSFeatureLayers, 1 AGSLocalTiledLayer, 1 AGSDynamicMapServiceLayer, 1 AGSTiledLayer search for features using AGSQueryTask AGSGDBFeatureTable and sqlite geodatabase with about 30000 point features synchs data to AGSFeatureLayer using applyEditsWithFeaturesToAdd and AGSAttachmentManager iOS 7.1.2 and iOS 8+
... View more
11-04-2014
08:10 AM
|
0
|
2
|
4022
|
POST
|
All of the shapes (point features) are valid. I haven't found a reason yet for the incomplete exports. However, I did find a temporary work-around. Through some developer sleuthing and curiosity, I found a reference to an SDE connection to the SQLite .geodatabase file. To my utter surprise, it worked. I opened a connection to the .geodatabase file in ArcCatalog as if it was a an enterprise geodatabase. I simply deleted the 3 incomplete features classes and copied them from the source database. All features accounted for and working without issues so far on the devices. It appears Esri is still in the development stages for this type of connection and thus why they haven't made it public .. if they plan on doing so at all. While my trials did work, there were plenty of errors that popped up when trying to do some typical "enterprise" operations just for giggles. I would only recommend this option to those who are very technical and understand the internal geodatabase structure and system tables. I hesitate to post the details of method I used unless someone at Esri gives the "OK" to do so. As with any undocumented method, there is a real danger of corrupting data.
... View more
07-02-2014
11:27 AM
|
0
|
1
|
289
|
POST
|
No. In fact, I had to copy the 3 feature classes to a new geodatabase before I could even get this far. They were participating in a few relationship classes. The source file-geodatabase has only the 3 point feature classes. No domains, relationships, or other special data. Just a bunch of points and attributes. I've been using and developing against Esri products since 1989. I've managed to find a few bugs and work around them, but this problem has me temporarily stumped.
... View more
06-19-2014
10:31 AM
|
0
|
0
|
822
|
POST
|
Unfortunately, that resulted in the same truncated export. I watched it more closely this time. The process seems to go something like this: Create the schema (around 80KB) Fill in the feature rows one feature class at a time (gets to around 70MB) Creates a new MXD named "CacheTemp.mxd" Pauses for a few seconds The .geodatabase then grows to about 100MB and immediately shrinks to the amount it grew by (around 35MB) Some kind of post-processing step is clearing out 50% of my features. The REALLY strange part is the feature counts of the original compared to the export: [INDENT] Meters 61184 30184 Lights 15414 7414 Structures 85498 42498 [/INDENT] The export feature count is exactly: (INT(numFeatures / 2000) * 1000) + MOD(numFeatures,1000) Something is very odd about that.
... View more
06-19-2014
09:53 AM
|
0
|
0
|
822
|
POST
|
I have 3 feature classes with approximately 162,000 features between them. I'm using ArcGIS Desktop/Server 10.2.2. No matter what method I use, I get a different number of features in the runtime .geodatabase. Using ArcMap "Share as ... Runtime Content", I end up with about 90% of the source features. Using ArcToolbox "Create Runtime Content", I only get about 50% of the features. Publishing it to ArcGIS Server 10.2.2 and using the "Create Replica" REST endpoint, I get about 98% of the features exported. I've zoomed in to show a subset of the features and the export is limited to those extents (somewhat as expected). However, zooming out to "the world", I still end up missing between 2-10% of the source features. What is going on??? I've tried rebuilding the spatial index, exporting from file-geodatabase and a SQL Server geodatabase. I've also noticed that during the export, the .geodatabase will shoot up to 90MB before being reduced to 35MB. If I make a copy before the "shrink", I see that 1 feature class has all 85498 features. Afterwards it only has 42498. What is happening? I'm at a loss. Short of writing my own SQLite exporter, is there something I can do to RELIABLY export this content?
... View more
06-19-2014
08:59 AM
|
0
|
0
|
822
|
POST
|
While you can't modify the Definition Query that the layer was published with, you can still set a definition expression which gets appended to the default. We have an IOS app which works against 10.1 and 10.2 REST services which sets additional filters on the layer based on the user's inputs. The results will be only features which satisfy the published definition expression (which you can't retrieve in 10.2 REST) and the definition expression set on the feature layer in client code. If the MapService allows layers to be added dynamically, then you might be able to create a new layer based off of one you want to change, set the definition expression, and add it to the map. I haven't tried this method yet.
... View more
01-03-2014
12:15 PM
|
0
|
0
|
636
|
POST
|
I have not gotten an answer from anyone. We do need an answer, if nothing else, to clarify WHEN the definition expression is overridden and WHEN it is appended to. There seems to be conflicts in the documentation. I believe that it is Esri's intent to always keep the base definition expression as the publisher intended while the client defined expressions are appended to the default .. never overridden. This action is preferred from a security aspect where a layer's definition expression may be used to filter out sensitive data (which is not really the best approach) and could also expose additional tables or functions that are internal to the database. An exception would be using dynamic layers where the client/application is aware of the available datasets and fields. In this case, a new layer is created and dynamically added to the map which can have a new expression defined. I haven't tested any of these assumptions ...
... View more
01-02-2014
06:51 AM
|
0
|
0
|
636
|
POST
|
I have an iOS Runtime SDK application which was running against 10.1.1. The app would read read the definition query from the layer so that additional filters could be added and the original query restored. Documentation was a little confusing as to when the query would override the definition query in the map/feature service and when it would append. In any case, we got it working. I setup a new 10.2 server and spent an entire day wondering why I was getting "Database error: invalid query" in my server logs and the feature services were not working properly. Some tracing with WireShark showed that the layer definitions were being set to strings like "((null) or (null)) and (status = 'Complete')". I know I need to change the App code to not produce queries like this but that is not the source of the problem. The real problem is that the REST endpoints no longer give the JSON "definitionExpression" parameter and the HTML format says "Definition Expression: N/A". Has 10.2 changed so that definition queries can no longer be read or overridden? Are REST/Runtime SDK supplied layer definitions now always evaluated in addition to the published definition query? Thanks for any input ...
... View more
08-15-2013
09:56 AM
|
0
|
8
|
4445
|
POST
|
I have map service in 9.3.1 sp2 which contains a feature class (parcels) joined on "pid" to a table (pdata). All data is contained in the same file gdb and relevant fields are indexed. I'm using 2 different Query operations with different problems ... The first query is attribute only with a where clause like this: parcels.pid in (select pdata.pid from pdata where upper(pdata.owner) like upper('%[value]%')) The above clause works if I choose the default join option of "Keep all records" but it takes 15-20 seconds. However, when using "Keep only matching records", the query fails with an "invalid SQL statement was used" error. If I remove the first "upper" function, the query works and returns the correct results almost instantly. parcels.pid in (select pdata.pid from pdata where pdata.owner like upper('%[value]%')) Can I not use the upper function on a table field in a subquery? The second query is a spatial query with a simple envelope for the query geometry. The query should return 19 parcels. If I use the join option of "keep all records", the operation works correctly. However, when "keep only matching" is used, there are some problems... The query will return the first 500 records of the parcels layer and place them in Africa instead of Florida .. no matter what the spatial query. All of the returned geometries appear to have the correct coordinates and spatial reference. So, that's strange ... The really strange part is that I use the REST Query page directly, with HTML or JSON output, I get the 1st 500 parcels. BUT, if I change the output to KMZ, I get the correct 19 parcels. Nothing else about the query changes except the output format! It appears that there is something different in the Query handler when the output format is KMZ that is correct, but is broken with other output formats when a feature class is joined to a table with the "keep only matching" join option. Any ideas on how I can get both queries to work correctly and efficiently staying with 9.3.1 sp2 and keeping the table dynamically joined to the feature class?
... View more
09-30-2010
08:29 AM
|
0
|
0
|
521
|
POST
|
I've created a Mosaic Dataset for which I manually adjusted the histogram settings for one of the rasters to generally match the color other. Then I found out about the Color Correction tab. It corrected the color better than I had manually set. The problem is that hitting the "Correct Color" button only works once. After that, the mosaic preview disappears and shows all white . The only way to get it back is to turn off the "Apply Color Correction" in the Mosaic Properties -> Defaults tab .. that process gets annoying when experimenting with the various settings. The other problem is that when I Build Overviews with the color correction applied, the overviews are solid black. This doesn't happend with the color correction off.
... View more
02-16-2010
07:18 AM
|
0
|
1
|
774
|
Title | Kudos | Posted |
---|---|---|
1 | 01-18-2018 07:43 AM | |
2 | 11-13-2014 09:56 AM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:22 AM
|