POST
|
Does anyone have a copy of the script that used to be found here: Multiprocess Geocoding | ArcGIS Blog This was a very handy script, and an example of how to implement multiprocessing. Thanks.
... View more
08-04-2016
12:03 PM
|
0
|
2
|
1322
|
POST
|
I am trying to populate intersection information for our facilities data. I am initially running simple points against the latest TomTom data that we purchase annualy (MultiNet Enterprise). In using the "Reverse Geocode" tool with the "INTERSECTION" option in ArcGIS Desktop 10.1 SP1, all of the >65,000 points are placed on top of each other, in places where a single facility exists. This is obviously incorrect, and I cannot figure out why. Then I tried using ArcGIS Pro 10.3. Using the Reverse Geocode tool there simply crashes the application. I have tried using multiple different copies of the point data, as well as several different address locators using different TeleAtlas reference data. I have run check and repair geometry on the point feature class as well as on the address locator reference data with no luck. The application crashes every time ("ArcGIS Pro has encountered a serious application error and is unable to continue...") Does anybody have any ideas why this is happening???
... View more
04-27-2016
08:24 AM
|
0
|
1
|
1986
|
POST
|
SELECT UDI,Channel,Value, DateTime_UTC
FROM PTData4 t1
WHERE EXISTS(SELECT 1
FROM PTData4 t2
WHERE t2.UDI = t1.UDI
AND t2.Channel = t1.Channel
GROUP BY t2.UDI,
t2.Channel
HAVING t1.DateTime_UTC = MAX(t2.DateTime_UTC)) This seems to work in ArcCatalog, but the results are not expected (I am getting readings from different dates for each device and channel): DateTime_UTC in (SELECT max( DateTime_UTC )
FROM PT_RO_Pressure_Tracker_Interval_Data
GROUP BY UDI, Channel) I have not tried any automation in Python yet; I would first like to get the extract working in Table to Table before attempting automation. Thanks.
... View more
11-16-2015
06:47 AM
|
0
|
1
|
396
|
POST
|
Hello all. While we are waiting for GeoEvent processor to be rolled out in my organization, I am attempting to create a temporary solution to get pressure tracking data into our GIS system to be made available to our end-users. I have the workflow planned out, but the only thing in my way is getting the most recent readings into a seperate table. We have a SQL Server database that holds all of the readings in a single table, which contains over 2 million rows. Data is held in the table per reading date, UDI (device identifier), channel (3 channels per device), and value (actual readings). Since the process will be scripted in python, I am trying to automate the copying of the most recent readings from this table into a new table. I am using the Table to Table tool, with a SQL. Here is the code I am trying to use, which seems to work well inside of SQL Server: select UDI,Channel,Value, DateTime_UTC from PTData4 t1 WHERE EXISTS(SELECT 1 FROM PTData4 t2 WHERE t2.UDI = t1.UDI AND t2.Channel = t1.Channel GROUP BY t2.UDI, t2.Channel HAVING t1.DateTime_UTC = MAX(t2.DateTime_UTC)) However in ArcGIS I get an error that states: "There was an error with the expression. Failed to parse the where clause." Any ideas how to convert this code to be compatible with ArcGIS?
... View more
11-13-2015
12:07 PM
|
0
|
3
|
3216
|
POST
|
Josh, if you read my original post, you would have seen that I am being told I cannot edit the default version. That was the whole point of my question.
... View more
09-14-2015
05:14 AM
|
0
|
3
|
1828
|
POST
|
Thanks Asrujit - it looks like I will have to create a version first, make the edits and then post. However, existing versions will not be able to see the edits, correct?
... View more
09-11-2015
10:51 AM
|
0
|
5
|
1828
|
POST
|
Hello all. We are trying to create a nightly script that will set attachment flags, indicate a feature has an attachment available. This is not possible using hte ArcGIS front-end due to issues related to ArcFM AutoUpdaters (and also performance reasons), so we are attempting to develop a solution using SQL Developer (we cannot use MS Access or SQL Server due to incompatibilities with ST_GEOMETRY). I am told that running an UPDATE query on a view will take care of updating all of the delta tables as well as the base table. The first feature I am running this on is a versioned view. Here is my code snippet: EXECUTE sde.version_util.set_current_version ('SDE.DEFAULT');
EXECUTE sde.version_user_ddl.edit_version ('SDE.DEFAULT', 1);
UPDATE GASVALVE_VW
SET ATTACH_FLAG = 'N';
Commit;
EXECUTE sde.version_user_ddl.edit_version ('SDE.DEFAULT', 2); However, I am being told I cannot edit the DEFAULT version: Error starting at line 3 in command: EXECUTE sde.version_user_ddl.edit_version ('SDE.DEFAULT', 1) Error report: ORA-20500: Cannot edit the DEFAULT version in STANDARD transaction mode. ORA-06512: at "SDE.VERSION_USER_DDL", line 941 ORA-06512: at line 1 I am able to start an edit session in the DEFAULT version inside of ArcMap in the same GDB. SQL Developer then tells me i cannot edit without being in an edit session (because the edit session could not be started by SQL Developer). Error starting at line 4 in command: UPDATE GASVALVE_VW SET ATTACH_FLAG = 'N' Error report: SQL Error: ORA-20504: Editing the DEFAULT version is not supported because the spatial attribute is not a spatial type or the table is registered as versioned with the option to move edits to base. The session must call edit_version to start an edit session before editing the view. ORA-06512: at "PSEG_GAS.V87_UPDATE", line 1 ORA-04088: error during execution of trigger 'PSEG_GAS.V87_UPDATE' Not sure what I am doing wrong. Any help would be greatly appreciated!
... View more
09-11-2015
07:57 AM
|
0
|
7
|
5981
|
POST
|
Is it possible to utilize 3D Analyst to import LiDAR data for the purposes of modeling a Metering and Regulating station in a gas distribution system? I did some searching and could not find any documentation for this application of the software. This is something we are interested in pursuing. Thanks.
... View more
07-28-2015
06:35 AM
|
0
|
1
|
2346
|
POST
|
What do we have to type into the command line to get this to work?
... View more
07-13-2015
06:14 AM
|
0
|
0
|
495
|
POST
|
Thank you for the suggestions everyone, although still no luck. Every avenue ends up with 9999 general function failure, topoengine errors, or the tool simply does not finish after allowing over 3 days of processing time. I have now merged the entire street centerline feature class into one single feature and am attempting to run the buffer tool on that without the dissolve option checked, although it is still at 0% after a half hour. If this does not work I will open a ticket with esri.
... View more
01-06-2015
11:58 AM
|
0
|
1
|
1091
|
POST
|
We require support from esri on this issue. Who can I contact?
... View more
01-02-2015
07:40 AM
|
0
|
2
|
1091
|
POST
|
Still struggling with this. Chopping up the feature class did not work, I am still getting topoengine errors and 9999 general function failures. Also this method is creating gaps between feature all over the state, which is unacceptable.
... View more
12-29-2014
05:52 AM
|
0
|
0
|
1091
|
POST
|
I was able to run the dissolve on 4 / 5 feature classes. The one that fails has 170,000 feature inside of it.
... View more
12-26-2014
09:24 AM
|
0
|
0
|
1091
|
POST
|
I tried breaking up the input feature class and running dissolve on each part, then merging back together. This worked (except for one of the parts, which throws error 99999, even after repairing geometry), but now running dissolve on the merged feature class gives an "out of memory" error. Does anybody else have any suggestions?
... View more
12-26-2014
08:25 AM
|
0
|
3
|
1091
|
POST
|
We need to create 25 foot buffers for the entire state of New Jersey for our land base. We are using 2013 TeleAtlas centerlien data as the input. It does not have to be multi-part. I am open to the idea of doing to it in batches, but would not there then be overlap that would require another dissolve? Why does it matter how many features are being run through the tool? The software should be able to handle large sets of data; it is almost 2015 and 4TB hard drives are less than $150. I have made posts in the past pleading for esri to rework their core product's most basic tools to bring them into the 21st century but we still continually face these types of issues when working with large datasets..
... View more
12-24-2014
10:58 AM
|
0
|
5
|
2880
|
Title | Kudos | Posted |
---|---|---|
1 | 02-18-2014 07:08 AM | |
1 | 02-27-2014 05:22 AM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|