POST
|
Basically I have a large user base and which all run downloaded collector maps. We have run into the situation in which one of our core maps need to have the map filter adjusted to take into account new data scenarios. However, we have found that there is no way to ensure this applies to downloaded maps without deleting and re downloading the map. We have a series of work packages and have been isolating the data with filters to avoid confusion with work scope, however as new work comes on and old data is retired the map filter is adjusted to add and remove values in the filter. Simple process to manage a large amount of data and minimise the reliance on a less experienced work force to be constantly downloading new maps. The problem is that the applied filter never updates until the map is removed and re-downloaded. is this a bug?
... View more
10-08-2016
06:59 PM
|
0
|
0
|
810
|
IDEA
|
With the differing symbology support for several different platforms and conflicting requirements I see a need for the creation of a symbology Validation Tool. I envision that the input would be a layer or group layer in the Map view and a simple drop down for the target symbology type. Output would be a table of the layer that violates the targeted symbol rules and what feature is not supported. ie. Layer x has cartographic line with offset enabled which is not supported in collector Layer y has non-simple polygon fill which is not supported in vector tiles
... View more
06-05-2016
06:44 PM
|
7
|
2
|
1496
|
POST
|
I have been trying to optimize the cartographic presentation of dynamic labels and Maplex is generating weird placement options. I have been trying to place labels on the mid point of a series of lines and I am getting random really long leaders that require significant manual correction. I have tried all manner of placement options and feature weights to correct the issue but I still getting very unusual leader behavior. I assume that the label engine does not include the leader in its weightings for placement as I have leaders going through several features that have weights of 1000. I can see several locations where the label could be placed quite simply and the engine is in essence wigging out and completely missing the mid point by lots. Running ArcGIS 10.2.2 Standard What I Want ( I placed all of these labels as the generated one below was bad) What I get ( I know different scales I just removed the annotation groups from before and changed scale and turned the labels back on)
... View more
07-30-2014
08:23 PM
|
0
|
0
|
2085
|
POST
|
Just a word of warning In a file geodatabase the date and times are all stored as unicode strings, and you need to specify that when manipulating them with Python. time_value.encode('utf-8'), "%d/%m/%Y %I:%M:%S %p")
... View more
02-09-2014
06:04 PM
|
0
|
0
|
591
|
POST
|
after lots of hunting and a ESRI support ticket later I have found out that the Field Caclulator and by extension, the Calculate Field geoprocessing tool access the date and time values as strings. In a file geodatabase these are all unicode strings, though you may find they're not in the case of a personal geodatabase. We can use datetime.strptime to convert the two strings of text into datetime objects. Then we can access the properties of each of these objects to pull out information about the day / month / year in the case of our field with only the date, in it, or hour / minutes / seconds in the case of our field containing both a date and time. When the Calculate Field goes to fill in the time information in the combined field, it wants this data to be provided as a datetime object though. So this makes our best bet, to take the information we're interested from each of our two existing datetime objects, and create a new datetime object using datetime.datetime(yeah, month, day, hour, minute, second). Expression: combine_datefields(!Time_!, !DateTime_!) Codeblock: def combine_datefields(time_value, date_value): t = datetime.datetime.strptime( time_value.encode('utf-8'), "%d/%m/%Y %I:%M:%S %p") d = datetime.datetime.strptime( date_value.encode('utf-8'), "%d/%m/%Y") combined_dt = datetime.datetime(d.year, d.month, d.day, t.hour, t.minute, t.second) return combined_dt So as a result the Datetime.combine was not the right tool for the job
... View more
01-31-2014
04:29 PM
|
1
|
0
|
817
|
POST
|
def Calc (Operator, PointID): if (Operator == None): if (re.search ("DWS", PointID) is TRUE): return "Don" elif (re.search ("GMS", PointID) is TRUE): return "Gordon" elif (re.search ("JZL", PointID) is TRUE): return "Julian" elif (re.search ("AWM", PointID) is TRUE): return "Anthony" else: return (Operator) else: return (Operator) I finally got it all to work thanks for pointing me in the right direction all needed to import the re function as its not native in ArcPy and the true statement is not required. import re def Calc (Operator,PointID): if (Operator == None): if (re.search ("AWM",(PointID))): return "Anthony" elif (re.search ("AZS",(PointID))): return "Andrea" elif (re.search ("ERT",(PointID))): return "Earl" else: return (PointID) else: return (Operator)
... View more
01-31-2014
04:22 PM
|
0
|
0
|
430
|
POST
|
Your code is not right. You did not feed the PointID into the code block through the expression, so the code block has no idea what PointID means. Your code is wrong for detecting Null values within Python. You don't need a while expression in the field calculator for the records it is calculating, since it iterates all records for the calculated field for you. Finally, your have to return a value for every record calculated, not just Null records, which your code does not do. To succeed the code needs to read: Expression: Calc (!Operator!, !PointID!) Code Block: def Calc (Operator, PointID):
if (Operator == None):
if (re.search ("DWS", PointID) is TRUE):
return "Don"
elif (re.search ("GMS", PointID) is TRUE):
return "Gordon"
elif (re.search ("JZL", PointID) is TRUE):
return "Julian"
elif (re.search ("AWM", PointID) is TRUE):
return "Anthony"
else return (Operator)
else return (Operator) code says there is a syntax problem with line 11, the else statements.
... View more
01-26-2014
05:51 PM
|
0
|
0
|
430
|
POST
|
I am trying to write a script to calculate values if there are null values in a table however it never calculates any values and I am pretty sure my code is right. Where (Operator) is a field with values and null values and (PointID) is a field with coded values and the Operator name as a 3 letter code stored within the text. Expression: Calc (!Operator!) Code Block: def Calc (Operator): while (Operator is null): if (re.search ("DWS", PointID) is TRUE): return "Don" elif (re.search ("GMS", PointID) is TRUE): return "Gordon" elif (re.search ("JZL", PointID) is TRUE): return "Julian" elif (re.search ("AWM", PointID) is TRUE): return "Anthony" else return (Operator)
... View more
01-26-2014
01:44 PM
|
0
|
4
|
694
|
POST
|
... are the !DateTime! and !Time! fields strings, or are they actual date fields in the table? If the dates are date objects with 00:00:00 time components, You might be able to add the time field value to the dateTime field value The fields are both dates DateTime has values: 20/01/2014 and Time has values: 23/01/2014 8:52:03 AM The problem is that the Time field should have no date in it but part of the export process it gets a date added and is incorrectly given the date the file was processed. Essentially what I want is a field with 20/01/2014 8:52:03 AM, generated from the two fields. I tried the addition but I can't get it to ignore the date in the Time field.
... View more
01-25-2014
08:08 PM
|
0
|
0
|
817
|
POST
|
Just wondering if anyone can tell me why my expression is not working to calculate some values in a table? Below the !DateTime! and !Time! are fields in a File geodatabase. Basically I need to combine the two fields, where DateTime currently has the correct date for the record and !Time! has a random date and the correct time I need combined with the first field. datetime.combine (datetime.strptime ( !DateTime!, %Y %m %d), datetime.strptime (!Time!, %H:%M:%S))
... View more
01-22-2014
09:25 PM
|
1
|
5
|
2086
|
POST
|
I am after a way to snap Z values in ArcMap. Essentially I have several thousand point features that need converting to different lines or polygons, I can snap the xy easily enough but was after a z value snapping as well. The editing in ArcScene and globe is problematic at best. Yes I could use the feature to line / poly etc but that would also take longer to sort out the selection and definition queries etc to get it all right. I was thinking that an update vertex from point process or script would be ideal.
... View more
12-08-2013
12:55 PM
|
0
|
0
|
369
|
POST
|
With these different functions for the definition of the neighborhood is it possible to have a means to specify your own parameters or formula. I understand that the zone of indifference is actually very helpful but if it were possible to change the formula from 1/d to say 1-(1/d) or even 1/(d/2) etc as this would allow more flexibility as there are many more cases in which the relationship of the data cannot be analysed with the standard functions. ideally it would be best if the Gi* function could have a zone of indifference with a Gaussian trend from the zone boundary not starting at the central point.
... View more
08-13-2011
05:28 AM
|
0
|
0
|
473
|
Title | Kudos | Posted |
---|---|---|
1 | 01-22-2014 09:25 PM | |
1 | 01-31-2014 04:29 PM | |
7 | 06-05-2016 06:44 PM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:24 AM
|