POST
|
Thanks for replying. Below is an example of the data I'm working with. Each dot is at a centroid of a postcode unit, and represents one service pipe failure at that unit (so there are stacks of dots where there's been more than one failure in that postcode). I've also done a standard hotspot analysis with the failures summarised so that there is only one dot at each location with a count of the total number of failures over a five year period. I could instead use polygons of each postcode unit and use those with no gas meters as the locations where failures cannot exist, if that sounds sensible to you?
... View more
05-31-2018
09:08 AM
|
0
|
1
|
538
|
POST
|
I'm looking at analysing numbers of gas service pipe failures to find emerging hotspots. Because there could be different numbers of properties with gas meters in whichever polygon aggregation I use (postcodes, hexagons etc.) I think that ideally the data should be normalised, i.e. failure rate = number of failures / number of gas meters. Is there a way to create a space time cube that will calculate this failure rate rather than just use the non-normalised count of failures? Or is there some workaround to get the same result to use with the emerging hot spot analysis tool?
... View more
05-31-2018
08:44 AM
|
0
|
3
|
662
|
POST
|
I'm writing a python toolbox implementing several different areal interpolation methods, from the most basic areal weighting up to methods that use some statistical techniques. I'm planning on running some monte carlo simulations to assess the accuracy of each method with a particular set of input data, so I need to make the execution as quick as possible. My current implementation for areal weighting is below. I've used a searchcursor nested within an updatecursor, which I'm led to believe is not the most efficient way. Basically the task is to interpolate values from source zones (census polygons) to arbitrary target zone polygons, in this case based simply on the ratio of the area of each source zone that intersects a target zone to the total area of that source zone. To do this, I loop through the target zones, select just the source zones that intersect the target zone, then loop through those source zones. Where the source zone is entirely contained within the target zone, I just take the total value from that source zone (I'm not sure if this optimization is worth it as all I save is the ratio calculation...), otherwise, I calculate the ratio and multiply it by the value to be interpolated. I've been trying to think of a way to avoid this inner searchcursor by creating a dictionary outside the updatecursor. The difficulty is that this dictionary does not need to include all the source zones, just those that intersect the target zone. Ideally I'd have a dictionary with the ID of each target zone as the key, and the value would have to be another dictionary for each intersecting source zone with the interpolation and geometry fields as keys and the values of those fields as values. But it seems like to build this dictionary would require running an intersect or spatial join first, or another nested loop, so I'm unsure whether I would really gain much speed this way. I know I can use cProfile to see exactly how long each part is taking, but it's not clear how best to set it up within a python toolbox - can anyone provide some examples of how best to do this? Anyway, here is the execute code: def execute(self, parameters, messages):
"""The source code of the tool."""
# get parameters
inSourceFeatures = parameters[0].valueAsText
inInterpolationFields = parameters[1].valueAsText.split(";")
inTargetFeatures = parameters[2].valueAsText
outTargetFeatures = parameters[3].valueAsText
# use in_memory workspace for quicker results
arcpy.env.workspace = 'in_memory'
# make feature layers of the source zone feature class
arcpy.MakeFeatureLayer_management(inSourceFeatures, "source_zones")
# take a copy of the target zones
arcpy.CopyFeatures_management(inTargetFeatures, "target_zones")
# add fields for the interpolated variables
for field in inInterpolationFields:
arcpy.AddField_management("target_zones", field, "DOUBLE")
# get spatial reference of zones
sourceSR = arcpy.Describe("source_zones").spatialReference
targetSR = arcpy.Describe("target_zones").spatialReference
# for each target zone...
with arcpy.da.UpdateCursor("target_zones", inInterpolationFields + ['SHAPE@'], spatial_reference=targetSR) as targetCursor:
for targetRow in targetCursor:
# create a dictionary for the row for easier field access
targetRowDict = dict(zip(inInterpolationFields + ['SHAPE@'], targetRow))
# get target geometry
targetGeom = targetRowDict['SHAPE@']
# select source zones that intersect current target zone
arcpy.SelectLayerByLocation_management("source_zones", "INTERSECT", targetGeom)
# initialise output interpolated value fields
for field in inInterpolationFields:
targetRowDict[field] = 0.0
# for each intersecting source zone...
with arcpy.da.SearchCursor("source_zones", inInterpolationFields + ['SHAPE@'], spatial_reference=sourceSR) as sourceCursor:
for sourceRow in sourceCursor:
# create a dictionary for the row for easier field access
sourceRowDict = dict(zip(inInterpolationFields + ['SHAPE@'], sourceRow))
# get source geometry
sourceGeom = sourceRowDict['SHAPE@']
# if source zone is entirely within the target zone, add the full value
if sourceGeom.within(targetGeom):
for field in inInterpolationFields:
if sourceRowDict[field]:
targetRowDict[field] += sourceRowDict[field]
else:
# calculate the ratio of the intersection area to the source zone area
ratio = sourceGeom.intersect(targetGeom, 4).area / sourceGeom.area
for field in inInterpolationFields:
if sourceRowDict[field]:
# add the areally weighted values to the output fields
targetRowDict[field] += ratio * sourceRowDict[field]
targetCursor.updateRow([targetRowDict[field] for field in inInterpolationFields + ['SHAPE@']])
# copy output target zones to output
arcpy.CopyFeatures_management("target_zones", outTargetFeatures)
return
... View more
04-24-2018
03:07 AM
|
0
|
0
|
375
|
POST
|
In ArcMap I frequently apply unique values/category symbology to layers with several tens of thousands of unique values. This is very fast in ArcMap and has never caused me any issues, however on ArcGIS Pro the same thing seems to take forever and lock up the interface for up to an hour while it applies the symbology. If I then want to edit the properties of all symbols that's another long wait. When it's finally finished the whole application becomes so slow it's unusable. Is there any workaround and/or is this something that's being looked at?
... View more
03-24-2017
04:26 AM
|
1
|
0
|
1048
|
POST
|
That does sound like a similar issue, but I am definitely using 10.2.1 and it has worked correctly in the past!
... View more
11-22-2016
08:03 AM
|
0
|
1
|
362
|
POST
|
Why does it take literally hours for server-side post processing to complete when publishing a feature service to ArcGIS Online from ArcMap? I'm only trying to publish a single feature class which is only showing as 17MB (although there are several hundred thousand features) - why on earth does it take so long?
... View more
11-22-2016
08:00 AM
|
0
|
2
|
815
|
POST
|
I've just spent hours trying to figure out why my feature class in British National Grid (OSGB) coordinates wouldn't line up with any of the Esri Basemaps despite (apparently) having set up the appropriate transformation, attempting to project the feature class etc. etc. It seems that the transformations I was selecting were not being applied. If I went back to the Data Frame Properties to try a different transformation it had reverted to 'none'. Finally I tried copying down the details of the transformation I wanted, and creating a custom transformation with those details, and it worked! Does anyone know why the built in transformations weren't being applied? Even when I tried to reproject the feature class using the built in transformation the result was still misaligned, but with my custom transformation (specified identically to the built in one), it worked! Is this a known issue? I'm using ArcMap 10.2.1. Any ideas how to fix this in future?
... View more
11-22-2016
07:27 AM
|
0
|
3
|
603
|
POST
|
You can write scripts in Python using the ArcPy library to do exactly the sort of thing you mention: What is ArcPy?—Help | ArcGIS for Desktop There is also Model Builder if you prefer to do things graphically or don't feel confident writing Python code: What is ModelBuilder?—Help | ArcGIS for Desktop
... View more
10-21-2016
07:55 AM
|
1
|
1
|
319
|
POST
|
Hi Dan, Sorry but I can't share any of the actual data. I can give you an idea of the schema though if that helps? It's a pipe network (just a standard line feature class though, no geometric network stuff), each feature is a section of pipe (split at fairly arbitrary points). Each section has an 8-digit unique ID, and a number of attributes such as material (2-letter text code), diameter (floating point number), diameter units (either inches or millimetres), pressure tier (Low, Medium and High), etc. What I want is to combine contiguous sections of pipes with the same attributes into multi-part features. I will then assign each one a unique 'subsystem' ID, and spatial join this back on to the original features. The Dissolve tool isn't suitable even with 'create multipart features' or 'unsplit lines' checked, as this either combines ALL features with the selected attributes the same (not just those that are connected) in the former case, and will only combine lines that can be combined into single-part features in the latter case. The main issue I'm having trying to do it with dicts etc. is building a SQL query to select out each unique combination of attributes. It gets very messy when I have to start dealing with different data types. Don't know if that helps or makes sense. Let me know if you'd like any more info. Thanks for helping. Dan
... View more
03-08-2016
04:06 AM
|
0
|
2
|
791
|
POST
|
I know it's inefficient (I got it working and it took 13 hours to run on my data!) but I spent several days last week tearing my hair out trying to get it to work the efficient way, without success!
... View more
03-07-2016
01:05 AM
|
0
|
4
|
791
|
POST
|
Would that not add 'dissolveFields' as a list element within the list?
... View more
03-04-2016
05:16 AM
|
0
|
1
|
1924
|
POST
|
Thanks, managed to solve it though. Turns out converting to a tuple did work, I just had to keep the square brackets around multipartPolyline - although actually what I ended up doing was instead converting 'row' to a list as I had a few other fields to add and it made sense to only do one conversion to list rather than multiple to tuple! Now my code looks like this: # Insert the current superstring into the output feature class
insertCursor2 = arcpy.da.InsertCursor('SuperStrings_temp_out', dissolveFields + ["FREQUENCY", "SUM_SHAPE_Length", "SHAPE@"])
multipartPolyline = arcpy.Polyline(superstringPartsArray)
insertCursor2.insertRow(list(row) + [count] + [multipartPolyline.length] + [multipartPolyline])
del insertCursor2 I just don't understand how 'row', which I treat like a list with no problems earlier on in the code, has suddenly become a tuple!
... View more
03-04-2016
05:14 AM
|
0
|
0
|
1924
|
POST
|
I am trying to use an insert cursor to populate several fields with attributes and also the SHAPE field with a multi-part polyline, but I keep getting a "can only concatenate tuple (not "list") to tuple" error. I don't understand, because it works to insert just the SHAPE surrounded by square brackets, and my row from an outer SearchCursor works as a list everywhere else! (Converting to a tuple doesn't work either!). Here is my code: # Insert the current superstring into the output feature class
insertCursor2 = arcpy.da.InsertCursor('SuperStrings_temp_out', dissolveFields + ["SHAPE@"])
multipartPolyline = arcpy.Polyline(superstringPartsArray)
insertCursor2.insertRow(row + [[multipartPolyline]])
del insertCursor2 'dissolveFields' is the list of fields whose attributes I want to update with 'row', which is the current row of a table of unique combinations (from the summary statistics tool), which has the same fields as in the dissolveFields list. Please help!
... View more
03-04-2016
04:12 AM
|
0
|
5
|
4369
|
POST
|
Thanks all, Looks like I've solved it by using Summary Statistics to get the unique combinations, then looping over the unique combinations with a SearchCursor. Within this SearchCursor I have another SearchCursor looping over all my input features and an InsertCursor to write only the matches into a temporary feature class. Thanks again for the help. Dan
... View more
03-04-2016
03:26 AM
|
0
|
6
|
2870
|
POST
|
Thanks, that'll work for getting the unique combinations, but I'm having some difficulty using those to select out one combination at a time to apply my process to. At the moment I'm trying to use the SearchCursor to loop through each combination and build a SQL selection query for that combination by concatenating strings together. This isn't ideal though as I have to have lots of if statements to handle different data types and when it runs into NULL values it can't cope either - I could write more string concatenations to deal with NULLS but surely there must be a better way? Here's what I have so far (is there a 'code' tag in this new forum??): # Use a SearchCursor to loop through each combination, build a selection query, and apply the algorithm to that selection
with arcpy.da.SearchCursor(inputLines, dissolveFields) as cursor:
selectionQuery = ''
for row in cursor:
for field in row:
arcpy.AddMessage(field)
count = 0
if count < len(row):
if type(field) == 'str':
selectionQuery += arcpy.AddFieldDelimiters(arcpy.env.scratchGDB, dissolveFields[count]) + " = '" + field + "' AND "
elif type(field) == 'int':
selectionQuery += arcpy.AddFieldDelimiters(arcpy.env.scratchGDB, dissolveFields[count]) + " = " + int(field) + " AND "
elif type(field) == 'float':
selectionQuery += arcpy.AddFieldDelimiters(arcpy.env.scratchGDB, dissolveFields[count]) + " = " + float(field) + " AND "
else:
if type(field) == 'str':
selectionQuery += arcpy.AddFieldDelimiters(arcpy.env.scratchGDB, dissolveFields[count]) + " = '" + field + "'"
elif type(field) == 'int':
selectionQuery += arcpy.AddFieldDelimiters(arcpy.env.scratchGDB, dissolveFields[count]) + " = " + int(field)
elif type(field) == 'float':
selectionQuery += arcpy.AddFieldDelimiters(arcpy.env.scratchGDB, dissolveFields[count]) + " = " + float(field)
count += 1
arcpy.AddMessage(selectionQuery)
# Apply selection query and create working copy
arcpy.FeatureClassToFeatureClass_conversion(inputLines, arcpy.env.scratchGDB, 'SuperStrings_temp_in', selectionQuery)
... View more
03-03-2016
02:18 AM
|
0
|
2
|
2079
|
Title | Kudos | Posted |
---|---|---|
1 | 10-21-2016 07:55 AM | |
1 | 12-02-2015 08:36 AM | |
1 | 03-25-2014 08:25 AM | |
1 | 08-17-2011 07:28 AM | |
1 | 03-24-2017 04:26 AM |
Online Status |
Offline
|
Date Last Visited |
03-03-2022
10:03 AM
|