POST
|
Interestingly though, in regards to the brackets Con(Float("tmp" > 8.64), 0, Con((Float("tmp" <= 8.64)) & (Float("tmp" >= 4.31)), (Float(8.64 - "tmp")) / (8.64 - 4.31), 1)) produced the same result as Con(("tmp" > 8.64), 0, Con(("tmp" <= 8.64) & ("tmp" >= 4.31), (8.64 - "tmp") / (4.33), 1)) even though the brackets in the first one seem wrong.
... View more
07-05-2017
04:10 AM
|
0
|
1
|
445
|
POST
|
Dan - I tried them both and it was indeed only taking one of the elements which was > 0.9 and therefor low values were given a wrong suitability, so much appreciated Jayanta - I went back and tried the expression and it worked just fine and gave me the result I wanted as well so it seems that both ways work fine but I'm surprised it worked without the float because originally it was producing an integer result Either way it produced the NDVI layer that I wanted so big thank you to you both
... View more
07-05-2017
03:48 AM
|
1
|
0
|
445
|
POST
|
Hey forgive my lack of insight but say for NDVI I thought the statement Con((Float("spainndvi") < 0.13) & (Float("spainndvi") > 0.9), 0 would specify that any number that is lower than 0.13 and higher than 0.9 will be attributed a value of 0?
... View more
07-04-2017
05:41 PM
|
0
|
2
|
445
|
POST
|
Hmm I had that before but it was having issues because it was decimal which is the reason I added float, but that appears to have fixed it so thanks a lot Dan, very helpful as always
... View more
07-04-2017
10:40 AM
|
0
|
1
|
964
|
POST
|
Messages Executing: RasterCalculator Con((Float("spainndvi" < 0.13)) & (Float("spainndvi" > 0.9)), 0, Con((Float("spainndvi" >= 0.13)) & (Float("spainndvi" <= 0.32)), (Float("spainndvi" - 0.13)) / (0.32 - 0.13)), Con((Float("spainndvi" >= 0.6)) & (Float("spainndvi" <= 0.9)), (Float(0.9 - "spainndvi")) / (0.9 - 0.6), 1)) "E:\Users\GTAV\My Documents\ArcGIS\Default.gdb\rastercalc" Start Time: Tue Jul 04 17:25:11 2017 Con((Float(Raster(r"spainndvi") < 0.13)) & (Float(Raster(r"spainndvi") > 0.9)), 0, Con((Float(Raster(r"spainndvi") >= 0.13)) & (Float(Raster(r"spainndvi") <= 0.32)), (Float(Raster(r"spainndvi") - 0.13)) / (0.32 - 0.13)), Con((Float(Raster(r"spainndvi") >= 0.6)) & (Float(Raster(r"spainndvi") <= 0.9)), (Float(0.9 - Raster(r"spainndvi"))) / (0.9 - 0.6), 1)) ERROR 000539: Error running expression: rcexec() Traceback (most recent call last): File "<expression>", line 1, in <module> File "<string>", line 5, in rcexec File "e:\program files (x86)\arcgis\desktop10.5\arcpy\arcpy\sa\Functions.py", line 263, in Con where_clause) File "e:\program files (x86)\arcgis\desktop10.5\arcpy\arcpy\sa\Utils.py", line 53, in swapper result = wrapper(*args, **kwargs) File "e:\program files (x86)\arcgis\desktop10.5\arcpy\arcpy\sa\Functions.py", line 257, in Wrapper where_clause) File "e:\program files (x86)\arcgis\desktop10.5\arcpy\arcpy\geoprocessing\_base.py", line 510, in <lambda> return lambda *args: val(*gp_fixargs(args, True)) ExecuteError: ERROR 999999: Error executing function. An invalid SQL statement was used. An invalid SQL statement was used. [VAT_boole_ras] An invalid SQL statement was used. [SELECT * FROM VAT_boole_ras WHERE E:\Users\GTAV\My Documents\ArcGIS\Default.gdb\ifthe_ras1] Failed to execute (Con). The table was not found. [VAT_ifthe_ras1] Failed to execute (RasterCalculator). Failed at Tue Jul 04 17:25:24 2017 (Elapsed Time: 12.36 seconds)
... View more
07-04-2017
08:32 AM
|
0
|
1
|
964
|
POST
|
Hey, so, having frustraited over this for ages I give up and will ask I have a conditional statement I applied to an NDVI layer to apply a double sided membership function to it It is in 32 bit float as it is decimal. The following statement was used Con((Float("spainndvi" < 0.13)) & (Float("spainndvi" > 0.9)), 0, Con((Float("spainndvi" >= 0.13)) & (Float("spainndvi" <= 0.32)), (Float("spainndvi" - 0.13)) / (0.32 - 0.13)), Con((Float("spainndvi" >= 0.6)) & (Float("spainndvi" <= 0.9)), (Float(0.9 - "spainndvi")) / (0.9 - 0.6), 1)) The odd thing is it worked when I last used it but having miss named the file I deleted it by accident, went to re-run the statement and it just gives me an error. Is there something obviously wrong with my statement? Thanks
... View more
07-04-2017
08:30 AM
|
0
|
16
|
2164
|
POST
|
I suppose using focal statistics to calculate the sum then raster calculator to x100 /#cells
... View more
05-30-2017
05:45 AM
|
0
|
0
|
783
|
POST
|
So I have been thinking hard about this... because although that would give me somewhat of a landcover suitability it doesnt quite manage it in terms of habitat use... even taking into account the statistics I need to know more what thresholds of land composition are more suitable... So I've decided to break them down into individual landcovers in a simplified version containing grassland, cropland forest etc. I need to work out what % of each land cover is considered most suitable within the home ranges of the kites so because saying one type of land cover is more suitable etc doesn't really cover the issue. Basically the best way I can see to do it is to create a custom filter whos size is determined by the average home range of my birds... so about 60 sqkm. I want it to pass over classifying each cell based on surrounding cells to kind of give it a % cover... so say its a 3x3 pixel filter and 5 out of the 9 cells surrounding the centre cell are forest it will give it 100*5/9 value, doing that for each cell in the raster. Then I can plot it and fit a function to it. Any input on the best way of creating the filter? I hope its not too confusing how I have written it
... View more
05-30-2017
04:53 AM
|
0
|
0
|
783
|
POST
|
oh no i meant if you see another way that I am overlooking I am fully open to suggestions as stats is not my forte.
... View more
05-19-2017
09:14 AM
|
0
|
0
|
783
|
POST
|
Well for that element I was thinking to take each variable and do them individually. Say altitude, plotting alt in y-axis and cell count or area in x-axis and then create a membership function for each parameter in that way... although I'm very open to criticism
... View more
05-19-2017
07:27 AM
|
0
|
2
|
783
|
POST
|
Basically I am trying to generate a fuzzy suitability map for habitat using data derived from gps. The GPS points have been summerised into a "home range" using kernel density estimators and then merged to form a blotchy mask which I used to extract relevant data. What I have now is simply the data within the home ranges. For elements like DEM, NDVI etc there are just values from the one parameter and with that I will create membership functions using regression analysis (if there is a good method you know of then feel free to share) which I still need to work out. The Corine Land cover layer is a different issue altogether. From masking off the data I can see that there is a strong preference for crop areas but I cannot negate the other land covers as they are still significant. I cannot partition them into say 20x20km squares because an area might become more suitable taking into account both areas in the middle... I just don't really know what the best way to approach this in order to give it suitability values
... View more
05-19-2017
06:43 AM
|
0
|
1
|
783
|
POST
|
So I am in the process of creating a fuzzy suitability map for a species of bird and for the most part its fine and I have the values to create my functions. The only dataset that is causing me an issue is that of land cover. What i have done is create buffers to denote home range and then have masked off the rest of the map so I only have the values of interest. Within these values I have simplified a Corine land cover layer to "forest" "cropland" etc and have statistics regarding these covers. The part I am struggling with is how to best approach generating a fuzzy surface. Should I take each cover individually? Would a good approach be to take a "window" based on the mean home range and somehow classify the the surface using that? Any insight would be greatly appreciated. John
... View more
05-19-2017
05:59 AM
|
0
|
9
|
1373
|
POST
|
unless it is the Z factor in the slope table that is causing the issue but I am not so sure where to find out what the Z units are
... View more
04-15-2017
09:49 AM
|
0
|
0
|
1171
|
POST
|
So the data im using is the GOTOP30 from EarthExplorer using Get Data | The Long Term Archive . I am sure it is somewhere during the projection from WGS to ETRS UTM 30N. I reworked the process from the original DEMS and got the same result. I dont really understand whats causing this but perhaps there is another source of data that can provide me a reliable 50m DEM of spain...
... View more
04-15-2017
09:37 AM
|
0
|
0
|
1171
|
POST
|
Oh sorry, to clarify, that's what I did, i didn't re-sample, I just used the project tool and specified the output cells to be 50. I got the DEM from Ita.cr.usgs.gov/gtop030 which gave me two DEMs with a 30m resolution in WGS_1984 which I then mosaic'd together and projected them to UTM 30N using the raster project with a specified cell size of 50m. The originals seem to be fine that's why I am baffled by the output
... View more
04-14-2017
09:49 AM
|
0
|
1
|
1171
|
Title | Kudos | Posted |
---|---|---|
1 | 07-05-2017 03:48 AM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|