IDEA
|
Hi all involved, It would be a great if I have ArcMap with windows that is expandable as below image: See the above picture of Dataframe window that is not expandable in four ways. In some case I need to expand this window e.g. I need to scroll to see the coordinate system of the three layers in the TOC, I need to see them substantially to compare them if there is any parameter/text different for one of those three layers. If I have window which I can expand upward and downward then comparing would be of great easy and I could grab the subtle change in the projection parameters while toggle selecting the three layers but now I need to copy projection parameters for each layers into notepad and compare them manually!! There so many software that implement this feature to ease the user interaction e.g. eCognition by Trimble as below:
... View more
02-26-2017
06:39 AM
|
3
|
1
|
294
|
POST
|
Hi, have you found your solution yet? I am facing the same problem my settings are windows8 (64 bit), Arcmap 10.4.1 and visual studio 2013 professional. I see the same error dialog while trying to install the .NET SDK.
... View more
10-02-2016
05:03 AM
|
0
|
1
|
565
|
POST
|
Thanks all Dan and Xander, Al last I solved the problem! My understanding that solved the problem: Use separate scratch and output work-space Using multiprocessing is imperative if raster to process is so many(I am obliged to Dan for this concept)- it is really helpful though slow in comparison to the other multiprocessing tasks- esri engineers can tell more about it. Be careful about locks. Always clean the garbage in each work-space. If given multiple raster to process then arcpy.gp.Times_sa can handle more than the arcpy.sa.Times can handle.Really it is absurd that deprecated tool is more powerful that the new one (i call it a nascent babe or at least an apple of Sodom) My understanding about arcpy in raster processing at least in their grid format: Arcpy can not handle task that involves more than 3000 raster at a stretch to process. Below is my script so far: #!/usr/bin/python
# -*- coding: utf-8 -*-
#Author:Shariful Islam
#Contact: msi_g@yahoo.com
import arcpy,os,shutil,multiprocessing,re
from arcpy.sa import Con
from arcpy.sa import Raster
arcpy.env.overwriteOutput = True
arcpy.CheckOutExtension("spatial")
try:
from openpyxl import load_workbook
except:
raise Exception("install the openpyxl module in your python system")
#================================================================================================================#
#change below as it suits your system. try to make short all path below e.g. C:\OutputRsater D:\InputTempFile.xlsx
#just change the paths i.e. right side of the equations below nothing else- be careful.
INPUT_TEMP_EXCEL_PATH = r"C:\test\Temprt1.xlsx"
INPUT_DEM_RASTER_PATH = r"C:\test\dem_clip_11"
OUTPUT_TEMP_RASTER_FOLDER = r"C:\test\myout1"
TEMP_FOLDER_PATH = r"C:\test\mytemp"
#do not change below from here
#================================================================================================================#
Temp_Data = []
#Loading temperature data
temp_wb = load_workbook(filename=INPUT_TEMP_EXCEL_PATH, read_only=True)
temp_ws = temp_wb[temp_wb.sheetnames[0]]
for row in temp_ws.rows:
d = []
if len(row)>3:
for cell in row:
if cell.value == None:
pass
elif cell.value == 0:
d.append(0.000000)
elif isinstance(cell.value, float):
d.append(round(cell.value,6))
else:
d.append(cell.value)
Temp_Data.append(d)
#process collected excel data
Temp_Data= Temp_Data[1:]
seen = set()
Temp_Data = [x for x in Temp_Data if x[0] not in seen and not seen.add(x[0])]# removing duplicate date
#Folder content deleter
def folder_content_deleter(folder_path):
for the_file in os.listdir(folder_path):
file_path = os.path.join(folder_path, the_file)
try:
if os.path.isfile(file_path):
os.unlink(file_path)
elif os.path.isdir(file_path): shutil.rmtree(file_path)
except Exception as e:
pass
#folder deleter
def purge(dirpth, pattern):
for f in os.listdir(dirpth):
if re.search(pattern, f):
pth = os.path.join(dirpth, f)
shutil.rmtree(pth, ignore_errors=True)
#gdb content deleter
def gdb_content_deleter(wrkspc):
for r,d,fls in arcpy.da.Walk(wrkspc, datatype="FeatureClass"):
for f in fls:
print f
try:
arcpy.Delete_management(os.path.join(r,f))
except:
pass
#copy and group by year
def grouperByYear(input_folder_path, output_folder_path):
for dirpath, dirnames, filenames in arcpy.da.Walk(input_folder_path, topdown=True, datatype="RasterDataset", type="GRID"):
for filename in filenames:
out_folder_name = re.findall(r'(?<=\g)\d{4}', filename)[0]
out_folder_path = os.path.join(output_folder_path,out_folder_name)
if not os.path.exists(out_folder_path):
print "Creating and populating folder for year %s ......"%out_folder_name
os.mkdir(out_folder_path)
in_data = os.path.join(dirpath,filename)
ou_feature_name = 'g'+re.findall(r'(?<=\g\d{4})\d{4}$',filename)[0]
out_data = os.path.join(out_folder_path,ou_feature_name)
arcpy.Copy_management(in_data, out_data)
#processing Second part
def times_worker(times_range_list):
#set temporary places, grid format needs a gdb for placing intermediate data
#folder_content_deleter(TEMP_FOLDER_PATH)
scratch_db_name = "Scratch_"+str(times_range_list[0][0])
arcpy.CreateFileGDB_management(out_folder_path=TEMP_FOLDER_PATH, out_name=scratch_db_name, out_version="CURRENT")
scr_db = os.path.join(TEMP_FOLDER_PATH,scratch_db_name+".gdb")
arcpy.env.scratchWorkspace = scr_db
#set output db
out_db_name = "RData_"+str(times_range_list[0][0])
#arcpy.CreateFileGDB_management(out_folder_path=OUTPUT_TEMP_RASTER_FOLDER, out_name=out_db_name, out_version="CURRENT")
#out_db = os.path.join(OUTPUT_TEMP_RASTER_FOLDER,out_db_name+".gdb")
out_db = os.path.join(OUTPUT_TEMP_RASTER_FOLDER,out_db_name)
if not os.path.exists(out_db):os.mkdir(out_db)
arcpy.env.workspace = out_db
for tdata in times_range_list:
T1 = float('%.6f'%tdata[1])
T2 = float('%.6f'%tdata[2])
T3 = float('%.6f'%tdata[3])
out_path = os.path.join(out_db,'g'+str(tdata[0]))
outRast_name = "in_memory\\%s"%out_db_name
arcpy.MakeRasterLayer_management(INPUT_DEM_RASTER_PATH,outRast_name)
output_second = Con(Raster(outRast_name)<2573,T1,Con(Raster(outRast_name)<=2754,T2,T3))
final_temp_raster = output_second
#save
final_temp_raster.save(out_path)
#cleaning
gdb_content_deleter(scr_db)
def main(cu, worker, d_range):
pool = multiprocessing.Pool(cu)
pool.map(worker,d_range,1)
pool.close()
pool.join()
if __name__ == '__main__':
core_usage = 5
chunk_size = 1000
needed_cpu = int(round((len(Temp_Data)/chunk_size),0)+1)
offsetter = list(divmod(needed_cpu, core_usage))
cpu_distribution = [core_usage]*offsetter[0]+[offsetter[1]]
cpu_distribution = [cp for cp in cpu_distribution if cp!=0]#just remove zero
temp_data_range = [Temp_Data[i:i+chunk_size] for i in range(0,len(Temp_Data),chunk_size)]
print r"Doing raster math. It may take upto 3-7 hours even and may use your cpu at the heighest.\
So stop using your cpu fo this time. Go and enjoy elsewhere, let me do the job for you!.........."
loopcnt = 0
for cpu in cpu_distribution:
temp_data_range_splitted = temp_data_range[loopcnt:loopcnt+cpu]
if len(temp_data_range_splitted)>0:
main(cpu, times_worker, temp_data_range_splitted)
loopcnt+=cpu
#Cleaning
if arcpy.Exists("in_memory"):
arcpy.Delete_management("in_memory")
folder_content_deleter(TEMP_FOLDER_PATH)
#group by year
print "\nGrouping raster math output by year for you. It may take 1-2 hours at best.So stay tuned!........\n"
grouperByYear(OUTPUT_TEMP_RASTER_FOLDER, OUTPUT_TEMP_RASTER_FOLDER)
#delete unnecessary folders
print "\nCleaning all unnecessary files........\n"
purge(OUTPUT_TEMP_RASTER_FOLDER, r'RData_[0-9]{8}')
print r"All job finished! Now you are ready for the processing:)"
... View more
09-11-2016
08:41 AM
|
1
|
0
|
1575
|
POST
|
Thanks Dan! At last I tried to implement some of your advice. Is there any problem to read excel into a list and using this list later in the geoprocessing? I transferred all in root folder and tried to delete all the intermediate files after it is being used. I tried to save rasters as tif file and checked if align with the naming convention. In fact each raster name length is 8 e.g. 19120205 .But all with no success. import arcpy,os,sys,shutil,time,csv
from arcpy.sa import Con
from arcpy.sa import Raster
from openpyxl import load_workbook
arcpy.env.overwriteOutput = True
arcpy.CheckOutExtension("spatial")
INPUT_TEMP_EXCEL_PATH = (#arcpy.GetParameterAsText(0)
r"C:\test\Temprt.xlsx"
)
INPUT_DEM_RASTER_PATH = (#arcpy.GetParameterAsText(1)
r"C:\test\dem_clip_11"
)
Second_Discrete_variable = (#arcpy.GetParameterAsText(2)
1
)
OUTPUT_TEMP_RASTER_FOLDER = (#arcpy.GetParameterAsText(3)
r"C:\test\myout"
)
TEMP_FOLDER_PATH = (#arcpy.GetParameterAsText(4)
r"C:\test\mytemp"
)
Second_Discrete_variable = float('%.6f'%Second_Discrete_variable)
Temp_Data = []
#Loading temperature data
temp_wb = load_workbook(filename=INPUT_TEMP_EXCEL_PATH, read_only=True)
temp_ws = temp_wb[temp_wb.sheetnames[0]]
for row in temp_ws.rows:
d = []
for cell in row:
if cell.value == None:
pass
elif cell.value == 0:
d.append(0.000000)
elif isinstance(cell.value, float):
d.append(round(cell.value,6))
else:
d.append(cell.value)
Temp_Data.append(d)
#process collected excel data
Temp_Data= Temp_Data[1:]
seen = set()
Temp_Data = [x for x in Temp_Data if x[0] not in seen and not seen.add(x[0])]# removing duplicate date
#Folder content deleter
def folder_content_deleter(folder_path):
for the_file in os.listdir(folder_path):
file_path = os.path.join(folder_path, the_file)
try:
if os.path.isfile(file_path):
os.unlink(file_path)
elif os.path.isdir(file_path): shutil.rmtree(file_path)
except Exception as e:
pass
#gdb content deleter
def gdb_content_deleter(wrkspc):
for r,d,fls in arcpy.da.Walk(wrkspc, datatype="FeatureClass"):
for f in fls:
print f
try:
arcpy.Delete_management(os.path.join(r,f))
print "deleted scratch"
except:
pass
#set temporary places, grid format needs a gdb for placing intermediate data
folder_content_deleter(TEMP_FOLDER_PATH)
arcpy.CreateFileGDB_management(out_folder_path=TEMP_FOLDER_PATH, out_name="SDB_SR", out_version="CURRENT")
scr_db = os.path.join(TEMP_FOLDER_PATH,"SDB_SR.gdb")
arcpy.env.scratchWorkspace = scr_db
#processing Second part
counter = 0
for tdata in Temp_Data:
T1 = float('%.6f'%tdata[1])
T2 = float('%.6f'%tdata[2])
T3 = float('%.6f'%tdata[3])
out_folder_name = str(tdata[0])[0:4]
out_folder = os.path.join(OUTPUT_TEMP_RASTER_FOLDER,out_folder_name)
if not os.path.exists(out_folder):os.mkdir(out_folder)
arcpy.env.workspace = out_folder
out_path = os.path.join(out_folder,str(tdata[0])+'.tif')
output_second = Con(Raster(INPUT_DEM_RASTER_PATH)<2573,T1,Con(Raster(INPUT_DEM_RASTER_PATH)<=2754,T2,T3))
#apply map algebra if feature exists
try:
final_temp_raster = arcpy.gp.Times_sa(output_second, str(Second_Discrete_variable), "in_memory\\test_"+str(counter))
qq = final_temp_raster.getoutput(0)
qq=Raster(qq)
final_temp_raster = qq
except:
pass
try:
#check if wait is of use
time.sleep(15)
final_temp_raster = arcpy.gp.Times_sa(output_second, str(Second_Discrete_variable), "in_memory\\test_"+str(counter))
qq = final_temp_raster.getoutput(0)
qq=Raster(qq)
final_temp_raster = qq
except:
pass
with open(os.path.join(TEMP_FOLDER_PATH,'error.csv'), 'ab') as error_file:
writr = csv.writer(error_file)
writr.writerow(tdata)
counter+=1
#save
final_temp_raster.save(out_path)
#cleaning
if arcpy.Exists("in_memory"):
arcpy.Delete_management("in_memory")
gdb_content_deleter(scr_db)
#Cleaning
if arcpy.Exists("in_memory"):
arcpy.Delete_management("in_memory")
folder_content_deleter(TEMP_FOLDER_PATH)
... View more
09-10-2016
01:59 PM
|
0
|
2
|
1575
|
POST
|
I am dabbling with the resolution of the error says RuntimeError: ERROR 010240: Could not save raster dataset to as attached as a screen shot. I am working with a DEM file and trying to generate raster based on the conditional values from an excel file. As the excel file has about 15000 row, I need to generate about 15000 raster. I am very much disappointed when the script takes long time and at last it fails every time at different point and this has been happening from the yesterday. I am just reading date and associated 3 temperature values from the excel file and applying con operation on DEM arcgis grid raster to generate another raster. This process is repeated for all the dates i.e. rows in the excel file as attached. My script is a below #!/usr/bin/python
# -*- coding: utf-8 -*-
import arcpy
import os
import sys
import shutil
from arcpy.sa import Con
from arcpy.sa import Raster
from openpyxl import load_workbook
arcpy.env.overwriteOutput = True
arcpy.CheckOutExtension('spatial')
INPUT_TEMP_EXCEL_PATH = \
r"C:\Users\Winrock\Desktop\Ryan\Sept02ModularApproach\Temperature Model Data.xlsx" # arcpy.GetParameterAsText(0)
INPUT_DEM_RASTER_PATH = \
r"C:\Users\Winrock\Desktop\Ryan\Sept02ModularApproach\DEM\dem_clip_11" # arcpy.GetParameterAsText(1)
Second_Discrete_variable = 10 # arcpy.GetParameterAsText(2)
OUTPUT_TEMP_RASTER_FOLDER = \
r"C:\Users\Winrock\Desktop\Ryan\Sept02ModularApproach\OutputRaster" # arcpy.GetParameterAsText(3)
TEMP_FOLDER_PATH = \
r"C:\Users\Winrock\Desktop\Ryan\Sept02ModularApproach\Temp" # arcpy.GetParameterAsText(4)
Second_Discrete_variable = float(Second_Discrete_variable)
Temp_Data = []
# Loading temperature data
temp_wb = load_workbook(filename=INPUT_TEMP_EXCEL_PATH, read_only=True)
temp_ws = temp_wb[temp_wb.sheetnames[0]]
for row in temp_ws.rows:
rw = [cell.value for cell in row]
Temp_Data.append(rw)
Temp_Data = Temp_Data[1:]
# Folder content deleter
def folder_content_deleter(folder_path):
for the_file in os.listdir(folder_path):
file_path = os.path.join(folder_path, the_file)
try:
if os.path.isfile(file_path):
os.unlink(file_path)
elif os.path.isdir(file_path):
shutil.rmtree(file_path)
except Exception, e:
pass
# set temporary places, grid format needs a gdb for placing intermediate data
folder_content_deleter(TEMP_FOLDER_PATH)
arcpy.CreateFileGDB_management(out_folder_path=TEMP_FOLDER_PATH,
out_name='ScratchData_solRaster',
out_version='CURRENT')
arcpy.env.workspace = arcpy.env.scratchWorkspace = \
os.path.join(TEMP_FOLDER_PATH, 'ScratchData_solRaster.gdb')
# processing Second part
for tdata in Temp_Data:
T1 = tdata[1]
T2 = tdata[2]
T3 = tdata[3]
output_second = Con(Raster(INPUT_DEM_RASTER_PATH) < 2573, T1,
Con(Raster(INPUT_DEM_RASTER_PATH) <= 2754, T2,
T3))
final_temp_raster = output_second + Second_Discrete_variable
# save
out_path = os.path.join(OUTPUT_TEMP_RASTER_FOLDER, str(tdata[0]))
final_temp_raster.save(out_path)
# Cleaning
if arcpy.Exists('in_memory'):
arcpy.Delete_management('in_memory')
folder_content_deleter(TEMP_FOLDER_PATH)
My error in gist is- RuntimeError: ERROR 010240: Could not save raster dataset to C:\Users\Winrock\Desktop\Ryan\Sept02ModularApproach\Temp\ScratchData_solRaster.gdb\ifthe_ras with output format FGDBR The excel file I am using is https://www.dropbox.com/s/qacfhipo4ry7o2b/Temperature%20Model%20Data.xlsx?dl=0 My error- N.B. I tried several thread some of them are- What causes RuntimeError: ERROR 010240 saving after CellStatistics? Why does CON statement give ERROR 010240: Could not save raster dataset to (value) with output format GRID? error 010240 with output format grid arcgis 10.0 - python.multiprocessing and "FATAL ERROR (INFADI) MISSING DIRECTORY" - Geographic Information Systems Stack… System specification: --------------------------------------------------------------------------------------------------------------------------------------------------------------- Update: *I tried with different scratch and current workspace *I tried with arcpy.gp.Times_sa(it stops after processing almost 3000 rasters) and arcpy.sa.Times(it stops after processing almost 1070 rasters) *I tried with arcpy.TestSchemaLock *I tried with setting output to tif format too. I see that it stops and raises error when it processed and output exact number of 1070 grid files. It is giving me pain for several days- please help!
... View more
09-09-2016
08:16 PM
|
0
|
8
|
3619
|
POST
|
Hi all, It may be a general problem, by the way. Does the problem of outputting geo-processing in "Feature Dataset" still remains? I am failing to output result of "arcpy.JSONToFeatures_conversion" tool in "Feature Dataset" in ArcMap 10.3 (Advanced). When i run script, output goes directly into the parent "Geodatabase" though the projection of that "Feature Dataset" is the exact of the output of that tool. Thanks Shariful
... View more
08-15-2015
10:25 AM
|
0
|
0
|
125
|
POST
|
Thanks, But what I need is only the selection order,because i have many consecutive number to be populated with (in the field ). I want this done automatically based on selection. Is there any programming hints for C# Please guide; I am really type of bored doing this manually. Thanks again...
... View more
02-18-2013
07:01 AM
|
0
|
0
|
306
|
POST
|
Thanks Anthony, What do you mean by calculate value, would you clear a bit. I want to populate "Parcel ID" by my selection order of polygons in editing session; i am in arc map 10.
... View more
02-15-2013
02:54 AM
|
0
|
0
|
306
|
POST
|
I have parcel maps, where each polygon represents for parcel.These polygons are made by "Feature to polygon" of polylines made from arc scan auto vectorization of a tiff files. These tiffs contains parcel IDs in the form of several set of consecutive numbers.Now i want to populate a field in the attribute table of the polygon feature named "Parcel ID" based on the my selection serial in editing session. In a nutshell what i want is to select polygon based on their "Parcel Ids", scribed on tiffs,serial (e.g.555,556,557,558) and the apply a field calculator script to populate with consecutive numbers(555-558) by python script,but problem is when i select polygons based on the Parcel ID serial they are not select in the attribute table accordingly rather in ascending order of FIDs/OIDs. I am to work on many of such tiffs, what can i do. Thanks in advance...
... View more
02-15-2013
01:34 AM
|
0
|
5
|
410
|
Title | Kudos | Posted |
---|---|---|
1 | 09-11-2016 08:41 AM | |
3 | 02-26-2017 06:39 AM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:24 AM
|