POST
|
Thanks’ for your help! As I’m quite a novice using ModelBuilder so it took me a while to figure out how to implement the iteration and to write the results in a table. But now I’m happy with the result (attached file: 1_Decommission_Monitoring_Stations). By now, I’ve tested the model running 12 of 137 iterations. And I can determine that performing all 137 iterations will take about 3 hours (due to the prediction of 21,000 points per iteration mainly). Implementing the tool “Create Geostatistical Layer” is a little tricky indeed: Unfortunately, the Input dataset can’t be connected directly but has to be chosen from a drop-down menu. Here the required Dataset “Input_XY_REF” exists 4 times (attached file: 2_Create_Geostatistical_Layer_Screenshot)! Using trial-and-error I determined the right one: It includes the required selection, so that the candidate is removed from the dataset. (Choosing the wrong ones’ results in using the whole dataset and therefore the standard error staying the same…) I’ve decided NOT to check the option “Always reset input dataset when the geostatistical model source changes” as I think my Template Geostatistical Layer won’t have to change at all. Am I right? Best regards, Maurice
... View more
07-21-2016
09:52 AM
|
0
|
1
|
604
|
POST
|
Hi Eric, thank you very much for your quick and helpful answer. I could now figure out the problem: Unfortunately, I performed kriging as an exact interpolator and therefore didn’t use a nugget at all. So there was no kriging standard error at the measurement locations despite the “Measurement Error” being 100%. As the tool “Densify Sampling Network” needs a measurement error I altered my isotropic semivariogram model (Type: Stable; Parameter: 1.4; Range: 1,200 m; Partial Sill: 5.86 m) by adding - (I) a small nugget of 0.01 (that deteriorates the cross-validation results), - (II) an extremely small nugget of 0.00001 (with negligible effects on the cross-validation results). Then the tool ran successfully both variants creating two point features classes, each containing the 137 measurement stations in the order of decreasing StdErr values (which I interpreted as decreasing importance). As expected, variant (II) results in smaller StdErr values compared to variant (I). However, the ranking of the 137 stations is similar but not identical: There are minor differences between both rankings with a maximum difference of 5 ranks. So regarding my target of identifying stations that could decommissioned the tool “Densify Sampling Tool” gives me good hints but I’m not fully satisfied. Originally, I thought of performing the monitoring network reduction in another way: I intended to use a cross-validation procedure by sequentially removing each of the 137 station to register the increase of the mean kriging standard error of all prediction point locations (nearly 21,000 points on a regular grid). The station leading to the smallest increase of the mean kriging standard error would be decommissioned and I would restart the procedure with the remaining 136 stations based on the new Geostatistical Layer. However, the manual procedure is too time consuming. But there would be another way leading to the same results more straightforward using the mean kriging weights of the measurement stations: Firstly, the stations with their associated kriging weights for all single prediction point locations need to be determined (which can be done manually for any single prediction point location in the Search Neighborhood page of the Geostatistical Wizard). Secondly, the mean kriging weight of every station would be averaged over all prediction point locations. The station with the least mean kriging weight would be decommissioned and then the procedure is repeated based on the new Geostatistical Layer. Unfortunately, again I see no way other than the absolutely unrealistically manual procedure… Is there any chance to get one of my two intended procedures automatized? Do you have any advice for my decommission target? Cheers!
... View more
07-19-2016
06:22 AM
|
0
|
3
|
604
|
POST
|
Hello, I am working on optimizing an existing monitoring well network in ArcGIS as part of my master thesis. I successfully used the tool “Densify Sampling Network” for adding measurement locations to parts of the study area with a high kriging standard error. However I can’t manage to use the tool for the reduction of measurement locations for those parts of the study area where they have the least influence on the predictions. I keep receiving "Error 040043: Measurement error for data with coincident samples should not be equal to zero." (As parameters I used number_output_points: 5 and in_candidate_point_features: point feature class containing all 137 monitoring wells). What am I doing wrong? Thanks for your help in advance. Best regards Maurice
... View more
07-18-2016
10:04 AM
|
0
|
6
|
2323
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:25 AM
|