POST
|
Is it possible in the TimeSlider object of the Javascript API to lock the range between the two thumbs? Basically not allow the user to move only one of the two thumbs? Thanks!
... View more
04-06-2016
03:15 AM
|
0
|
0
|
1795
|
POST
|
Thanks Adam, that simplified things a lot. You actually don't need to use JSON to Feature tool. You can save it directly to shapefile. I will put the code below. However I am still frustrated that I have to go through the service to get to the file instead of doing that in the background. Geoserver has that functionality built in In any case, if anyone wants to use the script below they will probably want to add a return string with the file name according their needs. import arcpy from arcpy import env import zipfile import sys import os import glob from random import randint baseURL= arcpy.GetParameterAsText(0) where = '1=1' fields ='*' token = '' #The above variables construct the query query = "?where={}&outFields={}&returnGeometry=true&f=json&token={}".format(where, fields, token) fsURL = baseURL + query fs = arcpy.FeatureSet() fs.load(fsURL) fs.save(arcpy.env.scratchFolder+"\\"+"download_file.shp") #Randomize file name def random_with_N_digits(n): range_start = 10**(n-1) range_end = (10**n)-1 return randint(range_start, range_end) #Ziping the shapefile files def zipShapefile(inShapefile, newZipFN): if not (os.path.exists(inShapefile)): print inShapefile + ' Does Not Exist' return False if (os.path.exists(newZipFN)): os.remove(newZipFN) if (os.path.exists(newZipFN)): return False zipobj = zipfile.ZipFile(newZipFN,'w') for infile in glob.glob( inShapefile.lower().replace(".shp",".*")): zipobj.write(infile,os.path.basename(infile),zipfile.ZIP_DEFLATED) zipobj.close() return True randomname = str(random_with_N_digits(8)) InputShapeFile = arcpy.env.scratchFolder+"\\"+"download_file.shp" #Public folder. Probably you will want to change this OutputZipFile = "C:\\inetpub\\wwwroot\\Shapefiles\\"+randomname+"_file.zip" zipShapefile(InputShapeFile,OutputZipFile) print "done!"
... View more
07-29-2015
08:06 AM
|
1
|
1
|
276
|
POST
|
Hi all, As I wrote in original post, the limitation of the above solutions is that you have to predefine the layers to clip/download. What I did to work around this limitation was to write a small script on the server, I used C# but anything will work, that provides a single service. That service takes a WFS link as an input and builds a shapefile using the GDAL library. Then wraps it up in a zip file, puts it on the server and returns a link to the file... Not the most efficient way to do it and might time-out for large datasets but I haven't found anything better. To avoid the timeouts you can instead of returning a link to the file, send an email with the link to the user when the zip file is ready. I have tested libraries in js that build the shapefile on the client but they were slower and time-out a lot more. It would be great (and surely easy to develop) if this could be done with a build in GP service. Cheers
... View more
07-29-2015
02:14 AM
|
0
|
0
|
1136
|
POST
|
Thank you David, You helped me put things in order. I managed to solved it. There was a number of things going wrong: AGS kept locking the schema so converting the model to python helped The "in_memory" never worked for me so I used scratch_folder arcpy.env.workspace didn't work properly when I set it to the SDE either so UNC path names with the awkward "//" The user running AGS service even though is an administrator for the machine that is running AGS and owner of the SDE database he had permission problems. The logs of the SQL server showed that he didn't make an attempt to connect to the database when the GP tool was running. However the user had no problems writing to the local folders. This still remains a mystery to me. The solution was to run the process as another user. This was probably caused by our internal structure, however it would be nice to know exactly what this user needs as far as permissions are concerned. To access tables in the SDE the db name and the schema was necessary (check script below) Here is the script that worked. It used to be a lot prettier but every time I try to beautify it, something goes wrong... import arcpy from arcpy import env import zipfile # Script arguments Input_zip = arcpy.GetParameterAsText(0) arcpy.env.overwriteOutput = True fh = open(Input_zip, 'rb') z = zipfile.ZipFile(fh) for name in z.namelist(): z.extract(name, arcpy.env.scratchFolder) if name.endswith('.shp'): result = arcpy.env.scratchFolder+"\\"+name fh.close() sdecon = arcpy.CreateDatabaseConnection_management(arcpy.env.scratchFolder, "tempcon", "SQL_SERVER", "xxxxxx.xxxx.xxxxx", "OPERATING_SYSTEM_AUTH", "", "*****", "SAVE_USERNAME", "GDB", "", "TRANSACTIONAL", "sde.DEFAULT", "") outpath = arcpy.env.scratchFolder+"\\"+"tempcon.sde" arcpy.FeatureClassToFeatureClass_conversion(result,outpath,"NEWFILE") newfile=outpath+"\\"+"GDB.SDE.NEWFILE" oldfile=outpath+"\\"+"GDB.SDE.MAIN" renamefile=outpath+"\\"+"RENAMEME" arcpy.Merge_management([oldfile,newfile], renamefile ) fromname = outpath+"\\"+"GDB.SDE.RENAMEME" arcpy.Delete_management(oldfile, "FeatureClass") arcpy.Delete_management(newfile, "FeatureClass") arcpy.Rename_management(fromname, "MAIN") Thanks again. I hope some of the above will help other user.
... View more
06-26-2015
05:22 AM
|
1
|
1
|
1079
|
POST
|
I very happily tried to publish as a service the following model and run very quickly into trouble. What it does is basically taking a zip file with a shapefile, import it into an sde database (SQL server) merge it with an existing featureset and delete intermediate products. It works perfectly on my desktop but after publishing the results as a service the "Import to database" gives the following error "ERROR 000210: Cannot create output C:\arcgisserver\directories\arcgissystem\arcgisinput\TestFolder\Model5.GPServer\extracted\v101\sql06.sde\temp1 Failed to execute (Import to Database). Failed to execute (Model3). Failed to execute (merge_layer)." The database is registered with the AGS (10.3) and when I publish I don't get prompted to copy anything. Any ideas why AGS can't write to the database?
... View more
06-22-2015
03:56 AM
|
0
|
5
|
5533
|
POST
|
As Arcgis for server has not a built in download shapefile(or any other format) functionality the "clip and ship" geoprocessing example is being suggested by many users and developers. However I find it quite limiting. As far as I understand, the "Extract Data" tool requires first to be run with a pre-defined set of Layers as a "Parameter" just like the "Zion" example. What I (and many other developers) need is a script/service that takes an Arcgis Server service as an input (or a relevant js object like featureset) and returns a shapefile (or any other format) as an output. Of course I am referring to a client web application. I am using the Javascript API but if I am not mistaken Flex developers face the same issue. Thanks.
... View more
02-09-2015
01:39 AM
|
1
|
12
|
6546
|
POST
|
According to the Geoportal documentation the file identifier field if it is not filled then it gets the value of DOCUUID: Database Tables · Esri/geoportal-server Wiki · GitHub However, when I publish a record through the GPT publish client (From ArcCatalogue) the field file identifier (FILEIDENTIFIER) remains blank. This is rather important because if there is no file identifier then the CSW Client for ArcGIS can not read the metadata record from the geoportal. Is there a way to fix this and have the FILEIDENTIFIER field to get assigned with a code automatically?
... View more
10-14-2014
06:07 AM
|
0
|
0
|
3859
|
POST
|
I am also interested in this. Did you find any solution?
... View more
10-01-2014
05:18 AM
|
0
|
0
|
338
|
POST
|
Dear all, I am trying to make a simple geocoder widget with autocomplete on but I always get up to 5 suggested results. Is there a way to tweak that and get more? I am using a code similar to this example: http://developers.arcgis.com/javascript/samples/locator_widget/ Thanks in advance! Periklis
... View more
06-04-2014
09:03 AM
|
0
|
2
|
3358
|
Title | Kudos | Posted |
---|---|---|
1 | 07-29-2015 08:06 AM | |
1 | 06-26-2015 05:22 AM | |
1 | 02-09-2015 01:39 AM |
Online Status |
Offline
|
Date Last Visited |
02-09-2024
05:02 AM
|