# Import python modules import arcpy import os import sys import logging import zipfile from datetime import date import time import json import configparser def getConfigVariables(): # Impostazione di valori dal file di configurazione # Initialise parser config = configparser.ConfigParser() config.read('config.ini') # Return dictionary of environment variables return config def defineLogger(logFolder, loggerName, logFileName, level = logging.INFO): # Create Log file logFilePath = os.path.join(logFolder, logFileName) logger = logging.getLogger(loggerName) if len(logger.handlers) == 0: hdlr = logging.FileHandler(logFilePath) formatter = logging.Formatter('[%(lineno)d - %(funcName)s()] %(asctime)s %(levelname)s %(message)s') hdlr.setFormatter(formatter) logger.addHandler(hdlr) logger.setLevel(level) return logger def getDBconnectionParams(serverConfig): # Return the connection environment variables from the configuration file. return {"DatabasePlatform" : serverConfig["Platform"], "Instance" : serverConfig["Instance"], "AuthType" : serverConfig["AuthType"], "Username" : serverConfig["Username"], "Password" : serverConfig["Password"] } def buildSDEfile(serverConfig, out_folder): """Se non esiste, crea una connessione .sde con nome del tipo SILVIA_data_corrente""" out_name = serverConfig["Schema"]+ ".sde" full_name = os.path.join(out_folder, out_name) if os.path.exists(full_name) == False: DBconnectionParams = getDBconnectionParams(serverConfig) arcpy.CreateDatabaseConnection_management(out_folder, out_name, DBconnectionParams["DatabasePlatform"], DBconnectionParams["Instance"], DBconnectionParams["AuthType"], DBconnectionParams["Username"], DBconnectionParams["Password"], 'SAVE_USERNAME' ) return full_name def getWorkingFolders(wp): """Ritorna i path delle cartella di lavoro della procedura. La procedura verifica se le cartele esistono o meno sul filesystem di AGS. Se non esistono, la crea. La cartella logs è dove la procedura scrive il proprio file di log. La cartella shapefile è dove la procedura genera gli shapefile e li archivia temporaneamenete.""" return { "log_dir" : os.path.join(wp,'extract','logs','SILVIA'), "shapefile_dir" : os.path.join(wp,'extract','logs','SILVIA','shapefile'), "zipfile_dir" : os.path.join(wp,'netapp02trm','iit_share_pre','SITERLPRE','directories','arcgisoutput','silvia_zipfile'), "url_file": "https://sire.rtil1s.it/arcgis3/rest/directories/arcgisoutput/silvia_zipfile" } def createWorkingFolder(folder): """Verifica se esiste una cartella ad un certo path. Se non esiste, la crea.""" if not os.path.exists(folder): os.makedirs(folder) # Define SDE connection file name def SDE_cnn_name(): """Ritorna nome sde connection registrata presso AGS""" return 'SILVIA2018.sde' def VS_names(schema_name, vs_names): """Ritorna nomi utente oracle (schema_name) e viste (vs_names)""" vs_list_name = [] for name in vs_names: test = schema_name + '.' + name logger.debug(test) vs_list_name.append(test) logger.info(vs_list_name) return vs_list_name def sendStepMessage(nStep, nTotal, msg): arcpy.AddMessage("({}}/{}) {}".format(nStep, nTotal, msg)) def getTimeStamp(): ts = time.gmtime() return time.strftime("%Y%m%d_%H%M%S", ts) def getInMemoryWorkspace(table): ''' Crea table in memory :param table: :return: ''' in_memory_wk = ('in_memory',) in_memory_wk += (table,) return '\\'.join(in_memory_wk) def checkIfFeatures(in_features, where_clause=None): cursor = arcpy.da.SearchCursor(in_features, 'OID@', where_clause) return any(cursor) def makeFeatureLayer(in_features, out_layer, where_clause=None, workspace=None, field_info=None): arcpy.MakeFeatureLayer_management (in_features, out_layer, where_clause, workspace, field_info) def copyFeature(in_features, out_feature_class, config_keyword=None, spatial_grid_1=None, spatial_grid_2=None, spatial_grid_3=None): arcpy.CopyFeatures_management (in_features, out_feature_class, config_keyword, spatial_grid_1, spatial_grid_2, spatial_grid_3) def zipShp (in_folder, in_names, out_folder, archive_name, delete_shp = True, make_one_zip=True): '''Ritorna il full path dello zip contenente gli shapefile. Se delete_shp=True elimina shapefile dopo averli zippati. Se make_one_zip=True crea un archivio di zip. Elimina gli zip dopo averli spostati nell'archivio.''' # Lista delle estensioni extensions = [".shp", ".shx", ".dbf", ".sbn", ".sbx", ".fbn", ".fbx", ".ain", ".aih", ".atx", ".ixs", ".mxs", ".prj", ".xml"] # Scrive in arcgisoutput o in folder esposta in internet for in_name in in_names: logger.debug(in_name) zipfl = os.path.join(out_folder, in_name + '.zip') logger.info(out_folder) logger.info('Path di creazione zip => {}'.format(zipfl)) logger.debug(zipfl) ZIP = zipfile.ZipFile (zipfl, "w") # Itera sui file presenti nella folder in_folder for fl in os.listdir (in_folder): # Itera sulle estensioni for extension in extensions: # Verifica se un file è un file ammesso tra le estensioni if fl == in_name + extension: inFile = os.path.join (in_folder, fl) # Aggiunge il file allo zip ZIP.write (inFile, fl) break ZIP.close() logger.info('Zippato shp {0}'.format(in_name)) if True == delete_shp: arcpy.env.workspace = os.path.join(in_folder) for in_name in in_names: shp_deleted = os.path.join(in_name + '.shp') arcpy.Delete_management(shp_deleted) logger.info('Cancellato shp {}'.format(shp_deleted)) if False == make_one_zip: return zipfl else: archfl = os.path.join(out_folder, archive_name + '.zip') ARCH = zipfile.ZipFile(archfl, 'w', zipfile.ZIP_DEFLATED) for in_name in in_names: shpzip_path = os.path.join(out_folder, in_name + '.zip') shpzip_name = os.path.join(in_name + '.zip') ARCH.write(shpzip_path, shpzip_name) os.remove(shpzip_path) logger.info('Cancellato zip {}'.format(shpzip_path)) ARCH.close() return archfl def removeSDEfilename(top): """Elimina tutte le connessioni .sde con nome diverso da SILVIA_data_corrente""" current_sde = os.path.basename(buildSDEfile(top)).lower() for root, dirs, files in os.walk(top): for f in files: a, b = os.path.splitext(f) if b.lower() == '.sde' and f.lower() != current_sde: full_path = os.path.join(root, f) try: os.remove(full_path) except Exception as ex: logger.error(ex) if __name__ == "__main__": # Get parameters from the user id = arcpy.GetParameterAsText(0) # Number of steps n = 13 # Working Folder sendStepMessage( 1, n, "Set baseworking directory") wp = arcpy.env.ScratchFolder # Retrieve environment variables sendStepMessage( 2, n, "Retrieve environment variables") config = getConfigVariables() # Set the working directories sendStepMessage( 3, n, "Create path to working directories") workingFolders = getWorkingFolders(wp) #Create path for the working directories in case they are missing sendStepMessage( 4, n, "Create working directories (if missing)") createWorkingFolder(workingFolders['log_dir']) createWorkingFolder(workingFolders['shapefile_dir']) createWorkingFolder(workingFolders['zipfile_dir']) # Define logs sendStepMessage( 5, n, "Create logger of the processing tool") logger = defineLogger(workingFolders['log_dir'], 'silvia_tool', 'silvia_tool.log', level = logging.INFO) logger.info('Scratch Workspace => {}'.format(wp)) # Build the connection to the SDE sendStepMessage( 6, n, "Define and build the configuration of the connection to the db") wks_filename = buildSDEfile(config["ServerConfig"], wp) arcpy.env.workspace = wks_filename logger.info('SDE connection file => {}'.format(wks_filename)) # Define define the connection to the views sendStepMessage( 7, n, "Define the paths to the views") fcs = VS_names(config['ServerConfig']['Schema'], config['ServerConfig']['Schema']) # Define the query sendStepMessage( 8, n, "Define the query to the views") where_clause_ID_PRCDR = "ID_PRCDR = {}".format(id) logger.info('Where clause per esportazione => {}'.format(where_clause_ID_PRCDR)) # Extraction procedure zip_file_url = {} shapefile_names = [] try: if id > 0: sendStepMessage( 9, n, "Create container for the extracted shapefiles") archive_name = "{0}_export_{1}_{2}".format('shapefile', id, getTimeStamp()) i = 0 for fc in fcs: i += 1 sendStepMessage( 11, n, "Extracting features from views ({}/{})".format(i, len(fcs))) # Path definition to the views fc_fullpath = os.path.join(os.path.join(wks_filename, 'SILVIA2018.sde'), fc) logger.info('FC => {}'.format(fc_fullpath)) # Check memory of the workspace fc_in_memory = getInMemoryWorkspace(fc) # Check number of features that fulfill the query feature = checkIfFeatures(fc_fullpath, where_clause=where_clause_ID_PRCDR) logger.info('FC {0} contiene elementi che soddisfano la where clause per esportazione => {1}'.format(fc_fullpath, str(feature))) # Define the path to the folder to generate the shapefiles shapefile_name = "{0}_export_{1}_{2}".format(fc.split('.')[1], id, getTimeStamp()) shapefile_output_folder = workingFolders['shapefile_dir'] shapefile_output_fullpath = os.path.join(shapefile_output_folder, shapefile_name) # Create feature layer with the matching features and create a copy if feature: makeFeatureLayer(fc_fullpath, fc_in_memory, where_clause=where_clause_ID_PRCDR) logger.info('Creato feature layer in memory => {}'.format(fc_in_memory)) copyFeature(fc_in_memory, shapefile_output_fullpath) logger.info('Creato shp temporaneo al path => {}'.format(shapefile_output_fullpath)) shapefile_names.append(shapefile_name) # Create for the different views used in the query sendStepMessage( 12, n, "Creating url file path") out_zip_url = workingFolders['url_file'] out_zip_folder = workingFolders['zipfile_dir'] logger.debug(shapefile_names) if shapefile_names: # Aggiunto params make_one_zip e archive_name a zipShp per gestire zip di zip. shapefile_name viene sostituito da shapefile_names. zip_file_url[archive_name] = out_zip_url + '/' + os.path.basename(zipShp(shapefile_output_folder, shapefile_names, out_zip_folder, archive_name, True, True)) logger.info('Zippato archivio shp {0}. Disponibile al path {1} e all\'url {2}'.format(archive_name, out_zip_folder, zip_file_url[archive_name])) returnJSON = json.dumps(zip_file_url) else: zip_file_url[archive_name] = 'Error. No feature for ID_PRCDR={}'.format(id) returnJSON = json.dumps(zip_file_url) logger.info(json.dumps(zip_file_url)) # Return the link url as an output parameter sendStepMessage( 13, n, "Created file url") arcpy.SetParameter(1, out_zip_folder) except Exception as ex: logger.error(ex) arcpy.AddError(str(ex)) removeSDEfilename(wp)