def archive(): # DCA - this should really be broken out into inputs vs outputs folders so its not all lumped together log("Archiving WHI inputs and outputs") # create new geodatabse archive_gdb = "WHI_archive_" + datetime.datetime.now().strftime('%Y%m%d') full_path = os.path.join(config.archive_loc, archive_gdb + ".gdb") if arcpy.Exists(full_path) == False: arcpy.CreateFileGDB_management(config.archive_loc, archive_gdb) # copy input files into geodatabase log("...archiving inputs") # vector sources for fc in config.vect_archive_list: if arcpy.Exists(fc) == True: log("......vectors") arcpy.FeatureClassToGeodatabase_conversion( config.vect_archive_list, full_path) return else: return str(fc) + " not found" # exclude these for now - they are going to be pretty static and take forever to copy ## # raster sources ## for fc in config.rast_archive_list: ## if arcpy.Exists(fc) == True: ## log("......rasters") ## arcpy.RasterToGeodatabase_conversion(config.rast_archive_list, full_path) ## else: ## return str(fc) + " not found" # copy output files into geodatabase log("...archiving outputs") # table outputs log("......tables") arcpy.env.workspace = config.primary_output tables = arcpy.ListTables() arcpy.TableToGeodatabase_conversion(tables, full_path) # feature class outputs log("......feature class(es)") fcs = arcpy.ListFeatureClasses() arcpy.FeatureClassToGeodatabase_conversion(fcs, full_path) log("Archiving complete")
def write_gdb(self): # todo: clean this up a bit # create a list of feature classes in the workspace to # transfer over to the newly created geodatabase arcpy.CreateFileGDB_management(self.__folderout, self.__outname) #print arcpy.ListFeatureClasses('', 'All', arcpy.ListDatasets()[0]) geodbpath = self.__folderout + "\\" + self.__outname for f in arcpy.ListDatasets(): fcList = arcpy.ListFeatureClasses('', 'All', f) if len(fcList) > 0: arcpy.FeatureClassToGeodatabase_conversion(fcList, geodbpath) fcList = arcpy.ListFeatureClasses() arcpy.FeatureClassToGeodatabase_conversion(fcList, geodbpath)
def create_layer_minsk_ate(self): """ create layes ATE, which contain boundary of Minsk from layer rb.shp GCS Pulkovo-1942 :return: ATE(boundary Minsk), WGS-84 """ shp = r'{0}\{1}1.shp'.format(self.path_to_layer_ate, self.name_district) shp_sk = r'{0}\ATE.shp'.format(self.path_to_layer_ate, self.name_district) tempEnvironment0 = arcpy.env.outputCoordinateSystem arcpy.env.outputCoordinateSystem = sk_42 arcpy.Select_analysis( os.path.join(self.path_to_layer_ate, self.name_layer_ate), shp, "REGN = 17030") arcpy.env.outputCoordinateSystem = tempEnvironment0 arcpy.Project_management(shp, shp_sk, wgs84, "CK42_to_ITRF2005", sk_42) for root, dirs, files in os.walk(self.path_to_layer_ate): for file in files: if file.find('1') > -1: os.remove('{0}\{1}'.format(self.path_to_layer_ate, file)) try: arcpy.FeatureClassToGeodatabase_conversion(shp_sk, self.nameDataSet) except: print "This layer's been in DateBase yet" # удаление шейпов ATE, Selsovets for root, dirs, files in os.walk(self.path_to_layer_ate): for file in files: if file.find('ATE') > -1: os.remove('{0}\{1}'.format(self.path_to_layer_ate, file))
def ConvertMDBtoGDB(mdb, gdb): tables = [] featurs = [] FCs = ListFCinGDBorMDB(mdb) #print(FCs) for fc in FCs: ft = arcpy.Describe(mdb + '\\' + fc).dataElementType print(ft) if ft == 'DEFeatureClass': featurs.append(mdb + '\\' + fc) if ft == 'DETable': tables.append(mdb + '\\' + fc) if os.path.exists(gdb): try: shutil.rmtree(gdb) except Exception as e: print(e) pass arcpy.CreateFileGDB_management(os.path.dirname(gdb), os.path.basename(mdb).split('.')[0]) if len(featurs) > 0: arcpy.FeatureClassToGeodatabase_conversion(featurs, gdb) if len(tables) > 0: arcpy.TableToGeodatabase_conversion(tables, gdb) outL = featurs outL.extend(tables) return (outL)
def ImportZonalLayer(InZoneLayer, UPlanGDB, LongName): ''' *Imports a Zonal layer to the GDB *Adds a record to the upc_layers table for this constraint layer Calls: AddToUPC_Layers Called by: Import Zone Layer Toolbox Arguments: InZoneLayer: The layer to be added to the GDB as a zonal layer UPlanGDB: The UPlan GDB (where the layer will be imported) LongName: The descriptive name of the layer Returns: None ''' # Set workspace env.workspace = UPlanGDB if os.path.basename(InZoneLayer)[-4:] == ".shp": SHPName = os.path.basename(InZoneLayer)[:-4] else: SHPName = os.path.basename(InZoneLayer) #Add layer to geodatabase arcpy.FeatureClassToGeodatabase_conversion(InZoneLayer, UPlanGDB) #add zonal layer to layer tracker table AddToUPC_Layers(UPlanGDB, SHPName, LongName, 'ZonalSummary')
def shpToGDB(): """ Takes shapefiles from a folder and copies them to a GDB Returns: Feature classes """ #set local environment for this function arcpy.env.workspace = r'C:\Users\JBurton_AOR\Documents\ArcGIS\Projects\Oahu\daily_shapefiles' arcpy.env.overwriteOutput = 'True' #create list to use in for loop shpList = arcpy.ListFeatureClasses() print(shpList) out_GDB = r'C:\Users\JBurton_AOR\Documents\ArcGIS\Projects\Oahu\Daily_Coverage.gdb' arcpy.Delete_management(out_GDB) print('Daily_Coverage.gdb deleted') path = r'C:\Users\JBurton_AOR\Documents\ArcGIS\Projects\Oahu' name = r'Daily_Coverage.gdb' arcpy.CreateFileGDB_management(path, name) print('Daily_Coverage.gdb created') #for loop runs tool on each shapefile in the folder for sf in shpList: arcpy.FeatureClassToGeodatabase_conversion(sf, out_GDB) print(sf + ' ' + 'was exported') print('Shapefiles were exported to the GDB') return
def import_model_shapefile(shapefile, output_geodatabase): """ Import a model result shapefile into a geodatabase :param shapefile: :type shapefile: :param output_geodatabase: :type output_geodatabase: :return: :rtype: """ try: _logger.info("asserting values") assert shapefile is not None assert output_geodatabase is not None arcpy.env.overwriteOutput = True shape = shapefile output_geodatabase = output_geodatabase _logger.info("Adding {} to {}".format(shape, output_geodatabase)) shapefile_name = str(shape).split("\\") out_feature = "{}\\{}".format( output_geodatabase, shapefile_name[len(shapefile_name) - 1].replace(".shp", "")) _logger.debug(out_feature) if (arcpy.Exists(out_feature)): _logger.info("feature exists - deleting old feature") arcpy.Delete_management(out_feature) x = arcpy.FeatureClassToGeodatabase_conversion(shape, output_geodatabase)[0] return out_feature except Exception, e: _logger.error(e.message)
def ImportGPLayer(InGPLayer, UPlanGDB, LongName): ''' *Imports an General Plan layer to the GDB *Adds a record to the upc_layers table for this attractor layer Calls: AddToUPC_Layers Called by: Import General Plan Toolbox Arguments: InGPLayer: The layer to be added to the GDB as a general plan UPlanGDB: The UPlan GDB (where the layer will be imported) LongName: The descriptive name of the layer currently disabled... Timesteps: What timestep(s) to assign this general plan layer to Returns: None ''' #Import a general plan layer record it's role, and record the gp class field name. arcpy.env.workspace = UPlanGDB if os.path.basename(InGPLayer)[-4:] == ".shp": SHPName = os.path.basename(InGPLayer)[:-4] else: SHPName = os.path.basename(InGPLayer) #Add layer to geodatabase arcpy.FeatureClassToGeodatabase_conversion(InGPLayer, UPlanGDB) #add general plan layer to layer tracker table AddToUPC_Layers(UPlanGDB, SHPName, LongName, 'GeneralPlan')
def preProcess(datetime, province, infiles): # 根据datetime创立工作目录 cwd = os.getcwd() workpath = ''.join([cwd, u"/temp/", province, '/', datetime]) if not os.path.exists(workpath): os.makedirs(workpath) workspace = ''.join([workpath, '/GDB.gdb']) if not arcpy.Exists(workspace): arcpy.CreateFileGDB_management(workpath, 'GDB.gdb') arcpy.env.overwriteOutput = True text_table = ''.join([workspace, "/", u"data", datetime, u".txt"]) writeOriginData(infiles, text_table) database_path = generateDatabase(text_table, province) #建立数据库,以供SQL查询 if not arcpy.Exists(''.join([workpath, '/SQL.mdb'])): arcpy.CreatePersonalGDB_management(workpath, "SQL.mdb") #在指定位置建立个人数据库 arcpy.FeatureClassToGeodatabase_conversion(database_path, ''.join( [workpath, '/SQL.mdb'])) #将文件数据库中的要素类导入到个人数据库 #读取省下属各地级市面积,这要在地图数据中预先计算好,这只是读取 province_area = {} province_feature = ''.join( [cwd, u'/data/LightningBulletin.gdb/', province, u'_分区']) with SearchCursor(province_feature, ["Region", "area"]) as cursor: for row in cursor: province_area[row[0]] = row[1] f = open(os.path.join(workspace, 'province_area.pkl'), 'wb') pickle.dump(province_area, f, pickle.HIGHEST_PROTOCOL) f.close()
def extractFeatureServices(): # Prep GDB if not arcpy.Exists(outputGDB): arcpy.CreateFileGDB_management(root, "YourGDB.gdb") # Delete feature class that will be extracted cleanGDB() # Build the list of feature classes to extract for feature in serviceList: dataList.append(serviceDictionary[feature]) # Export from SDE to local GDB arcpy.FeatureClassToGeodatabase_conversion(dataList, outputGDB) # Features to Proper Case # ********** This part doesnt work ************** if "Parks" in serviceList: renameFeature("parks", "Parks") if "AddressPoints" in serviceList: renameFeature("address", "AddressPts") if "AgSecurity" in serviceList: renameFeature("Agsecurity", "AgSecurity") if "UGA" in serviceList: renameFeature("Uga_poly", "UGA") # Rename parcel_poly to parcels and add Parcel_Name if "Parcels" in serviceList: renameFeature("parcel_poly", "Parcels") modifyParcelData() if postOwnerName == "Yes": processParcelDetails()
def import_shapefiles_to_gdb(self, wildcard=None): shplist = get_path.pathFinder.get_shapefile_path_wildcard( self.input_path, wildcard) print("\nI found {} files to import!!!".format(len(shplist))) try: for x in shplist: name = os.path.split(x)[1] output = os.path.join(self.outputGDB, name.strip(".shp")) print(output) logging.info("Importing: {}".format(name.strip(".shp"))) if arcpy.Exists(output): print("exists, passing over this fc") logging.warning("{} exists, passing over this fc".format( name.strip(".shp"))) else: arcpy.FeatureClassToGeodatabase_conversion( x, self.outputGDB) print(arcpy.GetMessages(0)) logging.info(arcpy.GetMessages(0)) except: tb = sys.exc_info()[2] tbinfo = traceback.format_tb(tb)[0] pymsg = "PYTHON ERRORS:\nTraceback info:\n" + tbinfo + "\nError Info:\n" + str( sys.exc_info()[1]) msgs = "ArcPy ERRORS:\n" + arcpy.GetMessages(2) + "\n" arcpy.AddError(pymsg) arcpy.AddError(msgs) print(pymsg) print(msgs) logging.error(pymsg)
def multipleGdb(self, name): arcpy.env.workspace = self.path arcpy.env.overwriteOutput = True arcpy.MakeFeatureLayer_management(os.path.join(self.path, layer), name) arcpy.SelectLayerByAttribute_management(name, "NEW_Selection",self.filter_string) if arcpy.Exists(os.path.join(self.gdb_path, name)): arcpy.Delete_management(os.path.join(self.gdb_path, name)) arcpy.FeatureClassToGeodatabase_conversion(name, self.gdb_path)
def replicateDatabase(dbConnection, targetGDB): log = logging.getLogger("backup_log") startTime = time.time() if arcpy.Exists(dbConnection): featureSDE, cntSDE = getDatabaseItemCount(dbConnection) log.info("Geodatabase being copied: %s -- Feature Count: %s" % (dbConnection, cntSDE)) if arcpy.Exists(targetGDB): featureGDB, cntGDB = getDatabaseItemCount(targetGDB) log.info("Old Target Geodatabase: %s -- Feature Count: %s" % (targetGDB, cntGDB)) try: shutil.rmtree(targetGDB) log.info("Deleted Old %s" % (os.path.split(targetGDB)[-1])) except Exception as e: #log.info(e) log.exception(e) GDB_Path, GDB_Name = os.path.split(targetGDB) log.info("Now Creating New %s" % (GDB_Name)) arcpy.CreateFileGDB_management(GDB_Path, GDB_Name) arcpy.env.workspace = dbConnection log.info("Start fetching the feature class list \n") print("Start fetching the feature class list \n") for fc in arcpy.ListFeatureClasses(): if "GIS_ALL" not in fc: print("Skipped a feature that is not from the GIS_ALL") log.info("Skipped a feature that is not from the GIS_ALL") del fc continue else: try: print("Attempting to Copy %s to %s" % (fc, targetGDB)) log.info("Attempting to Copy %s to %s" % (fc, targetGDB)) #arcpy.FeatureClassToGeodatabase_conversion(featureClasses, targetGDB) arcpy.FeatureClassToGeodatabase_conversion(fc, targetGDB) print("Finished copying %s to %s \n" % (fc, targetGDB)) log.info("Finished copying %s to %s \n" % (fc, targetGDB)) del fc except Exception as e: print(e.message) print("Unable to copy %s to %s" % (fc, targetGDB)) log.info("Unable to copy %s to %s" % (fc, targetGDB)) log.exception(e) del fc continue featGDB, cntGDB = getDatabaseItemCount(targetGDB) print("Completed replication of %s -- Feature Count: %s" % (dbConnection, cntGDB)) log.info("Completed replication of %s -- Feature Count: %s" % (dbConnection, cntGDB)) else: print( "{0} does not exist or is not supported! \ Please check the database path and try again.".format(dbConnection)) log.info("{0} does not exist or is not supported! \ Please check the database path and try again.".format(dbConnection))
def DodanieDoBazywarstwzshp(): inFeaturesdrogi = [ 'D:\mgr\M33033CD\droga\ZL101.shp', 'D:\mgr\M33033CD\droga\ZL103.shp', 'D:\mgr\M33033CD\droga\ZL104.shp', 'D:\mgr\M33033CD\droga\ZL105.shp', 'D:\mgr\M33033CD\droga\ZL106.shp', 'D:\mgr\M33033CD\droga\ZL107.shp', 'D:\mgr\M33033CD\droga\ZL108.shp', 'D:\mgr\M33033CD\droga\ZL018.shp', 'D:\mgr\M33033CD\Rzeka_L.shp', 'D:\mgr\M33033CD\Las.shp' ] arcpy.FeatureClassToGeodatabase_conversion(inFeaturesdrogi, outLocation)
def execute(self, parameters, messages): # create GDB folder = parameters[0].valueAsText name = parameters[1].valueAsText arcpy.CreateFileGDB_management(folder, name) gdb_path = folder + '\\' + name # create garages shapefile, add to GDB garage_location = parameters[3].valueAsText garage_shp_name = parameters[4].valueAsText garages = arcpy.MakeXYEventLayer_management(garage_location, 'X', 'Y', garage_shp_name) arcpy.FeatureClassToGeodatabase_conversion(garages, gdb_path) garage_path = gdb_path + '\\' + garage_shp_name # create buildings shapefile given the structures .shp in Campus campus_gdb_path = parameters[2].valueAsText structures = campus_gdb_path + '\Structures' campus_buildings = gdb_path + '\\' + 'campus_building' arcpy.Copy_management(structures, campus_buildings) # reproject garages to the spatial reference of campus buildings projection = arcpy.Describe(campus_buildings).spatialReference arcpy.Project_management(garage_path, gdb_path + '\garage_projected', projection) garage_projected = gdb_path + '\garage_projected' # get building to buffer and buffer distance garage_selection = parameters[5].valueAsText buffer_distance = float(parameters[6].valueAsText) # make sure garage exists where = "Name = '%s'" % garage_selection cursor = arcpy.SearchCursor(garage_projected, where_clause=where) shouldProceed = False for row in cursor: if row.getValue('Name') == garage_selection: shouldProceed = True # if should proceed = true if shouldProceed: # generate the name for buffer layer garage_buff = r'\garage_%s_buffed_%s' % (garage_selection, buffer_distance) # get reference to building garageFeature = arcpy.Select_analysis(garage_projected, gdb_path + r'building_%s' % (garage_selection), where) # buffer selected garage garage_buffered = arcpy.Buffer_analysis(garageFeature, gdb_path + garage_buff, buffer_distance) # intersection of garage buffer and campus buildings arcpy.Intersect_analysis([gdb_path + garage_buff, gdb_path + r'\campus_buildings'], gdb_path + '\garage_building_intersection', 'All') # convert to csv arcpy.TableToTable_conversion(gdb_path + '\garage_building_intersection.dbf', 'C:\\Users\\Eileen\\Documents\\lab 5', 'nearbyBuildings.csv') else: messages.addErrorMessage('garage not found') raise arcpy.ExecuteError return
def createFC(path, shpFile): import arcpy import os from arcpy import env env.workspace = path if arcpy.Exists(rasterFile): return shpFile + ' already exists!' else: arcpy.FeatureClassToGeodatabase_conversion(shpFile, path) return shpFile
def singleGdb(self, name, year_attr, year_code): arcpy.env.workspace = os.path.dirname(self.path) arcpy.env.overwriteOutput = True filter_year = '"{0}" = {2}{1}{2}'.format(year_attr, year_code, "\'" if isinstance(year_code, basestring) else "") filter_string_yr = " AND ".join([self.filter_string,filter_year]) arcpy.MakeFeatureLayer_management(self.path, name) arcpy.SelectLayerByAttribute_management(name, "NEW_Selection",filter_string_yr) if arcpy.Exists(os.path.join(self.gdb_path, name)): arcpy.Delete_management(os.path.join(self.gdb_path, name)) arcpy.FeatureClassToGeodatabase_conversion(name, self.gdb_path)
def export_DDPIndex(sde_offline_floorplans, file_GDB_floorplans): """Given the location of the offline SDE and the local floorplans file GDB, exports the DDPINDEX view as a feature class from offline sde to the local GDB. Views cannot be replicated, so we will have to convert it into a feature class each time.""" arcpy.AddMessage("DDPINDEX") arcpy.env.workspace = file_GDB_floorplans for fc in arcpy.ListFeatureClasses(): if fc == "DDPINDEX": arcpy.Delete_management(fc) arcpy.env.workspace = sde_offline_floorplans arcpy.FeatureClassToGeodatabase_conversion("FLOORPLANSOFFLINE.DBO.DDPINDEX", file_GDB_floorplans) arcpy.AddMessage("Done DDPINdex")
def convertShapefiles(): env.workspace = 'hurricane_data' env.overwriteOutput = True fcList = arcpy.ListFeatureClasses() print("shapefiles: {0}".format(fcList)) stormsGDB = 'storms.gdb' if not arcpy.Exists(stormsGDB): arcpy.CreateFileGDB_management(env.workspace, stormsGDB) arcpy.FeatureClassToGeodatabase_conversion(fcList, stormsGDB) #delete downloaded data for fc in fcList: arcpy.Delete_management(fc)
def create_scratch_workspace_import_lrs(): print "Creating scratch fgdb workspace..." arcpy.CreateFileGDB_management( "C:/Users/gbunce/Documents/projects/UTRANS/UpdateMilepostsInRoads_Edit/", "UpdateUtransMileposts_" + formatted_date + ".gdb") print "Importing LRS data into scratch workspace " file.write("\n" + "\n" + "Began importing sgid lrs into scratch workspace at: " + str(datetime.datetime.now())) arcpy.FeatureClassToGeodatabase_conversion( Input_Features= "Database Connections/DC_agrc@[email protected]/SGID10.TRANSPORTATION.UDOTRoutes_LRS", Output_Geodatabase=local_workspace)
def ReprojectHARNDataset(OutputGdb, CalcXY): arcpy.AddMessage("\nFinal Step: Reprojecting dataset to NAD 1983") # create separate objects from user supplied path SplitGdbPath = OutputGdb.split('\\') Folder = '\\'.join(SplitGdbPath[:-1]) GdbName = '\\'.join(SplitGdbPath[-1:]).split('.')[0] GdbNameNAD83 = GdbName + '_NAD83.gdb' OutputGdbNAD83 = Folder + '\\' + GdbNameNAD83 # create spatial reference object, new gdb, new feature dataset sr = arcpy.SpatialReference('NAD 1983') arcpy.CreateFileGDB_management(Folder, GdbNameNAD83, 'CURRENT') arcpy.CreateFeatureDataset_management(OutputGdbNAD83, 'CadastralReference', sr) # input/outut, import HARN features into NAD83 dataset infcs = [ """{}\\CadastralReference\\PLSSTownship""".format(OutputGdb), """{}\\CadastralReference\\PLSSSpecialSurvey""".format(OutputGdb), """{}\\CadastralReference\\PLSSSecondDivision""".format(OutputGdb), """{}\\CadastralReference\\PLSSPoint""".format(OutputGdb), """{}\\CadastralReference\\PLSSFirstDivision""".format(OutputGdb), """{}\\CadastralReference\\MetadataGlance""".format(OutputGdb), """{}\\CadastralReference\\MeanderedWater""".format(OutputGdb), """{}\\CadastralReference\\ConflictedAreas""".format(OutputGdb), """{}\\CadastralReference\\Control""".format(OutputGdb), """{}\\CadastralReference\\SurveySystem""".format(OutputGdb) ] outdataset = """{}\\CadastralReference""".format(OutputGdbNAD83) arcpy.FeatureClassToGeodatabase_conversion(infcs, outdataset) PLSSPointReProject = """{}\\PLSSPoint""".format(outdataset) if CalcXY == 'true': arcpy.AddMessage("Calculating XCOORD, YCOORD, COORDSYS AND HDATUM") arcpy.CalculateField_management(PLSSPointReProject, "XCOORD", "!Shape.Centroid.X!", "Python_9.3", "") arcpy.CalculateField_management(PLSSPointReProject, "YCOORD", "!Shape.Centroid.Y!", "Python_9.3", "") spatialRef = arcpy.Describe(PLSSPointReProject).SpatialReference srType = "'{}'".format(spatialRef.type) #srName = "'{}'".format(spatialRef.name) Possibly to be used later. Right now the HDATUM is hard coded to NAD83 to fit the CadNSDI standard. arcpy.CalculateField_management(PLSSPointReProject, "COORDSYS", srType, "PYTHON", "") arcpy.CalculateField_management(PLSSPointReProject, "HDATUM", "'NAD83'", "PYTHON", "")
def select_and_copy_feature_class(self, gdb_full_path_name, id_list, input_data, rename_basename, test_field): feature_layer_name = "feature_layer_selection" feature_layer_selection = arcpy.MakeFeatureLayer_management(input_data, feature_layer_name, "{0} in ({1})".format( test_field, self.utility.format_list_for_where_clause( id_list)) ) if arcpy.GetCount_management(feature_layer_selection) > 0: arcpy.AddMessage("...Copying fc to gdb") arcpy.FeatureClassToGeodatabase_conversion(feature_layer_name, gdb_full_path_name) arcpy.Rename_management(os.path.join(gdb_full_path_name, feature_layer_name), rename_basename) del feature_layer_selection else: pass
def transferToGDB_old(in_folder): ## ## This is incomplete. Beware. ## listString = '' # This will save a string containing all the files to be transfered to GDB, used in the arcpy function for batch_file in inList: new_name = findNewName(batch_file) listString = listString + os.path.join(in_folder, item) + ';' if listString[-1] == ';': listString = listString[:-1] ## Transfer specified files to the GDB arcpy.FeatureClassToGeodatabase_conversion(Input_Features=listString, Output_Geodatabase=out_GDB)
def transfer(self, dictList=[]): ''' Transfer shapefiles in our directories to scratch.gdb for ease of use. :return: None. ''' print('Transferring input files...\n') shps = [] for d in dictList: for shp, path in d.items(): if '_path' in shp: shps.append(path) else: pass for shp in shps: arcpy.FeatureClassToGeodatabase_conversion(shp, self.scratch) print('\n')
def exportFeatureClassesByShapeType(input_geodatabase, shapeType, output_geodatabase): #set the arcpy workspace to be the input_geodatabase arcpy.env.workspace = input_geodatabase #get list of the feature classes from the geodatabase fcList = arcpy.ListFeatureClasses('*') #do a for loop of the feature classes in the list of feature classes for fc in fcList: #use .Describe to get the properties of the feature class desc = arcpy.Describe(fc) #find the feature classes with a type of shapeType if desc.shapeType == shapeType: #create a new geodatabase in the output_geodatabase path arcpy.CreateFileGDB_management(output_geodatabase, 'name') #add the feature classes that match the shape type to the output geodatabase arcpy.FeatureClassToGeodatabase_conversion(desc, 'name')
def create_layer_ate(self): """ create layers ATE (boundaris of settlements) an Selsovets (boundaries of Selsovets) from layer .shp (rb.shp, coordinate system - geografic CS Pulkovo-1942) :param path_to_layer_ate: path where we put layer rb.shp in GCS Pulkovo-1942, which containes boundaries of Selsovets and settlements :param name_layer_ate: name layer (.shp), which containes boundaries of Selsovets and settlements :return: two layer: ATE and Selsovets in DateBase, WGS-84 """ shp = r'{0}\{1}1.shp'.format(self.path_to_layer_ate, self.name_district) shp_sk = r'{0}\{1}_sk.shp'.format(self.path_to_layer_ate, self.name_district) shp_city = r'{0}\ATE.shp'.format(self.path_to_layer_ate) shp_ss = r'{0}\Selsovets.shp'.format(self.path_to_layer_ate) tempEnvironment0 = arcpy.env.outputCoordinateSystem arcpy.env.outputCoordinateSystem = sk_42 arcpy.Select_analysis( os.path.join(self.path_to_layer_ate, self.name_layer_ate), shp, "\"SOATO\" LIKE '{0}' OR \"SOATO\" LIKE '{1}'".format( district_soato[self.name_district][0], district_soato[self.name_district][1])) arcpy.env.outputCoordinateSystem = tempEnvironment0 arcpy.Project_management(shp, shp_sk, wgs84, "CK42_to_ITRF2005", sk_42) arcpy.Select_analysis( shp_sk, shp_city, "CATEGORY = 111 OR CATEGORY = 112 OR CATEGORY= 121 OR CATEGORY= 113 OR CATEGORY= 123 OR CATEGORY = 213 OR CATEGORY= 221 OR CATEGORY= 223 OR CATEGORY= 222 OR CATEGORY= 231 OR CATEGORY= 232 OR CATEGORY= 234 OR CATEGORY= 235 OR CATEGORY= 239" ) arcpy.Select_analysis(shp_sk, shp_ss, "\"CATEGORY\" =103") for root, dirs, files in os.walk(self.path_to_layer_ate): for file in files: if file.find('1') > -1 or file.find('_sk') > -1: os.remove('{0}\{1}'.format(self.path_to_layer_ate, file)) try: arcpy.FeatureClassToGeodatabase_conversion( "{0};{1}".format(shp_city, shp_ss), self.nameDataSet) except: print "This layer's been in DateBase yet" # удаление шейпов ATE, Selsovets for root, dirs, files in os.walk(self.path_to_layer_ate): for file in files: if file.find('ATE') > -1 or file.find('Selsovets') > -1: os.remove('{0}\{1}'.format(self.path_to_layer_ate, file))
def dynamicFeatureImport(workspace, overwriteSetting, outputGDB): arcpy.env.workspace = workspace arcpy.env.overwriteOutput = overwriteSetting in_features = arcpy.ListFeatureClasses() #print the following list of shapefiles in the current workspace pp.pprint( "This folder contains the following shapefiles or feature classes: " + str(in_features)) #ask the user for feedback boolPrompt = input("Would you like to import the following data? ") #check whether the user wishes to proceed with importing the data if boolPrompt == "Yes": for feature in in_features: arcpy.FeatureClassToGeodatabase_conversion(feature, outputGDB) else: print("The data cannot be imported.")
def ImportConstraintLayer(InConstraint, UPlanGDB, LongName): ''' *Imports a Constraint layer to the GDB *Creates a field with the same name as the layer and calculates it equal to 1 *Adds a record to the upc_layers table for this constraint layer Calls: CalculateFieldTo1 AddToUPC_Layers Called by: Import Constraint Toolbox Arguments: InConstraint: The layer to be added to the GDB as a constraint UPlanGDB: The UPlan GDB (where the layer will be imported) LongName: The descriptive name of the layer Returns: None ''' #Import a feature class layer to the existing geodatabase as a constraint # Set workspace env.workspace = UPlanGDB if os.path.basename(InConstraint)[-4:] == ".shp": SHPName = os.path.basename(InConstraint)[:-4] else: SHPName = os.path.basename(InConstraint) #Add layer to geodatabase arcpy.FeatureClassToGeodatabase_conversion(InConstraint, UPlanGDB) #calculate a new field = 1 CalculateFieldTo1(UPlanGDB, SHPName) #add constraint layer to layer tracker table AddToUPC_Layers(UPlanGDB, SHPName, LongName, 'Constraint')
row[2] = referenceno ucursor.updateRow(row) arcpy.AddMessage("Outline updated.") # Update the progressor label for current feature class arcpy.SetProgressorLabel("Merged {0}...".format(fc)) completeCount = completeCount + 1 # Update the progressor position arcpy.SetProgressorPosition() arcpy.AddMessage("{0} outlines complete.".format(completeCount)) arcpy.ResetProgressor() arcpy.AddMessage("Copying outlines to output geodatabase....") outlineList = [ ftr for ftr in arcpy.ListFeatureClasses("*", "polygon") if ftr.endswith('_outline') ] try: for f in outlineList: arcpy.FeatureClassToGeodatabase_conversion(f, outGDB) except: arcpy.AddMessage( "Oulines {0} could not be copied to export geodatabase.".format(f)) arcpy.AddMessage("Outlines copied to export geodatabase.")
def execute(self, parameters, messages): """The source code of the tool.""" # local variables and env arcpy.CreateFileGDB_management("E:/gina/poker/gdb", parameters[0].valueAsText) arcpy.env.workspace = "E:/gina/poker/gdb/" + parameters[ 0].valueAsText + ".gdb" arcpy.env.overwriteOutput = True adnr_lo_shp = "E:/gina/poker/shp/wip/land_ownership_data/adnr_gls_dls_merge_20170823_v1.shp" pfrr_popn_places = "E:/gina/poker/shp/wip/popn_places_data/pokerflat_popn_places_gcs_wgs84_to_akalbers_2.shp" afs_known_sites = "E:/gina/poker/shp/asf_data/asf_known_sites_20180629_3338.shp" pipTable = "E:/gina/poker/dbf/predicted_impact_xy.dbf" pip_point_shp = "E:/gina/poker/pip/pip_point.shp" pip_point_3338 = "E:/gina/poker/pip/pip_point_3338.shp" pip_buffer_shp = "E:/gina/poker/pip/pip_buffer.shp" pip_range_rings_shp = "E:/gina/poker/pip/pip_range_rings.shp" pip_lo_in_buffer_shp = "E:/gina/poker/pip/pip_lo_in_buffer.shp" pip_lo_in_buf_sum_dbf = "E:/gina/poker/pip/pip_lo_in_buf_sum.dbf" pip_lo_in_buf_sum_csv = "E:/gina/poker/pip/pip_lo_in_buf_sum.csv" pip_popn_places_in_buffer_shp = "E:/gina/poker/pip/pip_popn_places_in_buffer.shp" pip_known_sites_in_buffer_shp = "E:/gina/poker/pip/pip_known_sites_in_buffer.shp" x = parameters[2].valueAsText y = parameters[3].valueAsText r = parameters[10].valueAsText + " NauticalMiles" rr1 = (float(parameters[10].valueAsText)) / 3 rr2 = (rr1 * 2) rrs = str(rr1) + ";" + str(rr2) + ";" + r.split(" ")[0] pipLayer = "pipLayer1" srs = arcpy.SpatialReference("Alaska Albers Equal Area Conic") intersect_fc1 = [adnr_lo_shp, pip_buffer_shp] intersect_fc2 = [pfrr_popn_places, pip_buffer_shp] intersect_fc3 = [afs_known_sites, pip_buffer_shp] mxd = arcpy.mapping.MapDocument("current") dataframe = arcpy.mapping.ListDataFrames(mxd)[0] sourceLoSymbologyLayer = arcpy.mapping.Layer( "E:/gina/poker/lyr/lo2.lyr") sourcePipSymbologyLayer = arcpy.mapping.Layer( "E:/gina/poker/lyr/pip2.lyr") sourceRrsSymbologyLayer = arcpy.mapping.Layer( "E:/gina/poker/lyr/rrs.lyr") sourcePopSymbologyLayer = arcpy.mapping.Layer( "E:/gina/poker/lyr/pop.lyr") sourceAfsSymbologyLayer = arcpy.mapping.Layer( "E:/gina/poker/lyr/afs2.lyr") # Process: Calculate Lon Field arcpy.CalculateField_management(pipTable, "Lon", x, "PYTHON", "") # Process: Calculate Lat Field arcpy.CalculateField_management(pipTable, "Lat", y, "PYTHON", "") # Process: Make XY Event Layer arcpy.MakeXYEventLayer_management( pipTable, "Lon", "Lat", pipLayer, "GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]];-400 -400 1000000000;-100000 10000;-100000 10000;8.98315284119522E-09;0.001;0.001;IsHighPrecision", "") # Process: Copy Features arcpy.CopyFeatures_management(pipLayer, pip_point_shp, "", "0", "0", "0") # Process: Project pip point arcpy.Project_management(pip_point_shp, pip_point_3338, srs) # Process: Buffer pip point arcpy.Buffer_analysis(pip_point_3338, pip_buffer_shp, r, "FULL", "ROUND", "NONE", "", "PLANAR") # Process: Multiple Ring Buffer arcpy.MultipleRingBuffer_analysis(pip_point_3338, pip_range_rings_shp, rrs, "NauticalMiles", "", "NONE", "FULL") # Process: Intersect pip buffer with land ownership arcpy.Intersect_analysis(intersect_fc1, pip_lo_in_buffer_shp, "ALL", "", "INPUT") # Process: Intersect pip buffer with popn places arcpy.Intersect_analysis(intersect_fc2, pip_popn_places_in_buffer_shp, "ALL", "", "INPUT") # Process: Intersect pip buffer with afs known sites arcpy.Intersect_analysis(intersect_fc3, pip_known_sites_in_buffer_shp, "ALL", "", "INPUT") # Process: Make feature layers and add to the map ## pip feature class list fclist = arcpy.ListFeatureClasses() ## pip layer arcpy.MakeFeatureLayer_management(pip_point_3338, "Predicted Impact Point") ## land ownership layer arcpy.MakeFeatureLayer_management( pip_lo_in_buffer_shp, "Land Ownership within 3sigma of Predicted Impact Point") ## Range Rings arcpy.MakeFeatureLayer_management(pip_range_rings_shp, "Range Rings") ## populated places layer popn_places_records = int( arcpy.GetCount_management(pip_popn_places_in_buffer_shp).getOutput( 0)) if popn_places_records > 0: arcpy.MakeFeatureLayer_management( pip_popn_places_in_buffer_shp, "Populated Places within 3sigma of Predicted Impact Point") addPipPopnPlacesLayer = arcpy.mapping.Layer( "Populated Places within 3sigma of Predicted Impact Point") arcpy.mapping.AddLayer(dataframe, addPipPopnPlacesLayer) ## known sites layer known_sites_records = int( arcpy.GetCount_management(pip_known_sites_in_buffer_shp).getOutput( 0)) if known_sites_records > 0: arcpy.MakeFeatureLayer_management( pip_known_sites_in_buffer_shp, "AFS Known Sites within 3sigma of Predicted Impact Point") addPipKnownSitesLayer = arcpy.mapping.Layer( "AFS Known Sites within 3sigma of Predicted Impact Point") arcpy.mapping.AddLayer(dataframe, addPipKnownSitesLayer) addPipPointLayer = arcpy.mapping.Layer("Predicted Impact Point") arcpy.mapping.AddLayer(dataframe, addPipPointLayer) add3sigmaLoLayer = arcpy.mapping.Layer( "Land Ownership within 3sigma of Predicted Impact Point") arcpy.mapping.AddLayer(dataframe, add3sigmaLoLayer) addRangeRings = arcpy.mapping.Layer("Range Rings") arcpy.mapping.AddLayer(dataframe, addRangeRings) # Add and calc Acres field for intersected Land Ownership arcpy.AddField_management(pip_lo_in_buffer_shp, "Acres", "DOUBLE") arcpy.CalculateField_management(pip_lo_in_buffer_shp, "Acres", "!shape.area@acres!", "PYTHON_9.3", "") # Summarize intersected Land Ownership by Owner and total Acres arcpy.Statistics_analysis(pip_lo_in_buffer_shp, pip_lo_in_buf_sum_dbf, "Acres SUM", "OWNER") arcpy.MakeTableView_management(pip_lo_in_buf_sum_dbf) add3sigmaLoSumTbl = arcpy.mapping.TableView(pip_lo_in_buf_sum_dbf) arcpy.mapping.AddTableView(dataframe, add3sigmaLoSumTbl) # Symbolize and Refresh lo_layer = arcpy.mapping.ListLayers( mxd, "*Land Ownership within 3sigma of Predicted Impact Point*", dataframe)[0] arcpy.mapping.UpdateLayer(dataframe, lo_layer, sourceLoSymbologyLayer, True) lo_layer.symbology.addAllValues() pip_layer = arcpy.mapping.ListLayers(mxd, "*Predicted Impact Point*", dataframe)[0] arcpy.mapping.UpdateLayer(dataframe, pip_layer, sourcePipSymbologyLayer, True) rr_layer = arcpy.mapping.ListLayers(mxd, "*Range Rings*", dataframe)[0] arcpy.mapping.UpdateLayer(dataframe, rr_layer, sourceRrsSymbologyLayer, True) pop_layer = arcpy.mapping.ListLayers(mxd, "*Populated Places*", dataframe)[0] arcpy.mapping.UpdateLayer(dataframe, pop_layer, sourcePopSymbologyLayer, True) afs_layer = arcpy.mapping.ListLayers(mxd, "*Known Sites*", dataframe)[0] arcpy.mapping.UpdateLayer(dataframe, afs_layer, sourceAfsSymbologyLayer, True) arcpy.RefreshTOC() arcpy.RefreshActiveView() # Populate Mission GDB mission_layers = [ pip_point_3338, pip_lo_in_buffer_shp, pip_popn_places_in_buffer_shp, pip_range_rings_shp, pip_known_sites_in_buffer_shp ] arcpy.FeatureClassToGeodatabase_conversion(mission_layers, arcpy.env.workspace) return