Here’s a quick one. I have used this script many times as part of a data inventory workflow. It’s also been handy to start project databases with data from a variety of sources – so I can pluck data out of my source GIS databases into specific project databases. When used in conjunction with some other data cleanup scripts, it can be very useful in intersection analyses. I’ve used it in the initial steps of creating clean databases for analysis purposes that disconnect source data from accidental deletion or irreversible edits.
The script is below, I’ve commented it line by line so you can follow but the jist is that this allows you to add multiple layers from different sources and export every layer in a map into a single location.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
#import arcpy to access esri python module import arcpy #sets the processing target to the current mxd mxd = arcpy.mapping.MapDocument("CURRENT") #this is the output geodatabase for feature classes, this can be a folder if you want to output shapefiles out = r"D:\GIS_Temp\Map_Consolidation.gdb" #loop to gather source paths of each layer in current mxd and export each to the out directory defined above for lyr in arcpy.mapping.ListLayers(mxd): #checks that each layer has a datasource attribute if lyr.supports("DATASOURCE"): #grabs the layer's datasource ds = lyr.dataSource #grabs the layer's name from the table of contents nm = lyr.name #replaces spaces in the layer name with an underscore nm = nm.replace(" ","_") #prints the current layer's output filename print nm #perform the export of the current layer per the loop, to the out directory/gdb arcpy.FeatureClassToFeatureClass_conversion(ds, out, nm) print "Layer: " + lyr.name + " Source: " + lyr.dataSource
Here’s what it looks like. I ran the script on a map with this table of contents.
And this is what ends up in the out gdb.