arcgis.learn module¶
Functions for calling the Deep Learning Tools.
detect_objects¶
-
learn.
detect_objects
(model, model_arguments=None, output_name=None, run_nms=False, confidence_score_field=None, class_value_field=None, max_overlap_ratio=0, context=None, *, gis=None, **kwargs)¶ Function can be used to generate feature service that contains polygons on detected objects found in the imagery data using the designated deep learning model. Note that the deep learning library needs to be installed separately, in addition to the server’s built in Python 3.x library.
Argument
Description
input_raster
Required. raster layer that contains objects that needs to be detected.
model
Required model object.
model_arguments
Optional dictionary. Name-value pairs of arguments and their values that can be customized by the clients.
eg: {“name1”:”value1”, “name2”: “value2”}
output_name
Optional. If not provided, a Feature layer is created by the method and used as the output . You can pass in an existing Feature Service Item from your GIS to use that instead. Alternatively, you can pass in the name of the output Feature Service that should be created by this method to be used as the output for the tool. A RuntimeError is raised if a service by that name already exists
run_nms
Optional bool. Default value is False. If set to True, runs the Non Maximum Suppression tool.
confidence_score_field
Optional string. The field in the feature class that contains the confidence scores as output by the object detection method. This parameter is required when you set the run_nms to True
class_value_field
Optional string. The class value field in the input feature class. If not specified, the function will use the standard class value fields Classvalue and Value. If these fields do not exist, all features will be treated as the same object class. Set only if run_nms is set to True
max_overlap_ratio
Optional integer. The maximum overlap ratio for two overlapping features. Defined as the ratio of intersection area over union area. Set only if run_nms is set to True
context
Optional dictionary. Context contains additional settings that affect task execution. Dictionary can contain value for following keys:
cellSize - Set the output raster cell size, or resolution
extent - Sets the processing extent used by the function
parallelProcessingFactor - Sets the parallel processing factor. Default is “80%”
processorType - Sets the processor type. “CPU” or “GPU”
Eg: {“processorType” : “CPU”}
Setting context parameter will override the values set using arcgis.env variable for this particular function.
gis
Optional GIS. The GIS on which this tool runs. If not specified, the active GIS is used.
- Returns
The output feature layer item containing the detected objects
classify_pixels¶
-
learn.
classify_pixels
(model, model_arguments=None, output_name=None, context=None, *, gis=None, **kwargs)¶ Function to classify input imagery data using a deep learning model. Note that the deep learning library needs to be installed separately, in addition to the server’s built in Python 3.x library.
Argument
Description
input_raster
Required. raster layer that needs to be classified
model
Required model object.
model_arguments
Optional dictionary. Name-value pairs of arguments and their values that can be customized by the clients. eg: {“name1”:”value1”, “name2”: “value2”}
output_name
Optional. If not provided, an imagery layer is created by the method and used as the output . You can pass in an existing Image Service Item from your GIS to use that instead. Alternatively, you can pass in the name of the output Image Service that should be created by this method to be used as the output for the tool. A RuntimeError is raised if a service by that name already exists
context
Optional dictionary. Context contains additional settings that affect task execution. Dictionary can contain value for following keys:
outSR - (Output Spatial Reference) Saves the result in the specified spatial reference
snapRaster - Function will adjust the extent of output rasters so that they match the cell alignment of the specified snap raster.
cellSize - Set the output raster cell size, or resolution
extent - Sets the processing extent used by the function
parallelProcessingFactor - Sets the parallel processing factor. Default is “80%”
processorType - Sets the processor type. “CPU” or “GPU”
Eg: {“outSR” : {spatial reference}}
Setting context parameter will override the values set using arcgis.env variable for this particular function.
gis
Optional GIS. The GIS on which this tool runs. If not specified, the active GIS is used.
- Returns
The classified imagery layer item
export_training_data¶
-
learn.
export_training_data
(input_class_data=None, chip_format=None, tile_size=None, stride_size=None, metadata_format=None, classvalue_field=None, buffer_radius=None, output_location=None, context=None, *, gis=None, **kwargs)¶ Function is designed to generate training sample image chips from the input imagery data with labeled vector data or classified images. The output of this service tool is the data store string where the output image chips, labels and metadata files are going to be stored.
Argument
Description
input_raster
Required. raster layer that needs to be exported for training
input_class_data
Labeled data, either a feature layer or image layer. Vector inputs should follow a training sample format as generated by the ArcGIS Pro Training Sample Manager. Raster inputs should follow a classified raster format as generated by the Classify Raster tool.
chip_format
Optional String. The raster format for the image chip outputs.
TIFF: TIFF format
PNG: PNG format
JPEG: JPEG format
MRF: MRF (Meta Raster Format)
tile_size
Optional dictionary. The size of the image chips.
Example: {“x”: 256, “y”: 256}
stride_size
Optional dictionary. The distance to move in the X and Y when creating the next image chip. When stride is equal to the tile size, there will be no overlap. When stride is equal to half of the tile size, there will be 50% overlap.
Example: {“x”: 128, “y”: 128}
metadata_format
Optional string. The format of the output metadata labels. There are 4 options for output metadata labels for the training data, KITTI Rectangles, PASCAL VOCrectangles, Classified Tiles (a class map) and RCNN_Masks. If your input training sample data is a feature class layer such as building layer or standard classification training sample file, use the KITTI or PASCAL VOC rectangle option.
The output metadata is a .txt file or .xml file containing the training sample data contained in the minimum bounding rectangle. The name of the metadata file matches the input source image name. If your input training sample data is a class map, use the Classified Tiles as your output metadata format option.
KITTI_rectangles: The metadata follows the same format as the Karlsruhe Institute of Technology and Toyota Technological Institute (KITTI) Object Detection Evaluation dataset. The KITTI dataset is a vision benchmark suite. This is the default.The label files are plain text files. All values, both numerical or strings, are separated by spaces, and each row corresponds to one object.
PASCAL_VOC_rectangles: The metadata follows the same format as the Pattern Analysis, Statistical Modeling and Computational Learning, Visual Object Classes (PASCAL_VOC) dataset. The PASCAL VOC dataset is a standardized image data set for object class recognition.The label files are XML files and contain information about image name, class value, and bounding box(es).
Classified_Tiles: This option will output one classified image chip per input image chip. No other meta data for each image chip. Only the statistics output has more information on the classes such as class names, class values, and output statistics.
RCNN_Masks: This option will output image chips that have a mask on the areas where the sample exists. The model generates bounding boxes and segmentation masks for each instance of an object in the image. It’s based on Feature Pyramid Network (FPN) and a ResNet101 backbone.
classvalue_field
Optional string. Specifies the field which contains the class values. If no field is specified, the system will look for a ‘value’ or ‘classvalue’ field. If this feature does not contain a class field, the system will presume all records belong the 1 class.
buffer_radius
Optional integer. Specifies a radius for point feature classes to specify training sample area.
output_location
This is the output location for training sample data. It can be the server data store path or a shared file system path.
Example:
- Server datastore path -
/fileShares/deeplearning/rooftoptrainingsamples
/rasterStores/rasterstorename/rooftoptrainingsamples
/cloudStores/cloudstorename/rooftoptrainingsamples
- File share path -
\\servername\deeplearning\rooftoptrainingsamples
context
Optional dictionary. Context contains additional settings that affect task execution. Dictionary can contain value for following keys:
exportAllTiles - Choose if the image chips with overlapped labeled data will be exported. True - Export all the image chips, including those that do not overlap labeled data. False - Export only the image chips that overlap the labelled data. This is the default.
startIndex - Allows you to set the start index for the sequence of image chips. This lets you append more image chips to an existing sequence. The default value is 0.
cellSize - cell size can be set using this key in context parameter
extent - Sets the processing extent used by the function
Setting context parameter will override the values set using arcgis.env variable for this particular function.(cellSize, extent)
eg: {“exportAllTiles” : False, “startIndex”: 0 }
gis
Optional GIS. The GIS on which this tool runs. If not specified, the active GIS is used.
- Returns
Output string containing the location of the exported training data
list_models¶
-
learn.
list_models
(**kwargs)¶ Function is used to list all the installed deep learning models.
Argument
Description
gis
Optional GIS. The GIS on which this tool runs. If not specified, the active GIS is used.
- Returns
list of deep learning models installed
Model¶
-
class
arcgis.learn.
Model
(model=None)¶ -
from_json
(model)¶ Function is used to initialise Model object from model definition JSON
eg usage:
model = Model()
- model.from_json({“Framework” :”TensorFlow”,
“ModelConfiguration”:”DeepLab”, “InferenceFunction”:”
[functions]System\DeepLearning\ImageClassifier.py
”, “ModelFile”:”\\folder_path_of_pb_file\frozen_inference_graph.pb
”, “ExtractBands”:[0,1,2], “ImageWidth”:513, “ImageHeight”:513, “Classes”: [ { “Value”:0, “Name”:”Evergreen Forest”, “Color”:[0, 51, 0] },{ “Value”:1, “Name”:”Grassland/Herbaceous”, “Color”:[241, 185, 137] }, { “Value”:2, “Name”:”Bare Land”, “Color”:[236, 236, 0] }, { “Value”:3, “Name”:”Open Water”, “Color”:[0, 0, 117] }, { “Value”:4, “Name”:”Scrub/Shrub”, “Color”:[102, 102, 0] }, { “Value”:5, “Name”:”Impervious Surface”, “Color”:[236, 236, 236] } ] })
-
from_model_path
(model)¶ Function is used to initialise Model object from url of model package or path of model definition file eg usage:
model = Model()
model.from_model_path(“https://xxxportal.esri.com/sharing/rest/content/items/<itemId>”)
or model = Model()
model.from_model_path(
"\\sharedstorage\sharefolder\findtrees.emd"
)
-
install
(*, gis=None, **kwargs)¶ Function is used to install the uploaded model package (*.dlpk). Optionally after inferencing the necessary information using the model, the model can be uninstalled by uninstall_model()
Argument
Description
gis
Optional GIS. The GIS on which this tool runs. If not specified, the active GIS is used.
- Returns
Path where model is installed
-
query_info
(*, gis=None, **kwargs)¶ Function is used to extract the deep learning model specific settings from the model package item or model definition file.
Argument
Description
gis
Optional GIS. The GIS on which this tool runs. If not specified, the active GIS is used.
- Returns
The key model information in dictionary format that describes what the settings are essential for this type of deep learning model.
-
uninstall
(*, gis=None, **kwargs)¶ Function is used to uninstall the uploaded model package that was installed using the install_model() This function will delete the named deep learning model from the server but not the portal item.
Argument
Description
gis
Optional GIS. The GIS on which this tool runs. If not specified, the active GIS is used.
- Returns
itemId of the uninstalled model package item
-
prepare_data¶
-
learn.
prepare_data
(class_mapping, chip_size=224, val_split_pct=0.1, batch_size=64, transforms=None, collate_fn=<function _bb_pad_collate>, seed=42)¶ Prepares a Fast.ai DataBunch from the exported Pascal VOC image chips exported by Export Training Data tool in ArcGIS Pro or Image Server. This DataBunch consists of training and validation DataLoaders with the specified transformations, chip size, batch size, split percentage.
Argument
Description
path
Required string. Path to data directory.
class_mapping
Required dictionary. Mapping from PascalVOC id to its string label.
chip_size
Optional integer. Size of the image to train the model.
val_split_pct
Optional float. Percentage of training data to keep as validation.
batch_size
Optional integer. Batch size for mini batch gradient descent (Reduce it if getting CUDA Out of Memory Errors).
transforms
Optional tuple. Fast.ai transforms for data augmentation of training and validation datasets respectively (We have set good defaults which work for satellite imagery well).
collate_fn
Optional function. Passed to PyTorch to collate data into batches(usually default works).
seed
Optional integer. Random seed for reproducible train-validation split.
- Returns
fastai DataBunch object
SingleShotDetector¶
-
class
arcgis.learn.
SingleShotDetector
(data, grids=[4, 2, 1], zooms=[0.7, 1.0, 1.3], ratios=[[1.0, 1.0], [1.0, 0.5], [0.5, 1.0]], backbone=None, drop=0.3, bias=-4.0, focal_loss=False, pretrained_path=None)¶ Creates a Single Shot Detector with the specified grid sizes, zoom scales and aspect ratios. Based on Fast.ai MOOC Version2 Lesson 9.
Argument
Description
data
Required fastai Databunch. Returned data object from prepare_data function.
grids
Required list. Grid sizes used for creating anchor boxes.
zooms
Optional list. Zooms of anchor boxes.
ratios
Optional list of tuples. Aspect ratios of anchor boxes.
backbone
Optional function. Backbone CNN model to be used for creating the base of the SingleShotDetector, which is resnet34 by default.
dropout
Optional float. Dropout propbability. Increase it to reduce overfitting.
bias
Optional float. Bias for SSD head.
focal_loss
Optional boolean. Uses Focal Loss if True.
pretrained_path
Optional string. Path where pre-trained model is saved.
- Returns
SingleShotDetector Object
-
fit
(epochs=10, lr=slice(0.0001, 0.003, None))¶ Train the model for the specified number of epocs and using the specified learning rates
Argument
Description
epochs
Required integer. Number of cycles of training on the data. Increase it if underfitting.
lr
Required float or slice of floats. Learning rate to be used for training the model. Select from the lr_find plot.
-
classmethod
from_emd
(data, emd_path)¶ Creates a Single Shot Detector from an Esri Model Definition (EMD) file.
Argument
Description
data
Required fastai Databunch or None. Returned data object from prepare_data function or None for inferencing.
emd_path
Required string. Path to Esri Model Definition file.
- Returns
SingleShotDetector Object
-
load
(name_or_path)¶ Loads a saved model for inferencing or fine tuning from the specified path or model name.
Argument
Description
name_or_path
Required string. Name of the model to load from the pre-defined location. If path is passed then it loads from the specified path with model name as directory name. Path to “.pth” file can also be passed
-
lr_find
()¶ Runs the Learning Rate Finder, and displays the graph of it’s output. Helps in choosing the optimum learning rate for training the model.
-
save
(name_or_path)¶ Saves the model weights, creates an Esri Model Definition and Deep Learning Package zip for deployment to Image Server or ArcGIS Pro
Train the model for the specified number of epocs and using the specified learning rates
Argument
Description
name_or_path
Required string. Name of the model to save. It stores it at the pre-defined location. If path is passed then it stores at the specified path with model name as directory name. and creates all the intermediate directories.
-
show_results
(rows=5, thresh=0.5, nms_overlap=0.1)¶ Displays the results of a trained model on a part of the validation set.
-
unfreeze
()¶ Unfreezes the earlier layers of the detector for fine-tuning.