Syntax

Description

averagePrecision = evaluateDetectionPrecision(detectionResults,groundTruthData)
returns the average precision, of the detectionResults
compared to the groundTruthData. You can use the average
precision to measure the performance of an object detector. For a multiclass
detector, the function returns averagePrecision as a vector
of scores for each object class in the order specified by
groundTruthData.

Input Arguments

detectionResults — Object locations and scorestable

Object locations and scores, specified as a two-column table containing the bounding boxes and
scores for each detected object. For multiclass detection, a third column
contains the predicted label for each detection. The bounding boxes must be
stored in an M-by-4 cell array. The scores must be stored
in an M-by-1 cell array, and the labels must be stored as
a categorical vector.

When detecting objects, you can create the detection results table by using imageDatastore.

groundTruthData — Labeled ground truthdatastore | table

Labeled ground truth, specified as a datastore or a table.

Each bounding box must be in the format [xywidthheight].

Datastore — A datastore whose read and
readall functions
return a cell array or a table with at least two columns of bounding box and labels cell
vectors. The bounding boxes must be in a cell array of M-by-4 matrices in
the format
[x,y,width,height].
The datastore's read and
readall functions
must return one of the formats:

Table — One or more columns. All columns contain bounding boxes.
Each column must be a cell vector that contains
M-by-4 matrices that represent a single object
class, such as stopSign,
carRear, or carFront . The
columns contain 4-element double arrays of M
bounding boxes in the format
[x,y,width,height].
The format specifies the upper-left corner location and size of the
bounding box in the corresponding image.

threshold — Overlap threshold0.5 | numeric scalar

Overlap threshold for assigned a detection to a ground truth
box, specified as a numeric scalar. The overlap ratio is computed
as the intersection over union.

Output Arguments

averagePrecision — Average precisionnumeric scalar | vector

Average precision over all the detection results, returned as
a numeric scalar or vector. Precision is a
ratio of true positive instances to all positive instances of objects
in the detector, based on the ground truth. For a multiclass detector,
the average precision is a vector of average precision scores for
each object class.

Recall values from each detection, returned as an M-by-1 vector of numeric
scalars or as a cell array. The length of M equals 1 +
the number of detections assigned to a class. For example, if your detection
results contain 4 detections with class label 'car', then
recall contains 5 elements. The first value of
recall is always 0.

Recall is a ratio of true positive instances to the
sum of true positives and false negatives in the detector, based on the
ground truth. For a multiclass detector, recall and
precision are cell arrays, where each cell contains
the data points for each object class.

Precision values from each detection, returned as an M-by-1 vector of
numeric scalars or as a cell array. The length of M
equals 1 + the number of detections assigned to a class. For example, if
your detection results contain 4 detections with class label
'car', then precision contains
5 elements. The first value of precision is always
1.

Precision is a ratio of true positive instances to
all positive instances of objects in the detector, based on the ground
truth. For a multi-class detector, recall and
precision are cell arrays, where each cell contains
the data points for each object class.