bob.ip.common.data.utils¶
Common utilities
Functions
|
Inverts a binary PIL image (mode == |
|
Creates an image showing existing bounding boxes |
|
Creates an image showing existing labels and masko |
|
Returns a new image that represents |
Classes
|
PyTorch dataset wrapper around labelled and unlabelled sample lists |
|
PyTorch dataset wrapper around Sample lists |
|
PyTorch dataset wrapper around Sample lists |
- bob.ip.common.data.utils.subtract_mode1_images(img1, img2)[source]¶
Returns a new image that represents
img1 - img2
- bob.ip.common.data.utils.overlayed_image(img, label, mask=None, label_color=(0, 255, 0), mask_color=(0, 0, 255), alpha=0.4)[source]¶
Creates an image showing existing labels and masko
This function creates a new representation of the input image
img
overlaying a green mask for labelled objects, and a red mask for parts of the image that should be ignored (negative mask). By looking at this representation, it shall be possible to verify if the dataset/loader is yielding images correctly.- Parameters
img (PIL.Image.Image) – An RGB PIL image that represents the original image for analysis
label (PIL.Image.Image) – A PIL image in any mode that represents the labelled elements in the image. In case of images in mode “L” or “1”, white pixels represent the labelled object. Black-er pixels represent background.
mask (py:class:PIL.Image.Image, Optional) – A PIL image in mode “1” that represents the mask for the image. White pixels indicate where content should be used, black pixels, content to be ignored.
label_color (py:class:tuple, Optional) – A tuple with three integer entries indicating the RGB color to be used for labels. Only used if
label.mode
is “1” or “L”.mask_color (py:class:tuple, Optional) – A tuple with three integer entries indicating the RGB color to be used for the mask-negative (black parts in the original mask).
alpha (py:class:float, Optional) – A float that indicates how much of blending should be performed between the label, mask and the original image.
- Returns
image – A new image overlaying the original image, object labels (in green) and what is to be considered parts to be masked-out (i.e. a representation of a negative of the mask).
- Return type
- bob.ip.common.data.utils.overlayed_bbox_image(img, box, box_color=(0, 255, 0), width=1)[source]¶
Creates an image showing existing bounding boxes
This function creates a new representation of the input image
img
overlaying a green bounding box for labelled objects. By looking at this representation, it shall be possible to verify if the dataset/loader is yielding images correctly.- Parameters
img (PIL.Image.Image) – An RGB PIL image that represents the original image for analysis
box (list) – A list of bounding box coordinates.
box_color (py:class:tuple, Optional) – A tuple with three integer entries indicating the RGB color to be used for bounding box.
width (py:class:int, Optional) – An integer indicating the size of the rectangle line, in pixels.
- Returns
image – A new image overlaying the original image, object labels (in green).
- Return type
- class bob.ip.common.data.utils.SampleListDataset(samples, transforms=[])[source]¶
Bases:
Dataset
PyTorch dataset wrapper around Sample lists
A transform object can be passed that will be applied to the image, ground truth and mask (if present).
It supports indexing such that dataset[i] can be used to get the i-th sample.
- Parameters
samples (list) – A list of
bob.ip.common.data.sample.Sample
objectstransforms (
list
, Optional) – a list of transformations to be applied to both image and ground-truth data. Notice a last transform (bob.ip.common.data.transforms.ToTensor
) is always applied - you do not need to add that.
- property transforms¶
- class bob.ip.common.data.utils.SampleListDetectionDataset(samples, transforms=[])[source]¶
Bases:
Dataset
PyTorch dataset wrapper around Sample lists
A transform object can be passed that will be applied to the image, ground truth and mask (if present).
It supports indexing such that dataset[i] can be used to get the i-th sample.
- Parameters
samples (list) – A list of
bob.ip.common.data.sample.Sample
objectstransforms (
list
, Optional) – a list of transformations to be applied to both image and ground-truth data. Notice a last transform (bob.ip.common.data.transforms.ToTensor
) is always applied - you do not need to add that.
- property transforms¶
- class bob.ip.common.data.utils.SSLDataset(labelled, unlabelled)[source]¶
Bases:
Dataset
PyTorch dataset wrapper around labelled and unlabelled sample lists
Yields elements of the form:
[key, image, ground-truth, [mask,] unlabelled-key, unlabelled-image]
The size of the dataset is the same as the labelled dataset.
Indexing works by selecting the right element on the labelled dataset, and randomly picking another one from the unlabelled dataset
- Parameters
labelled (
torch.utils.data.Dataset
) – Labelled dataset (must have “mask” and “label” entries for every sample)unlabelled (
torch.utils.data.Dataset
) – Unlabelled dataset (may have “mask” and “label” entries for every sample, but are ignored)