salad.datasets.transforms packageΒΆ
-
salad.datasets.transforms.
tensor
(data, dtype=None, device=None, requires_grad=False) → TensorΒΆ Constructs a tensor with
data
.Warning
torch.tensor()
always copiesdata
. If you have a Tensordata
and want to avoid a copy, usetorch.Tensor.requires_grad_()
ortorch.Tensor.detach()
. If you have a NumPyndarray
and want to avoid a copy, usetorch.from_numpy()
.Parameters: - data (array_like) β Initial data for the tensor. Can be a list, tuple,
NumPy
ndarray
, scalar, and other types. - dtype (
torch.dtype
, optional) β the desired data type of returned tensor. Default: if None, infers data type fromdata
. - device (
torch.device
, optional) β the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (seetorch.set_default_tensor_type()
).device
will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types. - requires_grad (bool, optional) β If autograd should record operations on the
returned tensor. Default:
False
.
Example:
>>> torch.tensor([[0.1, 1.2], [2.2, 3.1], [4.9, 5.2]]) tensor([[ 0.1000, 1.2000], [ 2.2000, 3.1000], [ 4.9000, 5.2000]]) >>> torch.tensor([0, 1]) # Type inference on data tensor([ 0, 1]) >>> torch.tensor([[0.11111, 0.222222, 0.3333333]], dtype=torch.float64, device=torch.device('cuda:0')) # creates a torch.cuda.DoubleTensor tensor([[ 0.1111, 0.2222, 0.3333]], dtype=torch.float64, device='cuda:0') >>> torch.tensor(3.14159) # Create a scalar (zero-dimensional tensor) tensor(3.1416) >>> torch.tensor([]) # Create an empty tensor (of size (0,)) tensor([])
- data (array_like) β Initial data for the tensor. Can be a list, tuple,
NumPy
SubmodulesΒΆ
salad.datasets.transforms.digits moduleΒΆ
Standard Transformations for Digit datasets
-
salad.datasets.transforms.digits.
default_normalization
(key)ΒΆ
-
salad.datasets.transforms.digits.
default_transforms
(key)ΒΆ
-
salad.datasets.transforms.digits.
tensor
(data, dtype=None, device=None, requires_grad=False) → TensorΒΆ Constructs a tensor with
data
.Warning
torch.tensor()
always copiesdata
. If you have a Tensordata
and want to avoid a copy, usetorch.Tensor.requires_grad_()
ortorch.Tensor.detach()
. If you have a NumPyndarray
and want to avoid a copy, usetorch.from_numpy()
.Parameters: - data (array_like) β Initial data for the tensor. Can be a list, tuple,
NumPy
ndarray
, scalar, and other types. - dtype (
torch.dtype
, optional) β the desired data type of returned tensor. Default: if None, infers data type fromdata
. - device (
torch.device
, optional) β the desired device of returned tensor. Default: if None, uses the current device for the default tensor type (seetorch.set_default_tensor_type()
).device
will be the CPU for CPU tensor types and the current CUDA device for CUDA tensor types. - requires_grad (bool, optional) β If autograd should record operations on the
returned tensor. Default:
False
.
Example:
>>> torch.tensor([[0.1, 1.2], [2.2, 3.1], [4.9, 5.2]]) tensor([[ 0.1000, 1.2000], [ 2.2000, 3.1000], [ 4.9000, 5.2000]]) >>> torch.tensor([0, 1]) # Type inference on data tensor([ 0, 1]) >>> torch.tensor([[0.11111, 0.222222, 0.3333333]], dtype=torch.float64, device=torch.device('cuda:0')) # creates a torch.cuda.DoubleTensor tensor([[ 0.1111, 0.2222, 0.3333]], dtype=torch.float64, device='cuda:0') >>> torch.tensor(3.14159) # Create a scalar (zero-dimensional tensor) tensor(3.1416) >>> torch.tensor([]) # Create an empty tensor (of size (0,)) tensor([])
- data (array_like) β Initial data for the tensor. Can be a list, tuple,
NumPy
salad.datasets.transforms.ensembling moduleΒΆ
-
class
salad.datasets.transforms.ensembling.
Augmentation
(dataset, n_samples=1)ΒΆ Bases:
object
-
class
salad.datasets.transforms.ensembling.
ImageAugmentation
(hflip, xlat_range, affine_std, rot_std=0.0, intens_flip=False, intens_scale_range_lower=None, intens_scale_range_upper=None, intens_offset_range_lower=None, intens_offset_range_upper=None, scale_x_range=None, scale_y_range=None, scale_u_range=None, gaussian_noise_std=0.0, blur_range=None)ΒΆ Bases:
object
-
augment
(X)ΒΆ
-
augment_pair
(X)ΒΆ
-
-
salad.datasets.transforms.ensembling.
cat_nx2x3
(a, b)ΒΆ Multiply the N 2x3 transformations stored in a with those in b :param a: transformation matrices, (N,2,3) array :param b: transformation matrices, (N,2,3) array :return: a . b
-
salad.datasets.transforms.ensembling.
centre_xf
(xf, size)ΒΆ Centre the transformations in xf around (0,0), where the current centre is assumed to be at the centre of an image of shape size :param xf: transformation matrices, (N,2,3) array :param size: image size :return: centred transformation matrices, (N,2,3) array
-
salad.datasets.transforms.ensembling.
identity_xf
(N)ΒΆ Construct N identity 2x3 transformation matrices :return: array of shape (N, 2, 3)
-
salad.datasets.transforms.ensembling.
inv_nx2x2
(X)ΒΆ Invert the N 2x2 transformation matrices stored in X; a (N,2,2) array :param X: transformation matrices to invert, (N,2,2) array :return: inverse of X
-
salad.datasets.transforms.ensembling.
inv_nx2x3
(m)ΒΆ Invert the N 2x3 transformation matrices stored in X; a (N,2,3) array :param X: transformation matrices to invert, (N,2,3) array :return: inverse of X
-
salad.datasets.transforms.ensembling.
rotation_matrices
(thetas)ΒΆ Generate rotation matrices :param thetas: rotation angles in radians as a (N,) array :return: rotation matrices, (N,2,3) array
salad.datasets.transforms.noise moduleΒΆ
-
class
salad.datasets.transforms.noise.
DomainConfusion
(transform_list, intermediate)ΒΆ Bases:
object
Given x and a set of possible transforms, applies a random transform on x and returns a pair (x, d)
-
class
salad.datasets.transforms.noise.
DomainLabel
(domain, n_domains)ΒΆ Bases:
object
concats a domain label to the dataset
-
class
salad.datasets.transforms.noise.
Gaussian
(mu=0.0, sigma=0.1)ΒΆ Bases:
object
Add gaussian noise
-
class
salad.datasets.transforms.noise.
InvertContrast
ΒΆ Bases:
object
-
class
salad.datasets.transforms.noise.
SaltAndPepper
(p=0.05)ΒΆ Bases:
object
Adds salt and pepper noise with probability p to a given image or batch of images
-
class
salad.datasets.transforms.noise.
Shift
(w=5, h=5)ΒΆ Bases:
object
-
class
salad.datasets.transforms.noise.
Uniform
(p=0.05)ΒΆ Bases:
object
Add uniform noise