salad.models.digits packageΒΆ

Collection of models to reproduce resuls in original publications

SubmodulesΒΆ

salad.models.digits.adv moduleΒΆ

class salad.models.digits.adv.AdvModelΒΆ

Bases: torch.nn.modules.module.Module

forward(x)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class salad.models.digits.adv.FeatureΒΆ

Bases: torch.nn.modules.module.Module

forward(x)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class salad.models.digits.adv.Predictor(prob=0.5)ΒΆ

Bases: torch.nn.modules.module.Module

forward(x)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

salad.models.digits.assoc moduleΒΆ

class salad.models.digits.assoc.FrenchModelΒΆ

Bases: torch.nn.modules.module.Module

Model used in β€œSelf-Ensembling for Visual Domain Adaptation” by French et al.

forward(x)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class salad.models.digits.assoc.SVHNmodelΒΆ

Bases: torch.nn.modules.module.Module

Model for application on SVHN data (32x32x3) Architecture identical to https://github.com/haeusser/learning_by_association

forward(x)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

salad.models.digits.assoc.conv2d(m, n, k, act=True)ΒΆ

salad.models.digits.corr moduleΒΆ

class salad.models.digits.corr.FrenchModelΒΆ

Bases: torch.nn.modules.module.Module

Model used in β€œSelf-Ensembling for Visual Domain Adaptation” by French et al.

forward(x)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

class salad.models.digits.corr.SVHNmodelΒΆ

Bases: torch.nn.modules.module.Module

Model for application on SVHN data (32x32x3) Architecture identical to https://github.com/haeusser/learning_by_association

forward(x)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

salad.models.digits.corr.conv2d(m, n, k, act=True)ΒΆ

salad.models.digits.dirtt moduleΒΆ

class salad.models.digits.dirtt.ConditionalBatchNorm(*args, n_domains=1, bn_func=<class 'torch.nn.modules.batchnorm.BatchNorm2d'>, **kwargs)ΒΆ

Bases: torch.nn.modules.module.Module

forward(x, d)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

parameters(d=0)ΒΆ

Returns an iterator over module parameters.

This is typically passed to an optimizer.

Yields:Parameter – module parameter

Example:

>>> for param in model.parameters():
>>>     print(type(param.data), param.size())
<class 'torch.FloatTensor'> (20L,)
<class 'torch.FloatTensor'> (20L, 1L, 5L, 5L)
class salad.models.digits.dirtt.SVHN_MNIST_Model(n_classes=10, n_domains=2)ΒΆ

Bases: torch.nn.modules.module.Module

conditional_params(d=0)ΒΆ
forward(x, d=0)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

parameters(d=0, yield_shared=True, yield_conditional=True)ΒΆ

Returns an iterator over module parameters.

This is typically passed to an optimizer.

Yields:Parameter – module parameter

Example:

>>> for param in model.parameters():
>>>     print(type(param.data), param.size())
<class 'torch.FloatTensor'> (20L,)
<class 'torch.FloatTensor'> (20L, 1L, 5L, 5L)

salad.models.digits.ensemble moduleΒΆ

class salad.models.digits.ensemble.ConditionalBatchNorm(*args, n_domains=1, bn_func=<class 'torch.nn.modules.batchnorm.BatchNorm2d'>, **kwargs)ΒΆ

Bases: torch.nn.modules.module.Module

forward(x, d)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

parameters(d=0)ΒΆ

Returns an iterator over module parameters.

This is typically passed to an optimizer.

Yields:Parameter – module parameter

Example:

>>> for param in model.parameters():
>>>     print(type(param.data), param.size())
<class 'torch.FloatTensor'> (20L,)
<class 'torch.FloatTensor'> (20L, 1L, 5L, 5L)
class salad.models.digits.ensemble.SVHN_MNIST_Model(n_classes=10, n_domains=2)ΒΆ

Bases: torch.nn.modules.module.Module

conditional_params(d=0)ΒΆ
forward(x, d=0)ΒΆ

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

parameters(d=0, yield_shared=True, yield_conditional=True)ΒΆ

Returns an iterator over module parameters.

This is typically passed to an optimizer.

Yields:Parameter – module parameter

Example:

>>> for param in model.parameters():
>>>     print(type(param.data), param.size())
<class 'torch.FloatTensor'> (20L,)
<class 'torch.FloatTensor'> (20L, 1L, 5L, 5L)

salad.models.digits.fan moduleΒΆ