hopwise.model.init¶
Functions¶
|
Using xavier_normal_ in PyTorch to initialize the parameters in |
|
Using xavier_uniform_ in PyTorch to initialize the parameters in |
Module Contents¶
- hopwise.model.init.xavier_normal_initialization(module)¶
Using xavier_normal_ in PyTorch to initialize the parameters in nn.Embedding and nn.Linear layers. For bias in nn.Linear layers, using constant 0 to initialize.
Examples
>>> self.apply(xavier_normal_initialization)
- hopwise.model.init.xavier_uniform_initialization(module)¶
Using xavier_uniform_ in PyTorch to initialize the parameters in nn.Embedding and nn.Linear layers. For bias in nn.Linear layers, using constant 0 to initialize.
Examples
>>> self.apply(xavier_uniform_initialization)