TransWikia.com

Neural network: does bias equal to zero, is the same as, a layer without bias?

Data Science Asked by Kyle_397 on August 4, 2021

Question as in the title. Does bias equal to zero, is the same as, removing bias from the layer? Here’s a pytorch implementation to showcase what I mean.

class MLP_without_bias(torch.nn.Module):
    def __init__(self):
        super().__init__()  

        # Bias set as False
        self.linear = torch.nn.Linear(5, 3, bias = False)

        # Xavier initialization 
        torch.nn.init.xavier_uniform_(self.linear.weight)

    def forward(self, x):
        return self.linear(x)

class MLP_with_bias_zero(torch.nn.Module):
    def __init__(self):
        super().__init__()  

        # Default bias set as True
        self.linear = torch.nn.Linear(5, 3)

        # Xavier initialization 
        torch.nn.init.xavier_uniform_(self.linear.weight)

        # Bias initialized as zero
        torch.nn.init.zeros_(self.linear.bias)     

    def forward(self, x):
        return self.linear(x)

One Answer

No, they are not the same:

  • In MLP_without_bias the bias will be zero after training, because of bias=False.

  • In MLP_with_bias_zero the bias is zero at initialization, but this will not prevent it from being updated during training.

Correct answer by noe on August 4, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP