-
Notifications
You must be signed in to change notification settings - Fork 8
Open
Description
is there something that prevents a from exploding or vanishing?
ive been trying on my own implementation and seem to be getting nan values from snake. (the reason seems to be division or multiplication by large or tiny a)
our code looks very similar on the surface, and i couldnt find anything that prevents this is your code.
this isnt really an issue, but i didnt know how to ask.
here is what my implementation looks like. the only difference i see is that youre accounting for a being None, but im not (a is never none in my case)
class SnakeActivation(nn.Module):
'''
defines the snake activation function with learnable parameter a
and returns x + (1/a)* sin^2(ax)
'''
def __init__(self, a, in_features):
super(SnakeActivation, self).__init__()
self.a = nn.Parameter(
torch.ones(in_features) * a, requires_grad=True
)
def snake(self, x):
return x + (1/self.a)*torch.pow(torch.sin(self.a* (x)), 2)
def forward(self, x):
return self.snake(x)Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels