3

I am looking for a simple way to use an activation function which exist in the pytorch library, but using some sort of parameter. for example:

Tanh(x/10)

The only way I came up with looking for solution was implementing the custom function completely from scratch. Is there any better/more elegant way to do this?

edit:

I am looking for some way to append to my model the function Tanh(x/10) rather than plain Tanh(x). Here is the relevant code block:

    self.model = nn.Sequential()
    for i in range(len(self.layers)-1):
        self.model.add_module("linear_layer_" + str(i), nn.Linear(self.layers[i], self.layers[i + 1]))
        if activations == None:
            self.model.add_module("activation_" + str(i), nn.Tanh())
        else:
            if activations[i] == "T":
                self.model.add_module("activation_" + str(i), nn.Tanh())
            elif activations[i] == "R":
                self.model.add_module("activation_" + str(i), nn.ReLU())
            else:
                #no activation
                pass
3
  • I guess you can take a look at the lambda functions if I understand correctly. For example, Tanh(x / 10) can be implemented as new_tanh = lambda x: nn.tanh(x / 10). Then you can call it with new_tanh(y) which will return the value of Tanh(y / 10) Commented Jan 13, 2019 at 23:13
  • Or more generally, you can just implement a new function that delegates the computation to nn.tanh: def new_tanh(x): return nn.tanh(x / 10) . (sorry about the indentation) Commented Jan 13, 2019 at 23:17
  • when using lambdas i get an error which says that lambda isn't a pytorch module. lamdas worked for me for some cases but for other i get this error/ Commented Jan 13, 2019 at 23:30

2 Answers 2

3

Instead of defining it as a specific function, you could inline it in a custom layer.

For instance your solution could look like:


import torch
import torch.nn as nn

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.fc1 = nn.Linear(4, 10)
        self.fc2 = nn.Linear(10, 3)
        self.fc3 = nn.Softmax()

    def forward(self, x):
        return self.fc3(self.fc2(torch.tanh(self.fc1(x)/10)))

where torch.tanh(output/10) is inlined in the forward function of your module.

Sign up to request clarification or add additional context in comments.

Comments

1

You can create a layer with the multiplying parameter:

import torch
import torch.nn as nn

class CustomTanh(nn.Module):

    #the init method takes the parameter:
    def __init__(self, multiplier):
        self.multiplier = multiplier

    #the forward calls it:
    def forward(self, x):
        x = self.multiplier * x
        return torch.tanh(x)

Add it to your models with CustomTanh(1/10) instead of nn.Tanh().

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.