I have three series of observations, namely Y, T, and X. I would like to study the differences between the predicted values of the two models. The first model is to learn g such that Y=g(T, X). The second model is to learn L and f such that Y=L(T)f(X). I have no problem in learning the first model using the PyTorch package or the Tensorflow package. However, I am not sure how to learn L and f. In using the PyTorch package, I can set up two feedforward MLPs with different hidden layers and inputs. For simplicity, I define a Feedforward MLP class as follows:
class Feedforward(t.nn.Module): # the definition of a feedforward neural network
# Basic definition
def __init__(self, input_size, hidden_size):
super(Feedforward, self).__init__()
self.input_size = input_size
self.hidden_size = hidden_size
self.fc1 = t.nn.Linear(self.input_size, self.hidden_size)
self.relu = t.nn.ReLU()
self.fc2 = t.nn.Linear(self.hidden_size, 1)
self.sigmoid = t.nn.Sigmoid()
# Advance definition
def forward(self, x):
hidden = self.fc1(x)
relu = self.relu(hidden)
output = self.fc2(relu)
output = self.sigmoid(output)
return output
Suppose L=Feedforward(2,10) and L=Feedforward(3,9). From my understanding, I can only learn either L or f, but not both simultaneously. Is it possible to learn L and f simultaneously using Y, T, and X?