0

I am trying to apply a function to 3d torch tensor while the function is applied to 2d tensor which is read through the axis 1 of the 3d torch tensor.

For example, I have a torch tensor of the shape (51, 128, 20100) (a variable with name autoencode_logprob) and the function(rawid2sentence) runs on the input of the shape (51, 20100).

Right now I wrote the code to run with naive for loop, looping one by one with range(128).

However, it’s too slow. Following is the code part that matters.

autoencode_logprobs is the 3d tensor and I need to apply rawids2sentence function along its second axis. Any help to vectorize it?

for i in range(128):
    output_sent = self.dictionary.rawids2sentence(
        autoencode_logprobs[:, i].max(1)[
            1].data.cpu().numpy(),
        oov_dicts[i],
    )
    output_sent_encoding = ifst_model.encode([output_sent])
2
  • With only 128 iterations I'd focus the speed of the task itself, not on eliminating the iterations. Commented May 7, 2019 at 19:36
  • There are millions of batches to process the same task. Eliminating iteration is the basic and fundamental in deep learning. Commented May 7, 2019 at 19:47

1 Answer 1

1

Since I do not know what rawids2sentence or encode function does, I can help you with to do the max operation.

In the following statement,

autoencode_logprobs[:, i].max(1)[1]

You identify the index of the maximum values along dim=1 for each 51 x 20100 tensor. So, the output is a vector of size 51.

You can perform the same operation in your full tensor of shape 51 x 128 x 20100 and get the output as 128 x 51 tensor.

autoencode_logprobs.transpose(0, 1).max(2)[1] # 128 x 51

So, if your rawids2sentence or encode methods can tackle batch inputs, the above change should work for you without any loop.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.