4

I want to do two things with a PyTorch convolution which aren't mentioned in the documentation or code:

  1. I want to create a convolution with a fixed kernel like this:

    000010000
    000010000
    100010001
    000010000
    000010000
    

    The horizontal aspect is like dilation, I guess, but the vertical part is different. I see that dilation is available as a parameter in the code, but it has to be a scalar or single-element tuple (not one element per dimension), so I don't think it can do what I want here.

  2. I would like my convolutions to "wrap around" like a toroid, rather than use padding.

    EDIT TO ADD: I see that there is an open issue for this , which also provides a suboptimal workaround. So, I guess that there's no "right" way to do it, yet.

benjaminplanche
  • 12,511
  • 5
  • 48
  • 63
jmmcd
  • 604
  • 6
  • 15

1 Answers1

5
  1. Unlike torch.nn.conv2d() (which instantiates its own trainable kernel), torch.nn.functional.conv2d() takes as parameters both your matrix and kernel, so you can pass it whatever custom kernel you want.

  2. As suggested by @zou3519 in a Github issue (linked to the issue you mentioned yourself), you could implement yourself a 2D circular padding by "repeating the tensor in a nxn grid, then cropping out the part you need.":

def circular_pad_2d(x, pad=(1, 1)):
   # Snipped by @zou3519 (https://github.com/zou3519)
   return x.repeat(*x_shape[:2])[
        (x.shape[0]-pad[0]):(2*x.shape[0]+pad[0]), 
        (x.shape[1]-pad[1]):(2*x.shape[1]+pad[1])
   ]

# Example:
x = torch.tensor([[1,2,3],[4,5,6]])
y = circular_pad_2d(x, pad=(2, 3))
print(y)
#     1     2     3     1     2     3     1     2     3
#     4     5     6     4     5     6     4     5     6
#     1     2     3     1     2     3     1     2     3
#     4     5     6     4     5     6     4     5     6

  1. (previous) In the torch.nn.functional module too, torch.nn.functional.pad() can take as parameter mode=reflect, which is what you want I believe (?). You could use this method to manually pad your input matrix before performing the convolution. (note: you also have the torch.nn.ReflectionPad2d layer specifically tailored for fixed 2D padding by reflection)
benjaminplanche
  • 12,511
  • 5
  • 48
  • 63
  • Thanks! 1. I didn't realise this distinction between trainable and not. That helps a lot. In torch.nn.functional.conv2d(), if I understand right, `filters` is the custom kernel? 2. Toroidal is different from reflecting, see eg https://github.com/pytorch/pytorch/issues/6124 - it seems to be an open request, not currently possible except with a workaround. – jmmcd Jun 01 '18 at 10:32
  • 1
    1. Yes, give your kernel as *filter* (defined by the parameter `weights`). 2. Ah indeed, my bad. What about the workaround mentioned in this issue, i.e. implementing 2D circular padding yourself, to apply it to your matrix before convolution? (I updated my answer accordingly) – benjaminplanche Jun 01 '18 at 11:10
  • Yes, I had linked to it in my edit -- it does seem to be regarded as a workaround. It'll do for now. – jmmcd Jun 02 '18 at 18:01
  • 1
    It looks like [`torch.nn.functional.pad`](https://pytorch.org/docs/stable/nn.functional.html#pad) now support `circular` padding. – Vaelus Dec 13 '19 at 17:12