Skip to content

KhoronusFork/masked-convolution

 
 

Repository files navigation

Masked Convolution

License

A PyTorch implementation of a thin wrapper for masked convolutions.

What are masked convolutions?

Similarly to partial convolutions, masked convolutions mask a part of the kernel, essentially ignoring data at specific locations. For an example, consider

a = [1, 2, 3, 4, 5]

assuming we have a convolution kernel

kernel = [1, 1, 1]

convolving over a would give us

a_conv = [6, 9, 12]

However, if we were to mask the convolution kernel with a mask

mask = [1, 0, 1]

masked convolving over a would return

a_masked_conv = [4, 6, 8]

One use of masked convolutions is emulating skip-grams.

Installation

First, make sure you have PyTorch installed. This was tested on Python 3.8 and PyTorch 1.7.1. Further testing is needed to determine whether it works on a different setup - chances are it does. The recommended way to install this is through PyPi by running:

pip install masked-convolution

Other than that, you can clone this repository, and in its root directory (where setup.py is located) run

pip install .

Benchmarks

Every build, automatic benchmarks are run in order to determine how much overhead the implementation brings. The ordinary convolutions are used as a baseline, while the the performance of masked convolutions is described as a percentage of throughput of their respective baselines.

Keep in mind that these benchmarks are in no way professional, they only serve to give users a general idea. Their results greatly differ, so they should be taken with a grain of salt.

  • Masked Convolution 1D: 79.90 % Convolution 1D throughput
  • Masked Convolution 2D: 86.83 % Convolution 2D throughput
  • Masked Convolution 3D: 96.78 % Convolution 3D throughput

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%