This repository contains an custom keras layer that implements the Neural Arithmetic Logic Unit described in arXiv:1808.00508.
See basic_experiments.ipynb for basic examples of how to use the code as well as small collection of experiments -- being expanded.
Not in any particular order:
- Working version of the code.
- Expand this Readme to include more info about the idea behind NALU.
- Replicate the experiments found in the paper -- currently done: id and +.
- Investigate the extrapolation error of NALU.
- Add callback so we can use tensorboard.
- Following this, investigate and implement any tweaks to NALU.
Some interesting resources about NALU.
- arXiv:1808.00508 ;)
- This discussion on reddit. Includes one of the authors.
- This blog is a good introduction to NALU.
Some implementations:
- faizan2786/nalu_implementation: Tensorflow.
- bharathgs/NALU: PyTorch.
- kevinzakka/NALU-pytorch: PyTorch.
- kgrm/NALU: Keras.
- titu1994/keras-neural-alu: Keras.
- This blog post presents a x86 Assembly implementation of Nalu.
- AtriSaxena/Neural_Arithmetic_Logic_Units-Keras: Keras.