Skip to content

Automatic Data-Regularized Actor-Critic (Auto-DrAC)

License

Notifications You must be signed in to change notification settings

joshnroy/auto-drac

 
 

Repository files navigation

Factored World Model for Generalization

Old code below

Experiments to Run

  • Vanilla PPO
  • Encoder, MLP Transition + Reward Models
  • Encoder, Conv Transition + Reward Models
  • Encoder, Conv Transition + Factored Reward Models
  • Encoder, Conv Transition + Factored Reward Models
  • Graph Neural Networks via this paper and this code

Auto-DrAC: Automatic Data-Regularized Actor-Critic

This is a PyTorch implementation of the methods proposed in

Automatic Data Augmentation for Generalization in Deep Reinforcement Learning by

Roberta Raileanu, Max Goldstein, Denis Yarats, Ilya Kostrikov, and Rob Fergus.

Requirements

The code was run on a GPU with CUDA 10.2. To install all the required dependencies:

conda create -n auto-drac python=3.7
conda activate auto-drac

git clone git@github.com:rraileanu/auto-drac.git
cd auto-drac
pip install -r requirements.txt

git clone https://github.com/openai/baselines.git
cd baselines 
python setup.py install 

pip install procgen

Instructions

cd auto-drac

Train DrAC with crop augmentation on BigFish

python train.py --env_name bigfish --aug_type crop

Train UCB-DrAC on BigFish

python train.py --env_name bigfish --use_ucb

Train RL2-DrAC on BigFish

python train.py --env_name bigfish --use_rl2

Train Meta-DrAC on BigFish

python train.py --env_name bigfish --use_meta

Procgen Results

UCB-DrAC achieves state-of-the-art performance on the Procgen benchmark (easy mode), significantly improving the agent's generalization ability over standard RL methods such as PPO.

Test Results on Procgen

Procgen Test

Train Results on Procgen

Procgen Train

Acknowledgements

This code was based on an open sourced PyTorch implementation of PPO.

We also used kornia for some of the augmentations.

About

Automatic Data-Regularized Actor-Critic (Auto-DrAC)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 90.3%
  • Python 9.6%
  • Shell 0.1%