Skip to content

Reproducible Rapid Research for Neural Architecture Search (NAS)

License

Notifications You must be signed in to change notification settings

singlasahil14/archai

 
 

Repository files navigation

Welcome to Archai

Archai is a platform for Neural Network Search (NAS) that allows you to generate efficient deep networks for your applications. Archai aspires to accelerate NAS research by enabling easy mix and match between different techniques while ensuring reproducibility, self-documented hyper-parameters and fair comparison. To achieve this, Archai uses a common code base that unifies several algorithms. Archai is extensible and modular to allow rapid experimentation of new research ideas and develop new NAS algorithms. Archai also hopes to make NAS research more accessible to non-experts by providing powerful configuration system and easy to use tools.

Extensive feature list

Installation

Prerequisites

Archai requires Python 3.6+ and PyTorch 1.2+. To install Python we highly recommend Anaconda. Archai works both on Linux as well as Windows.

Install from source code

We recommend installing from the source code:

git clone https://github.com/microsoft/archai.git
cd archai
install.sh # on Windows, use install.bat

For more information, please see Install guide

Quick Start

Running Algorithms

To run a specific NAS algorithm, specify it by --algos switch:

python scripts/main.py --algos darts --full

For more information on available switches and algorithms, please see running algorithms.

Tutorials

The best way to familiarize yourself with Archai is to take a quick tour through our 30 Minute tutorial.

We also have the tutorial for Petridish algorithm that was developed at Microsoft Research and now available through Archai.

Visual Studio Code

We highly recommend Visual Studio Code to take advantage of predefined run configurations and interactive debugging.

From the archai directory, launch Visual Studio Code. Select the Run button (Ctrl+Shift+D), chose the run configuration you want and click on Play icon.

Running experiments on Azure AML

To run NAS experiments at scale, you can use Archai on Azure.

Documentation

Docs and API reference is available for browsing and searching.

Contribute

We would love community contributions, feedback, questions, algorithm implementations and feature requests! Please file a Github issue or send us a pull request. Please review the Microsoft Code of Conduct and learn more.

Contact

Join the Archai group on Facebook to stay up to date or ask any questions.

Team

Archai has been created and maintained by Shital Shah and Debadeepta Dey in the Reinforcement Learning Group at Microsoft Research AI, Redmond, USA. Archai has benefited immensely from discussions with John Langford, Rich Caruana, Eric Horvitz and Alekh Agarwal.

We look forward to Archai becoming more community driven and including major contributors here.

Credits

Archai builds on several open source codebases. These includes: Fast AutoAugment, pt.darts, DARTS-PyTorch, DARTS, petridishnn, PyTorch CIFAR-10 Models, NVidia DeepLearning Examples, PyTorch Warmup Scheduler, NAS Evaluation is Frustratingly Hard, NASBench-PyTorch. Please see install_requires section in setup.py for up to date dependencies list. If you feel credit to any material is missing, please let us know by filing a Github issue.

License

This project is released under the MIT License. Please review the License file for more details.

About

Reproducible Rapid Research for Neural Architecture Search (NAS)

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.5%
  • Other 0.5%