Skip to content

I use this depository to collect all of my crazy ideas for machine learning (neural networks).

Notifications You must be signed in to change notification settings

areou/machine_learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 

Repository files navigation

machine_learning

I use this depository to collect all of my crazy ideas for machine learning (neural networks).

Dependencies

All programs will be written with the following dependencies:

(all avaialable with pip)

Knowledge Distillation

One of the first things I am exploring is knowledge distillation (KD). The basic idea of knowledge distillation is to take a very accurate, but large and computationally expensive model and transfer the knowledge learned from the expensive model to a much smaller model which is easier to deploy on smaller machines such as cell phones (or even smaller like hearing aids). Many papers have been written on this subject, but the heart of the manner is explained and investigaed very well in Distilling the Knowledge in a Neural Network.

Extracting Knowledge

In most current approaches (like that in Improved Knowledge Distillation via Teacher Assistant: Bridging the Gap Between Student and Teacher) the objective is to take the softmax (probability like answers) of the expensive model and use this to help train a smaller model. It is believed (and backed up experimentally Relational Knowledge Distillation and others) that training a model to learn the relationships between identifiers will produce some of the best models.

About

I use this depository to collect all of my crazy ideas for machine learning (neural networks).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages