Skip to content

alexander-rakhlin/kaggle_otto

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Kaggle Otto Group Product Classification Challenge

My solution that scored 0.42232 and finished Otto competition on 218th position out of 3514 teams.
Represents 1:1 blend of XGBoost model and average of 20 Neural Nets. Models hyper parameters, NN architecture and blend weights have been chosen manually.

Requires:

Other work

Other Kagglers insights I found particularly interesting. For the most part they relate to blending. I list them here for further study:

  1. Triskelion. Competition 62nd. Blending
    forum link 1
    forum link 2
    Ensemble Selection from Libraries of Models
    For his turn he is referring to another kaggler Emanuele Olivetti, (forked code)

  2. Hoang Duong. Competition 6th. Blending
    forum link
    documentation

  3. Adam Harasimowicz. Competition 66th. Blending, Hyperopt
    forked code
    Blog post

  4. Mike Kim. Competition 8th. T-SNE features and meta bagging
    forum link
    code

About

Kaggle: Otto Group Product Classification Challenge

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published