Esempio n. 1
0
File: ner1.py Progetto: framr/ml
    docs = du.load_dataset('data/ner/dev')
    X_dev, y_dev = du.docs_to_windows(docs, word_to_num, tag_to_num, wsize=windowsize)

    # Load the test set (dummy labels only)
    docs = du.load_dataset('data/ner/test.masked')
    X_test, y_test = du.docs_to_windows(docs, word_to_num, tag_to_num, wsize=windowsize)


    # To avoid re-inventing the wheel, we provide a base class that handles a lot of the drudgery of 
    # managing parameters and running gradient descent. It's based on the classifier API used by 
    #[`scikit-learn`](http://scikit-learn.org/stable/), so if you're familiar with that library it should be easy to use.  
    # We'll be using this class for the rest of this assignment, so it helps to get acquainted with a simple 
    # example that should be familiar from Assignment 1. 
    # To keep this notebook uncluttered, we've put the code in the `softmax_example.py`; take a look at it there, then run the cell below.
    
    sr = SoftmaxRegression(wv=zeros((10,100)), dims=(100,5))

    # Automatic gradient checker!
    # this checks anything you add to self.grads or self.sgrads
    # using the method of Assignment 1
    sr.grad_check(x=5, y=4)


    # In order to implement a model, you need to subclass `NNBase`, then implement the following methods:
    # 
    # - `__init__()` (initialize parameters and hyperparameters)
    # - `_acc_grads()` (compute and accumulate gradients)
    # - `compute_loss()` (compute loss for a training example)
    # - `predict()`, `predict_proba()`, or other prediction method (for evaluation)
    # 
    # `NNBase` provides you with a few others that will be helpful:
Esempio n. 2
0
# Load the dev set (for tuning hyperparameters)
docs = du.load_dataset('data/ner/dev')
X_dev, y_dev = du.docs_to_windows(docs,
                                  word_to_num,
                                  tag_to_num,
                                  wsize=windowsize)

# Load the test set (dummy labels only)
docs = du.load_dataset('data/ner/test.masked')
X_test, y_test = du.docs_to_windows(docs,
                                    word_to_num,
                                    tag_to_num,
                                    wsize=windowsize)

from softmax_example import SoftmaxRegression
sr = SoftmaxRegression(wv=zeros((10, 100)), dims=(100, 5))

##
# Automatic gradient checker!
# this checks anything you add to self.grads or self.sgrads
# using the method of Assignment 1
sr.grad_check(x=5, y=4)

#from nerwindow import WindowMLP
from nerwindow_msushkov import WindowMLP
clf = WindowMLP(wv,
                windowsize=windowsize,
                dims=[None, 100, 5],
                reg=0.001,
                alpha=0.01)
clf.grad_check(X_train[0], y_train[0])  # gradient check on single point
Esempio n. 3
0
print wv.shape
print len(word_to_num)
print type(word_to_num)
print docs[0][:]
print X_train[0]
print y_train[0]


# To avoid re-inventing the wheel, we provide a base class that handles a lot of the drudgery of managing parameters and running gradient descent. It's based on the classifier API used by [`scikit-learn`](http://scikit-learn.org/stable/), so if you're familiar with that library it should be easy to use. 
# 
# We'll be using this class for the rest of this assignment, so it helps to get acquainted with a simple example that should be familiar from Assignment 1. To keep this notebook uncluttered, we've put the code in the `softmax_example.py`; take a look at it there, then run the cell below.

# In[71]:

from softmax_example import SoftmaxRegression
sr = SoftmaxRegression(wv=zeros((10,100)), dims=(100,5))

##
# Automatic gradient checker!
# this checks anything you add to self.grads or self.sgrads
# using the method of Assignment 1

sr.grad_check(x=5, y=0)

# sr._acc_grads(idx=5,label=4)
# print sr.params.W.shape
# print sr.params.b.shape
# print sr.sparams.L.shape
# # print sr.sgrads.L[1].shape

# sr.compute_loss(idx=5,label=4)