Skip to content

chebee7i/dit

 
 

Repository files navigation

dit is a Python package for information theory.

image

image

image

Code Health

Documentation:

http://docs.dit.io

Downloads:

Coming soon.

Dependencies:
  • Python 2.7, 3.2, 3.3, or 3.4
  • numpy
  • iterutils
  • six
  • contextlib2
  • prettytable
  • networkx
Optional Dependencies:
  • cython
Install:

Until dit is available on PyPI, the easiest way to install is:

pip install git+https://github.com/dit/dit/#egg=dit

Alternatively, you can clone this repository, move into the newly created dit directory, and then install the package. Be sure to include the period (.) in the install command:

git clone https://github.com/dit/dit.git
cd dit
pip install .
Mailing list:

None

Code and bug tracker:

https://github.com/dit/dit

License:

BSD 2-Clause, see LICENSE.txt for details.

Quickstart

The basic usage of dit corresponds to creating distributions, modifying them if need be, and then computing properties of those distributions. First, we import:

>>> import dit

Suppose we have a really thick coin, one so thick that there is a reasonable chance of it landing on its edge. Here is how we might represent the coin in dit.

>>> d = dit.Distribution(['H', 'T', 'E'], [.4, .4, .2])
>>> print d
Class:          Distribution
Alphabet:       ('E', 'H', 'T') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 1
RV Names:       None

x   p(x)
E   0.2
H   0.4
T   0.4

Calculate the probability of H and also of the combination H or T.

>>> d['H']
0.4
>>> d.event_probability(['H','T'])
0.8

Calculate the Shannon entropy and extropy of the joint distribution.

>>> dit.shannon.entropy(d)
1.5219280948873621
>>> dit.other.extropy(d)
1.1419011889093373

Create a distribution where Z = xor(X, Y).

>>> import dit.example_dists
>>> d = dit.example_dists.Xor()
>>> d.set_rv_names(['X', 'Y', 'Z'])
>>> print d
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 3
RV Names:       ('X', 'Y', 'Z')

x     p(x)
000   0.25
011   0.25
101   0.25
110   0.25

Calculate the Shannon mutual informations I[X:Z], I[Y:Z], and I[X,Y:Z].

>>> dit.shannon.mutual_information(d, ['X'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['Y'], ['Z'])
0.0
>>> dit.shannon.mutual_information(d, ['X', 'Y'], ['Z'])
1.0

Calculate the marginal distribution P(X,Z). Then print its probabilities as fractions, showing the mask.

>>> d2 = d.marginal(['X', 'Z'])
>>> print d2.to_string(show_mask=True, exact=True)
Class:          Distribution
Alphabet:       ('0', '1') for all rvs
Base:           linear
Outcome Class:  str
Outcome Length: 2 (mask: 3)
RV Names:       ('X', 'Z')

x     p(x)
0*0   1/4
0*1   1/4
1*0   1/4
1*1   1/4

Convert the distribution probabilities to log (base 3.5) probabilities, and access its probability mass function.

>>> d2.set_base(3.5)
>>> d2.pmf
array([-1.10658951, -1.10658951, -1.10658951, -1.10658951])

Draw 5 random samples from this distribution.

>>> dit.math.prng.seed(1)
>>> d2.rand(5)
['01', '10', '00', '01', '00']

Enjoy!

About

Python package for information theory.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.1%
  • Other 0.9%