LostTech.TensorFlow : API Documentation

Type Mixture

Namespace tensorflow.contrib.distributions

Parent Distribution

Interfaces IMixture

Mixture distribution.

The `Mixture` object implements batched mixture distributions. The mixture model is defined by a `Categorical` distribution (the mixture) and a python list of `Distribution` objects.

Methods supported include `log_prob`, `prob`, `mean`, `sample`, and `entropy_lower_bound`.

#### Examples
Show Example
# Create a mixture of two Gaussians:
            import tensorflow_probability as tfp
            tfd = tfp.distributions 

mix = 0.3 bimix_gauss = tfd.Mixture( cat=tfd.Categorical(probs=[mix, 1.-mix]), components=[ tfd.Normal(loc=-1., scale=0.1), tfd.Normal(loc=+1., scale=0.5), ])

# Plot the PDF. import matplotlib.pyplot as plt x = tf.linspace(-2., 3., int(1e4)).eval() plt.plot(x, bimix_gauss.prob(x).eval());

Methods

Properties

Public instance methods

Tensor entropy_lower_bound(string name)

A lower bound on the entropy of this mixture model.

The bound below is not always very tight, and its usefulness depends on the mixture probabilities and the components in use.

A lower bound is useful for ELBO when the `Mixture` is the variational distribution:

\\( \log p(x) >= ELBO = \int q(z) \log p(x, z) dz + H[q] \\)

where \\( p \\) is the prior distribution, \\( q \\) is the variational, and \\( H[q] \\) is the entropy of \\( q \\). If there is a lower bound \\( G[q] \\) such that \\( H[q] \geq G[q] \\) then it can be used in place of \\( H[q] \\).

For a mixture of distributions \\( q(Z) = \sum_i c_i q_i(Z) \\) with \\( \sum_i c_i = 1 \\), by the concavity of \\( f(x) = -x \log x \\), a simple lower bound is:

\\( \begin{align} H[q] & = - \int q(z) \log q(z) dz \\\ & = - \int (\sum_i c_i q_i(z)) \log(\sum_i c_i q_i(z)) dz \\\ & \geq - \sum_i c_i \int q_i(z) \log q_i(z) dz \\\ & = \sum_i c_i H[q_i] \end{align} \\)

This is the term we calculate below for \\( G[q] \\).
Parameters
string name
A name for this operation (optional).
Returns
Tensor
A lower bound on the Mixture's entropy.

object entropy_lower_bound_dyn(ImplicitContainer<T> name)

A lower bound on the entropy of this mixture model.

The bound below is not always very tight, and its usefulness depends on the mixture probabilities and the components in use.

A lower bound is useful for ELBO when the `Mixture` is the variational distribution:

\\( \log p(x) >= ELBO = \int q(z) \log p(x, z) dz + H[q] \\)

where \\( p \\) is the prior distribution, \\( q \\) is the variational, and \\( H[q] \\) is the entropy of \\( q \\). If there is a lower bound \\( G[q] \\) such that \\( H[q] \geq G[q] \\) then it can be used in place of \\( H[q] \\).

For a mixture of distributions \\( q(Z) = \sum_i c_i q_i(Z) \\) with \\( \sum_i c_i = 1 \\), by the concavity of \\( f(x) = -x \log x \\), a simple lower bound is:

\\( \begin{align} H[q] & = - \int q(z) \log q(z) dz \\\ & = - \int (\sum_i c_i q_i(z)) \log(\sum_i c_i q_i(z)) dz \\\ & \geq - \sum_i c_i \int q_i(z) \log q_i(z) dz \\\ & = \sum_i c_i H[q_i] \end{align} \\)

This is the term we calculate below for \\( G[q] \\).
Parameters
ImplicitContainer<T> name
A name for this operation (optional).
Returns
object
A lower bound on the Mixture's entropy.

Public properties

object allow_nan_stats get;

object allow_nan_stats_dyn get;

TensorShape batch_shape get;

object batch_shape_dyn get;

Categorical cat get;

object cat_dyn get;

IList<object> components get;

object components_dyn get;

object dtype get;

object dtype_dyn get;

TensorShape event_shape get;

object event_shape_dyn get;

string name get;

object name_dyn get;

object num_components get;

object num_components_dyn get;

IDictionary<object, object> parameters get;

object parameters_dyn get;

object PythonObject get;

object reparameterization_type get;

object reparameterization_type_dyn get;

object validate_args get;

object validate_args_dyn get;