# LostTech.TensorFlow : API Documentation

Type Mixture

Namespace tensorflow.contrib.distributions

Parent Distribution

Interfaces IMixture

Mixture distribution.

The Mixture object implements batched mixture distributions. The mixture model is defined by a Categorical distribution (the mixture) and a python list of Distribution objects.

Methods supported include log_prob, prob, mean, sample, and entropy_lower_bound.

#### Examples
Show Example
# Create a mixture of two Gaussians:
import tensorflow_probability as tfp
tfd = tfp.distributions  mix = 0.3
bimix_gauss = tfd.Mixture(
cat=tfd.Categorical(probs=[mix, 1.-mix]),
components=[
tfd.Normal(loc=-1., scale=0.1),
tfd.Normal(loc=+1., scale=0.5),
])  # Plot the PDF.
import matplotlib.pyplot as plt
x = tf.linspace(-2., 3., int(1e4)).eval()
plt.plot(x, bimix_gauss.prob(x).eval()); 

### Public instance methods

#### Tensorentropy_lower_bound(string name)

A lower bound on the entropy of this mixture model.

The bound below is not always very tight, and its usefulness depends on the mixture probabilities and the components in use.

A lower bound is useful for ELBO when the Mixture is the variational distribution:

\$$\log p(x) >= ELBO = \int q(z) \log p(x, z) dz + H[q] \$$

where \$$p \$$ is the prior distribution, \$$q \$$ is the variational, and \$$H[q] \$$ is the entropy of \$$q \$$. If there is a lower bound \$$G[q] \$$ such that \$$H[q] \geq G[q] \$$ then it can be used in place of \$$H[q] \$$.

For a mixture of distributions \$$q(Z) = \sum_i c_i q_i(Z) \$$ with \$$\sum_i c_i = 1 \$$, by the concavity of \$$f(x) = -x \log x \$$, a simple lower bound is:

\\begin{align} H[q] & = - \int q(z) \log q(z) dz \\\ & = - \int (\sum_i c_i q_i(z)) \log(\sum_i c_i q_i(z)) dz \\\ & \geq - \sum_i c_i \int q_i(z) \log q_i(z) dz \\\ & = \sum_i c_i H[q_i] \end{align} \

This is the term we calculate below for \$$G[q] \$$.
##### Parameters
string name
A name for this operation (optional).
##### Returns
Tensor
A lower bound on the Mixture's entropy.

#### objectentropy_lower_bound_dyn(ImplicitContainer<T> name)

A lower bound on the entropy of this mixture model.

The bound below is not always very tight, and its usefulness depends on the mixture probabilities and the components in use.

A lower bound is useful for ELBO when the Mixture is the variational distribution:

\$$\log p(x) >= ELBO = \int q(z) \log p(x, z) dz + H[q] \$$

where \$$p \$$ is the prior distribution, \$$q \$$ is the variational, and \$$H[q] \$$ is the entropy of \$$q \$$. If there is a lower bound \$$G[q] \$$ such that \$$H[q] \geq G[q] \$$ then it can be used in place of \$$H[q] \$$.

For a mixture of distributions \$$q(Z) = \sum_i c_i q_i(Z) \$$ with \$$\sum_i c_i = 1 \$$, by the concavity of \$$f(x) = -x \log x \$$, a simple lower bound is:

\\begin{align} H[q] & = - \int q(z) \log q(z) dz \\\ & = - \int (\sum_i c_i q_i(z)) \log(\sum_i c_i q_i(z)) dz \\\ & \geq - \sum_i c_i \int q_i(z) \log q_i(z) dz \\\ & = \sum_i c_i H[q_i] \end{align} \

This is the term we calculate below for \$$G[q] \$$.
##### Parameters
ImplicitContainer<T> name
A name for this operation (optional).
##### Returns
object
A lower bound on the Mixture's entropy.