Skip to content

Commit db4690d

Browse files
committed
Merge pull request lisa-lab#1 from yosinski/master
Some tutorial tweaks.
2 parents d879697 + 887d84e commit db4690d

3 files changed

Lines changed: 6 additions & 6 deletions

File tree

doc/lenet.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -216,7 +216,7 @@ one of Figure 1. The input consists of 3 features maps (an RGB color image) of s
216216
# but also to insert new ones along which the tensor will be
217217
# broadcastable;
218218
# dimshuffle('x', 2, 'x', 0, 1)
219-
# This will work on 3d tensors whith no broadcastable
219+
# This will work on 3d tensors with no broadcastable
220220
# dimensions. The first dimension will be broadcastable,
221221
# then we will have the third dimension of the input tensor as
222222
# the second of the resulting tensor, etc. If the tensor has

doc/mlp.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -175,7 +175,7 @@ both upward (activations flowing from inputs to outputs) and backward
175175
self.b = theano.shared(value= b_values, name ='b')
176176

177177

178-
Note that we used a given non linear function as the activation function of the hidden layer. By default this is ``tanh``, but in many cases we might want
178+
Note that we used a given non-linear function as the activation function of the hidden layer. By default this is ``tanh``, but in many cases we might want
179179
to use something else.
180180

181181
.. code-block:: python

doc/rbm.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -663,17 +663,17 @@ log-PL:
663663
where the expectation is taken over the uniform random choice of index :math:`i`,
664664
and :math:`N` is the number of visible units. In order to work with binary
665665
units, we further introduce the notation :math:`\tilde{x}_i` to refer to
666-
:math:`x` with bit-i being flipped (1->0, 0->1). The log-PL for an RBM with binary unit is
666+
:math:`x` with bit-i being flipped (1->0, 0->1). The log-PL for an RBM with binary units is
667667
then written as:
668668

669669
.. math::
670670
\log PL(x) &\approx N \cdot \log
671671
\frac {e^{-FE(x)}} {e^{-FE(x)} + e^{-FE(\tilde{x}_i)}} \\
672672
&\approx N \cdot \log[ sigm (FE(\tilde{x}_i) - FE(x)) ]
673673

674-
We therefore return this cost as well as the RBM updates in the `get_cost_updates` function of the `RBM` class.
674+
We therefore return this cost as well as the RBM updates in the ``get_cost_updates`` function of the ``RBM`` class.
675675
Notice that we modify the updates dictionary to increment the
676-
index of bit :math:`i`. This will result in bit i cycling over all possible
676+
index of bit :math:`i`. This will result in bit :math:`i` cycling over all possible
677677
values :math:`\{0,1,...,N\}`, from one update to another.
678678

679679
Note that for CD training the cost-entropy cost between the input and the
@@ -721,7 +721,7 @@ himself with the function ``tile_raster_images`` (see :ref:`how-to-plot`). Since
721721
RBMs are generative models, we are interested in sampling from them and
722722
plotting/visualizing these samples. We also want to visualize the filters
723723
(weights) learnt by the RBM, to gain insights into what the RBM is actually
724-
doing. Bare in mind however, that this does not provide the entire story,
724+
doing. Bear in mind however, that this does not provide the entire story,
725725
since we neglect the biases and plot the weights up to a multiplicative
726726
constant (weights are converted to values between 0 and 1).
727727

0 commit comments

Comments
 (0)