Skip to content

Commit 750673c

Browse files
committed
Restored doc/convolutional_mlp.txt to doc/lenet.txt and doc/logistic_sgd.txt to doc/logreg.txt to respect existing bookmarks.
1 parent 3007d07 commit 750673c

File tree

9 files changed

+17
-17
lines changed

9 files changed

+17
-17
lines changed

doc/DBN.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Deep Belief Networks
44
====================
55

66
.. note::
7-
This section assumes the reader has already read through :doc:`logistic_sgd`
7+
This section assumes the reader has already read through :doc:`logreg`
88
and :doc:`mlp` and :doc:`rbm`. Additionally it uses the following Theano
99
functions and concepts : `T.tanh`_, `shared variables`_, `basic arithmetic
1010
ops`_, `T.grad`_, `Random numbers`_, `floatX`_. If you intend to run the
@@ -164,7 +164,7 @@ hidden bias with its corresponding sigmoid layer.
164164

165165
All that is left is to stack one last logistic regression layer in order to
166166
form an MLP. We will use the ``LogisticRegression`` class introduced in
167-
:ref:`logistic_sgd`.
167+
:ref:`logreg`.
168168

169169
.. literalinclude:: ../code/DBN.py
170170
:start-after: # We now need to add a logistic layer on top of the MLP

doc/SdA.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Stacked Denoising Autoencoders (SdA)
44
====================================
55

66
.. note::
7-
This section assumes the reader has already read through :doc:`logistic_sgd`
7+
This section assumes the reader has already read through :doc:`logreg`
88
and :doc:`mlp`. Additionally it uses the following Theano functions
99
and concepts : `T.tanh`_, `shared variables`_, `basic arithmetic ops`_, `T.grad`_, `Random numbers`_, `floatX`_. If you intend to run the code on GPU also read `GPU`_.
1010

@@ -95,7 +95,7 @@ bias of the encoding part with its corresponding sigmoid layer.
9595

9696
All we need now is to add the logistic layer on top of the sigmoid
9797
layers such that we have an MLP. We will
98-
use the ``LogisticRegression`` class introduced in :ref:`logistic_sgd`.
98+
use the ``LogisticRegression`` class introduced in :ref:`logreg`.
9999

100100
.. literalinclude:: ../code/SdA.py
101101
:start-after: self.dA_layers.append(dA_layer)

doc/contents.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,9 +11,9 @@ Contents
1111
LICENSE
1212
intro
1313
gettingstarted
14-
logistic_sgd
14+
logreg
1515
mlp
16-
convolutional_mlp
16+
lenet
1717
dA
1818
SdA
1919
rbm

doc/dA.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ Denoising Autoencoders (dA)
44
===========================
55

66
.. note::
7-
This section assumes the reader has already read through :doc:`logistic_sgd`
7+
This section assumes the reader has already read through :doc:`logreg`
88
and :doc:`mlp`. Additionally it uses the following Theano functions
99
and concepts : `T.tanh`_, `shared variables`_, `basic arithmetic ops`_, `T.grad`_, `Random numbers`_, `floatX`_. If you intend to run the code on GPU also read `GPU`_.
1010

doc/intro.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,9 +29,9 @@ read through our :ref:`gettingstarted` chapter -- it introduces the notation, an
2929

3030
The purely supervised learning algorithms are meant to be read in order:
3131

32-
#. :ref:`Logistic Regression <logistic_sgd>` - using Theano for something simple
32+
#. :ref:`Logistic Regression <logreg>` - using Theano for something simple
3333
#. :ref:`Multilayer perceptron <mlp>` - introduction to layers
34-
#. :ref:`Deep Convolutional Network <convolutional_mlp>` - a simplified version of LeNet5
34+
#. :ref:`Deep Convolutional Network <lenet>` - a simplified version of LeNet5
3535

3636
The unsupervised and semi-supervised learning algorithms can be read in any
3737
order (the auto-encoders can be read independently of the RBM/DBN thread):

doc/convolutional_mlp.txt renamed to doc/lenet.txt

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
1-
.. _convolutional_mlp:
1+
.. _lenet:
22

33
Convolutional Neural Networks (LeNet)
44
=====================================
55

66
.. note::
7-
This section assumes the reader has already read through :doc:`logistic_sgd` and
7+
This section assumes the reader has already read through :doc:`logreg` and
88
:doc:`mlp`. Additionally, it uses the following new Theano functions and concepts:
99
`T.tanh`_, `shared variables`_, `basic arithmetic ops`_, `T.grad`_,
1010
`floatX`_, `downsample`_ , `conv2d`_, `dimshuffle`_. If you intend to run the
@@ -406,7 +406,7 @@ layer.
406406
Notice that when initializing the weight values, the fan-in is determined by
407407
the size of the receptive fields and the number of input feature maps.
408408

409-
Finally, using the LogisticRegression class defined in :doc:`logistic_sgd` and
409+
Finally, using the LogisticRegression class defined in :doc:`logreg` and
410410
the HiddenLayer class defined in :doc:`mlp` , we can
411411
instantiate the network as follows.
412412

doc/logistic_sgd.txt renamed to doc/logreg.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
.. index:: Logistic Regression
22

3-
.. _logistic_sgd :
3+
.. _logreg :
44

55

66
Classifying MNIST digits using Logistic Regression

doc/mlp.txt

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ Multilayer Perceptron
77
=====================
88

99
.. note::
10-
This section assumes the reader has already read through :doc:`logistic_sgd`.
10+
This section assumes the reader has already read through :doc:`logreg`.
1111
Additionally, it uses the following new Theano functions and concepts:
1212
`T.tanh`_, `shared variables`_, `basic arithmetic ops`_, `T.grad`_,
1313
:ref:`L1_L2_regularization`, `floatX`_. If you intend to run the
@@ -80,7 +80,7 @@ extension to vectors and tensors consists in applying them element-wise
8080

8181
The output vector is then obtained as: :math:`o(x) = G(b^{(2)} + W^{(2)} h(x))`.
8282
The reader should recognize the form we already used for
83-
:doc:`logistic_sgd`. As before,
83+
:doc:`logreg`. As before,
8484
class-membership probabilities can be obtained by choosing :math:`G` as the
8585
:math:`softmax` function (in the case of multi-class classification).
8686

@@ -129,7 +129,7 @@ to use something else.
129129
If you look into theory this class implements the graph that computes
130130
the hidden layer value :math:`h(x) = \Phi(x) = s(b^{(1)} + W^{(1)} x)`.
131131
If you give this as input to the ``LogisticRegression`` class,
132-
implemented in the previous tutorial :doc:`logistic_sgd`, you get the output
132+
implemented in the previous tutorial :doc:`logreg`, you get the output
133133
of the MLP. You can see this in the following short implementation of
134134
the ``MLP`` class.
135135

doc/rbm.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Restricted Boltzmann Machines (RBM)
55

66

77
.. note::
8-
This section assumes the reader has already read through :doc:`logistic_sgd`
8+
This section assumes the reader has already read through :doc:`logreg`
99
and :doc:`mlp`. Additionally it uses the following Theano functions
1010
and concepts : `T.tanh`_, `shared variables`_, `basic arithmetic ops`_, `T.grad`_, `Random numbers`_, `floatX`_ and `scan`_. If you intend to run the code on GPU also read `GPU`_.
1111

0 commit comments

Comments
 (0)