Skip to content

Commit bf266e5

Browse files
committed
Merge branch 'master' of git@github.com:lisa-lab/DeepLearningTutorials
2 parents cc97220 + dc00dcc commit bf266e5

4 files changed

Lines changed: 15 additions & 15 deletions

File tree

doc/gettingstarted.txt

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Datasets
2323
MNIST Dataset
2424
+++++++++++++
2525

26-
(`mnist.pkl.gz <http://www.iro.umontreal.ca/~lisa/deep/data/mnist/mnist.pkl.gz>`_)
26+
(`mnist.pkl.gz <http://deeplearning.net/data/mnist/mnist.pkl.gz>`_)
2727

2828
The `MNIST <http://yann.lecun.com/exdb/mnist>`_ dataset consists of handwritten
2929
digit images and it is divided in 60 000 examples for the training set and
@@ -48,7 +48,7 @@ MNIST Dataset
4848
.. |5| image:: images/mnist_5.png
4949

5050
For convenience we pickled the dataset to make it easier to use in python.
51-
It is available for download `here <http://www.iro.umontreal.ca/~lisa/deep/data/mnist/mnist.pkl.gz>`_.
51+
It is available for download `here <http://deeplearning.net/data/mnist/mnist.pkl.gz>`_.
5252
The pickled file represents a tuple of 3 lists : the training set, the
5353
validation set and the testing set. Each of the three lists is a pair
5454
formed from a list of images and a list of class labels for each of the
@@ -667,7 +667,7 @@ another machine in a distributed job.
667667
Read more about `serialization in Theano`_, or Python's `pickling`_.
668668

669669
.. _pickling: http://docs.python.org/library/pickle.html
670-
.. _serialization in Theano: http://deeplearning.net/theanodoc/tutorial/loading_and_saving.html
670+
.. _serialization in Theano: http://deeplearning.net/software/theano/tutorial/loading_and_saving.html
671671

672672
Plotting Intermediate Results
673673
++++++++++++++++++++++++++++++

doc/intro.txt

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ The unsupervised and semi-supervised learning algorithms are less co-dependent,
4141
* :ref:`encoder_decoder`
4242

4343

44-
.. _Theano: http://deeplearning.net/theanodoc
44+
.. _Theano: http://deeplearning.net/software/theano
4545

46-
.. _Theano basic tutorial: http://deeplearning.net/theanodoc/tutorial
46+
.. _Theano basic tutorial: http://deeplearning.net/software/theano/tutorial
4747

doc/logreg.txt

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -13,13 +13,13 @@ Classifying MNIST digits using Logistic Regression
1313
.. note::
1414
The code for this section is available for download `here`_.
1515

16-
.. _here: http://www.iro.umontreal.ca/~lisa/deep/tutorial/code/logistic_sgd.py
16+
.. _here: http://deeplearning.net/tutorial/code/logistic_sgd.py
1717

18-
.. _shared variables: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html#using-shared-variables
18+
.. _shared variables: http://deeplearning.net/software/theano/tutorial/examples.html#using-shared-variables
1919

20-
.. _basic arithmetic ops: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/adding.html#adding-two-scalars
20+
.. _basic arithmetic ops: http://deeplearning.net/software/theano/tutorial/adding.html#adding-two-scalars
2121

22-
.. _T.grad: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html#computing-gradients
22+
.. _T.grad: http://deeplearning.net/software/theano/tutorial/examples.html#computing-gradients
2323

2424
In this section, we show how Theano can be used to implement the most basic
2525
classifier: the logistic regression. We start off with a quick primer of the
@@ -389,7 +389,7 @@ error of 7.489%. On the GPU the code does almost 10.0 epochs/sec.
389389

390390
.. [#f1] For smaller datasets and simpler models, more sophisticated descent
391391
algorithms can be more effective. The sample code
392-
`logistic_cg.py <http://www.iro.umontreal.ca/~lisa/deep/tutorial/code/logistic_cg.py>`_
392+
`logistic_cg.py <http://deeplearning.net/tutorial/code/logistic_cg.py>`_
393393
demonstrates how to use SciPy's conjugate gradient solver with Theano
394394
on the logistic regression task.
395395

doc/mlp.txt

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -14,15 +14,15 @@ Multilayer Perceptron
1414
.. note::
1515
The code for this section is available for download `here`_.
1616

17-
.. _here: http://www.iro.umontreal.ca/~lisa/deep/tutorial/code/mlp.py
17+
.. _here: http://deeplearning.net/tutorial/code/mlp.py
1818

19-
.. _T.tanh: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html?highlight=tanh
19+
.. _T.tanh: http://deeplearning.net/software/theano/tutorial/examples.html?highlight=tanh
2020

21-
.. _shared variables: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html#using-shared-variables
21+
.. _shared variables: http://deeplearning.net/software/theano/tutorial/examples.html#using-shared-variables
2222

23-
.. _basic arithmetic ops: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/adding.html#adding-two-scalars
23+
.. _basic arithmetic ops: http://deeplearning.net/software/theano/tutorial/adding.html#adding-two-scalars
2424

25-
.. _T.grad: http://www.iro.umontreal.ca/~lisa/deep/theanodoc/tutorial/examples.html#computing-gradients
25+
.. _T.grad: http://deeplearning.net/software/theano/tutorial/examples.html#computing-gradients
2626

2727

2828
The next architecture we are going to present using Theano is the single-hidden

0 commit comments

Comments
 (0)