Skip to content

Commit 66a1218

Browse files
cosmetic changes
1 parent a4031a4 commit 66a1218

File tree

6 files changed

+222
-90
lines changed

6 files changed

+222
-90
lines changed

Chapter1_Introduction/Chapter1_Introduction.ipynb

Lines changed: 56 additions & 31 deletions
Large diffs are not rendered by default.

Chapter2_MorePyMC/MorePyMC.ipynb

Lines changed: 72 additions & 22 deletions
Large diffs are not rendered by default.

Chapter3_MCMC/IntroMCMC.ipynb

Lines changed: 23 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -241,7 +241,7 @@
241241
"cell_type": "markdown",
242242
"metadata": {},
243243
"source": [
244-
"### Example: Unsupervised Clustering using Mixture Model\n",
244+
"##### Example: Unsupervised Clustering using Mixture Model\n",
245245
"\n",
246246
"------------\n",
247247
"\n",
@@ -642,8 +642,10 @@
642642
"cell_type": "markdown",
643643
"metadata": {},
644644
"source": [
645-
"_____\n",
646-
"### Example: Poisson Regression [Needs work]\n",
645+
"\n",
646+
"##### Example: Poisson Regression [Needs work]\n",
647+
"\n",
648+
"---\n",
647649
"\n",
648650
"Perhaps the most important result from medical research was the *now obvious* link between *smoking and cancer*. We'll try to establish a link using Bayesian methods. We have a decision here: should we include a prior that biases us towards there existing a significant link between smoking and cancer? I think we should act like scientists at the turn of the century, and assume there's is no *a priori* reason to assume a link. \n",
649651
"\n",
@@ -1035,8 +1037,7 @@
10351037
" margin-right:auto;\n",
10361038
" }\n",
10371039
" h1 {\n",
1038-
" text-align:center;\n",
1039-
" font-family:\"Charis SIL\", serif;\n",
1040+
" font-family: \"Charis SIL\", Palatino, serif;\n",
10401041
" }\n",
10411042
" div.text_cell_render{\n",
10421043
" font-family: Computer Modern, \"Helvetica Neue\", Arial, Helvetica, Geneva, sans-serif;\n",
@@ -1047,21 +1048,34 @@
10471048
" margin-right:auto;\n",
10481049
" }\n",
10491050
" .CodeMirror{\n",
1050-
" font-family: Consolas, monospace;\n",
1051+
" font-family: \"Source Code Pro\", source-code-pro,Consolas, monospace;\n",
10511052
" }\n",
10521053
" .prompt{\n",
10531054
" display: None;\n",
10541055
" }\n",
1056+
" .text_cell_render h5 {\n",
1057+
" font-weight: 300;\n",
1058+
" font-size: 16pt;\n",
1059+
" color: #4057A1;\n",
1060+
" font-style: italic;\n",
1061+
" margin-bottom: .5em;\n",
1062+
" margin-top: 0.5em;\n",
1063+
" display: block;\n",
1064+
" }\n",
1065+
" \n",
1066+
" .warning{\n",
1067+
" color: rgb( 240, 20, 20 )\n",
1068+
" }\n",
10551069
"</style>"
10561070
],
10571071
"output_type": "pyout",
1058-
"prompt_number": 15,
1072+
"prompt_number": 1,
10591073
"text": [
1060-
"<IPython.core.display.HTML at 0x5e01370>"
1074+
"<IPython.core.display.HTML at 0x82d5dd8>"
10611075
]
10621076
}
10631077
],
1064-
"prompt_number": 15
1078+
"prompt_number": 1
10651079
},
10661080
{
10671081
"cell_type": "code",

Chapter4_TheGreatestTheoremNeverTold/LawOfLargeNumbers.ipynb

Lines changed: 28 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -24,21 +24,20 @@
2424
"metadata": {},
2525
"source": [
2626
"#Chapter 4\n",
27+
"______\n",
2728
"\n",
2829
"##The greatest theorem never told\n",
2930
"\n",
3031
"\n",
3132
"\n",
32-
"> This relatively short chapter focuses on an idea that is always bouncing around our heads, but is rarely made explicit outside books devoted to statistics or Monte Carlo. In fact, we've been used this idea in every example so far. \n",
33-
"\n",
34-
"______"
33+
"> This relatively short chapter focuses on an idea that is always bouncing around our heads, but is rarely made explicit outside books devoted to statistics or Monte Carlo. In fact, we've been used this idea in every example so far. "
3534
]
3635
},
3736
{
3837
"cell_type": "markdown",
3938
"metadata": {},
4039
"source": [
41-
"##The Law of Large Numbers\n",
40+
"###The Law of Large Numbers\n",
4241
"\n",
4342
"Let $Z_i$ be samples from some probability distribution. According to *the Law of Large numbers*, so long as $E[Z]$ is finite, the following holds,\n",
4443
"\n",
@@ -55,7 +54,7 @@
5554
"cell_type": "markdown",
5655
"metadata": {},
5756
"source": [
58-
"### Intution \n",
57+
"### Intuition \n",
5958
"\n",
6059
"If the above Law is somewhat surprising, it can be made more clear be examining a simple example. \n",
6160
"\n",
@@ -80,8 +79,9 @@
8079
"\n",
8180
"Equality holds in the limit, but we can get closer and closer by using more and more samples in the average. This Law holds for *any distribution*, minus some pathological examples that only mathematicians have fun with. \n",
8281
"\n",
82+
"##### Example\n",
8383
"____\n",
84-
"### Example\n",
84+
"\n",
8585
"\n",
8686
"Below is a diagram of the Law of Large numbers in action for three different sequences of Poisson random variables. \n",
8787
"\n",
@@ -258,8 +258,11 @@
258258
"\n",
259259
"The Law of Large Numbers is only valid as $N$ gets *infinitely* large: the law is treasure at the end of an infinite rainbow. While the law is a powerful tool, it is foolhardy to apply it liberally. Our next example illustrates this.\n",
260260
"\n",
261+
"\n",
262+
"\n",
263+
"##### Example: Aggregated geographic data\n",
264+
"\n",
261265
"--------\n",
262-
"### Example\n",
263266
"\n",
264267
"Often data comes in aggregated form. For instance, data may be grouped by state, county, or city level. Of course, the population numbers vary per geographic area. If included in the data is an average of some characteristic of each the geographic area, we must be concious of the Law of Large Numbers and how it can *fail* for areas with small populations.\n",
265268
"\n",
@@ -483,7 +486,7 @@
483486
"cell_type": "markdown",
484487
"metadata": {},
485488
"source": [
486-
"### Exercises\n",
489+
"##### Exercises\n",
487490
"\n",
488491
"1\\. How would you estimate the quantity $E\\left[ \\cos{X} \\right]$, where $X \\sim \\text{Exp}(4)$? What about $E\\left[ \\cos{X} | X \\lt 1\\right]$, i.e. the expected value *given* we know $X$ is less than 1? Would you need more samples than the original samples size to be equally as accurate?"
489492
]
@@ -507,7 +510,7 @@
507510
"cell_type": "markdown",
508511
"metadata": {},
509512
"source": [
510-
"2. The following table was located in the paper \"Going for Three: Predicting the Likelihood of Field Goal Success with Logistic Regression\" [2]. What mistake have the researchers made?\n",
513+
"2. The following table was located in the paper \"Going for Three: Predicting the Likelihood of Field Goal Success with Logistic Regression\" [2]. The table ranks football field-goal kickers by there percent of non-misses. What mistake have the researchers made?\n",
511514
"\n",
512515
"-----\n",
513516
"\n",
@@ -551,8 +554,7 @@
551554
" margin-right:auto;\n",
552555
" }\n",
553556
" h1 {\n",
554-
" text-align:center;\n",
555-
" font-family:\"Charis SIL\", serif;\n",
557+
" font-family: \"Charis SIL\", Palatino, serif;\n",
556558
" }\n",
557559
" div.text_cell_render{\n",
558560
" font-family: Computer Modern, \"Helvetica Neue\", Arial, Helvetica, Geneva, sans-serif;\n",
@@ -563,17 +565,30 @@
563565
" margin-right:auto;\n",
564566
" }\n",
565567
" .CodeMirror{\n",
566-
" font-family: Consolas, monospace;\n",
568+
" font-family: \"Source Code Pro\", source-code-pro,Consolas, monospace;\n",
567569
" }\n",
568570
" .prompt{\n",
569571
" display: None;\n",
570572
" }\n",
573+
" .text_cell_render h5 {\n",
574+
" font-weight: 300;\n",
575+
" font-size: 16pt;\n",
576+
" color: #4057A1;\n",
577+
" font-style: italic;\n",
578+
" margin-bottom: .5em;\n",
579+
" margin-top: 0.5em;\n",
580+
" display: block;\n",
581+
" }\n",
582+
" \n",
583+
" .warning{\n",
584+
" color: rgb( 240, 20, 20 )\n",
585+
" }\n",
571586
"</style>"
572587
],
573588
"output_type": "pyout",
574589
"prompt_number": 1,
575590
"text": [
576-
"<IPython.core.display.HTML at 0x581b050>"
591+
"<IPython.core.display.HTML at 0x82addd8>"
577592
]
578593
}
579594
],

Chapter5_LossFunctions/LossFunctions.ipynb

Lines changed: 28 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -131,8 +131,10 @@
131131
"cell_type": "markdown",
132132
"metadata": {},
133133
"source": [
134+
"\n",
135+
"##### Example: Optimizing for the *Showcase* on *The Price is Right*\n",
136+
"\n",
134137
"______________________________________\n",
135-
"### Example: Optimizing for the *Showcase* on *The Price is Right*\n",
136138
"\n",
137139
"Bless you if you are ever choosen as a contestant on the Price is Right, for here we will show you how to optimize your final price on the *Showcase*. For those who forget the rules:\n",
138140
"\n",
@@ -234,7 +236,8 @@
234236
"input": [
235237
"_hist = plt.hist( price_trace, bins = 50, normed= True, histtype= \"stepfilled\")\n",
236238
"plt.title( \"Posterior of the true price estimate\" )\n",
237-
"plt.vlines( mu_prior, 0, 1.1*np.max(_hist[0] ), label = \"prior's mean\", linestyles=\"--\" )\n",
239+
"plt.vlines( mu_prior, 0, 1.1*np.max(_hist[0] ), label = \"prior's mean\",\n",
240+
" linestyles=\"--\" )\n",
238241
"plt.vlines( price_trace.mean(), 0, 1.1*np.max(_hist[0] ), \\\n",
239242
" label = \"posterior's mean\")\n",
240243
"plt.legend()"
@@ -518,8 +521,10 @@
518521
"cell_type": "markdown",
519522
"metadata": {},
520523
"source": [
524+
"\n",
525+
"##### Example: Financial prediction\n",
526+
"\n",
521527
"____\n",
522-
"### Example: Financial prediction\n",
523528
"\n",
524529
"Suppose the future return of a stock price is very small, say 0.01 (or 1%). We have a model that predicts the stock's future price, and our profit and loss is directly tied to us acting on the prediction. How should be measure the loss associated with the model's predictions, and subsequent future predictions? A squared-error loss is agnogstic to the signage and would penalize a prediction of -0.01 equally as bad a prediction of 0.03:\n",
525530
"\n",
@@ -805,17 +810,16 @@
805810
"\n",
806811
"A good sanity check that our model is still reasonable: as the signal becomes more and more extreme, and we feel more and more confident about the positive/negativeness of returns, our position converges with that of the least-squares line. \n",
807812
"\n",
808-
"The sparse-prediction model is not trying to *fit* the data the best (according to a *squared-error loss* definition of *fit*). That honour would go to the least-squares model. The sparse-prediction model is trying to find the best prediction *with respect to our `stock_loss`-defined loss*. We can turn this reasoning around: the least-squares model is not try to *predict* the best (according to a *`stock-loss`* definition of *predict*). That honour would go the *sparse prediction* model. The least-squares model is trying to find the best fit of the data *with respect to the squared-error loss*.\n",
809-
"\n",
810-
"\n",
811-
"-------\n"
813+
"The sparse-prediction model is not trying to *fit* the data the best (according to a *squared-error loss* definition of *fit*). That honour would go to the least-squares model. The sparse-prediction model is trying to find the best prediction *with respect to our `stock_loss`-defined loss*. We can turn this reasoning around: the least-squares model is not try to *predict* the best (according to a *`stock-loss`* definition of *predict*). That honour would go the *sparse prediction* model. The least-squares model is trying to find the best fit of the data *with respect to the squared-error loss*.\n"
812814
]
813815
},
814816
{
815817
"cell_type": "markdown",
816818
"metadata": {},
817819
"source": [
818-
"### Example: Kaggle contest on *Observing Dark World*\n",
820+
"##### Example: Kaggle contest on *Observing Dark World*\n",
821+
"\n",
822+
"----\n",
819823
"\n",
820824
"A personal motivation for learning Bayesian methods was trying to piece together the winning solution to Kaggle's [*Observing Dark Worlds*](http://www.kaggle.com/c/DarkWorlds) contest. From the contest's website:\n",
821825
"\n",
@@ -1539,8 +1543,7 @@
15391543
" margin-right:auto;\n",
15401544
" }\n",
15411545
" h1 {\n",
1542-
" text-align:center;\n",
1543-
" font-family:\"Charis SIL\", serif;\n",
1546+
" font-family: \"Charis SIL\", Palatino, serif;\n",
15441547
" }\n",
15451548
" div.text_cell_render{\n",
15461549
" font-family: Computer Modern, \"Helvetica Neue\", Arial, Helvetica, Geneva, sans-serif;\n",
@@ -1551,17 +1554,30 @@
15511554
" margin-right:auto;\n",
15521555
" }\n",
15531556
" .CodeMirror{\n",
1554-
" font-family: Consolas, monospace;\n",
1557+
" font-family: \"Source Code Pro\", source-code-pro,Consolas, monospace;\n",
15551558
" }\n",
15561559
" .prompt{\n",
15571560
" display: None;\n",
15581561
" }\n",
1562+
" .text_cell_render h5 {\n",
1563+
" font-weight: 300;\n",
1564+
" font-size: 16pt;\n",
1565+
" color: #4057A1;\n",
1566+
" font-style: italic;\n",
1567+
" margin-bottom: .5em;\n",
1568+
" margin-top: 0.5em;\n",
1569+
" display: block;\n",
1570+
" }\n",
1571+
" \n",
1572+
" .warning{\n",
1573+
" color: rgb( 240, 20, 20 )\n",
1574+
" }\n",
15591575
"</style>"
15601576
],
15611577
"output_type": "pyout",
15621578
"prompt_number": 1,
15631579
"text": [
1564-
"<IPython.core.display.HTML at 0x586b050>"
1580+
"<IPython.core.display.HTML at 0x825be80>"
15651581
]
15661582
}
15671583
],

styles/custom.css

Lines changed: 15 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,7 @@
99
margin-right:auto;
1010
}
1111
h1 {
12-
text-align:center;
13-
font-family:"Charis SIL", serif;
12+
font-family: "Charis SIL", Palatino, serif;
1413
}
1514
div.text_cell_render{
1615
font-family: Computer Modern, "Helvetica Neue", Arial, Helvetica, Geneva, sans-serif;
@@ -21,9 +20,22 @@
2120
margin-right:auto;
2221
}
2322
.CodeMirror{
24-
font-family: Consolas, monospace;
23+
font-family: "Source Code Pro", source-code-pro,Consolas, monospace;
2524
}
2625
.prompt{
2726
display: None;
2827
}
28+
.text_cell_render h5 {
29+
font-weight: 300;
30+
font-size: 16pt;
31+
color: #4057A1;
32+
font-style: italic;
33+
margin-bottom: .5em;
34+
margin-top: 0.5em;
35+
display: block;
36+
}
37+
38+
.warning{
39+
color: rgb( 240, 20, 20 )
40+
}
2941
</style>

0 commit comments

Comments
 (0)