Skip to content

Commit fba8204

Browse files
Merge pull request CamDavidsonPilon#230 from zapakh/master
Use possessive 'its' where appropriate
2 parents 28b8d4d + cb07678 commit fba8204

File tree

3 files changed

+3
-3
lines changed

3 files changed

+3
-3
lines changed

Chapter3_MCMC/IntroMCMC.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1065,7 +1065,7 @@
10651065
"source": [
10661066
"One way to think of autocorrelation is \"If I know the position of the series at time $s$, can it help me know where I am at time $t$?\" In the series $x_t$, the answer is No. By construction, $x_t$ are random variables. If I told you that $x_2 = 0.5$, could you give me a better guess about $x_3$? No.\n",
10671067
"\n",
1068-
"On the other hand, $y_t$ is autocorrelated. By construction, if I knew that $y_2 = 10$, I can be very confident that $y_3$ will not be very far from 10. Similarly, I can even make a (less confident guess) about $y_4$: it will probably not be near 0 or 20, but a value of 5 is not too unlikely. I can make a similar argument about $y_5$, but again, I am less confident. Taking this to it's logical conclusion, we must concede that as $k$, the lag between time points, increases the autocorrelation decreases. We can visualize this:\n"
1068+
"On the other hand, $y_t$ is autocorrelated. By construction, if I knew that $y_2 = 10$, I can be very confident that $y_3$ will not be very far from 10. Similarly, I can even make a (less confident guess) about $y_4$: it will probably not be near 0 or 20, but a value of 5 is not too unlikely. I can make a similar argument about $y_5$, but again, I am less confident. Taking this to its logical conclusion, we must concede that as $k$, the lag between time points, increases the autocorrelation decreases. We can visualize this:\n"
10691069
]
10701070
},
10711071
{

Chapter4_TheGreatestTheoremNeverTold/LawOfLargeNumbers.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,7 @@
7272
"\n",
7373
"Below is a diagram of the Law of Large numbers in action for three different sequences of Poisson random variables. \n",
7474
"\n",
75-
" We sample `sample_size = 100000` Poisson random variables with parameter $\\lambda = 4.5$. (Recall the expected value of a Poisson random variable is equal to it's parameter.) We calculate the average for the first $n$ samples, for $n=1$ to `sample_size`. "
75+
" We sample `sample_size = 100000` Poisson random variables with parameter $\\lambda = 4.5$. (Recall the expected value of a Poisson random variable is equal to its parameter.) We calculate the average for the first $n$ samples, for $n=1$ to `sample_size`. "
7676
]
7777
},
7878
{

Chapter6_Priorities/Priors.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1272,7 +1272,7 @@
12721272
"source": [
12731273
"(Plots like these are what inspired the book's cover.)\n",
12741274
"\n",
1275-
"What can we say about the results above? Clearly TSLA has been a strong performer, and our analysis suggests that it has an almost 1% daily return! Similarly, most of the distribution of AAPL is negative, suggesting that it's *true daily return* is negative.\n",
1275+
"What can we say about the results above? Clearly TSLA has been a strong performer, and our analysis suggests that it has an almost 1% daily return! Similarly, most of the distribution of AAPL is negative, suggesting that its *true daily return* is negative.\n",
12761276
"\n",
12771277
"\n",
12781278
"You may not have immediately noticed, but these variables are a whole order of magnitude *less* than our priors on them. For example, to put these one the same scale as the above prior distributions:"

0 commit comments

Comments
 (0)