|
444 | 444 | "\n",
|
445 | 445 | " If, in reality, no sudden change occurred and indeed $\\lambda_1 = \\lambda_2$, the $\\lambda$'s posterior distributions should look about equal.\n",
|
446 | 446 | "\n",
|
447 |
| - "What would be good prior distributions for $\\lambda_1$ and $\\lambda_2$? Recall that $\\lambda_i, \\; i=1,2,$ can be any positive number. The *exponential* random variable has a density function for any positive number. This would be a good choice to model $\\lambda_i$. But again, we need a parameter for this exponential distribution: call it $\\alpha$.\n", |
| 447 | + "We are interested in inferring the unknown $\\lambda$s. To use Bayesian inference, we need to assign prior probabilities to the different possible values of $\\lambda$. What would be good prior probability distributions for $\\lambda_1$ and $\\lambda_2$? Recall that $\\lambda_i, \\; i=1,2,$ can be any positive number. The *exponential* random variable has a density function for any positive number. This would be a good choice to model $\\lambda_i$. But, we need a parameter for this exponential distribution: call it $\\alpha$.\n", |
448 | 448 | "\n",
|
449 | 449 | "\\begin{align}\n",
|
450 | 450 | "&\\lambda_1 \\sim \\text{Exp}( \\alpha ) \\\\\\\n",
|
|
464 | 464 | "& \\Rightarrow P( \\tau = k ) = \\frac{1}{70}\n",
|
465 | 465 | "\\end{align}\n",
|
466 | 466 | "\n",
|
467 |
| - "So after all this, what does our prior distribution look like? Frankly, *it doesn't matter*. What we should understand is that it would be an ugly, complicated, mess involving symbols only a mathematician would love. And things would only get uglier the more complicated our models become. Regardless, all we really care about is the posterior distribution anyways. We next turn to PyMC, a Python library for performing Bayesian analysis, and is agnostic to the mathematical monster we have created. \n", |
| 467 | + "So after all this, what does our overall prior for the unknown variables look like? Frankly, *it doesn't matter*. What we should understand is that it would be an ugly, complicated, mess involving symbols only a mathematician would love. And things would only get uglier the more complicated our models become. Regardless, all we really care about is the posterior distribution. We next turn to PyMC, a Python library for performing Bayesian analysis, and is agnostic to the mathematical monster we have created. \n", |
468 | 468 | "\n",
|
469 | 469 | "\n",
|
470 | 470 | "Introducing our first hammer: PyMC\n",
|
471 | 471 | "-----\n",
|
472 | 472 | "\n",
|
473 |
| - "PyMC is a Python library for programming Bayesian analysis [3]. It is a fast, well-maintained library. The only unfortunate part is that documentation can be lacking in areas, especially the bridge between problem to solution. One this book's main goals is to solve that problem, and also to demonstrate why PyMC is so cool.\n", |
| 473 | + "PyMC is a Python library for programming Bayesian analysis [3]. It is a fast, well-maintained library. The only unfortunate part is that documentation can be lacking in areas, especially the bridge between beginner to hacker. One this book's main goals is to solve that problem, and also to demonstrate why PyMC is so cool.\n", |
474 | 474 | "\n",
|
475 | 475 | "We will model the above problem using the PyMC library. This type of programming is called *probabilistic programming*, an unfortunate misnomer that invokes ideas of randomly-generated code and has likely confused and frightened users away from this field. The code is not random. The title is given because we create probability models using pogramming variables as the model's components. This will be the last time I use the term *probablistic programming*. Instead, I'll simply use *programming*, as that is what it really is. \n",
|
476 | 476 | "\n",
|
|
0 commit comments