Review the prerequisite material from probability and calculus. If there are any topics from probability that you don’t have memorized, please start memorizing them. This will make the rest of this class go much more smoothly for you. If you have any questions on this material, please don’t hesitate to ask me in office hours or on Piazza. I am happy to help with this material.
Over the next 1 or 2 classes we will cover material from Chapter 6 of the text. Give those sections in the book a brief skim.
Homework 0 due 5 pm Mon, Jan 27
Fri, Jan 24
In class, we will work on:
Wrap up any remaining probability examples from last class
Define the \(t\), \(F\), and \(\chi^2\) distributions; this material is in Chapter 6 of Rice. Lecture notes: pdfNote that there is a minor error in the lecture notes that we did not make in class: on page 5, the denominator should be the square root of a chi squared random variable divided by its degrees of freedom. Basically, we need to take the square root of the square of the denominator to see that we have a t-distributed random variable.
Lab 01. See labs page for solutions.
After class, please:
Reading: Over the next few classes we will cover material from Sections 8.1, 8.2, 8.3, and 8.5 of the text. Give those sections in the book a brief skim.
Lecture notes for maximum likelihood estimation for this example: pdf
We had a few extra minutes so we also defined an estimator (a random variable) and estimate (realized value of an estimator based on a sample, a number).
After class, please:
Reading: Over the next few classes we will cover material from Sections 8.1, 8.2, 8.3, and 8.5 of the text. Give those sections in the book a brief skim.
Homework 0 due 5 pm today, Mon, Jan 27
Wed, Jan 29
In class, we will work on:
Quiz on probability background. This will be very similar to the examples from Wednesday Jan 22 and HW 0. Please memorize everything on the “Topics from probability” document linked on the Resources page, and know how to use it. You only need to know the informal statement of the Central Limit Theorem.
Reminder about Taylor’s theorem, from calculus review handout (linked on resources page)
Intro to Newton’s method for optimization. Slides: pdf
After class, please:
Homework 1 due 5 pm Wed, Feb 5
Reading: Over the next few days, we’ll explore using Newton’s method to numerically optimize the log-likelihood function. Wikipedia is a reasonably good source; you could give the following a skim:
Remind yourself about how Taylor’s theorem works by skimming the [Wikipedia article] and/or looking at the last page of the review of topics from calculus posted on the Resources page of the course website. We’ll review this at the beginning of class on Wednesday too.
We also did the first example on this worksheet: pdf
After class, please:
Wed, Feb 19
In class, we will work on:
More on Bayesian inference:
Lab 6
Finish examples from practice worksheet last class.
After class, please:
Homework 3 due Wed. Feb 26
Fri, Feb 21
In class, we will work on:
Bayesian credible intervals: posterior percentiles, highest posterior density. Partial lecture notes are here, but I also discussed highest posterior density: pdf
Start on analysis of normal distribution from a Bayesian perspective.
Lecture notes: pdf. Note that for case 1 (unknown mean, known variance) I mixed up the notation a little towards the end of page 1 of these notes. The stuff I wrote in class was correct.
Plots illustrating MSE for Bayesian estimator of the mean when the variance is known: pdf
Another practice example we didn’t have time for: pdf. Solutions: pdf
If you want reading for the material we’ll cover in the next few days, you can refer to the following sources (in order of how useful I think they are):
At about minute 4:00 in the video, I incorrectly state that the marginal posterior distribution for the mean in a Bayesian analysis of a normal distribution with unknown variance is normal.
At about minute 36 I wrote down the wrong formula for the confidence interval. The formula should be \([\bar{x} - t_{n-1}(1 - \frac{\alpha}{2}) \frac{s}{\sqrt{n}}, \bar{x} - t_{n-1}(\frac{\alpha}{2}) \frac{s}{\sqrt{n}}]\)
The videos were longer than I originally planned, so we’ll do the example/lab next class.
R markdown file used to generate handout: RmdGitHub
Introduction and set up:
Errors and Notes:
At about 6:40, I gave an incorrect formula for the estimated standard error of the sample mean. The correct formula is \(\widehat{SE}(\bar{X}) = S/\sqrt{n} = \left[ \frac{\sum_{i=1}^n (X_i - \bar{X})^2}{n - 1} \right]^{0.5} / \sqrt{n}\)
As I made this video, I claimed we really needed a bootstrap for this case. As I looked at the example more, I decided that although I’m most comfortable with a bootstrap based interval in this example, probably other approaches would also work OK.
Percentile bootstrap confidence intervals:
Bootstrap t confidence intervals when a formula is available to calculate the estimated standard error.
Wed, Apr 15
In class, we will work on:
Continuing with handout from April 13
Bootstrap t confidence intervals when no formula is available to calculate the estimated standard error. Part 1: concepts
Bootstrap t confidence intervals when no formula is available to calculate the estimated standard error. Part 2: code walk-through
Lab on bootstrap confidence intervals
After class, please:
Fri, Apr 17
In class, we will work on:
Reminder about likelihood ratios:
Notes on worksheet from 2020-03-04. We talked about this handout in more detail that day. pdf
Example of likelihood ratio tests for a normal distribution. I suggest giving the problem a try before watching the solutions video or looking at the solutions pdf.