For Poisson regression, we have log(E(Y)) = X Beta.
For logistic regression, we have log(p/(1-p))= X Beta (6/)
As you can see the right hand side (RHS) of the regression equation is the same but the LHS is changing to go from LM to GLM. (7/)
Now we have covered LM --> GLM.
Next, let's go LM --> LMM. (10/)
You know there's person-to-person differences. One (naive) way to account ... (11/)
Yikes, now you have a lot of parameters to estimate! This could be problematic with degrees of freedom. (12/)
Instead of estimating each intercept, we can instead estimate the variance of the random effects. (13/)
Thus we have swapped out 99 fixed effects for one (or a few) variances. This estimation problem is nicer! (14/)
We say we have a linear "mixed model" because there's a combo of fixed effects and random effects. (15/)
As that cute kid says, "why not both?"
We can have both random effects and a response from an exponential family. That's a GLMM. (17/)
We can also add random effects to a logistic regression.
These would be examples of GLMM. (18/)
HALF TIME BREAK.