The proportional odds/ordinal regression model is slated to figure prominently in my dissertation analysis and since this model is a direct extension of the binary logistic model, I figured (for peace of mind) it would be prudent to review the principles and properties under-girding the logistic model. Let's start at the beginning.

Consider a probability, $\pi_i$, that is a function of parameters $\theta = (\alpha \ \beta)^T$ where $\pi_1$ is the probability of disease given an exposure and $\pi_2$ is the probability of disease given no exposure. Similarly, the complement of $\pi_1$, $(1-\pi_1)$, is the probability of no disease given exposure and $(1-\pi_2)$ is the probability of no disease given no exposure. Symbolically, we can write:

$\pi_1 = P(D | E), \quad \pi_2 = P(D | \overline{E}), \quad (1-\pi_1) = P(\overline{D} | E), \quad and \quad (1-\pi_2) = P(\overline{D} | \overline{E}).$ Given this characterization it then follows that if we define $log(\frac{\pi_1}{1-\pi_1}) = \alpha + \beta$ and $log(\frac{\pi_2}{1-\pi_2}) = \alpha$ we can show that the inverse logits are provided by the following logistic functions:

$\pi_1 = \frac{e^{\alpha + \beta}}{1 + e^{\alpha + \beta}}, \quad (1-\pi_1) = \frac{1}{1 + e^{\alpha + \beta}}, \quad \pi_2 = \frac{e^{\alpha}}{1 + e^{\alpha}}, \quad and \quad (1-\pi_2) = \frac{1}{1 + e^{\alpha}}.$

Each equality is derived below (beginning with logit then proceeding to logistic). First, consider $\pi_1$:

$[log(\frac{\pi_1}{1-\pi_1}) = \alpha + \beta]exp$

$(\frac{\pi_1}{1 - \pi_1}) = e^{\alpha + \beta}$

$\pi_1 = (e^{\alpha + \beta})(1-\pi_1)$

$\pi_1 = e^{\alpha + \beta} - \pi_1 e^{\alpha + \ beta}$

$\pi_1 + \pi_1 e^{\alpha + \beta} = e^{\alpha + \beta}$

$\pi_1 (1 + e^{\alpha + \beta}) = e^{\alpha + \beta}$

$\pi_1 = \frac{e^{\alpha + \beta}}{1 + e^{\alpha + \beta}}$

Now consider $(1 - \pi_1)$:

$[log(\frac{\pi_1}{1-\pi_1}) = \alpha + \beta]exp$

$(\frac{\pi_1}{1 - \pi_1}) = e^{\alpha + \beta}$

$\pi_1 = (e^{\alpha + \beta})(1-\pi_1)$

$(1 - \pi_1) = \frac{\pi_1}{e^{\alpha + \beta}},$ note that from the previous

$\pi_1 = \frac{e^{\alpha + \beta}}{1 + e^{\alpha + \beta}}$, and when substitute you get

$(1 - \pi_1) = \frac{(\frac{e^{\alpha + \beta}}{1 + e^{\alpha + \beta}})}{e^{\alpha + \beta}}$

which then reduces to $(1 - \pi_1) = \frac{1}{1 + e^{\alpha + \beta}}$

Now, $\pi_2$:

$[log(\frac{\pi_2}{1-\pi_2}) = \alpha]exp$

$\frac{\pi_2}{1 - \pi_2} = e^\alpha$

$\pi_2 = e^\alpha - \pi_2 e^\alpha$

$\pi_2 + \pi_2 e^\alpha = e^\alpha$

$\pi_2 (1 + e^\alpha) = e^\alpha$

$\pi_2 = \frac{e^\alpha}{1 + e^\alpha}$

And, lastly, $(1 - \pi_2)$:

$[log(\frac{\pi_2}{1-\pi_2}) = \alpha]exp$

$\frac{\pi_2}{1 - \pi_2} = e^\alpha$

$\pi_2 = e^\alpha (1 - \pi_2)$

$(1 - \pi_2) = \frac{\pi_2}{e^\alpha},$ recall that $\pi_2 = \frac{e^\alpha}{1 + e^\alpha}$ and when we substitute, we get

$(1 - \pi_2) = \frac{(\frac{e^\alpha}{1 + e^\alpha})}{e^\alpha}$ which reduces to

$(1 - \pi_2) = \frac{1}{1 + e^\alpha}.$

In a forthcoming post, I'll continue with the logistic regression model and present the likelihood as well as the log-likelihood functions and, eventually, the maximum likelihood estimates.

## No comments:

## Post a Comment