dummy
dummy
dummy
All models are wrong, but some are useful.
Box (1979), Robustness in the strategy of scientific model building
dummy
dummy
Ich glaube, einen Philosophen, einen der selbst denken kann, könnte es interessieren, meine Noten zu lesen. Denn wenn ich auch nur selten ins Schwarze getroffen habe, so würde er doch erkennen, nach welchen Zielen ich unablässig geschossen habe.
Wittgenstein, On Certainty, 387
contribution
dummy
novelty
"Some of the circles are black."
truth-value judgements for "Some of the circles are black."
dummy
When would a cooperative speaker say: "Some of the 10 circles are black"?
no. of black balls | probability of using "some" | salient alternative |
---|---|---|
0 | very, very low | "none" |
1 | very low | "one" |
2 | low | "two" |
3 | meh | "three" |
4-6 | high | ??? |
7-9 | lower | "most" |
10 | low | "all" |
upshot
The pragmatic felicity of a description \(m\) for a situation \(c\) is a measure of how adequate \(m\) is for a given purpose of talk relative to alternative descriptions.
## c=0 c=1 c=2 c=3 c=4 c=5 c=6 c=7 c=8 c=9 c=10 ## none 1 0 0 0 0 0 0 0 0 0 0 ## one 0 1 0 0 0 0 0 0 0 0 0 ## two 0 0 1 0 0 0 0 0 0 0 0 ## three 0 0 0 1 0 0 0 0 0 0 0 ## many 0 0 0 0 0 1 1 1 1 1 1 ## most 0 0 0 0 0 0 1 1 1 1 1 ## all 0 0 0 0 0 0 0 0 0 0 1 ## some 0 1 1 1 1 1 1 1 1 1 1
literal listener picks literal interpretation (uniformly at random):
\[ P_{LL}(c \mid m) = \text{Uniform}(c \mid \{ c' \mid m \text{ is true in } c' \} ) \]
utility for true \(c\) and interpretation \(c'\):
\[ U(c, c' \ ; \ \pi) = \exp(- \pi \ (c - c')^2 ) \]
expected utility:
\[ \text{EU}(m, c \ ; \ \pi) = \sum_{c'} P_{LL}(c' \mid m) \ U(c, c' \ ; \ \pi) \]
"Gricean"" speakers choose maximally informative/useful messages:
\[ m \in \arg \max_{m' \in M} \text{EU}(m', c \ ; \ \pi) \]
(c.f., Benz 2006, Stalnaker 2006, Franke 2011, Frank & Goodman 2012)
scaled expected utility given set \(X\) of entertained alternatives:
\[ \text{EU}^*(c , X \ ; \ \pi) = \frac{\text{EU}(\textit{some}, c) - \min_{m \in X} \text{EU}(m, c)}{\max_{m \in X} \text{EU}(m, c) - \min_{m \in X} \text{EU}(m, c)} \]
salience of alternatives \(m \in M \setminus \{ \textit{some} \}\):
\[ s_m \sim \text{Beta}(1,1) \]
probability of entertaining \(X \subseteq M\) (crudely assume independence!):
\[ P(X \mid \vec{s}) = \prod_{m \in X} s_m \prod_{m \in M \setminus X} \ (1-s_m) \]
expected relative felicity:
\[ \text{F}(c \ ; \ \vec{s}, \pi) = \sum_X P(X \mid \vec{s}) \ \text{EU}^*(c , X \ ; \ \pi) \]
data
head(cars)
## speed dist ## 1 4 2 ## 2 4 10 ## 3 7 4 ## 4 7 22 ## 5 8 16 ## 6 9 10
dummy
model
\[\beta_0, \beta_1 \sim \text{Norm}(0, 1000)\] \[\sigma^2_{\epsilon} \sim \text{Unif}(0, 1000)\]
\[\mu_i = \beta_0 + \beta_1 x_i\] \[y_i \sim \text{Norm}(\mu_i, \sigma^2_{\epsilon})\]
type | examples |
---|---|
metric | speed of a car, reading time |
binary | coin flip, truth-value judgement |
nominal | gender, political party |
ordinal | level of education, rating scale judgement |
counts | number of cars passing under a bridge in 1 hour |
type of \(y\) | (inverse) link function | likelihood function |
---|---|---|
metric | \(\mu = \eta\) | \(y \sim \text{Normal}(\mu, \sigma)\) |
binary | \(\mu = \text{logistic}(\eta, \theta, \gamma) = (1 + \exp(-\gamma (\eta - \theta)))^{-1}\) | \(y \sim \text{Binomial}(\mu)\) |
nominal | \(\mu_k = \text{soft-max}(\eta_k, \lambda) \propto \exp(\lambda \eta_k)\) | \(y \sim \text{Multinomial}(\vec{\mu})\) |
ordinal | \(\mu_k = \text{threshold-Phi}(\eta_k, \sigma, \vec{\delta})\) | \(y \sim \text{Multinomial}(\vec{\mu})\) |
count | \(\mu = \exp(\eta)\) | \(y \sim \text{Poisson}(\mu)\) |
dummy
dummy
dummy
(c.f., Kruschke (2015), Doing Bayesian Data Analysis, Chapter 15)
\[\text{logistic}(\eta, \theta, \gamma) = \frac{1}{(1 + \exp(-\gamma (\eta - \theta)))}\]
dummy
dummy
threshold \(\theta\)
gain \(\gamma\)
(c.f., Kruschke (2015), Doing Bayesian Data Analysis, Chapter 23)
dummy
dummy
dummy
dummy
dummy
dummy
dummy
dummy
dummy
dummy
dummy
## 256 sets to create.
general
dummy
specific