Skip to content

Instantly share code, notes, and snippets.

@puterleat
Last active August 29, 2015 14:05
Show Gist options
  • Select an option

  • Save puterleat/db6645ed64174276a076 to your computer and use it in GitHub Desktop.

Select an option

Save puterleat/db6645ed64174276a076 to your computer and use it in GitHub Desktop.
. binreg compnouc c.wingage , or n(20)
Iteration 1: deviance = 1118.075
Iteration 2: deviance = 1079.917
Iteration 3: deviance = 1079.575
Iteration 4: deviance = 1079.575
Iteration 5: deviance = 1079.575
Generalized linear models No. of obs = 376
Optimization : MQL Fisher scoring Residual df = 374
(IRLS EIM) Scale parameter = 1
Deviance = 1079.574935 (1/df) Deviance = 2.886564
Pearson = 1080.320194 (1/df) Pearson = 2.888557
Variance function: V(u) = u*(1-u/20) [Binomial]
Link function : g(u) = ln(u/(20-u)) [Logit]
BIC = -1138.091
------------------------------------------------------------------------------
| EIM
compnouc | Odds Ratio Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
wingage | 1.145 0.008 20.45 0.000 1.130 1.160
_cons | 0.110 0.020 -12.33 0.000 0.077 0.156
------------------------------------------------------------------------------
. margins, at((p25) wingage) at((p50) wingage) at((p75) wingage)
Adjusted predictions Number of obs = 376
Model VCE : EIM
Expression : Predicted mean compnouc, predict()
1._at : wingage = 25 (p25)
2._at : wingage = 29 (p50)
3._at : wingage = 33 (p75)
------------------------------------------------------------------------------
| Delta-method
| Margin Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
_at |
1 | 15.301 0.118 129.43 0.000 15.069 15.533
2 | 16.969 0.090 188.74 0.000 16.793 17.145
3 | 18.118 0.090 201.95 0.000 17.942 18.294
------------------------------------------------------------------------------
.
. meqrlogit compnouc c.wingage ||child:, bin(20) or
Refining starting values:
Iteration 0: log likelihood = -834.99187
Iteration 1: log likelihood = -826.86355 (not concave)
Iteration 2: log likelihood = -824.52117
Performing gradient-based optimization:
Iteration 0: log likelihood = -824.52117
Iteration 1: log likelihood = -823.96019
Iteration 2: log likelihood = -823.95862
Iteration 3: log likelihood = -823.95862
Mixed-effects logistic regression Number of obs = 376
Binomial trials = 20
Group variable: child Number of groups = 376
Obs per group: min = 1
avg = 1.0
max = 1
Integration points = 7 Wald chi2(1) = 158.02
Log likelihood = -823.95862 Prob > chi2 = 0.0000
------------------------------------------------------------------------------
compnouc | Odds Ratio Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
wingage | 1.167 0.014 12.57 0.000 1.139 1.195
_cons | 0.081 0.028 -7.33 0.000 0.041 0.159
------------------------------------------------------------------------------
------------------------------------------------------------------------------
Random-effects Parameters | Estimate Std. Err. [95% Conf. Interval]
-----------------------------+------------------------------------------------
child: Identity |
var(_cons) | 0.772 0.099 0.601 0.992
------------------------------------------------------------------------------
LR test vs. logistic regression: chibar2(01) = 309.77 Prob>=chibar2 = 0.0000
. margins, at((p25) wingage) at((p50) wingage) at((p75) wingage) predict(mu fixedonly)
Adjusted predictions Number of obs = 376
Expression : Predicted mean, fixed portion only, predict(mu fixedonly)
1._at : wingage = 25 (p25)
2._at : wingage = 29 (p50)
3._at : wingage = 33 (p75)
------------------------------------------------------------------------------
| Delta-method
| Margin Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
_at |
1 | 15.844 0.224 70.64 0.000 15.404 16.283
2 | 17.519 0.138 127.32 0.000 17.249 17.788
3 | 18.579 0.119 155.60 0.000 18.345 18.813
------------------------------------------------------------------------------
.
.
.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment