
THEORY AND METHODS OF INFERENCE
Prerequisites:

First year Masters courses, especially Probability Theory and Statistics (Advanced). 
Examination methods:

1/3 homework, 1/3 final written exam, 1/3 written and oral presentation rewiewing one or two recent research papers. 
Course unit contents:

 Statistical models and uncertainty in inference. Statistical models. Paradigms of inference: the Bayesian and frequentist paradigms. Prior specification. Model specification (data variability). Levels of model specification. Problems of distribution (variability of statistics). Simulation. Asymptotic approximations and delta method.
 Generating functions, moment approximations, transformations. Moments, cumulants and their generating functions. Generating functions of sums of independent random variables. Edgeworth and CornishFisher expansions. Notations Op(·) and op(·). Approximations of moments and transformations. Laplace approximation.
 Likelihood: observed and expected quantities, exact properties. Dominated statistical models. Sufficiency. Likelihood: observed quantities. Examples: a twoparameter model, grouped data,
censored data, sequential sampling, Markov chains, Poisson processes. Likelihood and sufficiency. Invariance properties. Expected likelihood quantities and exact sampling properties.
Reparameterizations.
 Likelihood inference: firstorder asymptotics. Likelihood inference procedures. Consistency of the maximum likelihood estimator. Asymptotic distribution of the maximum likelihood estimator. Asymptotic distribution of the loglikelihood ratio: simple null hypothesis, likelihood confidence regions, asymptotically equivalent forms, nonnull asymptotic distributions, composite null hypothesis (nuisance parameters), profile likelihood, asymptotically equivalent forms and onesided versions, testing constraints on the components of the parameter. Nonregular models.
 Bayesian Inference. Noninformative priors. Inference based on the posterior distribution. Point estimation and credibility regions. Hypothesis testing and the Bayes factor. Linear models.
 Likelihood and Bayesian inference: numerical and graphical aspects in R. Scalar and vector parameter examples. EM algorithm.
 Estimating equations and pseudolikelihoods. Misspecification. Estimating equations. Quasi likelihood. Composite likelihood. Empirical likelihood.
 Data and model reduction by marginalization and conditioning. Distribution constant statistics. Completeness. Ancillary statistics. Data and model reduction with nuisance parameters:
lack of information with nuisance parameters, pseudolikelihoods. Marginal likelihood. Conditional likelihood. Profile and integrated likelihoods.
 The frequencydecision paradigm. Statistical decision problems. Optimality in estimation: Cram´erRao lower bound, asymptoticefficiency, Godambe efficiency, RaoBlackwellLehmannScheffe theorem. Optimal tests: NeymanPearson lemma, composite hypotheses: families with monotone likelihood ratio, locally most powerful tests, twosided alternatives, other constraint criteria. Optimal confidence regions.
 Exponential families, Exponential dispersion families, Generalized linear models. Exponential families of order 1. Mean value mapping and variance function. Multiparameter exponential
families. Marginal and conditional distributions. Sufficiency and completeness. Likelihood and exponential families: likelihood quantities, conditional likelihood, profile likelihood and mixed parameterization. Procedures with finite sample optimality properties. Firstorder asymptotic theory. Exponential dispersion families. Generalized linear models.
 Group families. Groups of transformations. Orbits and maximal invariants. Simple group families and conditional inference. Composite group families and marginal inference. 

