THEORY AND METHODS OF INFERENCE

Second cycle degree in STATISTICAL SCIENCES

Campus: PADOVA

Language: English

Teaching period: Second Semester

Lecturer: ALESSANDRA SALVAN

Number of ECTS credits allocated: 9


Syllabus
Prerequisites: First year Masters courses, especially Probability Theory and Statistics (Advanced).
Examination methods: 1/3 homework, 1/3 final written exam, 1/3 written and oral presentation rewiewing one or two recent research papers.
Course unit contents: - Statistical models and uncertainty in inference. Statistical models. Paradigms of inference: the Bayesian and frequentist paradigms. Prior specification. Model specification (data variability). Levels of model specification. Problems of distribution (variability of statistics). Simulation. Asymptotic approximations and delta method.
- Generating functions, moment approximations, transformations. Moments, cumulants and their generating functions. Generating functions of sums of independent random variables. Edgeworth and Cornish-Fisher expansions. Notations Op(·) and op(·). Approximations of moments and transformations. Laplace approximation.
- Likelihood: observed and expected quantities, exact properties. Dominated statistical models. Sufficiency. Likelihood: observed quantities. Examples: a two-parameter model, grouped data,
censored data, sequential sampling, Markov chains, Poisson processes. Likelihood and sufficiency. Invariance properties. Expected likelihood quantities and exact sampling properties.
Reparameterizations.
- Likelihood inference: first-order asymptotics. Likelihood inference procedures. Consistency of the maximum likelihood estimator. Asymptotic distribution of the maximum likelihood estimator. Asymptotic distribution of the log-likelihood ratio: simple null hypothesis, likelihood confidence regions, asymptotically equivalent forms, non-null asymptotic distributions, composite null hypothesis (nuisance parameters), profile likelihood, asymptotically equivalent forms and one-sided versions, testing constraints on the components of the parameter. Non-regular models.
- Bayesian Inference. Non-informative priors. Inference based on the posterior distribution. Point estimation and credibility regions. Hypothesis testing and the Bayes factor. Linear models.
- Likelihood and Bayesian inference: numerical and graphical aspects in R. Scalar and vector parameter examples. EM algorithm.
- Estimating equations and pseudolikelihoods. Misspecification. Estimating equations. Quasi likelihood. Composite likelihood. Empirical likelihood.
- Data and model reduction by marginalization and conditioning. Distribution constant statistics. Completeness. Ancillary statistics. Data and model reduction with nuisance parameters:
lack of information with nuisance parameters, pseudo-likelihoods. Marginal likelihood. Conditional likelihood. Profile and integrated likelihoods.
- The frequency-decision paradigm. Statistical decision problems. Optimality in estimation: Cram´er-Rao lower bound, asymptoticefficiency, Godambe efficiency, Rao-Blackwell-Lehmann-Scheffe theorem. Optimal tests: Neyman-Pearson lemma, composite hypotheses: families with monotone likelihood ratio, locally most powerful tests, two-sided alternatives, other constraint criteria. Optimal confidence regions.
- Exponential families, Exponential dispersion families, Generalized linear models. Exponential families of order 1. Mean value mapping and variance function. Multiparameter exponential
families. Marginal and conditional distributions. Sufficiency and completeness. Likelihood and exponential families: likelihood quantities, conditional likelihood, profile likelihood and mixed parameterization. Procedures with finite sample optimality properties. First-order asymptotic theory. Exponential dispersion families. Generalized linear models.
- Group families. Groups of transformations. Orbits and maximal invariants. Simple group families and conditional inference. Composite group families and marginal inference.