Foundations of Statistics
In this course the fundamental principles and techniques underlying modern statistical and data analysis will be introduced. The course will cover the core foundations of statistical theory consisting of:
- probability distributions and techniques;
- statistical concepts and methods;
- linear models
The course highlights the importance of computers, and in particular, statistical packages, in performing modern statistical analysis. You will be introduced to the statistical package R as a statistical and programming tool and will gain hands-on experience in interpreting and communicating its output.
Frequentist Statistical Inference
This module is concerned with frequentist (classical/frequentist) statistical inference, both its theory and its applications and builds on the fundamental ideas of statistics introduced in the module “Foundations of Statistics”.
The following topics are explored and the Delta Method is also presented:
- maximum likelihood estimation
- properties of estimators
- confidence intervals
- likelihood ratio tests
There is emphasis on the exponential family of distributions, which includes many standard distributions such as the normal, Poisson, binomial and gamma. You will also explore how computers can be utilised to perform statistical inference for non-standard (i.e. analytically intractable) problems by applying innovative statistical and numerical methods.
Optimisation methods, the bootstrap algorithm and simulation techniques including Monte Carlo methods will be introduced in relation to problems of statistical inference. You will gain experience of linking the underlying statistical concepts to practical applications of the methodology.
You will gain experience of using statistical software and interpreting its output.
Statistical Modelling of Discrete and Survival Data
This module develops the theory of the generalised linear model and its practical implementation. It builds upon and extends the linear model introduced in the Foundations of Statistics module.
The teaching extends the understanding and application of statistical methodology to the analysis of discrete (count and binary) data and survival models, which frequently occur in diverse applications. You will gain experience of using statistical software to perform exploratory data analysis and to apply generalised linear model methodology to a wide range of applications.
You will develop key statistical skills in interpreting and communicating their statistical analysis.
Bayesian Data Analysis: Theory, Applications and Computational Methods
This module is concerned with the second main theory of statistical inference, Bayesian inference. It complements the frequentist statistical approach introduced in Statistical Inference.
This module will provide a full description of Bayesian analysis and cover popular models, such as the normal distribution and inference for categorical data.
- prior elicitation
- conjugate models
- marginal and predictive inference
- hierarchical models and model choice
Well known classical procedures, such as point estimation and confidence intervals, will be compared with their Bayesian counterparts.
You will also explore how computers allow the easy implementation of standard, but computationally intensive, statistical methods such as Markov chain Monte Carlo methods, to obtain samples from a posterior distribution.
You will gain experience of linking the underlying statistical concepts to practical applications of the methodology and benefit from hands-on experience of using statistical software and interpreting its output.
There is an exit point at the end of the taught modules in year one.
If you leave after successfully completing 60 credits in year one, you will gain a Postgraduate Certificate qualification (PGCert).