Modelling and Analytics for Medicine and Life sciences Doctoral Training Centre
The MAML doctoral training programme focuses on innovative modelling, simulation and data analysis approaches for the biomedical sciences, working across disciplines to study real-world problems in medicine and biology.
Providing sufficient nutritious food to a growing population, and maintaining a healthy society creates major challenges across the life sciences in areas including agriculture and food security, biotechnology, ageing, obesity and nutrition, cancer, drug resistance, chronic disease and mental health.Addressing such challenges necessitates continuing development and implementation of a raft of new mathematical approaches and their integration with experimental and clinical science.
The programme will equip a cohort of graduate students with fit-for-purpose methodologies to tackle these applications.
Students will apply mathematical approaches (from areas such as dynamic modelling, informatics, network theory, scientific computation and uncertainty quantification) to research projects at the forefront of biomedical and life sciences identified through well-established collaborations with both academic and industrial partners.
MAML students will be provided with an excellent training environment within the Centre for Mathematical Medicine and Biology and their collaborative departments. Students will undertake tailored training, complemented by broadening, soft-skills, wet-lab (where appropriate) and student-led activities. There will also be opportunities for training and exchanges with world-leading partners.
Students will be based within the School of Mathematical Sciences, and co-supervised by one or more academics from partner schools.
To apply for a place on the programme, please consult the project list below and:
Applicants for the MAML programme should have at least a 2:1 degree in mathematics, statistics or a similarly quantitative discipline (such as physics, engineering, or computer science).
For queries in relation to a particular project, please contact the supervisors associated with that project.
Modelling cellular spatial and lineage trajectories to predict mechanisms of germ cell and endoderm specification during gastrulation
Supervisors: Professor Markus Owen (School of Mathematical Sciences), Dr Ramiro Alberio (School of Biosciences) and Dr Matt Loose (School of Life Sciences)
Based on single cell transcriptome data (scRNA-Seq), spatial cell trajectories from light-sheet microscopy, and in situ hybridisation to visualise the spatial distribution of morphogens, predictive models of cell specification could be used to understand the evolution of cellular identity during gastrulation. This project will develop an individual-based model for the movement of cells through the embryo, subject to defined ActivinA and BMP gradients, with internal dynamics modelling the relevant signalling pathways, and with the key output of interest being the individual cellular level of SOX17 expression. Importantly, we will simulate the process of obtaining scRNA-Seq, to ascertain the extent to which the spatio-temporal history of the cells can be reconstructed. The model will predict the abundance of network components for a population of cells with different spatial trajectories and experiencing different morphogen levels, and will therefore predict the distribution of abundances across the population and correlations between different components, directly comparable to scRNA-Seq data.
Modelling single-cell heterogeneity to enhance bio-production by microorganisms
Supervisors: Professor John King (School of Mathematical Sciences), Dr Jamie Twycross (School of Computer Science) and Professor Simon Avery (School of Life Sciences)
Microbial synthesis is increasingly important for industrial production of useful chemicals and animal feedstocks, but strong phenotypic heterogeneity within cell populations compromises efficiency. Engineering-in homogeneity of metabolic pathways, for optimal activity in all cells, would significantly increase yields.
The MAML student will be trained to develop and employ both constraint-based-metabolic and dynamical-systems models to identify target pathway-steps for manipulation. Exemplar systems of study will be the production of ethanol and flavour compounds in yeast and proteins from bacterial cultures. The student will be trained in key aspects of microbiology as well as the above broadly applicable mathematical approaches.
New analytical and simulation tools in clinical oncology - fully funded by BAST Inc Ltd
Supervisors: Dr Gilles Stupfler (School of Mathematical Sciences), Dr Chris Brignell (School of Mathematical Sciences)
External partner: Joachim Grevel (BAST, Loughborough).
Cancer drug developers seek to show that satisfactory drug exposure increases the rate of beneficial patient responses, but without accounting for the risk of adverse effects. The student will develop fully parametric sub-distribution hazard models that account for such competing events, handle covariate information such as body weight and are more flexible than the standard proportional hazards model, so that a patient's individual risk of wasting limited life expectancy during treatment can easily be quantified and drug exposure optimised. Weekly supervisory meetings with BAST, with at least two years based there, will provide excellent training and career development opportunities.
Outcomes & impact: tools for patients, regulators and developers to better differentiate treatments based on the balance of efficacy and adverse effects.
Weight dependent BCM
Supervisors: Professor Stephen Coombes (School of Mathematical Sciences), Professor Mark van Rossum (Chair and Director/Neuralcomputation Research Group)
The synaptic connections in neuronal networks determine to a large extent the computation that a network performs and during learning and development synapses are modified to adapt the computation. For instance, during development of primary visual cortex, synapses adapt to the statistics of the visual environment creating the well-known bar-like feature detectors. Given its importance for biology as well as for biologically inspired computing, modelling this process of synaptic plasticity has received a lot of attention. One of the best known models which also has received some experimental support is the so called BCM model (Bienenstock, Cooper and Munro 1982). Using the BCM model one can stimulate neurons with a set of patterns (for instance visual patterns) and study the development of the synaptic weights as a dynamical system (e.g. Udeigwe and Ermentrout 2017). The fixed points of the dynamics and the neuron's computation can then be compared to published data.
Recently we modified the BCM model so that it includes the experimental observation that strong synapses are harder to strengthen than weak ones (Debanne 1999). Using simulation of single neurons we discovered that this leads to interesting novel fixed points in the weight dynamics. This ran counter to earlier models where inclusion of saturation simplified the dynamics. The project will examine the consequences of this on the network level models of visual development using simulation and analytical work (non-linear dynamics and stability theory).
Candidates are expected to have a background in maths, computer science or physics, and an affinity to computational neuroscience.
The project will be supervised by Mark van Rossum for his expertise in plasticity and Steve Coombes for his expertise in dynamical systems.
Chemokine gradient development around lymphatic vessels during immune and inflammatory responses
Supervisors: Dr Bindi Brook (Mathematical Sciences), Professor Markus Owen (Mathematical Sciences)
The precisely orchestrated migration of leukocytes (white blood cells of the immune system) is a key feature of all immune and inflammatory responses, including those that occur in infectious diseases. Rapid leukocyte transport around the body is facilitated by fluid delivery in the blood and lymphatic vessels.
However, their guidance to key destinations in tissues, lymph nodes or other tissue spaces is driven by gradients in a family of small secreted proteins called chemokines. Despite major advances in understanding chemokine function, it is still unclear how chemokine gradients are formed, maintained and regulated in tissues.
In addition to molecular diffusion, chemokine binding to extra-cellular matrix (ECM) components is likely to play a key role. Interstitial fluid flow will also contribute to gradient formation, and in the case of chemokine production near blood or lymphatic vessels, the transmural movement of fluid is likely to advect chemokines further into tissues than would be possible by pure diffusion. ‘Atypical’ chemokine receptors (ACKRs), a small family of molecules that scavenge and destroy extracellular chemokines are also likely to play a critical role in establishing, stabilizing and regulating chemokine gradients. The type of leukocyte migration induced depends on chemokine context, with soluble chemokine gradients directing chemotactic cell movement (migration up concentration gradients), while immobilized chemokine gradients induce integrin-dependent haptotaxis (migration up adhesion gradients).
The mechanisms that set up these gradients therefore include diffusion, advection (fluid movement), cell-mediated scavenging, and selective binding to extracellular matrix (ECM), some of which may be modified during inflammation. The aim of this project will be to develop mathematical models of chemokine gradient development during an immune or inflammatory response. The models will be developed in collaboration with immunologists based at the University of Glasgow (Profs Nibbs and Graham) , and a bioengineer at Imperial College London (Prof James Moore) who will be quantifying chemokine transport dynamics using a novel microfluidic platform to obtain a better understanding of chemokine transport and distribution in interstitial tissues around lymphatic vessels.
- Immune regulation by atypical chemokine receptors. Nibbs and Graham. Nature Reviews Immunology 13:815-829, 2013.
Analysing and interpreting neuroimaging data using mathematical frameworks for network dynamics
Supervisors: Professor Stephen Coombes (School of Mathematical Sciences), Dr Rachel Nicks (School of Mathematical Sciences), Dr Matthew Brookes (Sir Peter Mansfield Imaging Centre)
Modern non-invasive probes of human brain activity, such as magneto-encephalography, give high temporal resolution and increasingly improved spatial resolution. With such a detailed picture of the workings of the brain, it becomes possible to use mathematical modelling to establish increasingly complete mechanistic theories of spatio-temporal neuroimaging signals. There is an ever-expanding toolkit of mathematical techniques for addressing the dynamics of oscillatory neural networks allowing for the analysis of the interplay between local population dynamics and structural network connectivity in shaping emergent spatial functional connectivity patterns. This project will be primarily mathematical in nature, making use of notions from nonlinear dynamical systems and network theory, such as coupled-oscillator theory and phase-amplitude network dynamics. Using experimental data and data from the output of dynamical systems on networks with appropriate connectivities, we will obtain insights on structural connectivity (the underlying network) versus functional connectivity (constructed from similarity of real time series or from time-series output of oscillator models on networks). The project will focus in particular on developing techniques for the analysis of dynamics on “multi-layer networks” to better understand functional connectivity within and between frequency bands of neural oscillations.
- P Ashwin, S Coombes and R Nicks (2016) Mathematical frameworks for network dynamics in neuroscience. Journal of Mathematical Neuroscience. 6:2.
- J Hlinka and S Coombes (2012) Using Computational Models to Relate Structural and Functional Brain Connectivity, European Journal of Neuroscience, Vol 36, 2137—2145
- M J Brookes, P K Tewarie, B A E Hunt, S E Robson, L E Gascoyne, E B Liddle, P F Liddle and P G Morris (2016) A multi-layer network approach to MEG connectivity analysis, NeuroImage 132, 425-438
Optimising experiments for developing ion channel models
Supervisors: Dr Gary Mirams and Dr Simon Preston (School of Mathematical Sciences)
Background: in biological systems ion channel proteins sit in cell membranes and selectively allow the passage of particular types of ions, creating currents. Ion currents are important for many biological processes, for instance: regulating ionic concentrations within cells; passing signals (such as nerve impulses); or co-ordinating contraction of muscle (skeletal muscle and also the heart, diaphragm, gut, uterus etc.). Mathematical ion channel electrophysiology models have been used for thousands of studies since their development by Hodgkin & Huxley in 1952 , and are the basis for whole research fields, such as cardiac modelling and brain modelling . It has been suggested that there are problems in identifying which set of equations is most appropriate as an ion channel model. Often it appears different structures and/or parameter values could fit the training data equally well, but may make different predictions in new situations .
Aim: we have been developing novel experimental designs to provide more information about ion channel behaviour from shorter experiments. We would like to improve our techniques – to describe the ion current and also to characterise drug binding to ion channels (which can physically block them and reduce the current that flows to zero, sometimes leading to fatal heart rhythm changes). It is difficult to measure the rate at which drug/ion channel binding occurs and whether it occurs when the channels are open, closed, or both. These factors may be crucial in determining whether novel pharmaceutical compounds are likely to have side effects or not, and there is a need to develop efficient ways to measure them.
Approach: this project will involve computational biophysical modelling (efficient numerical solution of nonlinear ODE systems); the application of statistical techniques to quantify our uncertainty in model parameters and model equations/structure; and some wet-lab laboratory electrophysiology experiments. We will design more information-rich experiments to reduce our uncertainty in the models we develop  and work closely with labs to test out experiments we design and improve them.
- A. L. Hodgkin and A. F. Huxley, “A quantitative description of membrane current and its application to conduction and excitation in nerve,” J. Physiol., vol. 117, pp. 500–544, 1952.
- D. Noble, A. Garny, and P. J. Noble, “How the Hodgkin – Huxley equations inspired the Cardiac Physiome Project,” vol. 11, pp. 2613–2628, 2012.
- M. Fink and D. Noble, “Markov models for ion channels : versatility versus identifiability and speed,” Philos. Trans. A. Math. Phys. Eng. Sci., vol. 367, no. 1896, pp. 2161–79, Jun. 2009.
- G. R. Mirams, P. Pathmanathan, R. A. Gray, P. Challenor, and R. H. Clayton, “White paper: Uncertainty and variability in computational and mathematical models of cardiac physiology.,” J. Physiol., Mar. 2016.
Mathematical modelling of macromolecular capillary permeability
Supervisors: Dr Kenton Arkill (Medicine), Dr Reuben O’Dea (Maths), Professor David Bates (Medicine), Dr Matthew Hubbard (Maths)
The primary function of blood vessels is to transport molecules to tissues. In diseases such as cancer and diabetes this transport, particularly of large molecules such as albumin, can be an order of magnitude higher than normal.
The project is to model transient flow of macromolecules across the vascular wall in physiology and pathology. The doctoral student will join a team that includes medical researchers, biophysicists and mathematicians acquiring structural and functional data.
Detailed microscale models of vascular wall hydrodynamics and transport properties will be employed; in addition, powerful multiscale homogenisation techniques will be exploited that enable permeability and convection parameters on the nanoscale to be linked through the microscale into translatable information on the tissue scale. Computational simulations will be used to investigate and understand the model behaviour, including, for example, stochastic and multiphysics effects in the complex diffusion-convection nanoscale environment. The project will afford a great opportunity to form an information triangle where modelling outcomes will determine physiological experiments to feedback to the model. Furthermore, the primary results will inform medical researchers on potential molecular therapeutic targets.
Mathematical modelling of fatty liver disease
Supervisors: Dr Jonathan Wattis (Maths) and Professor Andy Salter (Biosciences).
The aim of this project is to produce detailed models of certain fat metabolism pathways in the liver. Metabolic Syndrome (MetSyn) represents a group of metabolic abnormalities associated with insulin resistance and leading to increased risk of developing cardiovascular disease (CVD) and type 2 diabetes. A further consequence of MetSyn is accumulation of lipid inside liver cells (Non-alcoholic Fatty Liver Disease , NAFLD) which can ultimately lead to cirrhosis, cancer or liver failure.
A greater understanding of the pathophysiology of MetSyn should help provide potential interventions to prevent NAFLD and more serious liver degeneration. There are many pathways which influence hepatic lipid accumulation. Existing models of fluxes through these pathways in the fasted and fed states have been fitted to healthy subjects, as well as a range of patients with varying degrees of insulin resistance. However, the longer term influences of the amount and type of food consumed on the accumulation of liver fat remain to be effectively modelled.
We will start with the production of VLDL in the liver and the role of specific enzymes , including stearoyl coenzyme A desaturase (SCD) and diacylglycerol acyltransferase enzymes (DGAT) . SCD regulates the conversation of saturated to monounsaturated fatty acids and appears critical to the subsequent formation of triacylglycerol (TAG). DGAT are involved in the synthesis of TAG from diacylgycerol in the cytosol and endoplasmic reticulum, and the relative activity of different isoforms may be a key factor in whether TAG is stored intracellularly or secreted within VLDL. Other areas where more detailed modelling is required is the breakdown of chylomicrons into remnant particles and the delivery of dietary TAG to the liver within these remnants. This means that ingested fats are adsorbed slowly and give rise to an input of fat over a prolonged period after eating. As these particles are digested their size decreases and their composition also changes, as TAG and cholesterol are removed separately. A main challenge in the project is to convert results on the dynamics which occur between one meal and the next into an understanding of the longer timescale of decades over which fat accumulates in the liver. Thus we will aim to investigate the effects of allowing certain parameters to vary slowly over various significantly longer timescales. Such modelling may help to identify key points in the development of NAFLD and suggest potential nutritional/pharmaceutical interventions to slow, or even reverse, the process.
Modelling the alternative splicing of tissue growth regulators and its implications for tumour growth
Supervisors: Professor Markus Owen (School of Mathematical Sciences), Professor David Bates (Division of Cancer and Stem Cells, School of Medicine)
Normal and pathological tissue growth is regulated by diverse growth factors and related molecules, many of which are produced in cells via the transcription of associated genes and translation of mRNA to protein. In many cases, alternative splicing, regulated by splicing factors, leads to different isoforms of proteins, which can have different effects. This is particularly pertinent to angiogenesis, the process whereby new blood vessels are produced from existing ones, which is crucial in cancer and also diseases such as diabetic retinopathy.
Different isoforms of Vascular Endothelial Growth Factor (VEGF), whose balance is regulated by alternative splicing, can promote or inhibit angiogenesis. In fact, the relevant splicing factors seem to regulate alternative splicing of families of genes controlling cell death, growth factor signaling, the cell cycle, invasion and immune responses. Thus it important to consider the overall effect of splicing factors in the context of a whole tissue where all these processes are modulated.
This project will focus on mathematical modelling of the various aspects of alternative growth factor splicing, regulation of angiogenesis, and tumour growth, with the following objectives:
- model splicing control at network level;
- model the implications for tissue growth of altered splicing control
- couple O1 and O2 to predict the efficacy of interventions that modulate alternative splicing in cancer.
This will require the develop and application of advanced mathematical and computational techniques to make the link from molecules to cells to tissues. A significant challenge is to use a blend of mathematical and statistical approaches to allow the translation of varied experimental data and knowledge into tractable parameterised mathematical frameworks that combine dynamics over a range of scales.
This project would also involve co-operation with Exonate, a biopharmaceutical company focussed on the discovery and development of small molecule drugs that modulate alternative mRNA splicing to address diseases of high unmet medical need. Exonate will provide relevant data and scientific input, and also contribute to the student training, for example by through hosting them within the company on secondment.
- M R Owen et al. Cancer Res 71(8) 2826-37 (2011)
Understanding how variability in cellular oxidative stress networks impacts tissue sensitivity
Supervisors: Dr Etienne Farcot, Dr Simon Preston (School of Mathematical Sciences) and Dr Alistair Middleton (Unilever)
Unilever is a large multinational company which produces a wide range of personal care, homecare, food and refreshment products. In toxicological science, there has been significant shift towards developing a mechanistic understanding of how certain compounds cause toxicity in order determine safe levels of exposure.
For many ingredients, cells can adapt to alterations in biological pathways provided the exposure level is sufficiently low . Examples include biological stress response pathways, of which there are approximately ten in humans, including oxidative stress, ER stress and DNA damage [2-5]. Models of these pathways have appeared in the literature, and are often composed of systems of nonlinear ordinary differential equations which are parameterised using in vitro data (i.e. data measured in the lab) using a single cell type. For example, HepG2 is a cell line that is often used in vitro as a surrogate for liver cells. However, it is likely (1) that responses will vary between different tissue types (liver cells may be more sensitive than skin cells, for example), and (2) that this variability is underpinned in large part by variations in the abundance of different stress response network components. While there is data available on the variability between different tissues types, both in terms of sensitivity to different compounds and abundance, a systematic attempt at understanding whether one can combine abundance data and mathematical models to understand variations in sensitivity in still lacking. To this end we propose to use an existing mathematical model of oxidative stress, together with the available experimental data to explore this. A key aspect of the work will be to draw on techniques from Bayesian statistics to explore uncertainties in the data and model predictions .
This project will be based in the School of Mathematical Sciences, with regular interactions with the industrial supervisor.
- Shah, I., Setzer, R. W., Jack, J., Houck, K. A., Judson, R. S., Knudsen, T. B., ... & Thomas, R. S. (2016). Using ToxCast™ data to reconstruct dynamic cell state trajectories and estimate toxicological points of departure. Environmental health perspectives, 124(7), 910
- Khalil, H. S., Goltsov, A., Langdon, S. P., Harrison, D. J., Bown, J., & Deeni, Y. (2015). Quantitative analysis of NRF2 pathway reveals key elements of the regulatory circuits underlying antioxidant response and proliferation of ovarian cancer cells. Journal of biotechnology, 202, 12-30
- Erguler, K., Pieri, M., & Deltas, C. (2013). A mathematical model of the unfolded protein stress response reveals the decision mechanism for recovery, adaptation and apoptosis. BMC systems biology, 7(1), 16
- Li, Z., Sun, B., Clewell, R. A., Andersen, M. E., & Zhang, Q. (2013). Dose response modeling of etoposide-induced DNA damage response. Toxicological sciences, kft259
- Simmons, S. O., Fan, C. Y., & Ramabhadran, R. (2009). Cellular stress response pathway system as a sentinel ensemble in toxicological screening. Toxicological sciences, kfp140
- Girolami, M. (2008). Bayesian inference for differential equations. Theoretical Computer Science, 408(1), 4-16
Learning the associations between brain connections and function/dysfunction
Supervisors: Professor Theo Kypraios (School of Mathematical Sciences), Professor Stam Sotiropoulos (School of Medicine)
Understanding the workings of the human brain is one of the most outstanding challenges of our time. In particular, determining factors that contribute to the individual signature of integrated cognitive function is of genuine interest to neuroscience, but also of paramount importance for neurological applications; characterizing the “normal” brain structure and function is key for characterizing abnormalities and approaching disease mechanisms. Non-invasive and in-vivo magnetic resonance imaging (MRI), as well as Magneto-encephalography (MEG), can uniquely shed light to these questions.
This project will capitalise on advances and data offered by the cornerstone Human Connectome Project (HCP) (www.humanconnectome.org), for which the principal supervisor (SS) has been a major contributor. We will build novel computational methodology for estimating connections using complementary MRI and MEG through state-of-the-art inference techniques. In particular, we will develop models for estimating network structure from multimodal data and we will explore causal interactions. Using data-driven exploratory analysis, we will then identify latent associations between brain organisation and function. This will further allow the extraction of summary imaging-derived measures with certain contextual relevance that could comprise potential markers for subsequently exploring pathology-induced abnormalities and dysfunction. For instance, we will identify predictive behavioral traits of psychiatric disorders and explore their associations with estimated connectivity.
Quantitative multi-scale modelling of the human auditory cortex
Supervisors: Professor Steve Coombes (School of Mathematical Sciences), Dr Chris Sumner (MRC Institute of Hearing Research), Patrick May (Leibniz Institute, Magdeburg), Dr Katrin Krumbholz (MRC Institute of Hearing Research)
Understanding how the underlying sensory processing by neural networks in the human brain gives rise to sensory perception is a difficult problem.
It is straightforward to measure human perception via behavioural experiments, but even accessing, let alone understanding, the underlying signals in the brain is difficult. Non-invasive measures such as EEG and fMRI allow monitoring of aggregate responses, but they are dramatically limited in their resolution, either spatially or temporally. Relating non-invasive responses back to underlying neural activity necessitates solving an inverse problem.
Moreover, our knowledge about how individual neurons behave can only be drawn from animal experiments. There is currently no principled way for inferring the behaviour of underlying neural circuits from non-invasive measurements.
This project will address the problem of sensory perception by developing a novel computational model of one part of the brain: the auditory cortex. This model will have the power to simulate individual spiking neurons, large populations of neurons, and far-field electrical signals (EEG, MEG) that are normally accessible in humans. This forward-model will allow the testing of hypotheses about the possible ways in which neural activity could give rise to the non-invasive observations, which in turn, are linked to results from behavioural experiments.
This will be achieved by bringing together two existing models: a large scale firing rate model of multiple auditory cortical fields, constrained by all the known anatomy (May et al. 2015); and a modelling framework which allows a rigorous abstraction from spiking models of neurons to neural field models (Byrne et al. in press). These field models can capture the dynamics of extended regions of the brain (Coombes 2010), be projected onto surfaces, and folded in the manner of the cortical surface. From these, far-field potentials (EEG, MEG) can be predicted. Thus for the first time we aim to provide a computational model that can predict, in a principled way, non-invasive measurements from the responses of single neurons. This will function as a platform for theoretically linking various measures of neural activity to sound perception.
- Coombes S (2010). Large-scale neural dynamics: Simple and complex, NeuroImage, Vol 52, 731–739.
- May PJ, Westö J, Tiitinen H. (2015). Computational modelling suggests that temporal integration results from synaptic adaptation in auditory cortex. Eur J Neurosci. 41:615-30.
- Á Byrne, M J Brookes and S Coombes 2016 (in press). A mean field model for movement induced changes in the beta rhythm, Journal of Computational Neuroscience.
Systems modelling of the coagulation cascade
Supervisors: Professor John King (School of Mathematical Sciences), Professor Alison Goodall (University of Leicester)
The coagulation cascade is a complex network of protein reactions that is central to the formation of blood clots. While clots are essential to prevent bleeding when formed inappropriately they can lead to heart attacks and strokes. The assessment of coagulation is key in diagnosing thrombotic conditions but routine assays do not presently correlate with an individual’s risk of thrombosis. The aims of this project are to develop a definitive mathematical model of the coagulation cascade, validate the models against experimental data describing coagulation profiles in both healthy individuals and patients with cardiovascular disease and utilise the models to furnish understanding of the key drivers of the differences seen in the data to provide insights into therapeutic markers.
Data-driven reconstruction algorithms for Magnetic Resonance Elastography
Supervisors: Dr Marco Iglesias (School of Mathematical Sciences), Dr Daniele Avitable (School of Mathematical Sciences) and Dr Deirdre McGrath (School of Medicine)
Magnetic Resonance Elastography (MRE) (1,2) is a powerful diagnostic imaging technique that measures changes in the biomechanical properties of biological tissue caused by disease. MRE research has recently begun at the Sir Peter Mansfield Imaging Centre, University of Nottingham, with the installation of an MRE system on the Philips 3-Tesla Ingenia Magnetic Resonance Imaging (MRI) scanner. MRE works by delivering mechanical waves to the tissue, which are measured using MRI, and these wave measurements are converted into estimated biomechanical properties using specialised reconstruction algorithms.
These algorithms solve an inverse problem: starting from MR imaging data, they estimate tissue biomechanical properties, thereby allowing the differentiation of healthy and diseased tissue. The accurate identification of the disease location and boundaries is a main challenge for current reconstruction algorithms (3), which are required to assimilate a large amount of noisy MRI imaging data.
We aim to develop novel reconstruction algorithms for MRE data, using Bayesian inversion approaches (4,5). These techniques enable one to quantify how uncertainty in the data and in the modelling assumptions affect the quality of the reconstruction of tissue properties. The algorithms will be informed by and validated with data acquired at the Sir Peter Mansfield Imaging Centre.
The development of these methods has the potential to improve significantly MRE-based diagnosis, by assimilating MRI data into a general class of heterogeneous and anisotropic biomechanical models.
Example of inversion for a study to simulate MRE in the brain (see ref. 6).
- Muthupillai R, Lomas DJ, Rossman PJ, Greenleaf JF, Manduca A, Ehman RL. Magnetic resonance elastography by direct visualization of propagating acoustic strain waves. Science (New York, NY) 1995;269(5232):1854-1857.
- Mariappan YK, Glaser KJ, Ehman RL. Magnetic resonance elastography: a review. Clinical anatomy (New York, NY) 2010;23(5):497-511.
- Doyley MM. Model-based elastography: a survey of approaches to the inverse elasticity problem. Physics in medicine and biology 2012;57(3):R35-73.
- Iglesias, M, Lu Y, Stuart A, A Bayesian Level Set Method for Geometric Inverse Problems, Interfaces and Free Boundaries 18 (2016), 181-217
- Iglesias, M. A regularizing iterative ensemble Kalman method for PDE-constrained inverse problems, Inverse Problems, 32 (2016) 025002
- McGrath DM, Ravikumar N, Wilkinson ID, Frangi AF, Taylor ZA. Magnetic resonance elastography of the brain: An in silico study to determine the influence of cranial anatomy. Magnetic resonance in medicine 2016;76(2):645-662.
Multiscale modelling of signalling microdomains
Supervisors: Dr Ruediger Thul (School of Mathematical Sciences), Professor Stephen Coombes (School of Mathematical Sciences)
A key role of cells is to translate external signals into appropriate cellular responses. For example, when cells that line blood vessels experience weak stimulation, they initiate the expression of certain genes, while for strong stimuli, they begin to move. How cells accomplish such diverse responses is still an open question. What has transpired, though, is that so called microdomains are vital for cellular decision-making.
Microdomains are small parts of a cell where molecular mediators and switches are concentrated in close proximity. This is advantageous since cell signalling intrinsically relies on molecules interacting, and if they are close to each other, chances are higher that signal transduction is successful. In many cases, these signalling pathways rely on small molecules that diffuse through the microdomain and hence can carry information from one molecular partner to the next. To appreciate the full potential of the signalling micordomains, it is crucial to have a comprehensive understanding of the dynamics of these diffusible messengers.
In this project, we will use a combination of semi-analytical and numerical techniques to develop three dimensional models of signalling microdomains. In particular, we will investigate how the intracellular calcium concentration changes in space and time within microdomains, and how these changes affect signal transduction. Gaining deeper insights into microdomains is key for understanding for understanding healthy physiology such as fertilisation and muscle contraction as well as diseases such as immunodeficiency and neurological disorders. The model will be informed by experiments conducted at Oxford and Penn State University.
Linking epidemiological and genomic data for infectious diseases
Supervisors: Prof Philip O'Neill (School of Mathematical Sciences), Dr Theodore Kypraios (School of Mathematical Sciences)
Summary: In the past few years, advances in sequencing technology and the reduction in associated costs have enabled scientists to obtain highly detailed genomic data on disease-causing pathogens on a scale never seen before. In addition to the inherent phylogenetic information contained in such data, combining genomic data with traditional epidemiological data (such as time series of case incidence) also provides an opportunity to perform microbial source attribution, i.e. determining the actual transmission pathway of the pathogen through a population.
These advances have seen a corresponding surge of activity in the modelling and statistical analysis community, so that now a number of methods and associated computer packages exist to carry out source-attribution, i.e. estimating who-infected-whom in a particular outbreak. All the methods have their own limitations; a very common issue is that the models used to perform estimation are conditional upon the observed data, which can create estimation biases and lead to misleading results. In contrast, the method developed by Worby, Kypraios and O'Neill involves a model that can explain how the data arose, overcoming such problems. This project is concerned with developing this approach to both (i) extend the idea to more complex model settings, relaxing certain technical assumptions and (ii) improve computational efficiency. A highly-detailed data set on MRSA provided by collaborators at Guy's and St Thomas' hospital trust, London, provides one opportunity for applying such methods.
Worby, C. J., O'Neill, P. D., Kypraios, T., Robotham, J. V., De Angelis, D., Cartwright, E. J. P., Peacock, S. J. and Cooper, B. S. (2016) Reconstructing transmission trees for communicable diseases using densely sampled genetic data. Annals of Applied Statistics 10(1), 395-417.