This is a list of research opportunities for PhD Mathematics applicants interested in doing a PhD project in the School of Mathematical Sciences. Some projects are joint with other Schools.
Find out how to apply
Clicking on a member of staff's name will take you to their personal home page whereas clicking on a project title will show more details about that particular project.
A network of globally coupled quadratic integrate-and-fire neurons with conductance based synapses has recently been shown to admit to an exact mean field description using the Ott-Antonsen (OA) ansatz [1,2]. The resulting neuronal population model is ideally suited as a model for understanding local field potentials. This PhD project will initially treat finite-size effects around the asynchronous state using a Bogolyubov-Born-Green-Kirkwood-Yvon (BBGKY) hierarchy with an appropriate moment-closure approximation . It will then move on to consider a path-integral formalism to derive a perturbation expansion in the inverse system size to determine the evolution of system covariances . The final part of the project will consider generalisations of the OA ansatz to allow the treatment of more general nonlinear integrate-and-fire neurons, and in particular piece-wise linear models with adaptation .
 S Coombes and Á Byrne 2019 Next generation neural mass models, Nonlinear Dynamics in Computational Neuroscience, PoliTO Springer Series, Ed. A Torcini and F Corinto, Springer, 1-16
 Á Byrne, M J Brookes and S Coombes 2017 A mean field model for movement induced changes in the beta rhythm, Journal of Computational Neuroscience, Vol 43, 143-158
 E J Hildebrand, M A Buice and C C Chow 2007 Kinetic theory of coupled oscillators, Physical Review Letters, 98:054101
 S Qiu and C C Chow 2018 Finite-size effects for spiking neural networks with spatially dependent coupling, Physical Review E, Vol 98, 062414
 R Nicks, L Chambon and S Coombes 2018 Clusters in nonsmooth oscillator networks, Physical Review E, Vol 97, 032213
Hawking's 1974 prediction of black hole radiation continues to inspire the search for novel quantum phenomena associated with global properties of spacetime and with motion of observers in spacetime, as well as the search for laboratory systems that exhibit similar phenomena. At a fundamental level, a study of these phenomena provides guidance for developing theories of the quantum mechanical structure of spacetime, including the puzzle of the microphysical origin of black hole entropy. At a more practical level, a theoretical control of the phenomena may have applications in quantum information processing in situations where gravity and relative motion are significant, such as quantum communication via satellites.
Specific areas for a PhD project may include:
Magnetic materials can be described by a variational principle for the magnetisation vector, which is of fixed length in ferromagnetic materials. The micromagnetic energy also includes the stray field generated by the magnetisation. The resulting model is non-convex and non-local, which results in a simple model that yields extemely rich multi-scale pattern formation. The geometry of thin films sometimes allows reduced theories can be derived. In this project, we study some thin film scalings and determine the possible ground states and their range of validity (using Gamma-convergence or Gamma-expansions) as well as studying the magnetisation dynamics.
I have several projects in the following broad areas of mathematics:
If you are interested in any of these areas, please get in touch with me (email@example.com).
The aim of this project is to develop computational Bayesian techniques for the solution of geometric inverses problems that arise in a wide range of applications such as subsurface geophysics, manufacturing engineering and the built environment. Examples of geometric inverse problem that will be addressed with the techniques developed in this project are: (i) inference of fracture networks and/or conduits in Karst aquifers during the injection of CO2 for its geologic storage; (ii) detection of defects in reinforced preform during the resin infusion process in the fabrication of composite materials; (iii) inference of internal structures (e.g. cavities) in building structures such as walls with the aim of improving estimates of energy consumption. These problems have an underlying (forward) model described by Partial Differential Equation(s) (PDE) with input parameters associated to some (unknown) physical property of interest; the inverse problem is to infer this property from noisy observations of the solution of the PDE. In the context of problems (i)-(iii), sophisticated geometric parameterizations are often required to enable an accurate and realistic characterization of these properties (e.g. channelized structures in Karst networks). This project will develop computational hierarchical Bayesian methodologies to infer those geometry-constrained properties within an infinite-dimensional Bayesian framework for PDE-constrained inverse problems.
Data-analysis for real-life epidemics offers many challenges; one of the key issues is that infectious disease data are usually only partially observed. For example, although numbers of cases of a disease may be available, the actual pattern of spread between individuals is rarely known. This project is concerned with the development and application of methods for dealing with these problems, and involves using the latest methods in computational statistics (e.g. Markov Chain Monte Carlo (MCMC) methods, Approximate Bayesian Computation, Sequential Monte Carlo methods etc).
This project will be based at the University of Nottingham in the School of Mathematical Sciences and the Faculty of Engineering.
The use of fibre-reinforced composite materials in aerospace and automotive industries and other areas has seen a significant growth over the last two decades. One of the main manufacturing processes for producing advanced composites is resin transfer moulding (RTM). The crucial stage of RTM is injection of resin into the mould cavity to fill empty spaces between fibres; the corresponding process is described by an elliptic PDE with moving boundaries. Imperfections of the preform result in uncertainty of its permeability, which can lead to defects in the final product. Consequently, uncertainty quantification (UQ) of composites’ properties is essential for optimal RTM. One of important UQ problems is quantification of the uncertain permeability. The objectives of this PhD project include (i) to construct, justify and test efficient algorithms for the Bayesian inverse problem within the moving boundary setting and (ii) to apply the algorithms to real data from composite laboratory experiments.
Eligibility/Entry Requirements: We require an enthusiastic graduate with a 1st class degree in Mathematics, preferably at MMath/MSc level (in exceptional circumstances a 2:1 class degree, or equivalent, can be considered). We are expecting that the successful applicant has a background in PDEs, Probability and Statistics and has exceptional computational skills.
For any enquiries please email: Marco.Iglesias@nottingham.ac.uk or Michael.Tretyakov@nottingham.ac.uk or Mikhail.Matveev@nottingham.ac.uk.
This project will be jointly supervised by Dr Mikhail Matveev in the Faculty of Engineering.
Hypothesis testing is an important way to draw scientific conclusions from experimental data. However, models relevant in industrial settings (for example models that characterise manufacturing processes) are almost invariably non-linear in the model parameters, and hypotheses of interest often involve parameters that lie on the boundary of the parameter space; these are challenging to standard (asymptotic) approaches to hypothesis testing. We will develop methods based on the "bootstrap" -- a powerful approach in computational statistics that involves computing null distributions using simulated data -- to address hypothesis testing in challenging non-linear settings.
Motivated by the pursuit of providing combinatorially computable 4D topological field theories that lift existing 3D theories obtained from quantum groups, categorification has grown into an active area of research combining methods from topology, representation theory, and geometry. Motivated by applications to categorification, there have been efforts to develop a theory of higher representations, called 2-representation theory, for such structures. Since 2010, Mazorchuk-Miemietz and collaborators have developed a program that categorifies the representation theory of finite-dimensional algebras. Building on this program, new directions are to be explored linking to categorification of algebraic structures involving roots of unity or derived algebraic geometry through differentially graded structures.
Cell signalling effects have crucial roles to play in a vast range of biological processes, such as in controlling the virulence of bacterial infections or in determining the efficacy of treatments of many diseases. Moreover, they operate over a wide range of scales, from subcellular (e.g. in determining how a particular drug affects a specific type of cell) to organ or population (such as through the quorum sensing systems by which many bacteria determine whether or not to become virulent). There is therefore an urgent need to gain greater quantitative understanding of these highly complex systems, which are well-suited to mathematical study. Experience with the study of nonlinear dynamical systems would provide helpful background for such a project.
Heisenberg's uncertainty principle states that momentum and position cannot be sharp at the same time because there is a lower bound for the product of the uncertaincies. Coherent states can be defined as the states that minimize the uncertainty -- in this sense they are as close as quantum mechanics allows to describe a classical point particle. When a quantum system starts in a coherent states it's expectation values follow the classical equations of motion while the shape of the wave function often changes only very slowly. Coherent states are an important tool to understand the corresp[ondence between quantum and classical dynamics.
In this project this correspondence will be analysed for a generalized quantum dynamics where the Hamilton operator is not required to be Hermitian. Such dynamics can arise in practice as an effective description for an open quantum system with eitehr decay or gain. Accordingly the energy eigenvalues may have an imaginary part that describes the loss or gain. Recently there have also be suggestions that non-hermitian Hamilton operators could play a fundamental role in quantum mechanics if the Hamilton operator remains symmetric with respect to a combined operatyion of parity P and time reversal T. Such PT-symmetric dynamics have a balance between gain and loss which can lead to real energy eigenvalues. Classical to quantum correspondence for such systems remains an open research topic and this project will aim at getting a clear understanding of the underlying classical dynamics using coherent states as the main tool.
After the groundbreaking works of V. Voevodsky, it became possible to work with algebraic varieties by completely topological methods. An important role in this context is played by the so-called Generalized Cohomology Theories. This includes classical algebraic K-theory, but also a rather modern (and more universal) Algebraic Cobordism theory. The study of such theories and cohomological operations on them is a fascinating subject. It has many applications to the classical questions from algebraic geometry, quadratic form theory, and other areas. One can mention, for example: the Rost degree formula, the problem of smoothing algebraic cycles, and u-invariants of fields. This is a new and rapidly developing area that offers many promising directions of research.
This aim of the project is to further develop the theory and numerical methods for compensated convex transforms introduced by the proposer and to apply these tools to approximations, interpolations, reconstructions, image processing and singularity extraction problems arising from applied sciences and engineering.
Computational Finance is the key element for successful risk management at investment banks and hedge funds and it is also a growing area on the interface between finance, computational mathematics and applied probability. Pricing and hedging financial derivatives, evaluating risks of default for financial product and firms, satisfying requirements of the Basel Accord, etc. - all require sophisticated modelling and reliable calibration of the models. These aims cannot be achieved without efficient numerical techniques which form the area of computational finance. The project will aim at developing new, efficient computational techniques related to finance.
Eligibility/Entry Requirements: We require an enthusiastic graduate with a 1st class degree in Mathematics, preferably at MMath/MSc level (in exceptional circumstances a 2:1 class degree, or equivalent, can be considered). We are expecting that the successful applicant has a good background in Probability and Stochastic Analysis, some knowledge of Finance and has exceptional computational skills.
For any enquiries please email: Michael.Tretyakov@nottingham.ac.uk
Computational Number Theory is a fairly recent part of pure mathematics even if computations in number theory are a very old subject. But over the last few decades this has changed dramatically with the modern, powerful and cheap computers. In the area of explicit computations on elliptic curves, there are two subjects that underwent a great development recently: elliptic curves over finite fields (which are used for cryptography) and 'descent' methods on elliptic curves over global fields, such as the field of rational numbers.
It is a difficult question for a given elliptic curve over a number field to decide if there are infinitely many solutions over this field, and if so, to determine the rank of the Mordell-Weil group. Currently, there are only two algorithms implemented for finding this rank, one is the descent method that goes back to Mordell, Selmer, Cassels,... and the other is based on the work of Gross, Zagier, Kolyvagin... using the link of elliptic curves to modular forms. While the first approach works very well over number fields of small degree, it becomes almost impossible to determine the rank of elliptic curves over number fields of larger degree. The second method unfortunately is not always applicable, especially the field must be either the field of rational numbers or a quadratic extension thereof.
There is another way of exploiting the relation between elliptic curves and modular forms by using the p-adic theory of modular forms and the so-called Iwasawa theory for elliptic curves. Results by Kato, Urban, Skinner give us a completely new algorithm for computing the rank and other invariants of the elliptic curve, but not much of this has actually been implemented. Possible PhD projects could concern the further development of these new methods and their implementation.
Despite recent advances in the development of computational methods for fitting epidemic models to data, many of these methods work best in small-scale settings where the study population is not especially big or the models have relatively few parameters. There is a need to develop methods which are appropriate to large-scale settings. Furthermore, nearly all existing methods rely on parametric approaches (e.g. models based on specific underlying assumptions), but recent work has shown that Bayesian nonparametric approaches can be successfully adapted to this area. This project involves developing novel computationally efficient methods to fit both parametric and non-parametric models to data in situations where the existing methods are infeasible.
Man-made materials play a major role in healthcare including extensive use as implants, which have an unfortunately significant rates of infection. The rise of antimicrobial resistance makes this problem pressing, in that the infections arising are often untreatable by antibiotics and therefore often fatal. Modification of the surface texture or topography has been found to control bacterial surface colonisation-we do not know why. This project will access a near infinite range of shapes formed at planar surfaces using 2 photon printing of polymers from a range of monomers. Through mining if this data, we aim to build a better understanding of the relationship between bacterial colonisation and shape that will enable better materials to be designed from first principles for use in healthcare. The student will develop and apply ideas and techniques from a variety of areas, including statistical shape analysis and machine learning, to address the questions of interest.
This project will be jointly supervised by Prof Morgan Alexander in the School of Pharmacy and Prof Ricky Wildman in the Faculty of Engineering.
In cognitive psychology, the eyes have been said to provide a window to the mind. Eye fixations, assessed with eye-tracking technology, is used to investigate reading times (RTs) in natural reading. In particular, eye-tracking allows researchers to probe the dependence of RTs for a given target word as a function of its context. Given that RTs are used as a proxy for the cognitive demands of reading – words that are easy to understand are read more quickly than those that are harder to process – these studies shed light on how different contexts impact on the understanding of a text. For example, how do people read and understand articles on the BBC, legal documents and letters from the NHS?
Traditionally, researchers have only focussed on RTs for particular target words that have been manipulated in a study, ignoring the RTs of the remaining words. This leaves an extremely rich dataset of RTs completely untapped. In this project, we will process the RTs from a large number of eye-tracking experiments collected at the University of Nottingham. Crucially, the project will deliver a principled data-cleaning algorithm. This is particularly important since different labs pre-process their data differently, which poses a substantial challenge to reproduce their results and to compare results from different labs.
Once EMNED (English Mathematics Nottingham Eye-tracking Database) is created, the data will be analysed with several reading models including the EZ reader to gain insight into reading in a natural context. The project will be undertaken in close collaboration with Prof Kathy Conklin in the School of English.
Polymers are very long chain molecules and many of their unique properties depend upon their long chain nature. Like simple fluids many polymer fluids crystallise when cooled. However, the crystallisation process is complicated by the way the constituent chains are connected, leading to many curious and unexplained phenomena. Furthermore, if a polymer fluid is placed under flow, this strongly affects both the ease with which the polymer crystallises and the arrangement of the polymer chains within the resulting crystal. This project will develop and solve models for polymer dynamics and phase transitions using a range of analytical, numerical and stochastic techniques, with the ultimate aim of improving our understanding of polymer crystallisation. The project offers the opportunity to collaborate with a wide range of scientists working in the field, including several world-leading experimental groups.
Project in collaboration with prof. Tom Hudson (University of Warwick), prof. Lorenzo Rovigatti (Roma La Sapienza), Dr. Nicodemo Di Pasquale (University of Manchester)
Many theoretical tools have been recently developed to reduce the complexity of high-dimensional non-linear ODEs or highly-resolved multiscale PDEs. These have now an enormous importance in computational chemistry, continuum mechanics, fluid dynamics, and dynamical systems in general. One of these bottom-up formal approaches is the Mori-Zwanzig projection formalism for dynamical systems. At the same time, also data-driven top-down methods, have been widely studied in machine learning and in numerical analysis. In this project, we aim to connect these theoretical and numerical tools to make them applicable for practical applications, such as the molecular dynamics simulation of complex molecule chains, or the relaxation to equilibrium of non-linear reaction-diffusion equations. In the first case, we can rely on the Hamiltonian structure of the full-resolution model, while the latter can be analysed through model decomposition or spectral analysis. The objective is to develop and implement flexible numerical approaches to deal with the model reduction of different model problems, by combining analytical derivations with numerical simulation data.
Pre-requisites: Statistical mechanics, ODEs/PDEs, Programming, Numerical methods
To understand the transition from healthy airways to diseased (asthmatic) airways and predict disease progression. This will be achieved by developing models that combine mechanistic and data-driven components, trained using in-vivo/in-vitro datasets, resulting in a multi-scale organ-level network model.
Introduction Asthma is a chronic disease affecting ~300m people worldwide. The annual financial burden in the UK is £2.3billion, with 80% of this spent on the 20% of people with the most severe and poorly controlled asthma. The disease is characterized by inflammation, airway hyper-responsiveness (causing rapid bronchoconstriction) and airway remodelling (structural changes to the epithelium, extra-cellular matrix and airway smooth muscle within the airway wall). However, it is not clear how these mechanisms are linked, and whether the latter two are causes or symptoms of the disease.
Previously, we have undertaken a joint experimental and theoretical study to better understand the links. We developed multiphase morphoelastic-growth partial differential equation (PDE) models linking inflammation to bronchoconstriction and remodelling for the first time. In parallel, we performed in vitro and in vivo studies to obtain an unprecedented spatio-temporal dataset of airway geometry, mechanics and constituent changes in ~2000 airways from a chronic mouse model of asthma. However, the mechanistic models in their current form rely on empirical descriptions of how cell processes such as proliferation and apoptosis respond to tissue stress and inflammatory mediators, that are not fully informed by the animal data and fail to account for biological variability in vivo.
In this project, we will augment existing mechanistic models with data-driven components to develop a biophysically-informed machine learning/mechanistic model hybrid to reveal non-linear dependencies of cell processes on stress/inflammation. This will help us better understand the emergent system dynamics and identify underlying pathogenic processes .
Outcomes Through multidisciplinary collaboration between applied mathematicians, statisticians, imaging experts and respiratory medicine clinicians, we will develop a biologically- and physically-informed model that will provide insight into disease pathogenesis and progression, as well as predict the effect of biological/mechanical interventions.
Scientific Machine Learning is becoming a rapidly growing discipline within the Computational and Data Sciences. It combines Scientific Computation, which focussed on the numerical simulation of mathematical models from the applied sciences, and Machine Learning, which focusses on algorithms for mathematical models that are data driven. While large-scale numerical simulations are extremely important for predictions of real-world problems, they can be computationally excessive, requiring tremendous computing power. This is where machine-learning algorithms can provide a much-needed solution: Reduced-order models can be learned that allow huge savings in computational costs while remaining accurate in relevant quantities of interest. In this project, the student will explore the above Scientific Machine Learning paradigm by utilising the power of Deep Neural Networks to systematically construct optimal reduced-order models for Partial Differential Equations.
Derived algebraic geometry is a powerful geometric framework which plays an increasingly important role in both the foundations of algebraic geometry and in mathematical physics. It introduces a refined concept of "space", the so-called derived stacks, that is capable to describe correctly geometric situations that are problematic in traditional approaches, such as non-transversal intersections and quotients by non-free group actions.
This project is about applying the novel techniques of derived algebraic geometry to simple problems in mathematical physics, with a particular focus on shifted Poisson geometry and deformation quantization.
Rechargeable batteries and other energy storage technologies are key elements for reaching a sustainable carbon-free energy market. The increasing usage of batteries needs to be supported by more accurate and faster mathematical models to be integrated in online control units. State-of-the-art battery models for control applications are typically Ordinary Differential Equations (ODEs), built by analogy with Equivalent Electrical Circuits. Despite their computational simplicity, these models fail to capture the physical meaning and the dependence of the parameters on the underlying chemical-physical processes. Another mostly unresolved issue is the simulation of irreversible and complex non-linear phenomena such as fast (dis)charge and degradation.
In this project, we aim to derive new efficient system-scale models, as an alternative to classical equivalent circuit models, to enable the fast, yet accurate, simulation of short and long-term behaviour of lithium-ion cells. By projecting the PDE continuum models of the porous electrode theory onto a low-dimensional manifold, we aim to derive reduced equations that can retain the interesting features of the full model (e.g., memory, non-linearities). Since a full first-principle characterisation of these models is out of question, the parameter identifiability from real data will also be investigated.
This project will see the participation of several academic and industrial partners in the UK and overseas.
We are living in the big data era and the new challenge for a statistician is to provide efficient tools to extract valid information from a large volume of data. An accurate statistical analysis could give answers that will help the practitioner to improve the existing methodologies used to address a particular problem. New technologies and networks allow the collection of datasets that exhibit increasingly complex features. Nowadays many datasets are huge and heterogeneous: data heterogeneity appears when the sample comes from at least two different populations. An important consequence of heterogeneity is model uncertainty. If the observations in the sample can be generated by different models, this will increase the uncertainty of the forecast of a future observation. The standard Bayesian setting implicitly assumes that the data are exchangeable. Roughly speaking, this means that the probability of observing a finite set of observations is not affected by the order of appearance of the observations. Exchangeability can be a restrictive assumption for heterogeneous data. A more general and appropriate probability model which could describe data heterogeneity is the one that assumes that the observations are partially exchangeable, that is, the observations are exchangeable only within a certain group. This project will propose novel models to deal with model uncertainty and data heterogeneity. This will be achieved by proposing a Bayesian nonparametric approach in a partially exchangeable setting. From a technical point of view, partial exchangeability can be expressed in terms of a vector of random probability measures. This project will focus on a recent construction introduced by the PI where the vector is defined by normalizing Compound Random Measures (CoRMs). CoRMs are a flexible and tractable framework for many dependent random measures including many of the superposition and Lévy copula approaches. The project will develop the theory and application of vectors of random probability measures in Bayesian nonparametrics with a special focus on CoRMs. Some parts of this project will be developed in collaboration with Prof. Jim Griffin (UCL) and Dr Alan Riva-Palacio (Universidad Nacional Autónoma de México).
Many physical and chemical processes, typified by those related to fluid flow, can be modelled mathematically using partial differential equations. These can usually only be solved in the simplest of situations, but solutions in far more complex cases can be approximated using numerical and computational techniques. Traditional approaches to providing these computational simulations have typically modelled the evolution of the system by approximating the equations on a uniform mesh of points covering a domain with a fixed boundary. However, many situations (consider the spreading of a droplet, for example), naturally suggest a domain which evolves with the flow, while the main focus of interest in others (say the movement of a shock wave up and down an aeroplane wing) is in following the motion of a sharp internal feature. For accuracy and efficiency a computational method should not only approximate the partial differential equations appropriately, but also move the computational mesh in a manner which follows such features.
Recent research has developed a finite element approach to the adaptive approximation of time-dependent physical problems involving moving boundaries or interfaces. It has been deliberately designed to preserve inherent properties (such as conservation principles and invariances) of the underlying partial differential equations and hence of the system the mathematics is intended to represent. Extremely promising results have been obtained for a wide range of problems in one and two space dimensions, but the applicability of the approach is still limited (as are all moving mesh methods) by the potential for the computational mesh to "tangle".
The aim of this project will be to develop an alternative approach, derived within the same framework, which takes advantage of the additional flexibility inherent in the discontinuous Galerkin finite element framework. This has the potential to reduce the occurrence of mesh tangling and to greatly improve the robustness of the method when modelling problems involving complex, interacting features and when using different monitor to govern the movement of the mesh.
Dynamic Survival Analysis (DSA) combines classical dynamical systems theory with survival analysis. Given a dynamical system, often described in terms of mean-field ordinary/partial differential equations representing proportions/concentrations in a large population/volume, the DSA method interprets those differential equations as probabilistic quantities such as survival functions, density functions, cumulative hazards etc. This interpretation is very useful for parameter inference of the dynamical system based on standard survival analysis tools.
On the methodology side, the project will further develop the theory of dynamical survival analysis and study its statistical properties. There will be plenty of opportunity to employ the method to particular applications in epidemiology, systems biology.
Some physical problems can be modelled by a function or vector field with a near discontinuity at a point. Specific examples include boundary vortices in thin magnetic films, and some types of dislocations in crystals. Typical static configurations can be found by minimizing certain energy functionals. As the core size of the singularity tends to zero, these energy functionals are usually well described by a limiting functional defined on point singularities.
This project investigates how to obtain dynamical laws for singularities (typically in the form of ordinary differential equations) from the partial differential equations that describe the evolution of the vector field. For some such problems, results for interior singularities are known, but their boundary counterparts are still lacking.
This project requires some background in the calculus of variations and the theory of partial differential equations.
Polymers are extraordinarily long molecules, made out of chains of simpler molecules. They occur everywhere in our everyday lives, including in the DNA chains that make up our genetics, in many high-tech consumer products and in the simple plastic bag. Often these applications depend crucially on the way that the polymer chains move. This is especially true in concentrated polymer liquids, where the chain dynamics are controlled by how the chains become entangled with each other. A powerful mathematical framework for describing these entangled systems has been under development for some time now, but the ideas have yet to be fully developed, tested and exploited in practical applications. Working on this PhD project will give the opportunity to train in a wide range of mathematical techniques including analytical work, numerical computations and stochastic simulation and to apply these to problems of real practical impact. This lively research field involves mathematicians, scientists and engineers and a keenness to learn from and co-operate with researchers from a range of backgrounds would be a real asset in this project.
Nonlinear PDE of Ginzburg-Landau type can be used to model a wide range of phenomena, from ferromagnetic materials and superconductors to quantum field theory. For certain ranges of the Ginzburg-Landau parameter (for self-duality and for point vortices), the equations can be reduced to ODEs. The focus of the present project will be to study the equations of motion in a wider setting, with the aim to extend the range of validity of the reductions and to compare almost singular solutions of the PDEs with simplified ODEs that describe the motion of the singularities. The project will use mostly rigorous analysis and possibly numerical simulation.
Electromagnetic systems and devices are often complicated, irregular in their geometry and heterogeneous in their electrical characteristics. Such a system could be a PC, a mobile phone, or even an airplane cockpit. The prediction of the energy distribution becomes hard when using traditional analytical and numerical tools, especially if the wavelength is small compared to the size of the structure. Statistical methods are often more appropriate to describing the physical process under investigation in such cases. Appropriately chosen, such methods can lead to surprisingly simple and physically understandable characterization of the problem, which can be used to exploit complexity and turn collective behaviour into beneficial engineering technology.
This PhD project uses a phase-space representation of wave fields, the so-called Wigner distribution function (WDF), to unveiled transport properties of fields using tools of dynamical system theory. An exact evolution operator for the transport of these Wigner functions can be derived, and approximation schemes are obtained by using ray families that include reflections from irregular boundaries. The project will explore the possibility of linking the WDF operator to existing semiclassical approximations of quantum mechanics, used to transport densities of quantum particles. The challenge lies in constructing a phase space picture of those operators through the WDF before including the source operator.
Dr Gabriele Gradoni is also based at George Green Institute of Electromagnetics Research in the Faculty of Engineering.
A ferrofluid is a colloidal-based magnetic fluid that is made by mixing nanoparticles and a surfactant into a viscous liquid. The surfactant is used to coat the ferromagnetic nanoparticles and hence prevent their agglomeration. Ferrofluids have a variety of applications in medicine (e.g. in drug delivery), technology etc. These fluids have the ability to be controlled by suitably applying a magnetic field, so it becomes clear that their study is both an interesting and challenging fluid mechanics subject.
This project will investigate the effects of magnetic field and soluble surfactants on a thin ferromagnetic fluid flowing down an inclined plane. Progress will be made theoretically using linear stability theory, numerically using simulations of appropriate reduced models, and experimentally (if time permits). The project will first explore a simpler case in which the fluid is weakly sensible to the magnetic field, and will later consider extension to cases in which the magnetic field changes with the motion of the fluid (this is true when a ferrofluid is considered).
Compact endomorphisms of commutative, semisimple Banach algebras have been extensively studied since the seminal work of Kamowitz dating back to 1978. More recently the theory has expanded to include power compact, Riesz and quasicompact endomorphisms of commutative, semiprime Banach algebras.
This project concerns the classification of the various types of endomorphism for specific algebras, with the aid of the general theory. The algebras studied will include algebras of differentiable functions on compact plane sets, and related algebras such as Lipschitz algebras.
In order to facilitate high penetration of renewable energy in to the grid, energy storage is needed to better manage the supply and demand for the grid. Hydrogen offers a high energy density solution and, rather than storing the hydrogen as a gas at high pressures, solid state storage of hydrogen in a metal like magnesium offers a low pressure and low cost technology. The hydrogenation of magnesium is very exothermic (74.5 kJ mol-1) and the material is also being investigated as a thermal energy store (i.e. using the exotherm of hydrogenation to liberate the stored thermal energy back as heat at 400°C).
A fear was that cycling a magnesium bed at high temperatures would lead to sintering and a loss of void space. However, the startling result was that the powdered magnesium bed when cycled at temperatures of 350-400°C, rather than losing porosity, gained porosity. The form of the bed had changed from a loose powder to a metal porous plug which had swelled in dimensions to fill the available head space in the vessel. Further cycling at temperature below 350°C results in the bed resorting back to a more densely packed loose powder.
The intriguing question is to uncover the fundamental mechanism(s) behind this process and to develop a predicative model based on the physical and chemical processes occurring. For the application, understanding these processes will enable optimisation of the porous structure for heat and mass flow; moreover, there is also concern the expanding bed may exert significant stress on the wall of the storage vessel eventually leading to failure of the vessel.
This challenging research project will develop new mathematical models based on the chemical and physical processes occurring in order to develop a model that simulates the expanding porous bed phenomenon. Some of these processes include: nucleation, growth of the metal hydride phase, crystal lattice expansion leading to defect formation, decrepitation, atomic diffusion and surface energy minimisation, annealing. The models developed will thus need to encompass a wide range of physical phenomena; the focus will be on partial-differential-equation/moving-boundary formulations, building on the established sintering literature but, for the reasons described above (specifically, to generate increased, rather than decreased, porosity), of necessity raising significant additional challenges. The project will accordingly equip the student with an unusually wide experience of experimental and modelling questions and of mathematical techniques, as applied in a context with clear energy and sustainability implications.
This project will be jointly supervised by Prof Gavin Walker in the Faculty of Engineering and Dr Richard Wheatley in the School of Chemistry.
A prolific research area is emerging at the interface between quantum-information and gravity . While deep conceptual links have been unveiled between e.g. entanglement and holography, there remains significant scope to unlock their operational value for future endeavours towards a consistent quantum gravity theory.
This project aims to identify, via quantum-information techniques, the entanglement features of finite regions of space leading to emergence of horizon-like surfaces, with the vision to achieve a cross-cutting information-theoretic modelling of black hole horizons in quantum gravity.
The project will exploit formal connections between spin networks – graphs decorated with quantum-geometric data which model finite regions of space in quantum gravity – and tensor networks – powerful quantum-informational structures with widespread applications in many-body physics .
A bilayer flow is a multi-phase flow of two immiscible and superposed liquids of different viscosities and densities. Multilayer flows are useful in a wide range of practical applications including oil recovery and transport. This project will investigate the development of interfacial instabilities in bilayer flows, both theoretically and experimentally. The theoretical component of the project will combine analytical and numerical techniques with aim to develop, analyse and solve appropriate mathematical models. The successful candidate will also be involved in the development, build, validation and use of a modified Couette cell, suitable for studying instabilities in bilayer flows. In particular, we will look at the influence of surfactants and particles on the stability of these flows and use imaging techniques to monitor fluctuations at the interface between the two liquid layers. The theoretical models will be developed alongside the experiments. Both theory and experiment will be used to inform each other throughout the project.
Full funding is available for Home (UK) students only, to include stipend, fees and travel to at least one overseas conference. The project will be co-supervised by Dr Anna Kalogirou (School of Mathematical Sciences) and Dr James Sharp (School of Physics).
Networks of interacting dynamical systems occur in a huge variety of applications including gene regulation networks, food webs, power networks and neural networks where the interacting units can be individual neurons or brain centres. The challenge is to understand how emergent network dynamics results from the interplay between local dynamics (the behaviour of each unit on its own), and the nature and structure of the interactions between the units.
Recent work has revealed that real complex networks can exhibit a large number of symmetries. Network symmetries can be used to catalogue the possible patterns of synchrony which could be present in the network dynamics, however which of these exist and are stable depends on the local dynamics and the nature of the interactions between units. Additionally, the more symmetry a network has the more possible patterns of synchrony it may possess. Computational group theory can be used to automate the process of identifying the spatial symmetries of synchrony patterns resulting in a catalogue of possible network cluster states.
This project will extend current methods for analysing dynamics on networks of (neural) oscillators through automating the process of determining possible phase relations between oscillators in large networks in addition to spatial symmetries. This will be used to investigate dynamics on coupled networks of simplified (phase-amplitude reduced or piecewise-linear) neuron and neural population models. We will also consider the effect on the network dynamics of introducing delays in the coupling between oscillators which will give a more realistic representation of interactions in real world networks.
Objects of study in algebraic geometry are called varieties. They are studied using various tools from both algebra and geometry. Typically two external behaviours of a variety are interesting: how they deform and what other avrities they are isomorphic to. The latter has given rise to birational classification and the former has appeared in several new lines of research. In this project, we will be looking at Fano varieties, their deformations and their birational classification. Roughly speaking, Fano avrieties are those defined by polynomials of low degree, or equivalently geometric shapes with positive curvature. New developments in the field, such as finiteness of families of Fano varieties by Birkar (Fields medal 2018) or the development of the theory of K-stability has opened up several questions to be answered.
The ability to manipulate, control and measure quantum systems is a central issue in Quantum Technology applications such as quantum computation, cryptography, and high precision metrology . Most realistic systems interact with an environment and it is important to understand how this affects the performance of quantum protocols and how it can be used to improve it. The input-output theory of quantum open systems  offers a clear conceptual understanding of quantum dynamical systems and continuous-time measurements, and has been used extensively at interpreting experimental data in quantum optics. Mathematically, we deal with an extension of the classical filtering theory used in control engineering at estimating an unobservable signal of interest from some available noisy data .
This projects aims at investigating the identification and control of quantum dynamical systems in the framework of the input-output formalism. As an example, consider a quantum system (atom) interacting with an incoming "quantum noise" (electromagnetic field); the output fields (emitted photons) emerging from the interaction can be measured, in order to learn about the system's dynamical parameters (e.g. its hamiltonian). The goal is to find optimal system identification strategies which may involve input state preparation, output measurement design, and quantum feedback control. An interesting related question is to understand the information-disturbance trade-off which in the context of quantum dynamical systems becomes identification-control trade-off.
The first steps in this direction were made in  which introduce the concept of asymptotic quantum Fisher information for "non-linear" quantum Markov processes, and  which investigates system identification for linear quantum systems, using transfer functions techniques from control theory. A furhter goal is to develop genearal Central Limit theory for quantum output processes as a probablistic underpinning of the asymptotic estimation theory. Another direction is the recently found connection between dynamical phase transitions in many-body open systems and high precision metrology for dynamical parameters (see arXiv:1411.3914).
Click here to find more information on this topic and some illustrations of different types of estimators. For more about my research interests you can visit my homepage..
This is a unique and exciting opportunity to undertake research that spans across the disciplines of energy engineering and mathematical sciences. With additional supervision from Prof Mark Gillott and Dr Parham Mirzaei Ahrnjani (Architecture & Built Environment), the doctoral student will be joining a strong interdisciplinary team from academia and industry who are currently working on the delivery of the Energy Research Accelerator (ERA) Community Energy System (CES) demonstrator at the 15 acre Trent Basin site in Nottingham.
The project will investigate the energy challenges and complexity science issues associated with heat and electrical power generation, storage and use arising from the connections between micro-generation output, grid/heat loads, weather, and energy/power demands (including occupant behavior) combined with variable load energy storage devices in order to provide energy stability, a reduction of cost and associated carbon emissions from fossil fuel use. The PhD research will develop new multi-vector CES models that utilise 'big data' obtained from a dedicated onsite monitoring platform at the housing development applied to a heterogeneous network of users. The work will ultimately help inform the design, implementation and operation of local community energy schemes in the UK.
About the Energy Research Accelerator: The Energy Research Accelerator (ERA) is a cross-disciplinary energy innovation hub which brings together capital assets, data and intellectual leadership to foster collaboration between academia and business to accelerate the development of solutions to the global energy challenge. It will provide new buildings and cutting-edge demonstrators, develop highly skilled people and jobs, as well as new products and services to ultimately transform the UK's energy sector. Building on existing programmes and academic expertise across the partnership, universities within ERA have committed over £2m for doctoral students as a critical part of the ERA skills agenda. Delivered through Innovate UK, the government has committed an initial capital investment of £60m, and ERA has secured private sector co-investment of £120m. ERA's initial priorities of Geo-Energy Systems, Integrated Energy Systems and Thermal Energy will help deliver the new technologies and behaviours that will open the avenues for its future development and demonstrate the transformative effect ERA can have across the energy spectrum.
Foundations of adaptive finite element methods for PDEs (Or- Why do adaptive methods work so well?)
Adaptive finite element methods allow the computation of solutions to partial differential equations (PDEs) in the most optimal manner that is possible. In particular, these methods require the least amount of degrees-of-freedom to obtain a solution up to a desired accuracy! In recent years a theory has emerged that explains this behaviour. It relies on classical a posteriori error estimation, Banach contraction, and nonlinear approximation theory. Unfortunately, the theory so far applies only to specific model problems.
Challenges for students:
Depending on the interest of the student, several of these issues (or others) can be addressed. Also, the student is encouraged to suggest a second supervisor, possibly from another group!
The project belongs to the following areas of Mathematics: Stochastic Analysis, Applied Probability, Numerical Analysis
For many applications (especially, in molecular dynamics and Bayesian statistics), it is of interest to compute the mean of a given function with respect to the invariant law of the diffusion, i.e. the ergodic limit. To evaluate these mean values in situations of practical interest, one has to integrate large dimensional systems of stochastic differential equations over long time intervals. Computationally, this is a challenging problem. Stochastic geometric integrators play an important role in long-time simulation of dynamical systems with high accuracy and relatively low cost. The project can be on the rapidly growing area of numerical approximation of SDEs on manifolds.
The project involves construction of new efficient numerical methods for ergodic stochastic differential equations and stochastic numerical analysis of properties of the methods.
We require an enthusiastic graduate with a 1st class degree in Mathematics, preferably at MMath/MSc level (in exceptional circumstances a 2:1 class degree, or equivalent, can be considered). We are expecting that the successful applicant has a very good background in Probability and has good computational skills.
What is the average shape of a brain tumour in a cohort of patients? How might this evolve over time? Is it possible to associate the shape and its evolution with certain genetic and clinical characteristics of a patient? Advent of high-resolution imaging technologies have enabled meaningful answers to such questions based on continuous representations of tumours and organs as parametric curves and surfaces. These data objects typically reside on manifolds equipped with non-trivial geometries and symmetries (invariances). The project will focus on developing statistical methods for such data using tools from stochastic processes, differential geometry and group theory.
Polytopes are one of the most fundamental shapes in geometry including polygons and the Platonic solids (tetrahedron, cube, octahedron, etc.). They can be characterised as shapes in Euclidean space enclosed by planes and accept notions such as vertices and edges. Lattice polytopes are polytopes whose vertices have integral coordinates. They provide a fascinating bridge between geometry and number theory that has been studied since the ancient Greeks. They are central in modern mathematics and their study has been amazingly exciting leading to key discoveries and recent breakthroughs in areas such as algebraic geometry, optimisation theory, representation theory, and statistics. New aspects are constantly being explored, such as the ‘spanning property’ of lattice polytopes. Initial work with my collaborators shows that the implications of this property are extremely powerful.
In this project, we will study fundamental geometric and combinatorial features of spanning lattice polytopes of ‘bounded combinatorial complexity’. Due to its links to algebraic geometry and combinatorics, this topic can either lead to a hands-on thesis crunching millions of examples with a computer, or a theoretical piece of work, exploring the underlying theory, depending on the student’s preference.
How can we define an average closed curve on the plane from a random sample of many such curves, such that it is invariant to certain transformations (e.g. rotations)? Answers to such questions have far-reaching impact on analysis of images arising in numerous discplines (e.g. images of brain tumours). Extending statistical methodology from finite- to infinite-dimensional linear and nonlinear settings requires an improved understanding of probability distributions on constrained function spaces. Employing stochastic processes as a tool to study the interplay between probability and geometry, the project will address some fundamental issues that arise in the development of statistical theory and methodology for data in the form of functions and curves.
Various projects are available on the interplay between any of the following areas: quantum gravity, alternative theories of gravity, strong gravity and black holes.
The description of phenomena for which gravity is important and also are in the realm of quantum physics requires a quantum gravity theory. Developing candidate theories to the extent that they can be confronted with observations is a very challenging task. In their classical limit these theories are mostly expected to deviate from General Relativity. In this sense, classical alternative gravity theories can be the interface between quantum gravity theory and classical phenomenology.
The gravitational interaction is much less explored in regimes where gravity is strong, such as in the vicinity of black holes or veer compact stars. These system can be thought of as natural laboratories for gravity.
The overall scope is to follow a synthetic approach which will combine results about the behavior of gravity at all different scales - from the quantum to astrophysical and cosmological system - in order to provide new insights.
Higher category theory plays an increasingly important role in the mathematical formulation of quantum field theory (QFT). It provides powerful tools to investigate and understand subtle aspects associated with gauge symmetries and thereby opens up new avenues towards designing refined axiomatic frameworks for QFT that are capable to describe quantum gauge theories such as Yang-Mills theory.
This is an interdisciplinary PhD project in the intersection of mathematical physics, algebra and topology. The project could focus either on new developments in higher categorical algebraic QFT, or on the construction of new examples in this framework.
We are currently carrying out an experiment to study the effects occurring around effective horizons in an analogue gravity system. In particular, the scientific goals are to explore black hole ringdown, superradiant scattering and the black hole evaporation process. To address this issue experimentally, we utilize the analogy between waves on the surface of a stationary draining fluid/superfluid flows and the behavior of classical and quantum field excitations in the vicinity of rotating black.
This project will be based at the University of Nottingham at the School of Mathematical Sciences. The two external collaborators are Prof. Josef Niemela (ICTP, Trieste in Italy) and Prof. Stefano Liberati (SISSA, Trieste in Italy). The external consultant for the experiment is Prof. Bill Unruh, who will be a regular visitor.
The PhD student will be involved in all aspects of the experiments theoretical as well experimental. We require an enthusiastic graduate with a 1st class degree in Mathematics/Physics/Engineering (in exceptional circumstances a 2(i) class degree can be considered), preferably of the MMath/MSc level. Candidates would need to be keen to work in an interdisciplinary environment and interested in learning about quantum field theory in curved spacetimes, fluid dynamics, analogue gravity, and experimental techniques such as flow visualisation (i.g. Particle Imaging or Laser Doppler Velocimetry) and surface measurements (i.g. profilometry methods).
Complex dynamics is the study of iteration of analytic functions on the complex plane. A rich mathematical structure is seen to emerge amidst the chaotic behaviour. Its appeal is enhanced by the intricate nature of the Julia sets that arise, and fascinating images of these fractal sets are widely admired.
Quasiregular mappings of n-dimensional real space generalise analytic functions on the complex plane. Roughly, a mapping is called quasiregular if it locally distorts space by only a bounded amount, so that small spheres are mapped to small ellipsoids. This is more flexible than the situation with analytic functions, where the Cauchy-Riemann equations tell us that infinitesimally small circles are mapped to small circles.
There are many similarities between the behaviour of analytic functions and quasiregular mappings. One can therefore attempt to develop a theory of quasiregular iteration parallel to the results of complex dynamics. Such a theory is just beginning to emerge, lying between the well-studied analytic case (where many powerful tools from complex analysis are available) and general iteration in several real variables, which is much less well-understood.
The problems studied will be inspired and guided by existing results in complex dynamics. For example, we can ask questions about the ‘escaping set’ of a function – this is the set of all starting points from which the sequence of iterates tends to infinity. One of the challenges we encounter is that as we increase the number of iterations of a quasiregular mapping, the amount of local distortion may become increasingly large.
This is very much a pure mathematics project and will appeal to someone who enjoys topics such as real analysis, complex analysis, metric spaces or discrete dynamical systems.
Language change is ubiquitous, and it is primarily driven by the input that we encounter. A key source for this is conversations that we have with e.g. family, friends and colleagues. Many of the changes that occur in language begin with teens and young adults. As young people interact with others their own age, their language grows to include words, phrases, and constructions that are different from those of the older generation. Some of these changes have a short life span (have you heard ‘groovy’ lately?), but others stick around to affect the language as a whole. Studying language change makes it apparent that language evolution is a network phenomenon.
This project will investigate how language change progresses through networks that initially consist of two classes of speakers. Each class possesses their own language characteristics such as the frequencies with which speakers produce certain vowels. Through interactions with other speakers, these characteristics can change. This leads to a host of intriguing questions including how the network topology impacts on language change or how irregular words (such as ‘went’) persist while others regularise. The project will be undertaken in close collaboration with Prof Kathy Conklin in the School of English.
Summary: In the past few years, advances in sequencing technology and the reduction in associated costs have enabled scientists to obtain highly detailed genomic data on disease-causing pathogens on a scale never seen before. In addition to the inherent phylogenetic information contained in such data, combining genomic data with traditional epidemiological data (such as time series of case incidence) also provides an opportunity to perform microbial source attribution, i.e. determining the actual transmission pathway of the pathogen through a population.
These advances have seen a corresponding surge of activity in the modelling and statistical analysis community, so that now a number of methods and associated computer packages exist to carry out source-attribution, i.e. estimating who-infected-whom in a particular outbreak. All the methods have their own limitations; a very common issue is that the models used to perform estimation are conditional upon the observed data, which can create estimation biases and lead to misleading results. In contrast, the method developed by Worby, Kypraios and O'Neill involves a model that can explain how the data arose, overcoming such problems. This project is concerned with developing this approach to both (i) extend the idea to more complex model settings, relaxing certain technical assumptions and (ii) improve computational efficiency. A highly-detailed data set on MRSA provided by collaborators at Guy's and St Thomas' hospital trust, London, provides one opportunity for applying such methods.
We live in a world that is, by nature, non-linear. Although linearity is often assumed, this is in general a convenient, yet forced, simplification. Overall, the project looks into improving the implementation of a statistical tool suitable to represent non-linear phenomenon: the Bayesian Additive Regression Tree (BART) model. In detail, this project aims to enhance the applicability of BART models through the delivery of two key outputs. First, we will develop a novel prior distribution for the structure of the trees in the BART. Second, we will develop a prior distribution to estimate the number of trees in the BART. This project aims to propose a novel loss-based approach to solve the above problems. This project will be developed in collaboration with Dr Cristiano Villa (Newcastle University).
Asthma is characterised by airway hyper-responsiveness, chronic inflammation and remodelling. Airway narrowing during an asthmatic attack is caused by rapid and excessive contraction of airway smooth muscle (ASM) cells lining the conducting airways. Over longer periods (months or years), repeated episodes of intense inflammation cause ASM proliferation leading to marked thickening (or remodelling) of the airway wall. While each of these three features contributes to asthma severity, how they interact is poorly understood. Most impor- tantly, it is not clear whether airway hyperresponsiveness or remodelling are causes or consequences of the disease.
We hypothesize that while airway remodelling is initiated by inflammatory mediators, it is perpetuated by mechanical factors. We know that cells respond to their physical environment through mechanotransduction, the translation of mechanical forces into biochemical signals. The cell changes that arise from this can lead to an altered cell microenvironment, creating a developmental feedback. Interplay between such mechanosentive pathways and other inter- and intra-cellular signalling mechanisms are therefore potentially responsible for maintaning a health airway. In this project we will use extensive data from a mouse model of asthma to develop computational biomechanial models of airway tissue combined with regulatory biochemical signalling to establish a mathematical description for the homoeostatic state (ie for healthy airways). This will then allow us to understand how perturbations to this homoeostatic state could drive airways into an asthmatic state, and ultimately to understand which processes are the cause of the disease.
I am looking to supervise PhD students interested in either gravitational physics / general relativity, or machine learning applied to scientific problems. Projects will be in the area of gravitational-wave (GW) astronomy, with focus on modeling and/or data analysis for gravitational systems such as black hole binaries. We aim to develop fast and accurate models and inference techniques, which we will then use to analyze real data from the LIGO-Virgo GW observatories and to make predictions for future detectors (LISA, ET). We can thereby precisely characterize sources, derive constraints on alternative gravity theories, and map out the GW universe.
To this end, my main research has been in developing probabilistic deep-learning techniques for Bayesian inference with GW data. This trains neural networks to analyze data in a fraction of the time of classical methods, with comparable levels of accuracy. Example projects could be either on the machine-learning side (developing faster and more accurate techniques, which could in principle be applied more broadly in science) or on applications (to future detectors, to new sources, and to populations of sources). On the theory side, I am interested primarily in modeling nonlinear effects using black hole perturbation theory, to study the behavior of black holes as they equilibrate following a merger, but I am happy to supervise interesting projects more broadly in general relativity.
The physical properties of all substances are determined by the interactions between the molecules that make up the substance. The energy surface corresponding to these interactions can be calculated from first-principles, in theory allowing physical properties to be derived ab-initio from a molecular simulation; that is by theory alone and without the need for any experiments. Recently we have focussed on applying these techniques to model carbon dioxide properties, such as density and phase separation, for applications in Carbon Capture and Storage. However, there is enormous potential to exploit this approach in a huge range of applications. A significant barrier is the computational cost of calculating the energy surface quickly and repeatedly, as a simulation requires. In collaboration with the School of Chemistry we have recently developed a machine-learning technique that, by using a small number of precomputed ab-initio calculations as training data, can efficiently calculate the entire energy surface. This project will involve extending the approach to more complicated molecules and testing its ability to predict macroscopic physical properties.
This project will be jointly supervised by Dr Richard Wheatley in the School of Chemistry.
Partial differential equations (PDEs) on complex domains (e.g., “swiss cheese” domains with holes or inclusions) are hugely relevant in the applied sciences, but they are notoriously difficult to solve numerically! Mimicking some concepts of a well-known mathematical description of fluid interfaces, known as diffuse-interface (or phase-field) model, the method of diffuse domains can reformulate PDEs on complex domains, imposing the boundary conditions weakly through a smoother indicator function defined on a homogeneous domain. This has several numerical and modelling advantages and it has been rigorously proved to be convergent to the solution on the exact (perforated) domains.
This project will explore this idea for heterogeneous materials and porous media, develop an implementation for advection-diffusion-reaction PDES, and study its macroscopic limit, when the inclusion diameters tend to zero, thereby providing a promising new approach to homogenisation of PDEs in complex domains.
Glioblastoma is the most aggressive adult brain tumour. Despite current treatment, the disease invariably returns after surgery, with survival sadly measured in months. Scientific evidence indicates that glioblastoma cells at the tumour’s edge, which infiltrate the healthy brain (termed ‘infiltrative margin’), closely resemble the eventual recurrent tumour. This is a crucial clue, as the glioblastoma infiltrative margin reflects disease which cannot be safely removed by surgery, and which ultimately causes the tumour to return. This project will work will local collaborators in the School of Medicine as well as international partners at the Mayo Clinic (Arizona, USA) to develop predictive models for glioblastoma recurrence.
Existing mathematical models for GBM have been shown to be useful in predicting patient survival based on image-derived model parameters. Complementary approaches to these mechanistic models have included machine learning based radiomics to fuse spatially localized biopsy and spatially resolved MRI data into predictive maps of molecular heterogeneity in unsampled regions. Subsequent model developments have included tissue state transitions in physiological features such as vascularity and hypoxia [1,3], and cellular features such as amplifications in certain driver genes (e.g., EGFR, PDGFRA) .
A key research question is whether we can build on this progress to make meaningful predictions of tissue-state transitions that occur prior to tumour recurrence, with the aim to substantially extend the quality and quantity of life for patients. Earlier prediction of tissue transitions preceding recurrence, using mechanistic mathematical models supported by patient data, will permit earlier initiation of second line treatment or patient-tailored experimental therapies post-surgery.
This project will develop mathematical models for glioblastoma growth, building on our improved understanding and ability to characterise heterogeneity using biopsy and imaging data. Models will typically take the form of systems of partial differential equations, although there will be scope to explore alternative formalisms such as individual-based and hybrid models. Model analysis will require a blend of mathematical analysis of reduced submodels (where tractable) and computer simulation, parameter inference and sensitivity analysis. As such, this will require a willingness and enthusiasm to engage with complex biological and mathematical concepts, as well as a signifiant component of computer programming and data analysis.
The primary function of blood vessels is to transport molecules to tissues. In diseases such as cancer and diabetes this transport, particularly of large molecules such as albumin, can be an order of magnitude higher than normal.
The project is to model transient flow of macromolecules across the vascular wall in physiology and pathology. With additional supervision from Dr Kenton Arkill and Professor David Bates (Medicine), the doctoral student will join a team that includes medical researchers, biophysicists and mathematicians acquiring structural and functional data.
Detailed microscale models of vascular wall hydrodynamics and transport properties will be employed; in addition, powerful multiscale homogenisation techniques will be exploited that enable permeability and convection parameters on the nanoscale to be linked through the microscale into translatable information on the tissue scale. Computational simulations will be used to investigate and understand the model behaviour, including, for example, stochastic and multiphysics effects in the complex diffusion-convection nanoscale environment. The project will afford a great opportunity to form an information triangle where modelling outcomes will determine physiological experiments to feedback to the model. Furthermore, the primary results will inform medical researchers on potential molecular therapeutic targets.
Many quantum physical systems (for example superconductors, superfluids, Bose-Einstein condensates) exhibit vortex states that can be described by Ginzburg-Landau type functionals. For various equations of motion for the physical systems, the dynamical behaviour of finite numbers of vortices has been rigorously established. We are interested in studying systems with many vortices (this is the typical situation in a superconductor). In the hydrodynamic limit, one obtains an evolution equation for the vortex density. Typically, these equations are relatives of the Euler equations of incompressible fluids: for the Gross-Pitaevskii equation (a nonlinear Schrödinger equation), one obtains Euler, for the time-dependent Ginzburg-Landau equation (a nonlinear parabolic equation), one obtains a dissipative variant of the Euler equations.
The goal of the project is to study the dissipative equations and to understand instabilities and low regularity solutions.
The project involves the development of mechanical models of peatland growth and restoration. Peat is a soft multiphase (solid, liquid, gas) material that stores 1/3 of earth’s terrestrial carbon. Current models combine mass balance and hydrology but none consider the mechanical stability of the peat. This is a huge oversight as the extremely weak multiphase peat body should deform with ease and this deformation must influence gas emissions and long term stability. The project will develop novel numerical models of peat growth and the mechanical response of peat to the changes in loading, mass balance and hydrology. The student will have the opportunity to visit peatlands in the UK and Malaysia and to link their work to geospatial observations.
This project is jointly supervised by Dr David Large in the School of Chemical & Environmental Engineering, Dr Bagus Muljadi in the School of Chemical & Environmental Engineering and Professor Neil Crout in the School of Biosciences.
Lung inflammation and airway hyperresponsiveness (AHR) are hallmarks of asthma, but their interrelationship is unclear. Excessive shortening of airway smooth muscle (ASM) in response to bronchoconstrictors is likely an important determinant of AHR. Hypercontractility of ASM could stem from a change in the intrinsic properties of the muscle, or it could be due to extrinsic factors such as chronic exposure of the muscle to inflammatory mediators in the airways with the latter being a possible link between lung inflammation and AHR. The aim of this project will be to investigate the influence of chronic exposure to a contractile agonist on the force-generating capacity of ASM via a cell-level model of an ASM cell. Previous experimental studies have suggested that the muscle adapts to basal tone in response to application of agonist and is able to regain its contractile ability in response to a second stimulus over time. This is thought to be due to a transformation in the cytoskeletal components of the cell enabling it to bear force, thus freeing up subcellular contractile machinery to generate more force. Force adaptation in ASM as a consequence of prolonged exposure to the many spasmogens found in asthmatic airways could be a mechanism contributing to AHR seen in asthma. We will develop and use a cell model in an attempt to either confirm this hypothesis or determine other mechanisms that may give rise to the observed phenomenon of force adaptation.
Fano varieties are one of the most fundamental spaces studied in algebraic geometry. They play an essential role in the Minimal Model Program, and their classification has been an open problem for centuries. Recent advances coming from birational geometry (for which Caucher Birkar - a former Nottingham PhD student - won the Fields Medal in 2018) and from Mirror Symmetry (an area of modern mathematics with its roots in theoretical physics) suggest that a general approach to classification may finally be possible.
Mirror Symmetry predicts a remarkable phenomenon: the numbers given by Gromov-Witten theory for a Fano variety - that is, the counts of the number of different paths a string can trace as it moves through space - can be reproduced by seemingly unrelated mathematical objects called Laurent polynomials. By understanding how to interpret the mathematics of Laurent polynomials in terms of geometry we are beginning to learn to see the structure of Fano varieties in new ways. In turn this is revealing previously unexplored commonalities between different areas of mathematics. The main aim of this project is develop the mathematics behind how Laurent polynomials can be used to classify three-dimensional terminal Fano varieties
This is an amazingly exciting area of modern algebraic geometry. The ideas are new and the mathematics involved is diverse and beautiful. One of the appealing aspects of this project is that you can get started straight away, producing results quickly and learning the material as you go. There is also the scope to adjust your approach depending on how your interests develop: this can be a very technical, abstract piece of work in deformation theory; it can focus on geometry or on combinatorics; and it can have large computational aspects.
Modern mobile and wireless communication is using ever more sophisticated antenna arrays (such as in massive MIMO (Multiple-Input Multiple-Output) systems) and smart reflecting surfaces to boost commination rates. Ray-tracing is a core tool in the physical modelling of these technologies – and in the very large and complex environments typically encountered, it’s often the only game in town.
This project centres around developing mathematical techniques that leverage ray-tracing simulations to gain as much information as possible about the communication rates that are possible in a given physical environment. This typically involves using the geometry of flows in a corresponding phase space to approximate intrinsically wave-based models of communication. The setting for this activity is directly analogous to “semiclassical” treatments of quantum mechanical wave phenomena, in which we aim to quantify eigenvalues of operators in terms of the properties of classical solutions and the geometry of their phase space.
In particular, for example, communication channels and their signal strengths for a massive MIMO set up can be obtained from an eigenvalue problem that is in some ways analogous to the problem of determining eigenvectors and eigenvalues of a quantum observable, such as the Hamiltonian. We borrow techniques developed in the quantum-mechanical context to glean useful information about these eigenvalues which allows us then to develop direct approximations of the overall communication rate. It should be noted that, although the foundation of the approach lies in quantum mechanics, an extensive background in this subject is not necessary to start the problem. The main ingredient will be an interest in wave phenomena generally and in methods of dealing with corresponding ray or mechanical phase spaces.
The output of the project will be directly relevant to an EU consortium RISE-6G, in which members of the School collaborate to create technologies in which Reconfigurable Intelligent Surfaces are used to control and direct mobile communication signals.
Project in collaboration with Dr. Bagus Muljadi (Engineering)
Porous media are ubiquitous in natural and engineered transport processes. When colloids or diffusive particles flows through their complex geometrical structure, non-trivial interactions arise between the advection, diffusion, particle-particle and particle-wall interactions. These processes can be modelled and simulated with computationally intensive three-dimensional simulations. In this project, a combination of rigorous multiscale analytical and multiscale numerical techniques (such as multiscale Finite Elements) will be used to derive and calibrate faster and simple models for filtration and adsorption processes. Extensions to include non-linear adsorption isotherms and non-linear reactions will be considered.
The project is part of a wider research effort within the GeoEnergy Research Centre that sees the collaboration of several UK and international academic partners, and industrial partners in the Automotive and Oil & Gas sector.
Whilst the dynamics of the DNA double helix are extremely complicated, a number of well-defined modes of vibration, such as twisting and bending, have been identified. At present the only accurate models of DNA dynamics involve large-scale simulations of molecular dynamics. Such approaches suffer two major drawbacks: they are only able to simulate short strands of DNA and only for extremely short periods (nanoseconds). the aim of this project is to develop simpler models that describe vibrations of the DNA double helix. The resulting systems of equations will be used to simulate the dynamics of longer chains of DNA over long timescales and, hence, allow larger-scale dynamics, such as the unzipping of the double helix, to be studied.
Polymers are very long chain molecules and many of their unique properties depend upon their long chain nature. Like simple fluids many polymer fluids crystallise when cooled. However, the crystallisation process is complicated by the way the constituent chains are connected, leading to a multitude of unexplained phenomena. Furthermore, if a polymer fluid is placed under flow, this strongly affects both the ease with which the polymer crystallises and the arrangement of the polymer chains within the resulting crystal. This project will develop molecular models and simulations for polymer dynamics and phase transitions using a range of analytical, numerical and stochastic techniques, with the ultimate aim of improving our understanding of polymer crystallisation.
Biofilms are communities of microorganisms living in a protective layer of slime, known as EPS (extracellular polymeric substances). They can be both problematic (e.g. contamination of medical implants) and beneficial (e.g. waste water treatment). This project will use mathematical modelling to understand why biofilms expand at different rates when cultured on agar of different water content. Our hypothesis is that EPS production can assist biofilm expansion by pulling water into the biofilm from the agar by osmosis, and that this may also increase nutrient delivery to the biofilm. A background in applied mathematics, physics or engineering will be helpful.
This PhD project will be undertaken jointly between the University of Nottingham and the University of Adelaide, Australia, and will involve spending time at both Universities. Supervision will be joint between Dr Leah Band and Dr Reuben O'Dea at Nottingham and Dr Edward Green and Dr Ben Binder in Adelaide. Please contact us for further information about how to apply!
Food security is the most pressing issue of the century – we desperately need to increase crop production to feed the growing world population in a sustainable manner. Understanding how hormones regulate a plant responses to environmental conditions represents a crucial target to accelerating the development of high-yielding crops that can withstand climate changes.
This PhD will involve developing and analysing ODE models to investigate how post-translational modifications affect the outputs of hormone signalling pathways within plants. These post-translational modifications play a major role in adaptation to environmental conditions, such as plant root branching in response to water availability (as published in Science by our collaborators - see details below). We will first model both mechanisms of the modication and the signalling network dynamics within single cells, using coupled systems of ODEs. These networks will then be embedded within multicellular geometries to create models to understand how the network dynamics affect patterns within the biological tissue, such as those that control root branching. These models can be studied either numerically or analytically (for example, using asymptotic analysis).
The mathematical models will help interpret data generated through the SumoCode project (www.sumocode.org) in collaboration with molecular biologists Prof Ari Sadanadom at Durham University and Prof Malcolm Bennett at the University of Nottingham.
The spread of so-called superbugs such as MRSA and other Antimicrobial Resistant pathogens within healthcare settings provides one of the major challenges to patient welfare within the UK. However, many basic questions regarding the transmission and control of such pathogens remain unanswered. This project involves stochastic modelling and data analysis using highly detailed data sets from studies carried out in hospital, addressing issues such as the effectiveness of patient isolation, the impact of different antibiotics, the way in which different strains interact with each other, and the information contained in data on high-resolution data (e.g. whole genome sequences).
The efficacy of lymphatic vessels for collecting and pumping lymph from interstitial tissues is key in returning fluid back to the cardiovascular system. Impaired pumping can lead to debilitating conditions such as lymphedema (often as a result of lymph node removal during breast cancer surgery). Upscaling current models of lymphatic pumping to large networks is currently computationally inefficient. This project aims to develop network models of lymphatic vessels that accounts for the complex active mechanics as well as valve behaviour of these vessels to provide deeper insights into how mechanical therapies can be developed to alleviate lymphedema.
When new infections emerge in populations (e.g. Covid-19; SARS; new strains of influenza), no vaccine is available and other control measures must be adopted. This project is concerned with addressing questions of interest in this context, e.g. What are the most effective control measures? How can they be assessed? The project involves the development and analysis of new classes of stochastic models, including intervention models, appropriate for the early stages of an emerging disease.
The vibro-acoustic response of mechanical structures (cars, airplanes...) can in general be well approximated in terms of linear wave equations. Standard numerical solution methods comprise the finite or boundary element method (FEM, BEM) in the low frequency regime and so-called Statistical Energy Analysis (SEA) in the high-frequency limit. Major computational challenges are posed by so-called mid-frequency problems - that is, composite structures where the local wave length may vary by orders of magnitude across the components.
Recently, I propsed a set of new methods based on ideas from wave chaos (also known as quantum chaos) theory. Starting from the phase space flow of the underlying - generally chaotic - ray dynamics, the new method called Dynamical Energy Analysis (DEA) interpolates between SEA and ray tracing containing both these methods as limiting cases. Within the new theory SEA is identified as a low resolution ray tracing algorithm and typical SEA assumptions can be quantified in terms of the properties of the ray dynamics. I have furthermore developed a hybrid SEA/FEM method based on random wave model assumptions for the short-wavelength components. This makes it possible to tackle mid-frequency problems under certain constraints on the geometry of the structure.
The PhD project wil deal with extending these techniques towards a DEA/FEM hybrid method as well as considering FEM formulations of the method. The work will comprise a mix of analytic and numerical skills and will be conducted in close collaboration with our industrial partners CDH AG, Germany and Jaguar/Landrover, Gaydon, UK.
A meta-material exhibits exotic properties, such as negative refraction, wave cloaking and non-reciprocal response, amongst others. These properties allow one to manipulate the propagation of waves in such a way as to, for example, realise an “invisible cloak”. Constructing meta-materials is not a trivial task, as one needs to judiciously design the material “atom-by-atom” in a periodic or aperiodic fashion using multiple-coupled geometrical and physical material parameters.
The fascinating properties of meta-materials occur at the interface where a continuum model can be used for the periodic structure (limit of large wave length) and the discrete “atomic” limit, where wave interference dominates (limit of small wave length). The aim of the project is to study this critical wavelength region using wave models on graph networks. The PhD project will develop such graph network models to study the dispersion relations of periodic meta-materials. The student will be introduced to the relevant electromagnetics theory and graph network techniques and will study the fundamental dynamics of rays and waves propagating through 1D and 2D meta-material structures modelling electromagnetic meta-surfaces, widely used for the manipulation of electromagnetic wave-front for fast signal processing and which can be realized in the laboratory. At a later stage of the project, the model parameters will be adjusted to mimic properties of meta-materials as they are produced in The Centre for Additive Manufacturing in Nottingham. Those structures are relevant for the next generation of electronics components and for cloaking of 3D objects.
Rechargeable batteries and other energy storage technologies are key elements for reaching a sustainable carbon-free energy market. The design and optimisation of battery packs and cells is still largely based on lengthy and costly experimental campaigns. In this project we will develop a new design approach, based on multiscale analysis, to find the optimal porous electrode structure, including phenomena like ageing and degradation. To achieve this, we will consider a fully resolved multiphase continuum PDE model and proper macroscopic limits analysed. Microscopic electrochemical models usually approximate the complex non-linear electrochemical reaction between electrode and electrolyte, with the so-called Butler Volmer kinetics. This is an effective law based on local equilibrium assumptions. In this project we will consider simplified geometries and the Poisson-Nernst-Planck system including double layer and short-range electrical interactions. Different diffusion models will be considered to describe transport in the solid electrode phase, where recent studies suggest a phase-change can happen, causing complex pattern formation and affecting the dynamics of the Solid-Electrolyte Interphase layer.
Stochastic (chemical) reaction networks are models for biophysical systems where each reaction describes a conversion of one or more chemical species to one or more other chemical species. Examples include Michaelis--Menten enzyme kinetic reactions, Togashi--Kaneko autocatalytic reactions, transciption/translation reactions etc. At the ecological scale, susceptible-infected-removed (SIR) epidemic models, predator-prey models could also be considered examples of stochastic reaction networks. The goal of this project is to provide rigorous probabilistic justification to various multiscale approximations such as the quasi-steady-state approximations (QSSAs) derived from deterministic ordinary/partial differential equations. We will also devise efficient simulation algorithms and statistical inference methodologies based on those multiscale approximations.
Cells respond to their physical environment through mechanotransduction, the translation of mechanical forces into biochemical signals; evoked cell phenotypic changes can lead to an altered cell microenvironment, creating a developmental feedback. Interplay between such mechanosentive pathways and other inter- and intra-cellular signalling mechanisms determines cell differentiation and, ultimately, tissue development. Such developmental mechanisms have key relevance to the initiation and development of cancer, a disease of such inherent complexity (involving the interaction of a variety of processes across disparate spatio-temporal scales, from intracellular signalling cascades to tissue-level mechanics) that, despite a wealth of theoretical and experimental studies, it remains a leading cause of mortality and morbidity: in the UK, more than one in three people will develop some form of cancer. There is therefore an urgent need to gain greater quantitative understanding of these highly complex systems, which are well-suited to mathematical study.
This project will develop a predictive framework, coupling key signalling pathways to cell- and tissue-level mechanics, to elucidate key developmental mechanisms and their interaction. Investigations will include both multiscale computational approaches, and asymptotic methods for model reduction and analysis. Importantly, model development, analysis and experimental validation will be enabled via close collaboration with Dr Robert Jenkins (Francis Crick Institute, a multidisciplinary biomedical discovery institute dedicated to understanding the scientific mechanisms of living things), thereby ensuring the relevance of the investigations undertaken.
Experience of mathematical/numerical techniques for ODEs and PDEs, the study of nonlinear dynamical systems, or mathematical biology more generally would be an advantage; prior knowledge of the relevant biology is not required.
Multiscale biological models involve coupling cellular and subcellular processes to understand how they impact the emergent growth and development of the organ or organism. These models typically involve coupling multiple smaller models, for example, coupling ODE network models, multicellular models, fluid dynamics or biomechanical growth models. This project will involve developing a novel multiscale model to understand how hormones influence plant growth. This understanding is of fundamental importance to determine how crop yields are affected by environmental stresses, such as drought. Given food security is one of the most pressing issues of this century, such knowledge is essential to developing crops that can sustain high yields despite climate change.
It is well established that plant hormones, such as auxin and GA, orchestrate growth and responses to environmental stress. However, despite its importance, we lack fundamental knowledge of how the overall hormone distribution is regulated, and how this impacts plant growth and responses to environmental conditions such as drought. The relative importance of local synthesis, cell-to-cell transport, and long-distance transport (through advection with water) is unknown. This project will address this question by developing novel multiscale models, including processes such as multicellular models, fluid dynamics, advection-diffusion equations and ODE network modelling.
The models will be developed in collaboration with biologists within Plant Science at the University of Nottingham and with Dr Alexander Jones at the University of Cambridge. Experience with ODEs/PDEs and mathematical biology would be beneficial, although prior knowledge of plants or the specific biology is not required.
Most human tissues are perfused by an evolving network of blood vessels which supply nutrients to (and remove waste products from) the cells. The growth of this network (via vasculogenesis and angiogenesis) is crucial for normal embryonic and postnatal development, and its maintenance is essential throughout our lives (e.g. wound healing requires the repair of damaged vessels). However, abnormal remodelling of the vasculature is associated with several pathological conditions including diabetic retinopathy, rheumatoid arthritis and tumour growth.
The phenomena underlying tissue vascularisation operate over a wide range of time and length scales. These features include blood flow in the existing vascular network, transport within the tissue of blood-borne nutrients, cell division and death, and the expression by cells of growth factors such as VEGF, a potent angiogenic factor. We have developed a multiscale model framework for studying such systems, based on a hybrid cellular automaton which couples cellular and subcellular dynamics with tissue-level features such as blood flow and the transport of growth factors. This project will extend and specialise our existing model to focus on particular applications in one of the following areas: wound healing, retinal angiogenesis, placental development, and corpus luteum growth. This work would require a significant element of modelling, numerical simulation and computer programming.
The number of neurons in the brain is immense (of the order of 100 billion). A popular approach to modelling such cortical systems is to use neural field models which are mathematically tractable and which capture the large scale dynamics of neural tissue without the need for detailed modelling of individual neurons. Neural field models have been used to interpret EEG and brain imaging data as well as to investigate phenomena such as hallucinogenic patterns, short-term (working) memory and binocular rivalry.
A typical formulation of a neural field equation is an integro-differential equation for the evolution of the activity of populations of neurons within a given domain. Neural field models are nonlinear spatially extended pattern forming systems. That is, they can display dynamic behaviour including spatially and temporally periodic patterns beyond a Turing instability in addition to localised patterns of activity. The majority of research on neural field models has been restricted to the line or planar domains, however the cortical white matter system is topologically close to a sphere. It is relevant to study neural field models as pattern forming systems on spherical domains, particularly as the periodic boundary conditions allow for natural generation (via interference) of the standing waves observed in EEG signals.
This project will build on recent developments in neural field theory, focusing in particular on extending to spherical geometry the neural field equations arising from “Next generation neural mass models” (which incorporate a description of the evolution of synchrony within the system). Techniques from dynamical systems theory, including linear stability analysis, weakly nonlinear analysis, symmetric bifurcation theory and numerical simulation will be used to consider the global and local patterns of activity that can arise in these models.
Many wave guides (such as optical fibres) show a Kerr-type effect that leads to nonlinear wave propagation. If th wave guides are coupled at junctions then there is an additional element of complexity due to the non-trivial connectivity of wave guides. In this project the impact of the structure and topology of the network on wave propagation will be studied starting from simple geometries such as a Y-junctions (three waveguides coupled at one junction), a star (many waveguides at one junction), or a lasso (a waveguide that forms a loop and is connected at one point to a second waveguide).
Tensor products play an important role in representation theory and in several of its applications to mathematical physics. From an algebraic point of view, representations of Hopf algebras, such as quantum groups, give interesting examples of tensor categories. Certain classes of tensor categories, called modular categories are of particular interest as they can used to construct 3D topological quantum field theories (TQFTs). In most of the literature, modular categories are fusion, i.e. semisimple. Recently, several of the applications of modular categories have been extended to the non-semisimple case, including constructions of modular functors, mapping class group actions, and 3D TQFTs. Therefore, it is an interesting question to construct more examples of such modular categories that are non-semisimple. Several questions remain to be explored in this direction.
Secondary Supervisors: Sian Bray (Biosciences), Cyril Rauch (Vet School)
With modern genetic sequencing, there is a mass of genetic data available. In determining the effects of genetic variations on physical characteristics (phenotypes), it is appealing to think there is a gene for 'X'. However, in reality, many phenotypic traits depend rather weakly on many genes a.k.a. small gene effects linked to complex phenotypic traits. The aim of this project is to develop mathematical and statistical tools to extract, from real data, these weak associations between phenotype and genotype. A range of statistical and mathematical modelling tools will be used to analyse the shape of distributions, correlations, using ideas from Shannon information theory. Initial data comes from the Bray-lab who are working on the ion uptake and equilibration in Arabidopsis -a model plant whose genome is well-sequenced. (This project is available as part of the BBSRC DTP).
The project belongs to the following areas of Mathematics: Stochastic Analysis, Applied Probability, Numerical Analysis.
Numerics for stochastic partial differential equations (SPDEs) is one of the central topics in modern numerical analysis. It is motivated both by applications and theoretical study. SPDEs essentially originated from the filtering theory and now they are also widely used in modelling spatially distributed systems from physics, chemistry, biology and finance acting in the presence of fluctuations. The primary objectives of this project include construction, analysis and testing of new numerical methods for SPDEs.
Network data is now routinely available from a variety of applications, including in social media, corpus linguistics and neuroimaging. Less common is the study of samples of networks, for example collected over time or at random from a population. The project will take an object oriented data analysis approach, where the first questions of interest are what are the data objects, what space do they lie and how are they represented in feature space. Networks can be compared by using metrics on the space of graph Laplacians, with the Frobenius norm being used most commonly. We will develop statistical methodology using other metrics, and also develop statistical procedures in the resulting manifolds. Motivating applications include large social and financial datasets from developing economies. Such datasets are often very sparse and very noisy, and so the appropriate handling of uncertainty in the analysis of the networks is paramount.
This project will be jointly supervised by James Goulding (N/LAB, Business School).
Asthma is a complex, multifactorial chronic airway disease affecting ~300 million people worldwide. It is characterised by acute episodes of airway hyperresponsiveness (a rapid constriction of the bronchial airways to much lower doses of contractile agonist than in non-asthmatic subject) and bronchoconstriction. Treatment of these acute events often involves the inhalation of aerosolised pharmaceuticals, such as corticosteroids, however delivery of these therapies to the target airways is difficult to predict due to the complex dynamics of airway constriction. We have previously developed multi-scale mathematical models of airway smooth muscle (ASM) contraction which show that under certain conditions deep inspirations (DIs) can counteract this bronchoconstriction and allow airways to reopen, but that in others DIs may have a detrimental effect and enhance the constrictions [2,3]. The lung however consists of tens of thousands of airways, and the interactions between these airways within a branching network can lead to complex emergent phenomena  that make predicting outcomes of particular treatments or interventions for specific asthmatic individuals extremely difficult . Thus, to better understand under what conditions deep inspirations, and subsequent delivery of inhaled treatment may have a beneficial effect, we propose to develop a computational model that incorporates the detailed active mechanical responses of each individual airway into a coupled model of the whole airway tree.
Working with Dr Carl Whitfield at the University of Manchester, we will
In collaboration with Prof Chris Brightling (a respiratory clinician at the University of Leicester), there will be the opportunity to test and validate these models against existing breath measurement datasets (e.g. forced oscillometry, multiple breath washout, and/or measurements of exhaled volatile compounds) from patient cohorts in longitudinal studies.
Strains of yeast used in the wine fermentation process can be enhanced through directed evolution in lab biostats. Starting strains are exposed poor nutritional or alcohol rich environments: those that survive these harsh environments can then be used in subsequent experiments, re-enforcing desirable genetic traits. In this project, we will develop stochastic and deterministic mathematical models of the dynamics of yeast strains bred in these conditions. The aim is to optimise experimental conditions that maximise the evolution of yeast suitable for use in the wine fermentation process. A background in applied mathematics and some knowledge of computer programming will be required.
This PhD project will be undertaken jointly between the University of Nottingham and the University of Adelaide, Australia, and will involve spending time at both Universities. Supervision will be joint between Dr Leah Band and Dr Reuben O'Dea at Nottingham and Dr Ben Binder and Dr Edward Green in Adelaide. Please contact us for further information about how to apply!
L-functions are complex analytic functions that have become central in modern number theory, bridging across arithmetic, geometry, algebra and analysis. They are the starring players in two of the 1-million-dollar Millennium problems: The Riemann Hypothesis and The Birch and Swinnerton-Dyer (BSD) Conjecture.
The Riemann zeta function is the meromorphic function of a complex variable s, defined e.g. here. It turns out that the values of this highly transcendental function are rational numbers that satisfy beautiful, deep congruence properties modulo powers of any prime number p. This is the starting point for, and the 1-dimensional case of, the study of p-adic L-functions and Iwasawa theory.
At its heart, Iwasawa theory seeks to find mysterious connections between algebraic objects (e.g. the solutions to algebraic equations) and analytic ones (L-functions). It has led to some of the most remarkable results in modern number theory, for example Kolyvagin's proof of BSD in analytic ranks 0 and 1.
Our understanding of p-adic L-functions is fairly limited beyond dimensions 1 and 2. This project would seek to study p-adic L-functions in higher-dimensional settings.
Interested students are strongly encouraged to get in touch with me via email, where I can share more details about Iwasawa theory and p-adic L-functions.
Inflammation, airway hyperresponsiveness and airway remodelling are key characteristics of asthma, but it is unclear how they are interconnected. A recent comprehensive experimental in vivo asthma mouse study quantifying structural changes and how they relate to the inflammatory state in the airway, has generated an unprecedented amount of data to inform a mechanistic model airway remodelling in asthma . Some parameter sets are well-defined from experimental data but others provide high levels of uncertainty in parameter value and model selection. This project seeks to understand how uncertainties in parameter estimation and model selection influence prediction, adding significant value to ongoing experiments, and dramatically increase the predictive power of the mechanistic model.
Phase-field modelling of evolving interfaces (Or – How does one effectively model and simulate interfacial phenomena?)
Evolving interfaces are ubiquitous in nature, think of the melting of the polar ice caps, the separation of oil and water, or the growth of cancerous tumours. Two mathematical descriptions exist to model evolving interfaces: those with sharp-interface descriptions, such as parametric and level-set methods, and those with diffuse-interface descriptions, commonly referred to as phase-field models.
Depending on the interest of the student, one of these issues (or others) can be addressed. Also, the student is encouraged to suggest a second supervisor, possibly from another group!
Phase-field modelling is emerging as a promising tool for the treatment of problems with interfaces. The classical description of interface problems requires the numerical solution of partial differential equations on moving domains in which the domain motions are also unknowns, thereby requiring a computational treatment that includes moving meshes. Phase-field modelling may be understood as a methodology to reformulate interface problems as equations posed on fixed domains.
This project will explore application of phase-field modelling on interfaces coated with surfactants in multi-phase systems. Surfactants are surface active agents that are known to affect the surface tension of interfaces and their presence can significantly alter the dynamics of flows. They are used in a range of processes and applications such as foam fabrication, emulsification or mixing. Of particular interest is the case when the surfactant is soluble in the bulk of the fluid, especially when the bulk concentration exceeds the critical micelle concentration, beyond which the molecules start forming multi-molecule aggregates called micelles. One of the aims of this project is to perform a formal asymptotic analysis in order to connect phase-field models to sharp interface models in the appropriate limit, and thereby discovering new model descriptions and computational approaches to multi-phase flows with soluble surfactants. Special cases might include study of thin films and/or film rupture.
Phase-reduced models, where oscillator dynamics are reduced to the dynamics of their phase on limit cycle, have been extremely successful in describing dynamical behaviours of networks of coupled oscillators in the case where individual oscillators possess a strongly attracting limit cycle and coupling between oscillators in the network is weak . However, for many biological (and particularly neural) oscillator networks these modelling assumptions are not appropriate; numerical simulations and results from other modelling techniques have revealed dynamics that cannot be captured in a framework where only phases of oscillators are considered .
The failure of phase reduced models to describe the known dynamics begs the question of whether an alternative framework that also tracks some notion of distance from the limit cycle, along with the phase on limit cycle, can more accurately describe observed dynamics. Such a framework, using phase-isostable coordinates has recently been proposed [3,4], but its ability to describe a range of observed network behaviours has yet to have been thoroughly investigated. The aims of this project will be to first investigate conditions under which the phase-isostable framework can accurately characterise the response of single oscillators to external forcing. After making any required refinements to the framework, network equations in phase-isostable coordinates will be developed and used to investigate network dynamics beyond the weak coupling limit. The network analysis will utilise any symmetries of the network dynamical system allowing for prediction of when certain cluster and chimera states may exist. It is anticipated that the phase-isostable paradigm will also be able to reveal a rich variety of more complex dynamical network behaviour.
Applied mathematics and statistics traditionally use very different types of modelling framework. In applied maths, models tend to be based upon physical laws of nature and first principles: our ability to model relies upon having a sound scientific understanding of the phenomena being modelled. In contrast, statistical models tend to be based on data and observed correlations: our ability to build accurate models relies upon having enough high-quality data.
This project will focus on developing machine learning models that combine scientific knowledge and empirical learning. In the first instance we will focus on incorporating differential equations into Gaussian process models (which are a key non-parametric modelling framework widely used in statistics and machine learning). This is an area of rapidly growing interest within machine learning, particularly as the tech companies (Google, Microsoft, Uber etc) realise that to make progress in some problems requires models that can meld data and expert knowledge.
The project will develop the underlying mathematics, using exemplar problems to guide this development.
Banach function algebras are complete normed algebras of bounded continuous, complex-valued functions defined on topological spaces. There are very many different examples with a huge variety of properties. Two contrasting examples are the algebra of all continuous complex-valued functions on the closed unit disc, and the subalgebra of this algebra consisting of those functions which are continuous on the closed disc and analytic on the interior of the disc. In the second of these algebras, any function which is zero throughout some non-empty open set must be constantly zero. This is very much not the case in the bigger algebra: indeed Urysohn’s lemma shows that for any two disjoint closed subsets of the closed disc, there is a continuous, complex-valued function defined on the disc which is constantly 0 on one closed set and constantly 1 on the other (algebras of this type are called regular algebras).
Most Banach function algebras have some features in common with one or the other of these two algebras. The aim of this project is to investigate a variety of conditions (including regularity conditions) for Banach function algebras, to relate these conditions to each other, and to other important conditions that Banach function algebras may satisfy, and to investigate the preservation or introduction of these conditions when you form various types of extension of the algebras (especially ‘algebraic’ extensions such as Arens-Hoffman or Cole extensions).
Quantum graphs are a paradigm model to understand and analyse the effect of complexity on wave propagation and excitations in a network of wires. They have also been used as a paradigm model to understand topics in quantum and wave chaos where the complexity has a different origin while the mathematical framework is to a large extent analogous.
Many properties of the waves that propagate through such a network can be described in terms of trajectories of a point particle that propagates through the network. The ideas is to write a property of interest as a sum over amplitudes (complex numbers) connected to all possible trajectories of the point particle. These sums remain challenging objects for explicit evaluations. Recently a numer of advanced methods for their summation have been introduced. The latter are built on so-called pseudo-orbits. In this project these methods will be develloped further and applied to questions related to quantum chaos and random-matrix theory.
Dr. Pumplün currently studies forms of higher degree over fields, i.e. homogeneous polynomials of degree d greater than two (mostly over fields of characteristic zero or greater than d). The theory of these forms is much more complex than the theory of homogeneous polynomials of degree two (also called quadratic forms). Partly this can be explained by the fact that not every form of degree greater than two can be “diagonalized”, as it is the case for quadratic forms over fields of characteristic not two. (Every quadratic form over a field of characteristic not two can be represented by a matrix which only has non-zero entries on its diagonal, i.e. is diagonal.)
A modern uniform theory for these forms like it exists for quadratic and symmetric bilinear forms (cf. the standard reference books by Scharlau or Lam) seems to be missing, or only exists to some extent. Many questions which have been settled for quadratic forms quite some time ago are still open as as soon as one looks at forms of higher degree. It would be desirable to obtain a better understanding of the behaviour of these forms. First results have been obtained. Another related problem would be if one can describe forms of higher degree over algebraic varieties, for instance over curves of genus zero or one.
Dr. Pumplün is also studying nonassociative algebras over rings, fields, or algebraic varieties. Over rings, as modules these algebras are finitely generated over the base ring. Their algebra structure, i.e. the multiplication, is given by any bilinear map, such that the distributive laws are satisfied. In other words, the multiplication is not required to be associative any more, as it is usually the case when one talks about algebras. Her techniques for investigating certain classes of nonassociative algebras (e.g. octonion algebras) include elementary algebraic geometry. One of her next projects will be the investigation of octonion algebras and of exceptional simple Jordan algebras (also called Albert algebras) over curves of genus zero or one. Results on these algebras would also imply new insights on certain algebraic groups related to them.
Another interesting area is the study of quadratic or bilinear forms over algebraic varieties. There are only few varieties of dimension greater than one where the Witt ring is known. One well-known result is due to Arason (1980). It says that the Witt ring of projective space is always isomorphic to the Witt ring of the base field.
If you want to investigate algebras or forms over algebraic varieties, this will always involve the study of vector bundles of that variety. However, even for algebraically closed base fields it is usually very rare to have an explicit classification of the vector bundles. Hence, most known results on quadratic (or symmetric bilinear) forms are about the Witt ring of quadratic forms, e.g. the Witt ring of affine space, the projective space, of elliptic or hyperelliptic curves. An explicit classification of symmetric bilinear spaces is in general impossible because it would involve an explicit classification of the corresponding vector bundles (which admit a form). There are still lots of interesting open problems in this area, both easier and very difficult ones.
From the beginning of the 20th century it was observed that quadratic forms over a given field carry a lot of information about that field. This led to the creation of rich and beautiful Algebraic Theory of Quadratic Forms that gave rise to many interesting problems. But it became apparent that quite a few of these problems can hardly be approached by means of the theory itself. In many cases, solutions were obtained by invoking arguments of a geometric nature. It was observed that one of the central questions on which quadratic form theory depends is the so-called "Milnor Conjecture". This conjecture, as we now understand it, relates quadratic forms over a field to the so-called motivic cohomology of this field. Once proven, this would provide a lot of information about quadratic forms and about motives (algebro-geometric analogues of topological objects) as well. The Milnor Conjecture was finally settled affirmatively by V. Voevodsky in 1996 by means of creating a completely new world, where one can work with algebraic varieties with the same flexibility as with topological spaces. Later, this was enhanced by F. Morel, and now we know that quadratic forms compute not just the cohomology of a point in the "algebro geometric homotopic world", but also the so-called stable homotopy groups of spheres as well. It is thus no wonder that these objects indeed have nice properties.
Therefore, by studying quadratic forms, one actually studies the stable homotopy groups of spheres, which should shed light on the classical problem of computing such groups (one of the central questions in mathematics as a whole). So it is fair to say that the modern theory of quadratic forms relies heavily on the application of motivic topological methods. On the other hand, the Algebraic Theory of Quadratic Forms provides a possibility to view and approach the motivic world from a rather elementary point of view, and to test the new techniques developed there. This makes quadratic form theory an invaluable and easy access point to the forefront of modern mathematics.
Graphs consist of \(V\) vertices connected by \(B\) bonds (or edges). They are used in many branches of science as simple models for complex structures. In mathematics and physics one is strongly interested in the eigenvalues of the \(V x V\) connectivity matrix \(C\) of a graph. The matrix element \(C_ij\) of the latter is defined to be the number of bonds that connect the i'th vertex to the j'th vertex. In this PhD project the statistical properties of the connectivity spectra in (generally large) graph structures will be analysed using methods known from quantum chaos. These methods have only recently been extended to combinatorial graphs (Smilansky, 2007) and allow to represent the density of states and similar spectral functions of a graph as a sum over periodic orbits. The same methods have been applied successfully to metric graphs and quantum systems in the semiclassical regime for more than two decades.
Photons do not directly interact with each other, but effective interactions between them can be obtained by exploiting the mediation of matter. Perhaps the most common systems in which these effective interactions are achieved are nonlinear crystals. The sophistication of modern experiments, however, allows us to consider the mediation of quantum systems such as single atoms and optomechanical devices, where light is coupled to the vibrational degrees of freedom of a mesoscopic mirror via radiation pressure.
For example, in typical experimental conditions, the interaction between light and a moving mirror (both of them being harmonic oscillators) is approximated well by "bilinear" Hamiltonians, such that when a photon is absorbed (emitted) an elementary excitation (phonon) of the oscillating mirror is created (annihilated) . Truly nonlinear contributions that describe e.g. the conversion between two photons and one phonon are typically negligible. Such non-linear interactions (i.e. non-bilinear Hamiltonians) are, however, of great importance for many applications of photonics. To give a paradigmatic example, it has been shown that purely quadratic Hamiltonians are not sufficient to achieve universal quantum computation: a form of ‘nonlinearity’ is always required .
By exposing a quantum system to a periodic control field one can change its properties substantially. For example, if a vibrating mirror or atom is manipulated appropriately, then its interactions with other systems may be modified and controlled to a significant extent. In turn, this may permit us to effectively enhance the non-linear photonic processes mediated by these systems. The goal of the project is indeed to identify optimally designed control sequences that realise robust non-linear interactions between photons.
You would start the project by studying an interaction between two modes of resonator (cavity) mediated by an atom that is modeled by a ground state and two excited states (known as V-system). Each of the atomic transitions is close to resonance with a cavity mode. The goal of this sub-project would be to identify time-dependent control fields for the atom that results in nonlinear effective Hamiltonians for the light fields. While the project will involve some numerical work, this will be complemented and guided by various analytical techniques for the approximate description of time-dependent Hamiltonians.
In the rest of your PhD project you will investigate a variety of quantum systems, and assess their capability of mediating photonic interactions. Optomechanical systems will likely be your object of study, and it is probable that other promising research directions will have arisen during the first few months. Since oscillating mirrors are mesoscopic objects, they typically feature substantial dissipation that can not be neglected. You would thus need to study the quantum control of dissipative systems, which requires a more general theoretical framework as compared to the purely Hamiltonian dynamics discussed earlier [3, 4]. Depending on your preliminary results and inclinations, you may identify a variety of long-term goals for this project. These may include the creation, verification and exploitation of light-matter entanglement, or the use of the achieved non-linear interactions for fundamental tests of quantum mechanics (e.g. the study of macroscopic superpositions and their implications in gravitational collapse models) [3, 4].
This project will involve close collaboration with Dr. Florian Mintert of the Controlled Quantum Dynamics theory group, Imperial College London.
Statistical inference and learning play an increasing role in Quantum Engineering and Quantum Metrology. The efficient statistical reconstruction of quantum states is a crucial enabling tool for current quantum engineering experiments in which multiple qubits can be prepared in exotic states. However, standard estimation methods such as maximum likelihood become practically unfeasible for systems of merely 10 qubits, due to the exponential growth in size of the Hilbert space.
The aim of this project is to develop mathematical theory and investigate new methods for learning quantum states of large dimensional quantum systems. This stems from ongoing collaborations with Theo Kypraios and Ian Dryden (Statistics group, Nottingham), Cristina Butucea (Univesite Paris Est), Michael Nussbaum (Cornell), Jonas Kahn (Toulouse) and Richard Kueng (Caltech). In [1,2] we proposed and analysed faster estimation methods with close to optimal accuracy. The first goal is to better understand the behaviour of the estimators with respect to different measurement scenarios. Next, we would like to equip them with reliable confidence regions (error bars) which are crucial for experimental applications. Going beyond "full state tomography" new methods are needed which are able to "learn" the structure of the quantum state by making use of prior information encoded in physically relevant low dimensional models. Possible directions to be explored include models based on matrix product states, neural networks, quantum time series, compressed sensing  and the study of the asymptotical structure of the statistical models . The project will involve both theoretical and computational work at the overlap between quantum information theory and modern statistical inference.
Click here to find more information on this topic and some illustrations of different types of estimators. For more about my research interests you can visit my homepage..
Optomechanics investigates the interaction of quantised light (photons) with microscopic vibrating objects such as mirrors, dielectric membranes or levitated nanoparticles . Such interaction takes place via radiation pressure, a phenomenon initially predicted by Johannes Kepler in 1619 in the context of astronomy, and nowadays observed even at the single-photon level. Among other applications, optomechanics embodies a promising experimental platform to probe quantum effects in massive objects, and hence investigate the classical/quantum boundary.
The student will initially review a widely used effective Hamiltonian for cavity optomechanics (the "linear model") , which is analytically solvable and predicts the generation of nonclassical states of light as well as light-matter entanglement. Subsequently, he/she will delve into the more rigorous canonical quantization of an optomechanical system , and assess the crucial limitations of the more basic model. Unfortunately, the more rigorous "mircoscopic" Hamiltonian is currently intractable both analytically and numerically, so that novel approximation techniques will need to be developed to improve the basic model while retaining computability. An example of a preliminary study in this direction can be found in Ref. .
The broad objective of the project will be to develop new optomechanical models that strike an optimal balance between reliability and tractability. These will then be used to verify and refine a number of theoretical predictions that have been made in the literature, tipically based on the linear model alone. Such predictions pertain a variety of applications of optomechanical systems, ranging from quantum information science to gravitational wave detection and Planck-scale physics . At the same time, an improved theoretical description will give us an opportunity to explore new physical effects and applications of these systems.
A futher ambitious goal of the project will be to develop a rigorous open quantum system model for optomechanics, improving the phenomenological approaches that are currently used in the literature.
Depending on the inclinations of the student, more emphasis can be put on either analytical or numerical work (e.g. via Matlab, Python or Mathematica).
The emergence of quantum information theory in the last three decades has led to a crucial reassessment of quantum effects such as superposition and entanglement: from poorly understood and even paradoxical concepts, these are now regarded as fundamental ingredients to achieve tasks otherwise impossible within the realm of classical physics, thus enabling a wealth of innovative technologies. At the core of this revolution lies the formalisation and characterisation of such phenomena as physical resources. Initiated with quantum entanglement, and successfully applied e.g. to purity, coherence , and informational nonequilibrium in thermodynamics, this application-driven viewpoint motivates the formulation of resource theories, that is, quantitative theories capturing the resource character of physical traits in a mathematically rigorous fashion. General studies on resource theories have shown that all quantum states which do not belong to a convex subset of "free" states can be regarded as resources, in the sense that they provide a quantifiable advantage in operational tasks such as channel discrimination .
This project aims to advance the current frontiers of knowledge on resource theories for quantum phenomena and physics more broadly. In particular, possible directions include:
The project deals with properties of quantum networks, that is, of networks on which unitary (wave-) evolution takes place along edges with scattering at the vertices. Such systems have been studied in the context of quantum information as well as in quantum chaos. It has been noted that a quadratic speed up of quantum random walks on these networks over classical random walks can be found on certain graphs with applications for search algorithms and search engines. The speed up has often be traced back to wave interference effects due to symmetries in the quantum propagation.
Recently, it has been shown, that quantum searching can also been undertaken on random graphs, that is, graphs for which connections between edges are given only wth a certain probability - so called Erdös-Rényi graphs. We will explore this new set-up for quantum searching and develop statisticsal models for the arrival times and success probabilities as well extend the model to realistic graph set-ups.
The project will be a good mix of graph theory, quantum mechanics and will require both analytical and numerical skills.
A primary limitation on the resolution of all optical devices is imposed by diffraction on the rim of the objective lens. Although this limit has been known for 150 years and appeared unsurmountable, modern optics showed that it can be beaten by extracting not just the intensity distribution in the image plane as in conventional direct imaging, but the correlation of electromagnetic field amplitudes at different transverse positions .
The main goal of this PhD project is to discover the ultimate physical limits of resolution and develop an imaging technology able to approach them in realistic conditions and at all scales. The project will utilise theoretical tools from quantum estimation theory  to investigate fundamental bounds on the precision with which a discrete or continuous set of parameters characterising an image can be simultaneously estimated by means of optimal detection strategies, and investigate to what extent artificial intelligence methods can further enhance the image reconstruction process in the presence of external noise and imperfections.
Quantum Technology is a fast developing field which aims to harness quantum phenomena such as entanglement and superposition in a broad array of applications ranging from secure communication and faster computation to high precision metrology and imaging.
Building a successful quantum device relies on the ability to prepare quantum systems in specifically designed states, and to accurately manipulate and measure the different components of the device.
Since quantum measurements are intrinsically stochastic, statistical inference plays a key role, enabling the experimenter to interpret the measurement data and validate the functioning of the device.
An important component of the quantum engineering toolbox is quantum tomography: the estimation of unknown quantum states based on random outcomes of measurements performed on identically prepared quantum systems. Although many estimation methods have been explored theoretically and experimentally, there is currently a need for new techniques to deal with inference for high dimensional quantum states.
This PhD project aims to develop efficient methods for computing point estimators and confidence regions for multipartite quantum states. In particular we will be interested in statistical models which take into account prior information about the state, in the form of correlation structure, rank, or symmetry. The project involves both theoretical and computational work; prior knowledge of quantum mechanics is beneficial but is not an absolute requirement. The PhD student with work together with Dr Madalin Guta and Dr Theo Kypraios who have a leading expertise in statistical aspects of quantum theory and have developed a range of computational tools for quantum tomography, see [1,2,3,4]. The project builds on the group's prior work in the field and will benefit from contacts with external collaborations on both theoretical and practical aspects.
Banach function algebras are complete normed algebras of bounded, continuous, complex-valued functions defined on topological spaces. There are very many different examples with a huge variety of properties. Two contrasting examples are the algebra of all continuous complex-valued functions on the closed unit disc, and the subalgebra of this algebra consisting of those functions which are continuous on the closed disc and analytic on the interior of the disc. In the second of these algebras, any function which is zero throughout some non-empty open set must be constantly zero. This is very much not the case in the bigger algebra: indeed Urysohn’s lemma shows that for any two disjoint closed subsets of the closed disc, there is a continuous, complex-valued function defined on the disc which is constantly 0 on one closed set and constantly 1 on the other (algebras of this type are called regular algebras).
Most Banach function algebras have some features in common with one or the other of these two algebras. The aim of this project is to investigate a variety of conditions, especially regularity conditions, for Banach function algebras, and to relate these conditions to each other, and to other important conditions that Banach function algebras may satisfy.
Regularity conditions have important applications in several areas of functional analysis, including automatic continuity theory and the theory of Wedderburn decompositions. There is also a close connection between regularity and the theory of decomposable operators on Banach spaces.
Understanding the mechanisms underlying patterns of SARS-CoV-2 transmission is important for informing strategic public health measures against respiratory pathogens more widely. Universities were identified as likely to support rapid transmission of a new pathogen such as SARS-CoV-2 in their usual mode of operation. This project will build on expertise developed during the SARS-CoV-2 pandemic to address key outstanding questions about the degree to which outbreaks can be mitigated while maintaining in-person activities, the extent to which institutional outbreaks are coupled to the epidemic in the surrounding community, and the role of demographic turnover and other seasonal effects.
In contrast to dynamics at the population-level, transmission dynamics in a university setting may be strongly influenced by the local contact network. This project will take a data-driven approach to motivate and construct a network or multitype epidemic model for a university, with opportunity to calibrate this model based on a variety of epidemiological data related to SARS-CoV-2 testing. This modelling may then be extended to explore a number of related open questions, such as the role of waning immunity on dynamics in endemic phases of viral circulation and/or the dynamics of respiratory disease transmission in other institutional settings.
The project belongs to the following areas of Mathematics: Stochastic Analysis, Applied Probability, Statistics, Machine Learning and Numerical Analysis
State-of-the-art applications in molecular dynamics (e.g., simulations of constrained behaviour between bonds), biomedical imaging (e.g., predicting tumour trajectory), language models in Natural Language Processing (e.g., optimising over constrained matrices), and many more, demand development of scalable, efficient computational and statistical tools that exploit the geometry of parameter spaces. At the core of such development is the ability to sample from probability distributions on geometric spaces such as manifolds.
This project is devoted to the development of novel numerical methods for ergodic stochastic differential equations on manifolds (e.g., 3D rotations) and more general non-Euclidean spaces (e.g., stratified spaces) and their stochastic numerical analysis, with a view towards using them to sample from distributions on such spaces thus facilitating statistical inference and optimisation.
We require an enthusiastic graduate with a 1st class degree in Mathematics, preferably at MMath/MSc level (in exceptional circumstances a 2:1 class degree, or equivalent, can be considered). We are expecting that the successful applicant has a very good background in Probability, good computational skills and some knowledge of differential geometry.
For any enquiries please email: Karthik.Bharath@nottingham.ac.uk and Michael.Tretyakov@nottingham.ac.uk
Relevant Publications Milstein, G.N. and Tretyakov, M.V. (2021) Stochastic Numerics for Mathematical Physics. Series: Scientific Computation, Springer Web-pages https://karthikbharath.github.io/ and http://www.maths.nott.ac.uk/personal/pmzmt.
Molecular Beam Epitaxy is a process by which single atoms are slowly deposited on a surface. These atoms diffuse around the surface until they collide with a cluster or another atom and become part of a cluster. Clusters remain stationary. The distribution of cluster sizes can be measured, and is observed to exhibit self-similarity. Various systems of equations have been proposed to explain the scaling behaviour observed. The purpose of this project is to analyse the systems of differential equations to verify the scalings laws observed and predict the shape of the size-distribution. The relationship of equations with other models of deposition, such as reactions on catalytic surfaces and polymer adsorption onto DNA, will also be explored.
A class of semi-parametric discrete time series models of infinite order where we are be able to specify the marginal distribution of the observations in advance and then build their dependence structure around them can be constructed via an artificial process, termed as Latent Branching Tree (LBT). Such a class of models can be very useful in cases where data are collected over long period and it might be relatively easy to indicate their marginal distribution but much harder to infer about their correlation structure. The project is concerned with the development of such models in continuous-time as well as developing efficient methods for making Bayesian inference for the latent structure as well as the model parameters. Moreover, the application of such models to real data would be also of great interest.
The random deposition of particles onto a surface is a process which arises in many subject areas, and determining its efficiency in terms of the coverage attained is a difficult problem.
In one-dimension the problem can be viewed as how many cars can be parked along a road of a certain length; this problem is similar to a problem in administering gene therapy in which polymers need to be designed to package and deliver DNA into cells.
Here one wishes to know the coverage obtained when one uses a variety of polymer lengths to bind to strands of DNA.
The project will involve the solution of recurrence relations, and differential equations, by a mixture of asymptotic techniques and stochastic simulations.
The localisation of energy and its transport is of great physical interest in many applications. The mechanisms by which this occurs have been widely studied in one-dimensional systems; however, in two- and three-dimensional systems a greater variety of waves and wave phenomena can be observed; for example, waves can be localised in one or both directions.
This project will start with an analysis of the nonlinear Schrodinger equation (NLS) in higher space dimensions, and with more general nonlinearities (that is, not just $\gamma=1$). Current interest in the Bose-Einstein Condensates which are being investigated in the School of Physics and Astronomy at Nottingham makes this topic particularly timely and relevant.
The NLS equation also arises in the study of astrophysical gas clouds, and in the reduction of other nonlinear wave equations using small amplitude asymptotic expansions. For example, the reduction of the equations of motion for atoms in a crystal lattice; this application is particularly intriguing since the lattice structure defines special directions, which numerical simulations show are favoured by travelling waves. Also the motion of a wave through a hexagonal arrangement of atoms will differ from that through a square array of atoms. The project will involve a combination of theoretical and numerical techniques to the study such systems.
A number of fascinating and important biological processes involve various kinds of spatial patterns: spatial patterns on animal skins, orthe very regular organ arrangements found in plants (called phyllotaxis) for instance. These patterns often originate at very small scales, and their onset can only be seen using very recent microscope and image analysis techniques. Among several families of models for biological patterning, one of the simplest is based on the idea that mobile substances (called morphogens) are acting upstream of their targets, which respond locally to a globally defined gradient pattern. In this project one will consider models where targets are themselves mobile morphogens, potentially regulating their own input. One will study the effect of such spatial feedback on patterning. To do so, one will rely on a class of models which are biologically relevant, tractable analytically, and not much studied yet in a context with spatial interactions. A class of models which meet all this criteria is provided by piecewise-linear differential equations.
Mirror symmetry, which originated in theoretical physics, is an exciting area of modern algebraic geometry toady. It has many applications within mathematics (e.g., classification of Fano varieties) and beyond (e.g., in theoretical physics). Mirror symmetry predicts a fascinating duality for certain geometric shapes (Calabi-Yau manifolds): Calabi-Yau manifolds seem to come in “mirror pairs” allowing mathematical notions and formulas on one side to be translated into corresponding counterparts on its mirror partner. It is expected that this deep duality will yield new insights into the geometry of algebraic varieties and produce new mathematics. However, the question of finding a rigorous explanation of this remarkable duality remains unsolved.
Finding explicit constructions to produce mirror pairs has turned out to be incredibly fruitful allowing to solve questions that previously seemed impossibly difficult. Batyrev and Borisov pioneered explicit constructions for mirror pairs by utilising toric geometry. A toric variety is a geometric shape (algebraic variety) with many symmetries. The symmetries of toric varieties prevent them from being Calabi-Yau varieties, however Calabi-Yau varieties can be embedded into toric varieties. The remarkable idea of Batyrev was to utilise a famous duality in combinatorics (polar duality) to produce to a given pair of toric ambient variety and Calabi-Yau subvariety a “toric mirror pair” with the predicted properties.
The aim of this project is to study extensions of the combinatorial Batyrev-Borisov mirror constructions for toric varieties in the context of a massively larger class of highly symmetric algebraic varieties: spherical varieties. Spherical varieties form a remarkable class of algebraic varieties containing those of toric varieties, flag varieties, and symmetric spaces. This project will allow you to get started from day 1, producing many examples of spherical mirror pairs and you learn the theory as you go. There is also the possibility to change your approach according to your interests and strengths: focus on developing new theories on the topology of spherical varieties; do a mix of theory and explicit computations exploiting the combinatorial nature of spherical varieties; or you can decide that your thesis contains a large computational component.
Mathematical Neuroscience is increasingly being recognised as a powerful tool to complement neurobiology to understand aspects of the human central nervous system. The research activity in our group is concerned with developing a sound mathematical description of sub-cellular processes in synapses and dendritic trees. In particular we are interested in models of dendritic spines , which are typically the synaptic contact point for excitatory synapses. Previous work in our group has focused on voltage dynamics of spine-heads . We are now keen to broaden the scope of this work to include developmental models for spine growth and maintenance, as well as models for synaptic plasticity . Aberrations in spine morphology and density are well known to underly certain brain disorders, including Fragile X syndrome (which can lead to attention deficit and developmental delay) and depression . Computational modelling is an ideal method to do in-silico studies of drug treatments for brain disorders, by modelling their action on spine development and plasticity. This is an important complementary tool for drug discovery in an area which is struggling to make headway with classical experimental pharmaceutical tools.
The mathematical tools relevant for this project will be drawn from dynamical systems theory, biophysical modelling, statistical physics, and scientific computation.
Neural field models describe the coarse grained activity of populations of interacting neurons. Because of the laminar structure of real cortical tissue they are often studied in 2D, where they are well known to generate rich patterns of spatio-temporal activity. Typical patterns include localised solutions in the form of travelling spots as well as spiral waves . These patterns are naturally defined by the interface between low and high states of neural activity. This project will derive the dimensionally reduced equations of motion for such interfaces from the full nonlinear integro-differential equation defining the neural field. Numerical codes for the evolution of the interface will be developed, and embedded in a continuation framework for performing a systematic bifurcation analysis. Weakly nonlinear theory will be developed to understand the scattering of multiple spots that behave as auto-solitons, whilst strong scattering solutions will be investigated using the scattor theory that has previously been developed for multi-component reaction diffusion systems .
S Coombes, H Schmidt and I Bojak 2012 Interface dynamics in planar neural field models, Journal of Mathematical Neuroscience, 2:9
A core-annular flow is a multi-phase flow of two immiscible liquids of different viscosities and/or densities, where one liquid moves through the core of a cylindrical pipe and another liquid forms an annular ring along the wall. Core-annular flows are useful in a wide range of practical applications including oil recovery and transport. In a steady flow the interface separating the two phases is concentric with the pipe wall, but interfacial waves are often required in order to enhance transport. It is possible to control the flow to a desired state by using various techniques, for example by adding chemical compounds called surfactants that affect the interfacial tension. This project will consider a two-layer core-annular flow in a three-dimensional cylindrical pipe. The main objective will be to examine the effect of soluble surfactants on the flow stability and explore the underlying nonlinear dynamics. The project will combine analytical and numerical techniques with aim to develop, analyse and solve appropriate mathematical models for the study of two-fluid core-annular flow with soluble surfactants.
Infectious diseases spread in continuous time through interaction between individuals, and many epidemic models are defined in continuous time. However, data are often collected in discrete time (e.g. daily case numbers), and it can be hard to accurately model real-time interactions (e.g. contacting different people at work/home/school at different times of the day). Discrete time epidemic models offer a way to model transmission on a natural time-scale and also have potential advantages for statistical inference. This project will explore the use of discrete time models for epidemics and develop efficient methods for fitting them to data.
Estimating population size is a common problem in statistics with many well-established methods. However, these methods rely on strict assumptions about the structure of the population. These assumptions are often unrealistic and may result in faulty population estimates. Methodology to estimate population size is commonly applied to epidemic and ecological data sets, but is being increasingly applied to social good applications, estimating the number of individuals facing human rights abuses (Silverman, 2020).
The project will advance methods for estimating population size, for example, by including population dynamics, or by stratifying the population into subgroups. The project will exploit advancements in Bayesian computation and functional data analysis to develop novel and efficient computational algorithms, this will allow for new models to be successfully implemented.
This project will be jointly supervised with Dr Rowland Seymour (Rights Lab, UoN)
In statistical shape analysis, objects are often represented by a configuration of landmarks, and in order to compare the shapes of objects, their configurations must first be aligned as closely as possible. When the landmarks are unlabelled (that is, the correspondence between landmarks on different objects is unknown) the problem becomes much more challenging, since both the correspondence and alignment parameters need to be inferred simultaneously.
An example of the unlabelled problem comes from the area of structural bioinformatics, when we wish to compare the 3-d shapes of protein molecules. This is important, since the shape of a protein is vital to its biological function. The landmarks could be, for example, the locations of particular atoms, and the correspondence between atoms on different proteins is unknown. This project will explore methods for unlabelled shape alignment, motivated by the problem of protein structure alignment. Possible topics include development of:
I have research interests in a broad range of statistics including:
If you would like to discuss projects further please get in touch (firstname.lastname@example.org)
"Biology is the new physics" -- Philip Hunter
Recent advances in technology have massively improved our ability to conduct experiments and collect data giving new insights into biological systems. The classical approach to mathematical biology has been to use analytic tools, e.g., ordinary/partial differential equations, perturbation theory etc. In this project, we will develop stochastic models for biological systems. We will start will simple Markovian models and then extend them to more general non-Markovian ones.
The project will require both theoretical (probability theory, functional analysis) and computational skills (programming).
Large scale studies of spiking neural networks are a key part of modern approaches to understanding the dynamics of biological neural tissue. One approach in computational neuroscience has been to consider the detailed electrophysiological properties of neurons and build vast computational compartmental models. An alternative has been to develop minimal models of spiking neurons with a reduction in the dimensionality of both parameter and variable space that facilitates more effective simulation studies. In this latter case the single neuron model of choice is often a variant of the classic integrate-and-fire model, which is described by a non-smooth dynamical system with a threshold . It has recently been shown  that one way to model the variability of neuronal firing is to introduce noise at the threshold level. This project will develop the analysis of networks of synaptically coupled noisy neurons. Importantly it will go beyond standard phase oscillator approaches to treat strong coupling and non-Gaussian noise. One of the main mathematical challenges will be to extend the Master-Stability framework for networks of deterministic limit cycle oscillators to the noisy non-smooth case that is relevant to neural modelling. This work will determine the effect of network dynamics and topology on synchronisation, with potential application to psychiatric and neurological disorders. These are increasingly being understood as disruptions of optimal integration of mental processes sub-served by distributed brain networks .
This broad project is devoted to the construction of new efficient numerical methods for stochastic differential equations and stochastic numerical analysis of properties of the methods. Depending on the interest of the student, it can be focused, e.g. on numerics for stochastic partial differential equations, on methods which are efficient for computing ergodic limits, on stochastic geometric integration, SDEs on manifolds.
Given a random graph G = (V, E), we begin by endowing each vertex v in V with a colour from a set S. As time progresses, the vertices change their colours as they interact with their neighbours. The goal of this project is to understand the large graph limiting behaviour of the system, as the number of vertices of the graph grows arbitrarily large.
This is a theoretical project and will require a solid foundation in analysis (both real and functional), probability theory in general and stochastic processes in particular.
Multilink is one of the leading computational models in bilingualism. It simulates the recognition and production of words of different lengths and frequencies in tasks like monolingual and bilingual lexical decision (i.e., decision about whether stimuli presented on the screen are a word or not, e.g. stone’, ‘rcks’), word naming, and word translation production. A core component of Multilink is the so-called resting level activation (RLA) of a word. In general, words that are encountered or used more frequently have higher RLA levels than words that occur less often. An open question, however, concerns the nature of this relationship. Are RLA and word frequency linearly related (as is often assumed), or is the relationship nonlinear? If the latter is true, what shape is the nonlinearity?
To answer this question, we will conduct a comprehensive exploration of Multilink’s predictions using Approximate Bayesian Computation (ABC). This will not only provide us with estimates of the most likely parameter values for Multilink, but also their uncertainty. Moreover, the results will further elucidate the cognitive processes that underlie e.g. visual word recognition, since different shapes of the relationship between RLA and word frequency necessitate different mechanisms. The project will be undertaken in close collaboration with Prof Kathy Conklin in the School of English.
Quantum graphs are a paradigm model for quantum chaos. They consist of a system of wires along which waves can propagate. Many properties of the excitation spectrum and the spatial distribution of standing waves can be mapped exactly onto a supersymmetric field theory on the network. In a mean-field approximation one may derive various universal properties for large quantum graphs. In this project we will focus on deviations from universal behaviour for finite quantum graphs with the field-theoretic approach.
Around 25% of the 50million epilepsy sufferers worldwide are not responsive to antiepileptic medication; improved understanding of this disorder has the potential to improve diagnosis, treatment and patient outcomes. The idea of modelling the brain as a complex network is now well established. However, the emergence of pathological brain states via the interaction of large interconnected neuronal populations remains poorly understood. Current theoretical study of epileptic seizures is flawed by dynamical simulation on inadequate network models, and by the absence of customised network measures that capture pathological connectivity patterns.
This project aims to address these deficiencies via improved computational models with which to investigate thoroughly the influence of the geometry and connectivity of the human brain on epileptic seizure progression and initiation, and the development of novel network measures with which to characterise epileptic brains. Such investigations will be informed by exhaustive patient datasets (such as recordings of neural activity in epilepsy patients and age-matched controls), and will be used to study (i) improved diagnostic strategies, (ii) the influence of treatment strategies on seizure progression and initiation, and (iii) the identification of key sites of epilepsy initiation.
Most models for the spread of an epidemic assume that the rate of (infectious) contact between any pair of individuals is constant over time. This is clearly unrealistic with an individual having different interactions with other individuals over the course of a day, for example, the individual could be at work or school at 9am and with family or socialising at 9pm. On longer time scales interactions vary over the course of a week (weekday versus weekend) or can be seasonal over the course of a year with possibly the rate of infection varying seasonally. The incorporation of temporal variability in contact between individuals makes modelling the spread of the disease harder to analyse raising the question, is it important or not to incorporate this feature?
This project will use a range of stochastic processes to explore the impact of variation in contact rates on the epidemic and consider suitable continuous and/or discrete time approximations of the epidemic process to study key epidemic quantities such as the probability and size of a major epidemic outbreak.
Social media such as microblogs (eg Twitter) and networking platforms (eg Facebook) are increasingly used for users to communicate breaking news and connect to each other anytime, from anywhere. Social media web sites contain various types of services and therefore different formats of data, including text, image, video etc are created. Among the various formats of data exchanged in social media, text plays a important role. The volume of textual data in social media is increasing exponentially, thus providing us with numerous opportunities for detecting the occurrence of an event in real-time (eg earthquakes, tsunami).
This project is concerned with developing computational statistics and machine learning methodology for text-analytics. In particular we are interested in detecting the occurrence of major events (eg. major train delay) when in the data (for example, tweets) there is often information about minor events too (eg someone tweeting/complaining because their train was 3 minutes late).
The thalamus is a body of neural cells that relays impulses to the cerebral cortex from the sensory pathways. Feedback from the cortex gives rise to thalamo-cortical loops that generate emergent brain rhythms from the interplay of single cell ionic currents and network mechanisms. This PhD will develop models of thalamo-cortical loops using recent advances in mean-field modelling developed at Nottingham. Spontaneous spatio-temporal patterning will be explored using tools from nonlinear dynamical systems theory and scientific computation. An important application of the work will be an extension of this (spontaneous) analysis to treat the system's response to external sensory drive in the form of median nerve stimulation. This has been shown to have a therapeutic effect in the treatment of Tourette's syndrome, and its understanding will pave the way for the design of improved healthcare treatments utilising wearable devices. This latter part of the project will investigate the effect of various forms of sensory stimulation on the Arnol'd tongue response structure of the thalamo-cortical mean-field model, as well as model fitting to human neuroimaging data and optimisation of median nerve stimulation protocols.
This project is in collaboration with Professor Stephen Jackson (Psychology).
When a thin layer of liquid flows down a wall, its surface can deform due to the action of physical forces such as gravity and surface tension, or in response to externally imposed factors such as electric fields, wall surface irregularities etc. The latter can be a useful tool to engineer desired liquid film structures (e.g. smooth or rippled films) depending on particular practical applications, for instance coating technologies or heat exchangers. Flow-manipulation techniques also include the use of chemical additives known as surfactants, which can greatly affect the behaviour of film flows, thereby providing a means of achieving appropriate surface shapes.
This project will consider a thin film flow over a patterned wall and the primary aim will be to investigate the deforming influence of surfactants on the profile of the film surface. Previous research analysed the flow of a clean (surfactant-free) liquid down a corrugated wall or surfactant-laden film flow over a smooth wall; here, the interacting effects of surfactant and bottom topography to the deformation of the free surface will be examined. The project will combine analytical and numerical techniques with aim to develop, analyse and solve appropriate mathematical models for the study of liquid film flow with surfactants over a topographically structured wall.
If a membrane vibrates at one of its resonance frequencies there are certain parts of the membrane that remain still. These are called nodal points and the collection of nodal points forms the nodal set. Building on earlier work this project will look at the statistical properties of the nodal set -- e.g. for 3-dimensional waves the nodal set consists of a coillection of surfaces and one may ask questions about how the area of the nodal set is distributed for an ensemble of membranes or for an ensemble of different resonances of the same membrane. This project will involve a strong numerical component as wavefunctions of irregular membranes need to be found and analysed on the computer. Effective algorithms to find the area of the nodal set, or the number of domain in which the sign does not change (nodal domains) will need to be developed andimplemented.
About 50 years ago Wigner and Dyson proposed a three-fold symmetry classification for quantum mechanical systems -- these symmetry classes consisted of time-reversal invariant systems with integer spin which can be described by real symmetric matrices, time-reversal invariant systems with half-integer spin which can be described by real quaternion matrices, and systems without any time-reversal symmetry which are described by complex hermitian matrices. These three symmetry classes had their immediate application in the three classical Gaussian ensembles of random-matrix theory: the Gaussian orthogonal ensemble GOE, the Gaussian symplectic ensemble GSE, and the Gaussian unitary ensemble GUE. In the 1990's this classification was extended by adding charge conjugation symmetries -- symmetries which relate the positive and negative part of a spectrum and which are described by anti-commutators.
The classification was completed by Altland and Zirnbauer who have shown that there are essentially only seven further symmetry classes on top of the Wigner-Dyson classes leading to what is now known as the 'ten-fold way'. All symmetry classes have applications in physics. The new symmetry classes are realised by various cases of the Dirac equation and the Bogoliubov-de Gennes equation. For a long time people have thought of these symmetries only in the context of many-body physics or quantum field theory. However there are simple quantum mechanical realisations of all ten symmetry classes which in terms of two coupled spins where the classification follows from properties of the coupling parameters and of the irreducible SU(2) representations on which the spin operators act. This project will explore these simple representations in the quantum mechanical and semiclassical context. One goal will be to understand the implications of the quantum mechanical symmetries for the corresponding classical dynamics which appears in the semiclassical limit of large spins.
The built environment is responsible for 45% of all UK carbon emissions with approximately 27% attributed to the domestic sector and 18% to non-domestic buildings. Reducing the energy demand in the built-environment is thus essential for the UK decarbonisation policy which legislates an 80% reduction of its 1990 greenhouse gas emissions by 2050. The existing housing stock is a primary target for reductions of the energy demand since it is estimated that up to 85% of existing buildings will be standing by 2050. An accurate characterisation of the thermal performance of the existing housing stock in the UK is thus needed to inform large-scale cost-effective policies for retrofit intervention that can effectively contribute towards achieving those decarbonisation targets. Unfortunately, existing approaches for the in-situ characterisation of the building fabric (including ISO standards) cannot accurately characterise the thermal performance of buildings in the presence of thermal bridge effects that arise from heterogeneities, irregularities and/or abrupt changes and discontinuities in the thermophysical properties of the building fabric. In particular, these approaches cannot capture thermal bridge effects due to fabric degradation and moisture condensation which are likely to be found in existing dwellings.
This challenging research will develop novel thermal imaging algorithms capable of characterising, with an accurate measure of uncertainty, the thermal performance of the building fabric in the presence of a general class of thermal bridge effects. This project will build upon state-of-the-art Bayesian algorithms for inverse problems that have been successfully applied for tomographic inversions in the context of groundwater flow , electrical impedance tomography , resin transfer moulding , and the characterisation of thermophysical properties of walls [3,4]. The techniques developed in this project will be validated with real experiments. Although highly ambitious, this proposed research has enormous potential to revolutionise current approaches for in-situ characterisation of the thermal performance of buildings thereby enhancing the predictive capabilities of existing housing stock models.
This project will be jointly supervised by Dr Yupeng Wu in the Faculty of Engineering.
We are announcing three PhD positions to join the ‘Quantum Simulators for Fundamental Physics’ (qSimFP) initiative. QSimFP is one out of seven proposals funded through the UK Quantum Technologies for Fundamental Physics (QTFP) programme. We are looking for • two PhD students to join our experimental work, to build quantum simulators for black hole physics. This work involves the development of hybrid superfluid optomechanical devices at low temperature.
• one PhD student to join our theoretical work, developing the field theoretic description of quantised wave-modes around the simulated quantum black holes.
Candidates are expected to have a strong background in areas relevant for the project, and to contribute to the project through both collaborative and individual work.
The qSimFP consortium is an interactive network of scientists from seven UK-based research organisations located in St.Andrews, Cambridge, King's College London, Newcastle, Nottingham, University College London and Royal Holloway University London. The three PhD students will benefit from all network activities and are expected to closely collaborate with the University of Royal Holloway and King’s College London.
Starting date October 2021.
If a light wave in a resonator between two almost perfect mirrors shows resonance if the wavelength is commensurate with the distance between the two mirrors. If this condition is satisfied it will decay much slower than at other wavelengths which are not commensurate. This is one of the simplest mechanisms for a resonace in a wave system. There are other weill known mechanisms that rely on complexity and disorder. It has recently been observed that a netork of wire may have a further mechanism that leads to resonances. This mechanism relies on cycles in the network and leads to various signatures which cannot be explained using other well-known mechanisms for resonances. In this project these signatures will be analysed in detail.
The advent of mega studies such as the British Lexicon Project (BLP), the English Lexicon Project (ELP) or the Dutch Lexicon Project (DLP) have allowed researchers to analyse single trial response times (RTs) to lexical decision tasks (i.e., decision about whether stimuli presented on the screen are a word or not, e.g. stone’, ‘rcks’) with unprecedented detail. As a consequence, there is now a large body of literature contrasting parameter fits for various RT models, including the seminal drift-diffusion model and parametric descriptions such as the Ex-Gaussian distribution. Because of their reliance on European languages, an open question remains whether these results generalise to non-European languages such as Arabic.
The present study aims at rectifying this gap in our knowledge. The first part of the project consists of collecting RTs from lexical decision tasks in Arabic, which will be a first step towards establishing an Arabic Lexicon Project. In the second part, this new data set will be analysed using process models such as the drift-diffusion model, established probability distributions including the Wald, Ex-Gaussian and log-normal distribution as well as the Bayesian Reader. The goal of this analysis is two-fold. Firstly, do these models, which have been primarily tested and validated for European languages, provide convincing fits to Arabic words? Secondly, if they do, how do the parameters of the models compare with those obtained for European languages? The results from the second question will be particularly interesting for ascertaining differences and commonalities in the cognitive processes that underpin lexical decisions in different languages with different scripts. The project will be undertaken in close collaboration with Prof Kathy Conklin in the School of English.
Over the past decade, geoelectrical imaging has become the leading technology for continuously monitoring the shallow subsurface volumetrically and in real time. This technology plays a crucial role in assisting industrial and governmental stakeholders in addressing some of the most pressing societal challenges that impact on the subsurface, such as unconventional energy sources, carbon sequestration, waste management and groundwater contamination. However, current geoelectrical imaging techniques do not allow appropriate quantification of the uncertainty intrinsic to (1) the subsurface and (2) conventional image reconstruction methods based on deterministic inversion. The absence of uncertainty quantification in the subsurface has profound detrimental effects on evidence-based decision-making, the assessment and management of risks associated with subsurface hazards, the design of cost-effective remediation strategies and the improvement of stakeholder and public acceptance in the context of potentially controversial uses of the subsurface (e.g. unconventional hydrocarbons, CO2 storage). This project will develop Bayesian methodologies for geoelectrical imaging with the ultimate aim of inferring and quantify uncertainty in subsurface properties in the presence of realistic geologies. The project will be focused on applications that include carbon capture and storage, unconventional hydrocarbons and underground gas storage.
Despite the recent significant developments in Digital Rock Physics (DRP), two-phase flows in complex pore geometries are still not fully predictable, understood, and quantitatively reproducible. This is due to a number of factors including incorrect physical models, insufficient mesh resolution, unknown parameters and pore heterogeneity. The qualitative and quantitative effects of these uncertainties have not been studied yet. In this project, we aim to develop a modelling and simulation workflow to quantify uncertainty and assess the validity of simplified multiphase flow models in digitalised porous media images. Deterministic and Monte-Carlo techniques, together with two-phase flow solvers, will be used to perform a global sensitivity analysis of the problem. The project will see a collaboration of an industrial partner and the Geoenergy Research Centre in Nottingham.
Classical automorphic forms are a powerful tool for handling difficult number theoretic problems. They provide links between analytic, algebraic and geometric aspects of the study of arithmetic problems and, as such, they are at the heart of the major research programmes in Number Theory, e.g., Langlands programme. Crucial for these links are certain functions associated to automorphic forms, called L-functions, which are the subject of some of the most important conjectures of Mathematics.
In recent years, investigations into the theory of automorphic forms have led into the study of variants of automorphic forms and of their L-functions, such as quasi-modular forms, harmonic Maass forms, mock modular forms, higher order modular forms and multiple Dirichlet series. In most cases, the motivation for introducing these objects was not just to generalise the classical automorphic forms and their L-functions, but to obtain novel tools to address existing number theoretic problems. The techniques associated with these new objects in turn raise interesting new questions and highlight connections beyond the original motivating problems. For example, the theory of harmonic Maass forms and modular forms has been used to resolve problems in partitions of numbers, and higher order modular forms have been applied to Percolation Theory problems in Physics.
As these techniques have only recently been discovered, they lead to several very interesting open questions, e.g., how to attach appropriate L-series to harmonic Maass forms, how to determine the arithmetic nature of high-order forms or how to explore foundational aspects of multiple Dirichlet series. Questions of this type are highly relevant both for the outstanding problems in classical automorphic forms and for the further development of the new subjects themselves. Therefore, many of them are appropriate for a PhD project.
Bayesian inference can be computationally challenging, and we often have to resort to random sampling methods such as Markov Chain Monte Carlo (MCMC) in order to find posterior distributions. Variational methods are an alternative approach to sampling methods: they seek to turn the problem of finding posteriors into an optimisation problem. They work by specifying a parametric form for the posterior distribution (e.g. assuming it is Gaussian), and then seek to find the parameters in the approximation by minimising a cost function. Variational methods are widely used in machine learning, and have proven to be a fast and effective alternative to MCMC.
In this project, we will look at using the variational autoencoder (VAE) framework to do Bayesian inference for differential equation models. The VAE seeks to make use of either automatic differentiation software (such as Tensorflow) or adjoint methods to find derivatives of the cost function, which are a function of the solution to the differential equations. These can then be used to solve the variational inference problem at speed.
The project will focus on developing and testing the methodology to do this, and will use a variety of exemplar problems to guide the development.
This aim of the project is to solve problems in vectorial calculus of variations, forward-backward diffusion equations, partial differential inclusions and coercivity problems for elliptic systems. These problems are motivated from the variational models for material microstructure, image processing and elasticity theory. Methods involve quasiconvex functions, quasiconvex envelope, quasiconvex hull, Young measure, weak convergence in Sobolev spaces, elliptic and parabolic partial differential equations, and other analytic and geometric tools.
Computing the dynamic response of modern aerospace, automotive and civil structures can be a computationally challenging task. Characterising the structural dynamics in terms of waves in a uniform or periodic medium is often an important first step in understanding the principal propagating wave modes.
Real mechanical structures are rarely fully periodic or homogeneous – variations in shape or thickness, boundaries and intersections as well as curvature destroy the perfect symmetry. The aim of the project is to extend periodic structure theory to wave propagation in quasi-periodic and inhomogeneous media such as stiffened structures. The modelling of waves can then be recast in terms of Bloch theory, which will be modified by using appropriate energy or flux conservation assumptions. The information about the propagating modes will then be implemented into modern high-frequency wave methods – such as the so-called Dynamical Energy Analysis developed in Nottingham - making it possible to compute the vibrational response of structures with arbitrary complexity at large frequencies.
The human brain has a wonderfully folded cortex with regions of both negative and positive curvature at gyri and sulci respectively. As the state of the brain changes waves of electrical activity spread and scatter through this complicated surface geometry. This project will focus on the mathematical modelling of realistic cortical tissue and the analysis of wave propagation and scattering using techniques from dynamical systems theory and scientific computation.
In more detail the project will consider models of neural activity represented by non-local integro-differential equations posed on both idealised and human realistic cortical structures. The former will allow the development of analytical tools to understand the role of tissue heterogeneity and disorder in sculpting wave dynamics, such as the recently developed interface approach . The latter will extend this so-called neural field approach  using cortical meshes from human connectome databases, making extensive use of spectral and finite element methods
This applied mathematical project will be facilitated by interaction with colleagues from the Sir Peter Mansfield Imaging Centre. As well as exposing the PhD student to rich neuroimaging data-sets collected locally using cutting edge magnetoencephalography techniques, the project will contribute to our understanding of cortical waves in the functioning of the human brain.
It has been known for a decade that some genes controlled by auxin (the most studied plant hormone) present oscillatory acitivity in plant roots. It is also well known that a key signalling pathway used by auxin to trigger genetic responses includes a negative feedback loop. Hence, this pathway is able to induce spontaneous oscillations, even under steady inputs of auxin. On the other hand auxin in plants is essentially never steady, being transported from cell to cell via a complex mechanism involving active transport (rather than diffusion). If the molecular details of auxin transport remain incompletely understood, recently developed fluorescent reporters allow to track auxin distribution in time and space with a resolution never achieved before. In particular, these reporters have been used to produce data sets in which complex spatio-temporal patterns of auxin are recorded with a single cell resolution. Using mathematical modelling and building upon published data, this project will aim to investigate the interplay between auxin signals and the response it triggers via its signalling pathway. The latter can be seen as a signal processing device, but its nonlinear nature makes it difficult to use traditional techniques; the research will include devising new approaches to uncover relevant effects such as resonance or filtering properties.
Demyelination has long been associated with diseases such as multiple sclerosis, and more recently with psychiatric disorders including depression and schizophrenia (where structural differences in white matter networks are manifest). It has only relatively recently been established that myelin is also modifiable by experience and can affect information processing by regulating the velocity of signal transmission to produce synchronous arrival of synaptic inputs between distant (and multiple) cortical regions. Indeed, myelin plasticity is increasingly being seen as a complementary partner to synaptic plasticity and, as well as being important to nervous system development, it has a major role to play in complex information processing tasks that involve coupling and synchrony among different brain regions.
This project will build a new mathematical framework for biologically motivated neural networks to help understand the important contribution that activity-dependent regulation of myelination can make to patterns of rhythmic activity known to subserve important aspects of large-scale brain dynamics and its dysfunction. It will
i) combine perspectives from neural mass and network modelling and develop a new set of mathematical tools able to unravel the contributions of space-dependent axonal delays to large-scale spatio-temporal patterning of brain activity;
ii) develop new mathematical models for myelin based plasticity and analyse their consequences for network timing.
The University of NottinghamUniversity Park
Nottingham, NG7 2RD
For all enquiries please visit:
Connect with the University of Nottingham through social media and our blogs.
Campus maps | More contact information | Jobs