top of page

As advertised to Heriot Watt University and University of Edinburgh students in

​

  • Computational Mathematical Finance

  • Financial Modelling and Optimisation

  • Operational Research

  • Statistics

  • Financial Mathematics

  • Quantitative Finance and Mathematics

  • Actuarial Science and Management

  • Quantitative Financial Risk Management

​

Please find below a list of the completed project placements

Summer 2019 Placements

Hymans Robertson LLP - On climate risk quantification approaches in the financial industry

Supervisor:

Dr Mayukh Gayen, Risk and Modelling Consultant

​

Two participating students from University of Edinburgh & 

Heriot Watt University

Climate risk is the result of the interaction between how climate shapes up due to natural forces, its feedback in the natural systems and human (in)action of adopting best practice and policies for environment. As huge and uncertain it is, the financial industry and professional bodies are yet to qualify, quantify and manage it adequately. In this project, we study and analyse different techniques that can be adopted by a financial services firm to quantify its climate risk exposure. If time permits, we will also touch upon the potential enhanced accuracy by using new-age (AI/ML) algorithms for this purpose.

 

Risks and controls

Bank of England has classified the climate change risks into – (1) physical damages due to climate change, (2) increased liability due to extremes of climate, (3) risks while in transition to a low carbon economy. Moreover, climate change is manifesting itself as a huge strategic risk, more so in the light of regulatory ESG concerns from the governing bodies.

 

Several controls have been proposed – moving away from fossil fuels, moving investments to be environment friendly etc. In the upcoming regime of ethical and sustainable investments, it is important to model quantitatively the impact of these systemic change in the financial markets.

 

Modelling stages

There are three stages of modelling – (1) physical damages, (2) macroeconomic implications of (1), and, (3) resultant microeconomic changes due to (1) and (2). Moreover, the entire modelling would be affected by shocks in the system induced by policy-makers.

 

Modelling output

It finally translates to quantifying the risk via (1) stress-tests, (2) scenario analysis, (3) full-funnels. The question is which one to choose?

 

Modelling approaches

There are different modelling approaches. Going in order of decreasing risk and increasing uncertainty, they can be – (1) continue existing models and expect the risks are caught up in bad scenarios, (2) recalibrate existing models, (3) assume a new ‘central’ scenario, (4) more uncertain and complex approach.

 

Conclusion

At the end of the project we should have a modelling hypothesis and implementation of incorporating climate change risks in the economic scenarios ready for industrial use.

Bank of England - Detecting Changes in North Atlantic Hurricane Activity

Supervisor:

Dimitris Papachristou

​

Participating student from

Heriot Watt University

The problem with the analysis of historical hurricane losses is that the historical data need to be adjusted for inflation and exposure changes. This is not a trivial exercise. Crompton and Klotzbach (2017) in “Normalised hurricane damage in the continental United States 1900-2017”  - nature sustainability – have carried out these adjustments. The data presented in this paper will be the basis of the analysis.

 

We have observed some increased Hurricane activity in the North Atlantic in the last few decades.

The increase does not appear to be random for certain types of hurricane.

The project will investigate how much data we need to be able to detect temporal trends in hurricane activity. Or alternatively, how big the changes in the hurricane activity need to be so that they are observable over say a period of 20 years.

​

There are different approaches that someone could take to look at the problem. One of them is the following:

  • Take a database with the losses from the largest US Hurricanes as they have been adjusted for changes in inflation and exposure.

  • Fit appropriate distributions to the frequency and severity of hurricanes.

  • Stress the parameters of the se distribution and/or change them gradually over time.

  • Derive statistical tests that could tell us whether the hurricane loss activity is a result of underlying trends.

  • Most likely the analysis will require simulation of losses from the distributions fitted and stressed above.

​

The output of the project will be appropriate distributions for the frequency and severity of the hurricane losses.  Statistical tests for deciding whether the changes in hurricane activity are due to random fluctuations or in temporal trends.

Crover Ltd - Reducing risk in the insurance of bulk-stored cereal grains: quantifying the opportunity

Supervisor:

Lorenzo Conti Managing Director

​

Participating student from

Heriot Watt University

Cereal grains are the basis of staple food, yet post-harvest losses during long-term storage are exceptionally high, above 20% in Scotland and worldwide. Pests are to blame, with grain moisture content and temperature being the most significant factors. Crover Ltd is developing the world’s first remote probing device (a ‘Crover’) for the monitoring of stored cereal grains. A Crover is a small robotic device able to move within grains stored in bulk, such as in sheds and silos and using on-board sensors to measure local parameters and to build a full map of conditions within the bulk of the grains. Unlike current grain monitoring solutions that measure only one variable and have limited reach, Crover’s remote monitoring device provides real-time data across a range of measurements, initially temperature and moisture, throughout the whole silo. This gives early detection of potential spoilage allowing proactive management to reduce losses and maintain quality.

 

Grain merchants and farmers generally have an obligation to meet their contract for the supply of raw produce even upon the occurrence of mass and quality losses. One way in which such risk is currently mitigated is through specialised insurance policies on the stored grains. The risk and uncertainty for insurers in the agricultural sector who provide such policies is currently also high, as they are exposed to natural capital risk.

 

The use of a Crover is expected to reduce risk significantly by lowering the occurrence of spoilage and helping the grain storer keep a better eye on the grains and maintain their quality. This has the potential to disrupt a traditional insurance costed model.

 

This could also have significant advantages for Crover users: because premiums are currently high to account for loses that might occur on high-worth stocks, the reduction in risk could result in lower premiums for customers.

 

Ideal tangible aims include:

  • To devise an optimal pricing model for a novel product that provides a heatmap of conditions within grain stores such as sheds and silos (a ‘Crover’) that involves grain insurers;

  • Quantifying the monetary value to grain insurers of the risk-reduction from use of a Crover

  • Quantifying the amount by which, if any, grain insurers would be willing to reduce premiums for customers using Crovers in their grain stores.

Moody’s Analytics - Machine learning calibration tool

Supervisor:

Dr David Redfern, Associate Director, Research

​

Participating student from

Heriot Watt University

At Moody’s Analytics we help our clients understand how the uncertainty of financial markets can impact their business.One of our flagship products is the Economic Scenario Generator (ESG), a software tool which brings together many stochastic models in either a risk neutral or real world economic environment.Accurate calibration of these models is vital to ensure that the model produces realistic results, however it can be a challenging and time consuming process.

 

Although we have calibration tools for all of the models, the purpose of the project is to understand whether machine learning, deep learning or AI techniques can be utilised (in a regression framework) as a replacement for a calibration tool.The project will involve working with large data sets with multiple inputs and multiple outputs.

 

The data set consists of about 200,000 stochastic simulations with varying parameters for one of our interest rate models. This has already been created and will be made available to the student via ftp. The student will have to provide their own computing resources for any subsequent computation that uses the data set.

 

Pricing of interest rate derivatives using the interest rate model in question has some degree of semi-analytical tractability. The details of this are documented and this documentation will be provided, as will calibration tools which attempt to map derivative prices back to appropriate model parameters. The student could obtain a short licence for the ESG software to create more data should they wish.

 

References:

  1. https://www.researchgate.net/publication/220505020_Machine_Learning_Vasicek_Model_Calibration_with_Gaussian_Processes

  2. https://www.politesi.polimi.it/bitstream/10589/140264/3/2018_04_Donati.pdf

Moody’s Analytics - The use of robust optimisation in portfolio optimization

Supervisor:

Dr Jiajia Cui, Associate Director, Research

​

Participating student from

Heriot Watt University

At Moody’s Analytics, we help our clients understand how the uncertainty of financial markets can impact their business.  This project will involve working with our award-winning Economic Scenario Generator (ESG) software.

 

The Moody’s Analytics ESG is a multi-asset class, multi-risk factor, multi-time step scenario generator which combines structural and reduced-form modelling approaches in order to capture physical as well as risk-neutral dynamics. In particular, it projects economic as well as financial variables using advanced time series models whose interdependencies are captured by a correlation matrix.

 

This project will look at robust portfolio selection approach, which is to systematically combat the sensitivity of the optimal portfolio to statistical and modelling errors in the estimates of the uncertain or unknown parameters.

In the traditional mean variance portfolio optimization activities, there are many uncertainties which can impact the decision making such as volatility in financial markets, uncertainties from calibration parameters, as well as choice of models and choice of reward metric.

 

Our objective is to improve the quality of decisions made in the environment by understanding, modelling, quantifying and managing uncertainty using robust optimisation.

 

The project has three primary aims:

  • To quantify the uncertainty in optimisation practice;

  • To build a robust mean-variance portfolio selection model;

  • To compare the different performance between our traditional MVO and the robust optimisation

 

References:

  1. (Goldfarb, D. & Iyengar,G. (2003). "Robust portfolio selection problems. Mathematics of Operations Research", 28(1), 1–38.

  2. Tütüncü, R. H., & Koenig, M. (2004). "Robust asset allocation. Annals of Operations Research", 132, 157–187.)

Moody’s Analytics - The use of diversification metrics during portfolio rebalancing

Supervisor:

Natasha Margariti, Associate Director, Research

​

Participating student from 

University of Edinburgh

At Moody’s Analytics, we help our clients understand how the uncertainty of financial markets can impact their business.  This project will involve working with our award-winning Economic Scenario Generator (ESG) software.

 

The Moody’s Analytics ESG is a multi-asset class, multi-risk factor, multi-time step scenario generator which combines structural and reduced-form modelling approaches in order to capture physical as well as risk-neutral dynamics. In particular, it projects economic as well as financial variables using advanced time series models whose interdependencies are captured by a correlation matrix.

 

This project will look at quantitative analytics of dynamic asset management, which is essentially the buying and/or selling of financial assets (stocks, bonds, property assets, etc.) in the right way in order to ensure that future financial objectives are met.  The focus will be quantifying diversification, risk and return attribution, utilizing state-of-art techniques in factor-based asset management and liability driven investment.  Various complications present modelling challenges here, including the multi-time step and multi-risk factor nature of the ESG’s stochastic modelling framework, and the dynamic & contingent nature of the investment decision-making process.

 

The project has two primary aims:

  • To define and characterise different approaches to the quantification of diversification in investment portfolios.

  • To determine how these diversification metrics should be used within the dynamic asset management context.

Moody’s Analytics - Stress scenario design and reverse stress testing using Moody’s ESG

Supervisor:

Dr Tamás Mátrai, Associate Director, Research

​

Participating student from Heriot Watt University

At Moody’s Analytics, we help our clients understand how the uncertainty of financial markets can impact their business.  This project will involve working with our award-winning Economic Scenario Generator (ESG) software.

 

Moody’s ESG is a multi-asset class, multi-risk factor, multi-time step scenario generator which combines structural and reduced-form modelling approaches in order to capture physical as well as risk-neutral dynamics. In particular, it projects economic as well as financial variables using advanced time series models whose interdependencies are captured by a correlation matrix.

 

The project has two closely related purposes:

  • To any given stress level, identify the ESG scenario with economic narrative that corresponds to the stress level.

  • Given any third-party stress scenario – e.g. a stress scenario from the Prudential Regulation Authority –, determine the level of stress in the ESG corresponding to the third-party scenario, and identify the ESG scenarios that are aligned with and at the level of the third-party scenario.

 

The project will involve

  • generating economic scenarios with the ESG;

  • analysing their dependence structure;

  • deciding on the method to measure the stress level of scenarios;

  • economic interpretation of particular scenarios

 

References:

  1. "Reverse stress testing approaches based on multivariate normality", Isabelle Niemi

  2. "On the Design of Stress Tests", Matthias Daniel Aepli

Lloyds Banking Group - Interest rate dynamics in historical simulation

Supervisor:

Dr Colin Burke, Head of Market Risk Model Approval

​

Participating student from

University of Edinburgh

Many Value at Risk(VaR) models employ historical simulation for risk predictions over periods of 1 to 10 days.  Interest rates are at or close to historic lows. In some regions, policy rates are rising but are static in others. Last 10 years or so dominated by falling rates. Standard practice for historical simulation is to use either absolute (first differences) or relative differences for interest rate modelling however the rationale for the choice is often heuristic and driven by how the variability of rates changes with the level of interest rates. The literature on assessing whether returns are closer to absolute or relative returns in a VaR context is quite small, although there is a larger literature on assessing how interest rate models expressed as Stochastic Differential Equations (SDEs) fit observed data. Usual thinking is that “locally” an absolute process has variability (usually volatility) that is flat as a function of interest rates. Parametric models give a framework to think about this but most of these models are unrealistic. Analysis so far has been undertaken as follows: kernel estimation is used examine how the local or conditional moments are functions of the local interest rate level. A jump diffusion model provided a framework to interpret the output.  Simple diffusion models all have zero conditional moments higher than the second one. This provides a way to test diffusion models: over finite samples, by chance, some local higher moments for diffusion models can be generated but this should be small. The empirical conditional higher moments are compared with those that a candidate parametric diffusion model generates: if there are excessive higher moments, specifically excess conditional kurtosis, then the parametric model is rejected. So far interpretation of the results is most clearly defined on the second moment (flat volatility being associated with absolute moves and linear with relative moves) However, when higher moments exist, the interpretation framework based in simple diffusions is insufficient and the goal is extend the framework to allow for interpretation of higher moments in terms of move type (absolute, relative, etc)

​

References 

  1. "Dependence of Rate Move Magnitude on Rate Level with an Application to a Real World Model", 2009, Deguillamme, MSc thesis, University of Oxford

  2. "The statistical and economic role of jumps in continuous time interest rate models", Johannes M, 2004, Journal of Finance

  3. “INTEREST PROCESSES IN LOW RATE (AND OTHER) REGIMES (PRESENTATION SLIDES)”, C Burke, 2017

Lloyds Banking Group - Variance and covariance estimation in the presence of autocorrelation

Supervisor:

Dr Colin Burke, Head of Market Risk Model Approval

​

Participating student from

University of Edinburgh

Many risk simulations require estimates of variances and covariances from historical data. Often the historical data is autocorrelated.  An approach known as effective sample size (ESS) ([1] “The Interpretation and Estimation of Effective Sample Size." Theibaux H and  Zwiers F, Journal of Climate and Applied Meteorology. v23, 1984) has been used when the data follows an AR(1) process.  The ESS approach boils down to a simple correction factor when calculating variances and covariances. However, for higher order processes (AR(p), p>1), the approach can generate counterintuitive results. Moreover, the approach in [1] can be calculated in terms of measured autocorrelations or, alternatively, in terms of AR coefficients:  when the latter approach is taken several methods present themselves in the estimation of the AR coefficients such as Yule Walker, MLE or the Burg method. The project could usefully examine the problem when p>1 and consider the optimality of different AR estimation techniques.

bottom of page