skip to content


Newton Gateway Case Studies

The Newton Gateway works with external partners to deliver activity and events. The case studies below highlight Programmes of Work where the Newton Gateway partnered with Warwick Centre for Predictive Modelling, the KTN (Knowledge Transfer Network), the Maths Foresees Network, BAE Systems, Microsoft Research, the University of Cambridge and the Judicial System.

Understanding Multi-Modal Data for Social and Human Behaviour

In November 2018, as part of the four-month Isaac Newton Institute Research Programme on Scaling Limits, Rough Paths, Quantum Field Theory, the Newton Gateway, in partnership with BAE Systems, organised a one-day “Open for Business” knowledge exchange workshop.

Read more

Mathematical and Statistical Challenges in Landscape Decision Making

In September 2018, the Newton Gateway, in partnership with the Natural Environment Research Council (NERC) and the Department for Environment, Food and Rural Affairs (Defra), ran a two-day scoping workshop on evidence based decision making for UK landscapes.

Read more

Algorithmic Trading: Perspectives from Mathematical Modelling

The emergence of new technologies and the advent of computerised trading have changed the landscape of financial markets in recent years, but there are concerns about the effect this is having on trading behaviours and markets. The Gateway held an event in February 2017, to explore the opportunities and challenges this presents. 

Read more

Probability and Statistics in Forensic Science

In 2016, the Isaac Newton Institute for Mathematical Sciences (INI) ran a six month Research Programme on Probability and Statistics in Forensic Science. The Programme aimed to try to tackle the problems around the rudimentary and often flawed way that probative value of forensic evidence is presented in Courts. In particular, where probative value is presented in probabilistic and statistical terms there have been numerous instances of misunderstanding leading to miscarriages of justice. 

Read more

Predictive Multiscale Materials Modelling

The Gateway delivered a 4 day research workshop in collaboration with and supported by the Warwick Centre for Predictive Modelling and the KTN, to investigate mathematical and algorithmic problems currently prohibiting uncertainty quantification and predictive materials modelling.

Read more

Partners Case Studies

UK Success Stories in Industrial Mathematics

This publication showcases the work of UK mathematicians and statisticians by describing industrial problems that have been successfully solved, together with a summary of the financial and/or societal impact that arose from the work. The articles contain some mathematical detail, but the emphasis is on telling the story of a successful collaboration between academia and industry and on the results obtained.

A number of the articles within it involve examples of collaborations involving stakeholders that the Newton Gateway has worked with. They are based on Impact Case Studies that were submitted to the Research Excellence Framework (REF2014), a UK government sponsored exercise that assessed the research quality within UK universities. The foreword is written by David Abrahams, who was Beyer Professor of Applied Mathematics at the University of Manchester and is now Director of the Isaac Newton Institute.

Practical Uses of High-dimensional Data Quality Assessment in Genomics (Julia Brettschneider, University of Warwick)

Residuals of a Poor Quality Microarray Hybridisation Data After Model Fit

While genes themselves are hardwired, their degree of expression depends on the type and condition of the organ cellular conditions and chemical environment. Comprehensive information on gene expression is therefore key to diagnosis and prognosis of complex genetic diseases such as cancers, cardio vascular diseases and brain disorders.

High-throughput measurement technologies for gene expression have opened up new avenues for biomedical research. However, based on assessing the concentration of fragile macromolecules in chemical assays, the data is typically noisy and biased, which has led to irreproducible scientific results undermining the credibility of the new technologies.

To reach its full potential, statistical challenges related to the size and complexity of these new types of data sets need to be tackled. This work on statistical quality assessment and visualisation of gene expression data quality has provided scientists with tools to rate the quality of their data and to detect concrete reasons for poor quality, such as certain lab conditions.

Statistics can serve as guides to increase the reproducibility of future experiments. An example for the impact of these new statistical quality assessment tools is their role in the Microarray Quality Control project, an FDA initiative to establish quality standards for high-throughput gene expression data. Another example is their role in the development of a diagnostic tool for thyroid cancer that has hugely reduced the number of unnecessary surgeries.

The Graph Whisperers (Peter Grindrod, University of Oxford; Desmond J. Higham, University of Strathclyde & Peter Laflin, Bloom Agency)

An Evolving Twitter Network Produced by the Whisper Software

Bloom is a Leeds-based digital agency whose product Whisper applies high quality analytics to social media data.  

Whisper examines social media networks to find influencers. Finding these key players is an important part of social marketing, since it helps pinpoint where spending will have the most benefit. It means that Bloom can more readily show clients how they can get a return on their investment – something that is otherwise hard to track.

Researchers from the University of Oxford and the University of Strathclyde collaborated to help Bloom develop its analytical techniques and give it an edge on its competitors.

This case study describes the underlying, public domain, mathematical research that Bloom picked up and ran with, and the subsequent mutually beneficial interactions that have taken place across the academic/business interface.

Industrial Application of Multiscale Texture Analysis (Idris Eckley, Lancaster University & Guy Nason, University of Bristol)

Multiscale Texture Analysis

Unilever is a British-Dutch multinational consumer goods company with a large R&D operation which includes a laboratory located in Port Sunlight, Wirral.

In the late 1990’s, researchers at the University of Bristol collaborated with scientists at Unilever to explore the utility of multiscale (wavelet) methods for characterising greyscale texture and constructed statistical models to enable the objective classification of image textures.
These methods were applied to problems in fabric care.

This case study explains the technical background that underlies the applied methods and shows how they were used on products data.

Life Expectancy with Cerebral Palsy and Other Neurological Injuries (Jane L. Hutton, University of Warwick)

Professor Jane Hutton

A major determinant of the economic cost of being disabled is the expected length of the disabled person’s life.
Health and social care providers, whether insurance companies or state authorities, need information on life expectancy in order to plan for the medical, educational and social needs of disabled people.
If medical liability is admitted, information on life expectancy is an essential component in deciding how much money is awarded. The UK Health services pay out millions of pounds to children with cerebral palsy—brain-damage which results in physical disability. Although medical doctors are often asked to give an opinion on a patient, they rarely have detailed follow-up of well-defined, large cohorts, or knowledge of methods for unbiased estimation of survival probabilities.
To provide a reliable estimate, good data and good statistical models are required. The most reliable source of information on survival is a precisely defined geographical cohort, with accurate records of the dates of onset and death and of factors which affect lifetime. The UK has excellent records of dates of death, but if many people in a cohort are still alive, methods which allow for unknown length of life are required. The relevant statistical approach is to use survival regression models. Choosing models which give consistent, accurate and robust estimates, even when some data are missing, is essential.

Modelling and Analysis of Floating Ocean Wave Energy Extraction Devices (Thomas J. Bridges, Matthew R. Turner & Hamid Alemi Ardakani, University of Surrey)

The Offshore Wave Energy Ltd Wave Energy Convertor

Extraction of energy from ocean waves is a high-priority sustainable-energy initiative in the UK.

The Offshore Wave Energy Ltd wave energy convertor (OWEL WEC) is a floating rectangular box which captures waves at one end and extracts their energy though a power take off (PTO) system at the other end. 

The Surrey team is providing underpinning modelling and mathematics to this project. The modelling requirements of the OWEL WEC design dovetail with research at the University of Surrey on interior fluid sloshing, external water wave dynamics, coupling between vessel and fluid motion, and modelling of the PTO as a gravity current interaction. The outcome is direct impact on the wave energy industry and indirect impact on the environment and the economy.

More information is on the website.

Statistical Challenges in Retail Credit (David J Hand, Imperial College, London)

Consumer Credit Award

The retail credit domain is characterised by data sets which are large in terms of number of cases, with billions of card transactions per year, and in terms of number of variables, with some 70-80 items recorded for each card transaction, expanded to hundreds or even thousands in fraud detection applications.

It was “big data” before the term was invented. It’s an area which presents many novel mathematical, statistical, and machine learning challenges, requiring the development of new methodology.

The Consumer Credit Research Group at Imperial has contributed significantly to these developments, creating new and improved methods of scorecard evaluation, new tools for fraud detection, strategies for coping with selection bias, and in other areas.

The group received the Credit Collections and Risk award for contributions to the credit industry in 2012, the first academic research group to receive this award.