Andrew Shannon

General Profile:

I became interested in artificial intelligence (AI) while working in an Emergency Department during the Covid-19 pandemic. AI has the potential to greatly affect people’slives and joining the CDT in interactive AI seemed like an excellent way to involve myself in the field while keeping human experience at the core of my work. 

Currently my interests are spread across many domains such as Brain Computer Interfaces, Cooperative Reinforcement Learning, and the use of AI in Synthetic Biology (particularly towards improving treatments for ageing). 

I have an MEng in Aeronautical Engineering (2019) and have just completed an MSc in Engineering Mathematics (2022). During my MEng, I undertook a research project investigating the manufacture of coiled carbon fibre artificial muscles. More recently, my MSc thesis was in the field of Cybergenetics: focusing on the use of model-based reinforcement learning to control the pluripotency state of mouse embryonic stem cells.  

In my spare time, I like swimming, running, and riding my motorbike.  

Research Project Summary:

We hope to develop a modern Electroencephalography (EEG) decoding pipeline using recent developments in physics-informed machine learning and neural dynamics modelling. The combination of data- and knowledge-driven techniques is expected to result in more robust, data-efficient, accurate, and interpretable decoding than previous statistical or purely data-driven approaches. Potential outcomes include: improving neural decoding for small form-factor EEG headsets that can be used 'in-the-wild' to move cognitive science experiments out of the lab; facilitating chronic recording that could improve the diagnosis of neurological diseases such as Parkinson’s disease; and enabling the development of more effective non-invasive brain-computer interfaces for medical, entertainment, and communication applications.

Decoding EEG signals is a challenging task. EEG data suffers from extremely low signal-to-noise ratio due to recording through multiple layers of attenuating material and the contribution of the activity of millions of neurons to the reading at a particular electrode. EEG activity is 'non-stationary,' meaning a given response will drift and vary over time. Furthermore, there is high variance in the neural response to a particular stimulus across subjects, days, and conditions, making the production of a general, calibration-free decoder difficult. However, if these challenges can be overcome, the high temporal resolution and non-invasive nature of EEG make it an excellent window into information processing in the brain.

Modern deep learning architectures, like Recurrent Neural Networks and Long Short-Term Memory networks, have shown promise in EEG decoding. However, they are prone to overfitting and lack the robustness required for real-world applications. Physics-informed machine learning (PIML) is a recent machine learning paradigm hoping to avoid these problems through the incorporation of knowledge into data-driven models. There are three main approaches to enhance data-driven methods with knowledge from physical models [1]. One can augment training data with instances generated using the physical model, soft constraints can be added to the learning process (in the loss function) to ensure predictions obey the physical model, and specific model architectures can be used to restrict solutions to respect known features of the system in question.

In particular, we intend to use a recently developed, biophysically accurate model of macroscale neural activity (Next Generation Neural Mass Models [2]) as part of an EEG decoding pipeline using tools from physics-informed machine learning (for instance, hybrid reservoir computing [3], physics-informed neural networks [4], and neural ordinary differential equations [5]). Once we have constructed the decoder and evaluated its performance, we intend to investigate applications of the decoder to novel research aims within fields such as neurotechnology and neurolinguistics.

References:

[1] Karniadakis, G. E., Kevrekidis, I. G., Lu, L., Perdikaris, P., Wang, S., & Yang, L. (2021). Physics-informed machine learning. Nature Reviews Physics3(6), 422-440.

[2] Coombes, S., & Byrne, A. (2018). Next generation neural mass models. In Nonlinear dynamics in computational neuroscience (pp. 1-16). Cham: Springer International Publishing.

 [3] Pathak, J., Wikner, A., Fussell, R., Chandra, S., Hunt, B. R., Girvan, M., & Ott, E. (2018). Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model. Chaos: An Interdisciplinary Journal of Nonlinear Science28(4).

 [4] Raissi, M., Perdikaris, P., & Karniadakis, G. E. (2019). Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics378, 686-707.

 [5] Chen, R. T., Rubanova, Y., Bettencourt, J., & Duvenaud, D. K. (2018). Neural ordinary differential equations. Advances in neural information processing systems31.

 

Supervisors:

Website:

Edit this page