Computer Vision Laboratory
 

Joy Egede

Transitional Assistant Professor, Faculty of Science

Contact

Biography

Joy Egede is a Transitional Assistant Professor in the School of Computer Science. She received a First class B.Eng degree in Computer Engineering from Covenant University, Nigeria and went on to work as a Software Engineer at Quanteq Technology Services, Abuja. She then proceeded to pursue a postgraduate degree at the University of Nottingham. During this time she received an MSc with distinction in Information Technology after which she was awarded the Horizon Digital Economy IDIC scholarship in 2014 to pursue a PhD research. She completed her PhD programme in 2018 and joined the Computer Vision Lab as a Research Fellow, from which she has now transitioned to her current role as Assistant Professor.

Joy's research output has attracted national awards and recognitions. She was awarded the prestigious 2020 L'oréal-UNESCO For Women in Science Fellowship Award in Computing and Mathematics, for her work on multi-modal neonatal infant pain assessment using machine-assisted methods. She is also the 2021 winner of the UK Women of the Future Awards in Science and was selected as part of the 2023 Foundation Future Leaders in Science and Technology.

Research Summary

Dr Egede's research focuses on understanding human behaviour via analysis of audio-visual and biosignals using machine learning and computer vision techniques.

Previously, her PhD thesis explored the application of traditional and deep-learned algorithms to continuous pain estimation in adults and neonates with a primary focus on the analysis of facial expression changes in response to pain stimuli. This work also led to the creation of the large-scale Acute Pain in Neonates (APN_db) database aimed at supporting research in automatic neonatal pain assessment.

Currently, she works with the Biomedical Research Center in developing automated methods for detecting and interpreting medical conditions from expressive human behaviour as well as designing models for user interfaces that adapt content delivery based on social signals read off the user. This includes projects relating to the automatic objective assessment of comorbid mental health issues and pain, and using virtual agents to deliver health advice to mothers and mothers-to-be in sub-Saharan Africa.

Recent Publications

  • EGEDE, JOY O, PRICE, DOMINIC, KRISHNAN, DEEPA B, JAISWAL, SHASHANK, ELLIOTT, NATASHA, MORRISS, RICHARD, TRIGO, MARIA J GALVEZ, NIXON, NEIL, LIDDLE, PETER, GREENHALGH, CHRISTOPHER and OTHERS, 2021. Design and Evaluation of Virtual Human Mediated Tasks for Assessment of Depression and Anxiety In: Proceedings of the 21st ACM International Conference on Intelligent Virtual Agents. 52-59
  • EGEDE, JOY, TRIGO, MARIA J GALVEZ, HAZZARD, ADRIAN, PORCHERON, MARTIN, BODIAJ, EDGAR, FISCHER, JOEL E, GREENHALGH, CHRIS and VALSTAR, MICHEL, 2021. Designing an Adaptive Embodied Conversational Agent for Health Literacy: a User Study In: Proceedings of the 21st ACM International Conference on Intelligent Virtual Agents. 112-119
  • GALVEZ TRIGO, MARIA J, PORCHERON, MARTIN, EGEDE, JOY, FISCHER, JOEL E, HAZZARD, ADRIAN, GREENHALGH, CHRIS, BODIAJ, EDGAR and VALSTAR, MICHEL, 2021. ALTCAI: Enabling the Use of Embodied Conversational Agents to Deliver Informal Health Advice during Wizard of Oz Studies In: CUI 2021-3rd Conference on Conversational User Interfaces. 1-5
  • EGEDE, JOY O, SONG, SIYANG, OLUGBADE, TEMITAYO A, WANG, CHONGYANG, AMANDA, C DE C, MENG, HONGYING, AUNG, MIN, LANE, NICHOLAS D, VALSTAR, MICHEL and BIANCHI-BERTHOUZE, NADIA, 2020. Emopain challenge 2020: Multimodal pain evaluation from facial and bodily expressions In: 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020). 849-856
  • EGEDE, JOY O, PRICE, DOMINIC, KRISHNAN, DEEPA B, JAISWAL, SHASHANK, ELLIOTT, NATASHA, MORRISS, RICHARD, TRIGO, MARIA J GALVEZ, NIXON, NEIL, LIDDLE, PETER, GREENHALGH, CHRISTOPHER and OTHERS, 2021. Design and Evaluation of Virtual Human Mediated Tasks for Assessment of Depression and Anxiety In: Proceedings of the 21st ACM International Conference on Intelligent Virtual Agents. 52-59
  • EGEDE, JOY, TRIGO, MARIA J GALVEZ, HAZZARD, ADRIAN, PORCHERON, MARTIN, BODIAJ, EDGAR, FISCHER, JOEL E, GREENHALGH, CHRIS and VALSTAR, MICHEL, 2021. Designing an Adaptive Embodied Conversational Agent for Health Literacy: a User Study In: Proceedings of the 21st ACM International Conference on Intelligent Virtual Agents. 112-119
  • GALVEZ TRIGO, MARIA J, PORCHERON, MARTIN, EGEDE, JOY, FISCHER, JOEL E, HAZZARD, ADRIAN, GREENHALGH, CHRIS, BODIAJ, EDGAR and VALSTAR, MICHEL, 2021. ALTCAI: Enabling the Use of Embodied Conversational Agents to Deliver Informal Health Advice during Wizard of Oz Studies In: CUI 2021-3rd Conference on Conversational User Interfaces. 1-5
  • EGEDE, JOY O, SONG, SIYANG, OLUGBADE, TEMITAYO A, WANG, CHONGYANG, AMANDA, C DE C, MENG, HONGYING, AUNG, MIN, LANE, NICHOLAS D, VALSTAR, MICHEL and BIANCHI-BERTHOUZE, NADIA, 2020. Emopain challenge 2020: Multimodal pain evaluation from facial and bodily expressions In: 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020). 849-856
  • EGEDE, J., VALSTAR, M., TORRES, M.T and SHARKEY, D., 2019. Automatic Neonatal Pain Estimation: An Acute Pain in Neonates Database. In: 8th International Conference on Affective Computing and Intelligent Interaction (ACII). 1-7
  • JAISWAL, SHASHANK, EGEDE, JOY and VALSTAR, MICHEL, 2018. Deep Learned Cumulative Attribute Regression In: Automatic Face & Gesture Recognition (FG 2018), 2018 13th IEEE International Conference on. 715-722
  • EGEDE, JOY, VALSTAR, MICHEL and MARTINEZ, BRAIS, 2017. Fusing deep learned and hand-crafted features of appearance, shape, and dynamics for automatic pain estimation In: Automatic Face & Gesture Recognition (FG 2017), 2017 12th IEEE International Conference on. 689-696
  • EGEDE, JOY O and VALSTAR, MICHEL, 2017. Cumulative attributes for pain intensity estimation In: Proceedings of the 19th ACM International Conference on Multimodal Interaction. 146-153

Computer Vision Laboratory

The University of Nottingham
Jubilee Campus
Wollaton Road
Nottingham, NG8 1BB


telephone: +44 (0) 115 8466543
email: andrew.p.french@nottingham.ac.uk