Automated assessment of facial nerve dysfunction

Cover Page


Cite item

Full Text

Abstract

BACKGROUND: Facial neuropathy at the peripheral level (unilateral muscular weakness of the entire half of the face) is a common neurological disorder. Assessment of the facial nerve dysfunction grade is necessary to track the dynamics of treatment and monitor the effectiveness of rehabilitation. For this purpose, worldwide clinical practice uses grading systems, the most popular of which are the House-Brackmann, Yanagihara, and Nottingham scales. Such methods are non-universal and based on visual diagnosis, which relies solely on the subjective experience of the physician. Consequently, objective measurements and automation are needed to track the dynamics of recovery. With the use of image processing and computer vision techniques, this task became feasible.

AIM: To develop a method of automated assessment of the facial nerve dysfunction grade by biometric facial analysis to monitor the patient’s recovery dynamics.

METHODS: As part of the collaboration with the Herzen Moscow Research Institute of Oncology, a database of target group patients with grades IV (4 people), V (4 people), and VI (11 people) of facial nerve disfunction according to the Haus-Brackmann scale was compiled. A control group consisted of 20 students from the Bauman Moscow State Technical University. During registration, subjects were asked to perform a series of following mimic tests: raising the eyebrows, closing the eyes, smiling, smiling with effort, inflating the cheeks, pouting the lips, and articulating with effort. Control points of the eyebrow, eye, and mouth areas were used to assess the degree of facial asymmetry. The two-dimensional MultiPIE model, implemented in the dlib library and containing 68 control points, was used as a facial model. A program code in Python was written, which calculates asymmetry coefficients based on changes in the coordinates of control points when the patient performs mimic tests.

RESULTS: A study was conducted to determine statistically significant differences between asymmetry coefficients in the control group and patients. Based on the Mann-Whitney criterion, asymmetry parameters during some mimic tests showed statistically significant differences (p <0.05). Thus, asymmetries in the forehead when raising the eyebrows (0.00 < 0.05), in the mouth when smiling (0.026 < 0.05), in the mouth when smiling with effort (0.00 < 0.05), in the mouth when pouting the lips (0.039 < 0.05), and in the mouth when articulating with effort (0.004 < 0.05) were revealed.

CONCLUSIONS: The results prove the performance of the proposed method and show the need for additional research, particularly the search for differences between groups of patients with different severity and the development of a classification model for machine learning.

Full Text

BACKGROUND: Facial neuropathy at the peripheral level (unilateral muscular weakness of the entire half of the face) is a common neurological disorder. Assessment of the facial nerve dysfunction grade is necessary to track the dynamics of treatment and monitor the effectiveness of rehabilitation. For this purpose, worldwide clinical practice uses grading systems, the most popular of which are the House-Brackmann, Yanagihara, and Nottingham scales. Such methods are non-universal and based on visual diagnosis, which relies solely on the subjective experience of the physician. Consequently, objective measurements and automation are needed to track the dynamics of recovery. With the use of image processing and computer vision techniques, this task became feasible.

AIM: To develop a method of automated assessment of the facial nerve dysfunction grade by biometric facial analysis to monitor the patient’s recovery dynamics.

METHODS: As part of the collaboration with the Herzen Moscow Research Institute of Oncology, a database of target group patients with grades IV (4 people), V (4 people), and VI (11 people) of facial nerve disfunction according to the Haus-Brackmann scale was compiled. A control group consisted of 20 students from the Bauman Moscow State Technical University. During registration, subjects were asked to perform a series of following mimic tests: raising the eyebrows, closing the eyes, smiling, smiling with effort, inflating the cheeks, pouting the lips, and articulating with effort. Control points of the eyebrow, eye, and mouth areas were used to assess the degree of facial asymmetry. The two-dimensional MultiPIE model, implemented in the dlib library and containing 68 control points, was used as a facial model. A program code in Python was written, which calculates asymmetry coefficients based on changes in the coordinates of control points when the patient performs mimic tests.

RESULTS: A study was conducted to determine statistically significant differences between asymmetry coefficients in the control group and patients. Based on the Mann-Whitney criterion, asymmetry parameters during some mimic tests showed statistically significant differences (p <0.05). Thus, asymmetries in the forehead when raising the eyebrows (0.00 < 0.05), in the mouth when smiling (0.026 < 0.05), in the mouth when smiling with effort (0.00 < 0.05), in the mouth when pouting the lips (0.039 < 0.05), and in the mouth when articulating with effort (0.004 < 0.05) were revealed.

CONCLUSIONS: The results prove the performance of the proposed method and show the need for additional research, particularly the search for differences between groups of patients with different severity and the development of a classification model for machine learning.

×

About the authors

Maxim V. Dembovskiy

Bauman Moscow State Technical University

Email: maxdembovsky@mail.ru
ORCID iD: 0009-0001-3361-9753
Russian Federation, Moscow

Andrey A. Boiko

Bauman Moscow State Technical University

Author for correspondence.
Email: boiko_andrew@mail.ru
ORCID iD: 0000-0003-3037-1390
Russian Federation, Moscow

References

  1. Association of Oral and Maxillofacial Surgeons and Dental Surgeons. Clinical protocol for medical care of patients with facial nerve neuropathy. Moscow; 2014. (In Russ).
  2. Stew B., Williams H. Modern management of facial palsy: a review of current literature. Br J Gen Pract. 2013;63:109–110. doi: 10.3399/bjgp13X663262
  3. Finsterer J. Management of peripheral facial nerve palsy. Eur Arch Otorhinolaryngol. 2008;265(7):743–752. doi: 10.1007/s00405-008-0646-4
  4. Fattah AY, Gurusinghe ADR, Gavilan J, et al. Facial nerve grading instruments: systematic review of the literature and suggestion for uniformity. Plastic and reconstructive surgery. 2015;135(2):569–579. doi: 10.1097/PRS.0000000000000905
  5. Gaudin RA, Robinson M, Banks CA, et al. Emerging vs time-tested methods of facial grading among patients with facial paralysis. JAMA facial plastic surgery. 2016;18(4):251–257. doi: 10.1001/jamafacial.2016.0025
  6. Markin SP. Porazheniya litsevogo nerva v praktike vracha. Voronezh: Voronezh State Medical University named after N.N. Burdenko; 2013. 38 p. (In Russ).
  7. Petrov KB. Kineziterapiya pri paralichakh mimicheskoi i yazykoglotochnoi muskulatury. Novokuznetsk : Poligrafist LLC; 2020. 211 p. (In Russ).
  8. Lavrova EA, Samorodov AV. Issledovanie metoda avtomaticheskoi otsenki stepeni asimmetrii litsa cheloveka na videoizobrazhenii. In: Sbornik trudov “FREME”. Moscow; 2018. 231 p. (In Russ).
  9. Nevropatiya litsevogo nerva [Internet]. OpenNeuro [cited 2023 Jun 03]. Available from: http://www.openneuro.ru/doctors/diagnosticheskie-algoritmy/bells-palsy. (In Russ).

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2023 Eco-Vector

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

СМИ зарегистрировано Федеральной службой по надзору в сфере связи, информационных технологий и массовых коммуникаций (Роскомнадзор).
Регистрационный номер и дата принятия решения о регистрации СМИ: серия ПИ № ФС 77 - 79539 от 09 ноября 2020 г.


This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies