Extended Reality in the consultation room
Reconstructive plastic surgeon Eveline Corten is developing an Extended Reality application that allows patients with facial skin cancer to look at lifelike images of the expected outcomes of their facial reconstruction. She is receiving 425,000 euros in funding for her project, titled Seeing is Believing.
Corten was inspired by the technology behind apps like Snapchat and Face Swap: ‘Its algorithms have been developed with millions of existing photos of ‘normal’ faces, allowing you to apply a filter to your own face. You’ll suddenly be wearing crazy glasses or have stars in your eyes. Or you can look 30 years younger or older. In theory, this should also allow us to show what your face looks like after surgical reconstruction.’
Removing the tumour is often very invasive for patients with facial skin cancer. Usually, equally invasive surgery is needed to reconstruct the face after removing the tumour. ‘Reconstructive plastic surgery aims to restore form and function of the face,’ says Corten. ‘This is important because the face plays a key role in social interaction. Often multiple surgeries are needed to achieve an optimal result.’
Moustache
‘At present, very little visual material is available to explain to patients the different stages of the procedure and what the expected result might look like. Sometimes patients are provided with examples, for example, a picture of a male patient with a moustache. A woman doesn’t recognize herself in that. Our patients need personalized, more lifelike educational material. With ‘Seeing is believing’, we want to achieve this.’
Personalized information is also necessary in the process of shared decision-making and in choosing what type of reconstruction suits the patient best. A technique to reconstruct the nose is to transpose a piece of skin from the forehead to the nose. This is a so-called ‘forehead flap’. After the first stage a skin pedicle is temporarily present, connecting the forehead to the nose. If you can use this XR technique to show their facial appearance after the surgery, the patient knows what to expect.’
Image of a ‘forehead flap’ as it currently appears in the patient brochure.
Facial deformities
Corten realizes the big challenge is the development of software. ‘Snapchat’s algorithms are based on hundreds of thousands of photos of normal faces. We, on the other hand, must deal with faces that are affected by skin cancer and deviate from ‘standard’ faces. While the application probably will not be perfect in three years, it will be much better than the current material used for preoperative counselling.’
In the future, other patient groups in which preoperative expectation management and postoperative facial appearance or body image play an important role can also benefit from Corten’s project.
The project Seeing is Believing is granted by the Top Sector Health~Holland Life Science & Health, which stimulates public-private partnerships.
What is Extended Reality?
Several years ago, Corten got introduced to Extended Reality, a term used as an umbrella for Virtual Reality (VR) and Augmented Reality (AR). She quickly saw the potential of this technology for improving surgical skills and teaching of (technical) medical students and medical doctors. And the patient could benefit from XR in preoperative counselling.
‘In XR, virtual images are rendered realistically and experienced as lifelike or ‘immersive’. In VR, the user is completely immersed in a simulated world while wearing a head-mounted device. In AR, objects are incorporated into the real world and appear as an overlay. Holograms are an example of AR.’
In addition to patient counselling, Corten’s research line focuses on the application of XR in the preoperative planning of autologous breast reconstructions, reduction of pain and anxiety in wound care, and training of visuospatial abilities in (technical) medical students. Her colleagues, Professor Marc Mureau and Professor Ruud Selles are actively involved in these projects.
In the project Seeing is Believing, she is working in close collaboration with engineers from the Biomedical Imaging Group Rotterdam (BIGR) and the faculty of Industrial Design at Delft University of Technology. SyncVR and Medical VR are the commercial parties involved and participate in the software development. Also hooked up to the project are the Department of Public Health of the Erasmus MC and Professor of Psychology Andrea Evers of Leiden University. By using technology to improve healthcare, this project is a textbook example of convergence.