Facial recognition technology used to monitor patient safety in ICU


PTI, Jun 3, 2019, 4:23 PM IST

Tokyo: Scientists have used facial recognition technology to predict when patients in the intensive care unit (ICU) are at high risk of unsafe behaviour, such as accidentally removing their breathing tube.

The research suggests that the automated risk detection tool has the potential as a monitor of patient’s safety and could remove some of the limitations associated with limited staff capacity that make it difficult to continuously observe critically-ill patients at the bedside.

“Using images we had taken of a patient’s face and eyes we were able to train computer systems to recognise high-risk arm movement,” said Akane Sato from Yokohama City University Hospital in Japan.

“We were surprised about the high degree of accuracy that we achieved, which shows that this new technology has the potential to be a useful tool for improving patient safety,” Sato said.

Critically ill patients are routinely sedated in the ICU to prevent pain and anxiety, permit invasive procedures, and improve patient safety.

Providing patients with an optimal level of sedation is challenging. Patients who are inadequately sedated are more likely to display high-risk behaviour such as accidentally removing invasive devices.

The study included 24 postoperative patients (average age 67 years) who were admitted to ICU in Yokohama City University Hospital between June and October 2018.

The model was created using pictures taken by a camera mounted on the ceiling above patients’ beds.

Around 300 hours of data were analysed to find daytime images of patients facing the camera in a good body position that showed their face and eyes clearly.

In total, 99 images were subject to machine learning — an algorithm that can analyse specific images based on input data, in a process that resembles the way a human brain learns new information.

The model was able to alert against high-risk behaviour, especially around the subject’s face with high accuracy.

“Various situations can put patients at risk, so our next step is to include additional high-risk situations in our analysis, and to develop an alert function to warn healthcare professionals of risky behaviour.

“Our end goal is to combine various sensing data such as vital signs with our images to develop a fully automated risk prediction system,” said Sato.

Udayavani is now on Telegram. Click here to join our channel and stay updated with the latest news.

Top News

Post communal violence, multi-party delegation calls for peace in Dakshina Kannada

CJI Ramana, Karnataka CM visit Tirumala temple

‘Egg attack’ on Siddaramaiah: CM Bommai dials Siddaramaiah, collects information

Ambedkar’s statue found damaged in UP’s Sitapur

I-T Dept allows taxpayers more time to claim credit for taxes paid outside India

Women have more sex partners than men in these 11 states: Study

K R Patkar resigns from president post of Shirva Gram Panchayat


Related Articles More

Apple warns of security flaw for iPhones, iPads and Macs

Common mobile chargers in India: Govt to set up expert groups to explore new changes

Airtel gets spectrum allocation letter, Sunil Mittal hails ease of doing business

Samsung starts pre-booking of most expensive 5G smartphone series Galaxy Z Fold4 in India

One of the brightest stars in the sky is evolving and dying before our eyes

MUST WATCH

NEWS BULLETIN 18-08-2022

Viral video at Shivamogga

protest against siddaramiah at kodagu

Why young men and women mostly become drug addicts?

NEWS BULLETIN 17-08-2022


Latest Additions

Post communal violence, multi-party delegation calls for peace in Dakshina Kannada

CJI Ramana, Karnataka CM visit Tirumala temple

‘Egg attack’ on Siddaramaiah: CM Bommai dials Siddaramaiah, collects information

Ambedkar’s statue found damaged in UP’s Sitapur

I-T Dept allows taxpayers more time to claim credit for taxes paid outside India

Thanks for visiting Udayavani

You seem to have an Ad Blocker on.
To continue reading, please turn it off or whitelist Udayavani.