Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights
| Main Author: | |
|---|---|
| Publication Date: | 2021 |
| Other Authors: | |
| Format: | Article |
| Language: | eng |
| Source: | Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) |
| Download full: | http://hdl.handle.net/11110/2424 |
Summary: | The human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions through gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV, or control the bed slope. A shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform, was used. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject. |
| id |
RCAP_8687a018bf30dfbf16cbaf7e488390f6 |
|---|---|
| oai_identifier_str |
oai:ciencipca.ipca.pt:11110/2424 |
| network_acronym_str |
RCAP |
| network_name_str |
Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) |
| repository_id_str |
https://opendoar.ac.uk/repository/7160 |
| spelling |
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First InsightsBedridden PeopleKinectNatural User InterfaceShape MatchingThe human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions through gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV, or control the bed slope. A shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform, was used. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject.International Journal of Healthcare Information Systems and Informatics2022-07-06T09:24:15Z2022-07-06T09:24:15Z2021-01-01T00:00:00Zinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/articlehttp://hdl.handle.net/11110/2424oai:ciencipca.ipca.pt:11110/2424enghttp://hdl.handle.net/11110/2424Carvalho, VitorEusébio, Joséinfo:eu-repo/semantics/openAccessreponame:Repositórios Científicos de Acesso Aberto de Portugal (RCAAP)instname:FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologiainstacron:RCAAP2022-09-05T12:53:44Zoai:ciencipca.ipca.pt:11110/2424Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireinfo@rcaap.ptopendoar:https://opendoar.ac.uk/repository/71602025-05-28T10:04:04.206316Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) - FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologiafalse |
| dc.title.none.fl_str_mv |
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights |
| title |
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights |
| spellingShingle |
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights Carvalho, Vitor Bedridden People Kinect Natural User Interface Shape Matching |
| title_short |
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights |
| title_full |
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights |
| title_fullStr |
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights |
| title_full_unstemmed |
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights |
| title_sort |
Virtual Interface With Kinect 3D Sensor for Interaction With Bedridden People: First Insights |
| author |
Carvalho, Vitor |
| author_facet |
Carvalho, Vitor Eusébio, José |
| author_role |
author |
| author2 |
Eusébio, José |
| author2_role |
author |
| dc.contributor.author.fl_str_mv |
Carvalho, Vitor Eusébio, José |
| dc.subject.por.fl_str_mv |
Bedridden People Kinect Natural User Interface Shape Matching |
| topic |
Bedridden People Kinect Natural User Interface Shape Matching |
| description |
The human-machine interaction has evolved significantly in the last years, allowing a new range of opportunities for developing solutions for people with physical limitations. Natural user interfaces (NUI) allow bedridden and/or physically disabled people to perform a set of actions through gestures thus increasing their quality of life and autonomy. This paper presents a solution based on image processing and computer vision using the Kinect 3D sensor for development of applications that recognize gestures made by the human hand. The gestures are then identified by a software application that triggers a set of actions of upmost importance for the bedridden person, for example, trigger the emergency, switch on/off the TV, or control the bed slope. A shape matching technique for six gestures recognition, being the final actions activated by the Arduino platform, was used. The results show a success rate of 96%. This system can improve the quality of life and autonomy of bedridden people, being able to be adapted for the specific necessities of an individual subject. |
| publishDate |
2021 |
| dc.date.none.fl_str_mv |
2021-01-01T00:00:00Z 2022-07-06T09:24:15Z 2022-07-06T09:24:15Z |
| dc.type.status.fl_str_mv |
info:eu-repo/semantics/publishedVersion |
| dc.type.driver.fl_str_mv |
info:eu-repo/semantics/article |
| format |
article |
| status_str |
publishedVersion |
| dc.identifier.uri.fl_str_mv |
http://hdl.handle.net/11110/2424 oai:ciencipca.ipca.pt:11110/2424 |
| url |
http://hdl.handle.net/11110/2424 |
| identifier_str_mv |
oai:ciencipca.ipca.pt:11110/2424 |
| dc.language.iso.fl_str_mv |
eng |
| language |
eng |
| dc.relation.none.fl_str_mv |
http://hdl.handle.net/11110/2424 |
| dc.rights.driver.fl_str_mv |
info:eu-repo/semantics/openAccess |
| eu_rights_str_mv |
openAccess |
| dc.publisher.none.fl_str_mv |
International Journal of Healthcare Information Systems and Informatics |
| publisher.none.fl_str_mv |
International Journal of Healthcare Information Systems and Informatics |
| dc.source.none.fl_str_mv |
reponame:Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) instname:FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia instacron:RCAAP |
| instname_str |
FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia |
| instacron_str |
RCAAP |
| institution |
RCAAP |
| reponame_str |
Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) |
| collection |
Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) |
| repository.name.fl_str_mv |
Repositórios Científicos de Acesso Aberto de Portugal (RCAAP) - FCCN, serviços digitais da FCT – Fundação para a Ciência e a Tecnologia |
| repository.mail.fl_str_mv |
info@rcaap.pt |
| _version_ |
1833590440015691776 |