Classification of Sign-language Using VGG16

International Journal of Academic Engineering Research (IJAER) 6 (6):36-46 (2022)
  Copy   BIBTEX

Abstract

Sign Language Recognition (SLR) aims to translate sign language into text or speech in order to improve communication between deaf-mute people and the general public. This task has a large social impact, but it is still difficult due to the complexity and wide range of hand actions. We present a novel 3D convolutional neural network (CNN) that extracts discriminative spatial-temporal features from photo datasets. This article is about classification of sign languages are not universal and are usually not mutually intelligible although there are also similarities among different sign languages. They are the foundation of local Deaf cultures and have evolved into effective means of communication. Although signing is primarily used by the deaf and hard of hearing, hearing people also use it when they are unable to speak, when they have difficulty speaking due to a health condition or disability (augmentative and alternative communication), or when they have deaf family members, such as children of deaf adults. In this article we use the 43500 image in the dataset in size 64*64 pixel by use CNN Architecture and achieved 100% accuracy.

Links

PhilArchive

External links

  • This entry has no external links. Add one.
Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Tegn som Språk.Sissel Redse Jørgensen & Rani Lill Anjum (eds.) - 2006 - Gyldendal Akademisk.
Language is shaped by the body.Mark Aronoff, Irit Meir, Carol Padden & Wendy Sandler - 2008 - Behavioral and Brain Sciences 31 (5):509-511.
On Translating Sache in Hegel’s Texts.James H. Wilkinson - 1996 - The Owl of Minerva 27 (2):211-226.
Wittgenstein on private language.Newton Garver - 1959 - Philosophy and Phenomenological Research 20 (3):389-396.
Language as Gesture: Merleau-Ponty and American Sign Language.Jerry H. Gill - 2010 - International Philosophical Quarterly 50 (1):25-37.
Language as Gesture: Merleau-Ponty and American Sign Language.Jerry H. Gill - 2010 - International Philosophical Quarterly 50 (1):25-37.
Sign language and the brain: Apes, apraxia, and aphasia.David Corina - 1996 - Behavioral and Brain Sciences 19 (4):633-634.

Analytics

Added to PP
2022-07-01

Downloads
1,017 (#12,487)

6 months
350 (#5,299)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Samy S. Abu-Naser
North Dakota State University (PhD)

Citations of this work

Classification of Sign-Language Using MobileNet - Deep Learning.Tanseem N. Abu-Jamie & Samy S. Abu-Naser - 2022 - International Journal of Academic Information Systems Research (IJAISR) 6 (7):29-40.
Age and Gender Classification Using Deep Learning - VGG16.Aysha I. Mansour & Samy S. Abu-Naser - 2022 - International Journal of Academic Information Systems Research (IJAISR) 6 (7):50-59.

Add more citations

References found in this work

Potato Classification Using Deep Learning.Abeer A. Elsharif, Ibtesam M. Dheir, Alaa Soliman Abu Mettleq & Samy S. Abu-Naser - 2020 - International Journal of Academic Pedagogical Research (IJAPR) 3 (12):1-8.
Glass Classification Using Artificial Neural Network.Mohmmad Jamal El-Khatib, Bassem S. Abu-Nasser & Samy S. Abu-Naser - 2019 - International Journal of Academic Pedagogical Research (IJAPR) 3 (23):25-31.

View all 16 references / Add more references