Show simple item record

dc.contributor.authorSugianto, Nehemia
dc.contributor.authorYuwono, Elizabeth Irenne
dc.date.accessioned2017-02-16T06:15:14Z
dc.date.available2017-02-16T06:15:14Z
dc.date.issued2016-02-29
dc.identifier.issn1876-1100
dc.identifier.urihttp://dspace.uc.ac.id/handle/123456789/883
dc.description.abstractAs the media of communication for people with hearing and speech disabilities, the importance to bridge the communication gap between them and normal people using sign language has become significance. This research proposed a model for the development of sign language recognition technology using Microsoft Kinect and convolutional neural network (CNNs). The proposed model succeeds in recognizing 10 dynamic Indonesian sign language words on complex background. There are total of 100 gesture image sequences containing color and depth data, perform by different users. The classifier consists of two CNNs and one ANN. The first CNN is to extract hand feature from color data, while the other is to extract hand feature from depth data. The training consists of three modes by applying drop-out and data augmentation and achieves the highest validation rate on 81.60% and test result on 73.00%.en_US
dc.language.isoenen_US
dc.publisherLecture Notes in Electrical Engineering, Springeren_US
dc.subjectcomputer vision, convolutional neural network, deep learning, hand gesture recognition, indonesian sign language recognitionen_US
dc.titleIndonesian Dynamic Sign Language Recognition At Complex Background With 2D Convolutional Neural Networksen_US
dc.typeOtheren_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record