SlideShare a Scribd company logo
1 of 9
Download to read offline
International Journal of Electrical and Computer Engineering (IJECE)
Vol. 10, No. 5, October 2020, pp. 5227~5234
ISSN: 2088-8708, DOI: 10.11591/ijece.v10i5.pp5227-5234  5226
Journal homepage: http://ijece.iaescore.com/index.php/IJECE
Application for Iraqi sign language translation
on Android system
Miaad Ahmed Alobaidy, Sundus Khaleel Ebraheem
Department of Computer Science, College of Computer Science and Mathematics, University of Mosul, Iraq
Article Info ABSTRACT
Article history:
Received Feb 9, 2020
Revised Apr 13, 2020
Accepted Apr 25, 2020
Deaf people suffer from difficulty in social communication, especially those
who have been denied the blessing of hearing before the acquisition of
spoken language and before learning to read and write. For the purpose of
employing mobile devices for the benefit of these people, their teachers
and everyone who has contact with them, this research aims to design
an application for social communication and learning by translating Iraqi
sign language into text in Arabic and vice versa. Iraqi sign language has been
chosen because of a lack of applications for this field. The current research,
to the best of our knowledge, is the first of its kind in Iraq. The application is
open source; words that are not found in the application database can be
processed by translating them into letters alphabetically. The importance of
the application lies in the fact that it is a means of communication and
e-learning through Iraqi sign language, reading and writing in Arabic.
Similarly, it is regarded as a means of social communication between deaf
people and those with normal hearing. This application is designed by using
JAVA language and it was tested on several deaf students at Al-Amal
Institute for Special Needs Care in Mosul, Iraq. It was well comprehended
and accepted.
Keywords:
Deaf person
Machine translation
Mobile application
Sign language
3D avatar
Copyright © 2020 Institute of Advanced Engineering and Science.
All rights reserved.
Corresponding Author:
Miaad Ahmed Alobaidy,
Department of Computer Science,
College of Computer Science and Mathematics,
University of Mosul,
Mosul, Iraq.
Email: mayadaalobaidy@uomosul.edu.iq
1. INTRODUCTION
It is in human nature to seek to help others in ways possible according to abilities, expertise and
knowledge. Similar to how medical experts have an important role in treating patients, and medical engineers
in manufacturing tools and instruments to help the sick and injured, such as the manufacture of prostheses
for the handicapped and headphones for the hearing impaired, there is also a major role for the programmer
in creating and developing applications and technologies to help people with special needs and in order to
make their lives easier. The role of the programmer lies in creating and developing applications and
technologies to help deaf people to communicate with each other, as well as with people with normal
hearing, in addition to facilitating their education. Undoubtedly, that the deaf constitute a large and influential
percentage in society, but they suffer from the difficulty of communication. Sign language, as a means of
communication, has been developed over thousands of years. It is defined as movements using one or both
hands and sometimes using facial expressions [1]. With the invention of the Internet and the development of
technology, the deaf have had a share in benefiting from this technology represented by providing means to
educate and communicate. This issue has become an important area that has attracted the attention of many
scientists and researchers working in this field.
Int J Elec & Comp Eng ISSN: 2088-8708 
Application for Iraqi sign language translation on Android system (Miaad Ahmed Hussein)
5227
Sign language recognition systems (SLRS) are one of the areas of human interaction with
computers (HCI) that have had a significant impact on the deaf community [2]. Sign language differs from
one country to another [3] because it is often the deaf who develop the sign language by identifying signs that
have a special meaning for them. This is affected by the culture and environment in which they live,
for example, when a deaf person in most Gulf countries refers to his chest to indicate the color white, which
relates to the color white Gulf uniform they are used to wearing [4], but people of other countries and
cultures do not wear this traditional attire so it will not have any special meaning for them.
The aim of this paper is to create an application for smart phone devices to translate Iraqi sign
language into what it corresponds to in classical Arabic and vice versa , as well as sending messages (SMS).
This application helps the deaf to communicate with each other as well as with people with normal hearing.
This application is of great significance for the deaf to learn to read and write Arabic. Through monitoring of
many deaf people, it has been noticed that their ability to read and write Arabic gradually weakens after
graduating from deaf teaching institutes because they do not use these skills as they usually use sign
language. This constitutes a problem. Therefore, this application is designed to present the optimal solution
as it provides the deaf person with a feasible way to learn and use the Arabic language continuously.
This application is not only a means of communication, but also a way to continue learning and using Arabic.
It also aims at increasing their level of awareness and mastery of this language in a way that is no less
efficient than their level of sign language.
The use of this application falls within the areas of e-learning and distance learning as it facilitates
the education of deaf people who cannot join deaf institutes. It does not require the physical presence of
a person, but it suffices for the deaf person to have the free application on a smart phone device and learn
while being at home. This application seeks to store images of signs of all words, letters and numbers in
the Iraqi sign language and then link these images to words that correspond to them in the Arabic.
2. RELATED WORKS
The development of sign language translation applications is a research-worthy topic that has been
given much attention by scholars. Being the focus of attention of many researchers, fruitful research papers in
this field have been published. The following is a review of some of this research.
Ludeña et al., introduced a system for translating Spanish sign language. This system is designed to
translate spoken sentences in Spanish into their equivalent in the Spanish sign language. A speech
recognition system was used to convert spoken sentences into written sentences, then the written sentences
are translated into the equivalent in Spanish sign language, using a virtual human (avatar). This system has
been applied to work in two domains: bus transport information and hotel reception, hence, most of
the vocabulary needed by staff and deaf users in these two domains has been absorbed [5].
In 2012, a chat system was designed by Hanadi et al. for deaf people, the first of its kind in the Arab
world. Through this system, Deaf people can communicate with each other, share pictures and multimedia
and learn about each other's experiences. The system consists of two applications: sign language translation
tool and chat system, both of which use the same database which includes the unified Arabic sign language
dictionary to translate Arabic sign language [6]. Harsh et al., use a Kinect device that contains a set of sensors
with the ability to capture a user photo and take many shots that embody the user's movements when his
hands are above the trunk area. These snapshots are used as input to systems that use special algorithms to
distinguish images and compare them with special images in Indian sign language stored within a special
database. It is then translated into the corresponding Indian language [7].
Mohammed et al., eveloped a data acquisition and control (DAC) system that translates sign
language into text that can be read by anyone. The system is called gesture recognition and sign language
translator and it utilizes a smart glove that captures the gesture of the hand and interprets these gestures into
readable text. This text can be sent wirelessly to a smart phone or shown in an embedded LCD display [8].
In 2016, the health questionnaire EuroQol EQ-5D-5L used to assess an individual's general health
was of interest to Rogers et al. They translated this questionnaire from English to British sign language.
The descriptive system for this tool consists of five dimensions: movement – self-care – usual activities –
pain/discomfort – anxiety/depression. Each dimension has five levels: no problems – simple problems –
moderate problems – acute problems – severe problems. All these levels are written in the English language,
and when you click on any level, the video clip that represents it in the British sign language (BSL) is
displayed, so that the deaf person can choose the level that represents his health status [9].
Another system produced by Basma et al., uses the leap motion controller (LMC). This device
works at a rate of 200 frames per second. These frames are used to determine the number, position and
rotation of hands and fingers. The workflow consists of several steps: Pre-processing – Tracking – Features –
Extraction – sign classification. This system can recognize fixed signals such as letters and numbers. It also
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 5, October 2020 : 5227 - 5234
5228
recognizes dynamic signals that include motions. Similarly, it proposes a method for dividing a series of
continuous signals based on palm speed tracking. Thus, the system not only translates the already fragmented
signals but is also able to translate continuous sentences [10].
Sujay et al., roposed a system to translate American sign language; this system consists of two stages:
- Stage 1: isolated gloss recognition system: The inputs into this stage are video clips. Systems are used to
distinguish American Sign Language from these videos. These systems consist of video pre-processing
and a time series neural network module.
- Stage 2: gloss to speech neural translator: This stage works on the outputs of the previous stage, where
the signals of the American Sign Language derived from the previous stage are translated into
the corresponding words in English [11].
Ebling et al., introduced a system for translating German sign language. Microsoft Kinect v2 depth
sensor and two GoPro Hero 4 Black video cameras were used, placed in front of the signer, in addition to
three cameras that are installed on the top, right, and left of the signer, so that signals are captured from
different angles. The well-known recording program BosphorusSign Recording Software was developed to
synchronize the capture from various devices [12].
A mobile application was designed to translate Pakistani sign language by Parvez et al.
Samples were selected from 192 deaf participants aged 5-10 years from two special institutes for children.
The aim of this study is to determine the effectiveness of the mobile interface through an advanced mobile
application to learn basic mathematical concepts using Pakistan sign language (PSL). This study bridges
the gap between technology-based and traditional teaching methods, which are used to teach mathematical
concepts using PSL [13].
3. BACKGROUND
Sign language translation using a computer or cell phone device is a form of machine translation
where words are translated from source language into target language using a sign language translation
system. Many scholars and researchers have dealt with the topic of translating sign language in different
ways, each method having its own scope of application and use. They have achieved different success rates.
Some studies have utilized cameras [14-16] or gloves with sensors [17-19] and many algorithms and artificial
intelligence techniques to distinguish the sign language. An optimal method cannot be identified because
what suits a particular situation may not be appropriate for another. For example, if a deaf person gives
a lecture to a group of students, in this case it is preferable to utilize the methods used with the camera or
electronic gloves equipped with sensors to translate the sign language. However, during the daily life of
a deaf person when wanting to communicate with family and friends, or anyone encountered on the road or at
the work site, it is better to use the sign language translation systems represented by using an application on
a mobile device. The data entry into this application is through a dedicated keyboard designed for this
purpose. Using this application is easy and cost efficient. It is worth noting that most sign language
translation systems are designed to be one-way only. Some of these systems work to translate sign language
into spoken language only [20], while others work to translate spoken language into sign language only [21].
This does not allow the full integration of deaf people into society. It is also noted that many of these systems
use video clips or images of very large sizes to represent the sign language [22, 23]. Consequently,
this requires the use of a dedicated server for storing video clips or images. Therefore, it needs to use
the Internet, which leads to restricting the use of these systems and thus reduces the flexibility of use. It was
noticed through the study of the Arabic language and sign language that the root word, its derivatives and
conjugations have the same translation in the sign language [24, 25]. Hence, this research aims at providing
a flexible and free application.
4. METHODOLOGY
To explain how the Iraqi sign language translation system works, we offer the following.
4.1. General overview
An overview of the steps involved in implementing the proposed system are listed as follows.
a. In this research, a free application for smart phones is designed to translate Iraqi sign language into
the equivalent in spoken Arabic. This application is designed using “Android Studio” as a development
environment to produce applications that run on devices that are managed by Android as an operating
system. As for the programming language, Java Micro Edition has been used because of the ease of this
language in transferring the application on different devices. Due to the nature of this application that
requires the use of a very large number of images that represent words, letters and numbers in the sign
Int J Elec & Comp Eng ISSN: 2088-8708 
Application for Iraqi sign language translation on Android system (Miaad Ahmed Hussein)
5229
language, needing very large storage space, an efficient way to store images with the least possible
storage space was required, and for this purpose “Blender” was used. It is a free 3D design program
launched under the open-source General Public License (GNU), which is used to create 3D movies,
movie tricks, interactive 3D programs, and video games. Blender capabilities include modeling, cladding,
bone positioning and tuning [26, 27]. This program provides the ability to use a virtual human (avatar) to
visualize sign language movements, and then store the images of the avatar in the application database for
use in the translation process.
b. Sign language grammar has been studied so that we can properly represent the sign language images, as
the sign language has its own way of representing derivations, conjugation of verbs, expressing singular,
folded, and plural, as well as expressing feminine and masculine aspects of grammar amongst others [1].
4.2. User interface
The user interface is designed to be as easy to use as in Figure 1. It contains the following components:
- Tape to display the Arabic text.
- The button (‫ترجم‬ / translate) to implement the translation of the text into the Iraqi sign language.
- Virtual human (avatar) to display translation in the Iraqi sign language.
- Button to show and hide the keyboard.
- SMS button.
- The keyboard to enter text in classical Arabic, containing all the 28 Arabic language characters in
addition to the special symbols (‫,ى‬ ‫,ء‬ ‫,ئ‬ ‫,ؤ‬ ‫,)ة‬ as well as a special button to display numbers (0-9).
Each button in it was represented so that it carries an image and symbol of the letter or the number that
represents it. Thus it can be used easily by the healthy person and the deaf and mute person alike.
Also, a special button was added to put a space between words, as well as a button for scanning.
- The button “‫االشارة‬ ‫”لغة‬ to go to the window for translating Iraqi sign language into standard Arabic.
Figure 1. User interface
4.3. Database
A database is designed with several fields. The first field “word” is designated for storing words,
letters, and numbers. The second field “img_name” contains the names of the images that correspond to
the words. It is to be noted that some words need more than one image to be translated. The third
field “Categories” contains the names of the categories, and the fourth field “Additional-img” contains
the names of additional images to express the direction of movement. When the user presses the button
“‫/ترجم‬ translate”, the application, through a special code, searches inside the database for the names of
the images that correspond to each word separately, then a function for displaying images is called using
“img_name” field. The process of searching for the names of images and calling the image display function
for each word continues until the end of the sentence is reached.
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 5, October 2020 : 5227 - 5234
5230
4.4. Images display
The process of displaying images from the database is done through a programmed function that is
provided with 2 parameters: the first is a series of letters representing the name of the image and the second is
a numerical value representing the speed of display determined at (500 ms). Prior experience has proven that
this is an ideal speed for display on mobile devices that are managed by the Android operating system.
Through implementing this function, sign language images are displayed as animated films on the application
screen, represented by the movements that the virtual human (avatar) performs.
5. THE PROPOSED SYSTEM WORKS
Application works includes two modes:
a. First mode: from a hearing person to a deaf person see Figure 2(a)
The text written in classical Arabic is translated into its equivalent in Iraqi sign language once
the user uses the keyboard to write text for the purpose of translating it, clicking on each letter or number that
represents it. It will be written in a dedicated bar to display the text. This procedure is implemented through
the use of a special method programmed in Java language. The work of this method depends on the index for
each button on the keyboard, where each index is linked to a letter or number representing that button.
After completing writing the text, the user presses the button “‫ترجم‬ /translate”. The program searches within
the database for the names of images that correspond to each word separately. As a result, the image display
method is called upon. If the user enters words that are not in the database, such as the names of people or
words that contain spelling errors, the application displays images of the letters of the text alphabetically.
b. Second mode: from the deaf person to the hearing person see Figure 2(b)
Iraqi sign language is translated into classical Arabic, where the button “‫االشارة‬ ‫”لغة‬ designed in
the user interface, when pressed, will move to the second mode, and to facilitate the process of finding
the required images, the images are organized into categories (alphabet, language, verbs, adjectives, history,
geography, etc.). Once the button “‫االشارة‬ ‫”لغة‬ is clicked, a list of these categories will be displayed.
In the same way, when any category is clicked, images that represent the words in this category will be
displayed. Moreover, clicking on any image will have the word representing that image written in a tape for
translation in Arabic text. The user is able to send the messages via (SMS). As the application deals with
a complete language, it is difficult to absorb all vocabularies of the language in a short period. So, this
application is designed to be open source so that more words can be added whenever needed so that as many
words as possible can be stored over time.
(a)
(b)
Figure 2. Application's modes: (a) first mode and (b) second mode
Int J Elec & Comp Eng ISSN: 2088-8708 
Application for Iraqi sign language translation on Android system (Miaad Ahmed Hussein)
5231
6. PRACTICAL EXAMPLES
6.1. Example 1
If the user uses the first mode and writes a sentence for example: “‫ألتعلم‬ ‫المدرسة‬ ‫الى‬ ‫.”أذهب‬ Clicking on
the button”‫,”ترجم‬ the sentence will be translated into the Iraqi sign language as shown in Figure 3.
- Note that the word “‫”أذهب‬ has been translated by Figure 3(a) and (b).
- The word “‫”الى‬ was translated by the display of Figure 3(c) and (d).
- The word “‫”المدرسة‬ is translated by displaying images of Figure 3(e) and (f).
- The translation of the word “‫”ألتعلم‬ is displayed by displaying Figure 3(g) and (h).
(a) (b) (c) (d)
(e) (f) (g) (h)
Figure 3. Translate a sentence written in standard Arabic into its equivalent in Iraqi sign language
6.2. Example 2
If the user uses the first mode and writes the name of a person, for example “‫حسين‬ ‫”احمد‬ and as is
well known, the sign language does not contain images of the names, so the names will be translated
alphabetically, as shown in Figure 4.
- In Figure 4(a), we notice the image represents the letter “‫.”الف‬
- In Figure 4(b), the image represents the letter “‫.”حاء‬
- In Figure 4(c), the image represents the letter “‫.”ميم‬
- In Figure 4(d), the image represents the letter “‫.”دال‬
- In Figure 4(e), the image represents the letter “‫.”حاء‬
- In Figure 4(f), the image represents the letter “‫.”سين‬
- In Figure 4(g), the image represents the letter “‫.”ياء‬
- In Figure 4(h), the image represents the letter “‫.”نون‬
As for the speed with which these images are displayed, it programmatically determined so that they
are comfortable for the eye and clear.
6.3. Example 3
If the user uses the second mode by clicking on the button “‫االشارة‬ ‫,”لغة‬ a list of categories names will
appear, as shown in Figure 5. By selecting the category and clicking on it, the sign language images for this
category will be displayed. For example, if the user wants to choose images for the phrase “‫كتاب‬ ‫”أقرأ‬ he will
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 5, October 2020 : 5227 - 5234
5232
click on the category “‫”أالفعال‬ and from it he will select the image that represent the word “‫”أقرأ‬ as shown in
Figure 6(a). Then he moves to the category “‫ومستلزماتها‬ ‫”المدرسة‬ and specifies the image for the word “‫”كتاب‬ as
in Figure 6(b), thus, the sentence is translated from sign language into standard Arabic.
(a) (b) (c) (d)
(e) (f) (g) (h)
Figure 4. Translate words alphabetically
(a) (b)
Figure 5. List of categories names Figure 6. Translating from Iraqi sign language to standard Arabic
Int J Elec & Comp Eng ISSN: 2088-8708 
Application for Iraqi sign language translation on Android system (Miaad Ahmed Hussein)
5233
7. CONCLUSION
The use of this application facilitates the integration of deaf people into society through the use of
the easiest and most widely used means in the world, which are mobile phone applications; this paper puts
forward a system for translating the Iraqi sign language, due to the lack of applications related to this dialect.
This application is an educational system that can be applied to teach deaf people Iraqi sign language. It also
teaches them how to read and write in standard Arabic. Additionally, the application teaches healthy people
Iraqi sign language easily, thus removing barriers between deaf people and healthy people and facilitating
the process of communication between them. Since this application works without using the Internet, this is
easy to use it at any time and any place. The use of (Blender) program makes it possible to store the huge
amount of images that the application needs by dividing the images and reducing the display resolution, thus
reducing the size of the images.
There are two proposes for future works: 1) to expand this system to include the use of sound for
input and output by healthy people, thus facilitating further communication; and 2) use the root word
extraction algorithm in Arabic language, so that the program works on the outputs of this algorithm.
Thus, it may be possible to reduce the size of the databases to some extent as well as reducing the number of
images that we need for translation. The root word is stored in databases on behalf of the derivatives or
the conjugation of that word, which all have the same translation in sign language. This increases
the efficiency of the translation process.
ACKNOWLEDGEMENTS
Our thanks to the University of Mosul, represented by all colleges and departments, especially
the Department of Computer Science. Our thanks and appreciation to the members of the journal for allowing
us to present our research. We especially thank “Al-Amal Institute for Special Needs Care, Mosul” for their
cooperation in highlighting effective ways in helping the deaf to be integrated into society.
REFERENCES
[1] S. Samreen, M. Al-Binali, “Standard Qatari Sign Language,” Doha The Supreme Council for Doha Affairs, 2010.
[2] N. B. Ibrahim, M. M. Selim, and H. H. Zayed, “An Automatic Arabic Sign Language Recognition
System (ArSLRS),” Journal of King Saud University - Computer and Information Sciences, vol. 30, issue 4,
pp. 470-477, 2018.
[3] M. Abdo, et al., “Arabic Alphabet and Numbers Sign Language Recognition,” International Journal of Advanced
Computer Science and Applications, vol. 6, no. 11, pp. 209-214, 2015.
[4] H. H. Nasereddin, “MMLSL: Modelling Mobile Learning for Sign Language,” Of Engineering and Computer
Science Research and Reviews in Applied Sciences, vol. 9, no. 2, pp. 20267-20272, 2017.
[5] V. López-Ludeña, et al., “Methodology for developing a Speech into Sign Language Translation System in a New
Semantic Domain,” 2012.
[6] H. Al-Dosri, N. Alawfi, and Y. Alginahi, “Arabic sign language easy communicate ArSLEC,” in Proc. Int. Conf.
Comput. Inf. Technol, pp. 474-479, 2012.
[7] H. V. Verma, E. Aggarwal and S. Chandra, "Gesture recognition using kinect for sign language translation," 2013
IEEE Second International Conference on Image Information Processing (ICIIP-2013), Shimla, pp. 96-100, 2013.
[8] M. Elmahgiubi, M. Ennajar, N. Drawil and M. S. Elbuni, "Sign language translator and gesture recognition," 2015
Global Summit on Computer & Information Technology (GSCIT), Sousse, pp. 1-6, 2015.
[9] K. D. Rogers, et al., “Translation, validity and reliability of the British Sign Language (BSL) version of
the EQ-5D-5L,” Quality of Life Research, vol. 25, no. 7, pp. 1825–1834, 2016..
[10] B. Hisham and A. Hamouda, “Arabic Static and Dynamic Gestures Recognition Using Leap Motion,” JCS, vol. 13,
no. 8, pp. 337-354, 2017.
[11] S. S Kumar, T. Wangyal, V. Saboo and R. Srinath, "Time Series Neural Networks for Real Time Sign Language
Translation," 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando,
FL, pp. 243-248, 2018.
[12] S. Ebling, et al., “SMILE Swiss German sign language dataset”. In Proceedings of the Eleventh International
Conference on Language Resources and Evaluation (LREC 2018), pp. 4221-4229, 2018.
[13] K. Parvez, et al. “Measuring Effectiveness of Mobile Application in Learning Basic Mathematical Concepts Using
Pakistan Sign Language,” Sustainability, vol. 11, no. 11, p. 3064, 2019. doi: https://doi.org/10.3390/su11113064
[14] A. A. Youssif, et al., “Arabic Sign Language (ArSL) Recognition System Using HMM,” International Journal of
Advanced Computer Science and Applications, vol. 2, no. 11, pp. 45-51, 2011.
[15] N. R. Albelwi and Y. M. Alginahi, “Real-time Arabic sign language (arsl) recognition,” in International
Conference on Communications and Information Technology, pp. 497-501, 2012.
[16] S. Shahriar et al., "Real-Time American Sign Language Recognition Using Skin Segmentation and Image Category
Classification with Convolutional Neural Network and Deep Learning," TENCON 2018 - 2018 IEEE Region 10
Conference, Jeju, Korea (South), pp. 1168-1171, 2018.
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 5, October 2020 : 5227 - 5234
5234
[17] P. D. Rosero-Montalvo et al., "Sign Language Recognition Based on Intelligent Glove Using Machine Learning
Techniques," 2018 IEEE Third Ecuador Technical Chapters Meeting (ETCM), Cuenca, pp. 1-5, 2018.
[18] B. Sapkota, et al., “Smart Glove for Sign Language Translation Using Arduino,” in 1st KEC Conference
Proceedings, vol. 1, pp. 5-11, 2018.
[19] S. A. Mehdi and Y. N. Khan, "Sign language recognition using sensor gloves," Proceedings of the 9th International
Conference on Neural Information Processing, 2002. ICONIP '02., Singapore, vol. 5, pp. 2204-2206, 2002.
[20] R. Ambar, et al., “Development of a Wearable Device for Sign Language Recognition,” Journal of Physics:
Conference Series, vol. 1019, p. 012017, 2018. doi: 10.1088/1742-6596/1019/1/012017
[21] A. Wray, et al., “A formulaic approach to translation at the post office: reading the signs,” Language &
Communication, vol. 24, no. 1, pp. 59–75, 2004.
[22] F. Al Ameiri, M. J. Zemerly and M. Al Marzouqi, "Mobile Arabic sign language," 2011 International Conference
for Internet Technology and Secured Transactions, Abu Dhabi, pp. 363-367, 2011.
[23] A. Almohimeed, M. Wald, and R. I. Damper, “Arabic text to Arabic sign language translation system for the deaf
and hearing-impaired community,” in Proceedings of the Second Workshop on Speech and Language Processing
for Assistive Technologies, pp. 101-109, 2011.
[24] R. Z. Al-Obaidy, “Sentiment analysis for Arabic tweets using support vector machine,” Master's thesis, College of
Computer Sciences And Mathematics, University of Mosul, 2019.
[25] A. Y. Taqa, “Constructing Anti – spam filter based on naive bayesian classifier,” PhD thesis, College of Computer
Sciences And Mathematics, University of Mosul, 2007.
[26] “Blender.” https://ar.wikipedia.org/wiki/%D8%A8%D9%84%D9%86% D8%AF%D8%B1 (accessed 2/1/2020).
[27] “Top 55+ Best Animation Software: The Ultimate List 2019,” Renderforest, 2019. [Online]. Available:
https://www.renderforest.com/blog/best-animation-software, [accessed: Aug 9 2019].
BIOGRAPHIES OF AUTHORS
Miaad Ahmed Alobaidy received her B.Sc.2017 in computer sciences from computer sciences
and mathematics college in university of Mosul. She is currently M.Sc. student in computer
sciences field of image processing in university of Mosul. She has interests in fields like
database and website design.
Sundus Khaleel Ebraheem received her B.Sc. 1996 in computer sciences and received her
M.Sc. degree 2001 in computer sciences field of image processing from university of Mosul. She
is currently working as assistant professor in computer sciences and mathematics college in
university of Mosul. She has 22 years experience in teaching field and has 19 published research
papers in various scientific journals.

More Related Content

What's hot

What's hot (20)

IRJET- Back to School
IRJET- Back to SchoolIRJET- Back to School
IRJET- Back to School
 
SMARCOS Abstract Paper submitted to ICCHP 2012
SMARCOS Abstract Paper submitted to ICCHP 2012SMARCOS Abstract Paper submitted to ICCHP 2012
SMARCOS Abstract Paper submitted to ICCHP 2012
 
Human Computer Interface
Human Computer InterfaceHuman Computer Interface
Human Computer Interface
 
IRJET - A Robust Sign Language and Hand Gesture Recognition System using Conv...
IRJET - A Robust Sign Language and Hand Gesture Recognition System using Conv...IRJET - A Robust Sign Language and Hand Gesture Recognition System using Conv...
IRJET - A Robust Sign Language and Hand Gesture Recognition System using Conv...
 
IRJET - Speech Recognition using Android
IRJET -  	  Speech Recognition using AndroidIRJET -  	  Speech Recognition using Android
IRJET - Speech Recognition using Android
 
Human computer interaction with Nalaemton
Human computer interaction with NalaemtonHuman computer interaction with Nalaemton
Human computer interaction with Nalaemton
 
Human Computer Interface of an Information System
Human Computer Interface of an Information SystemHuman Computer Interface of an Information System
Human Computer Interface of an Information System
 
Toward a New Algorithm for Hands Free Browsing
Toward a New Algorithm for Hands Free BrowsingToward a New Algorithm for Hands Free Browsing
Toward a New Algorithm for Hands Free Browsing
 
IRJET - Voice based E-Mail for Visually Challenged
IRJET -  	  Voice based E-Mail for Visually ChallengedIRJET -  	  Voice based E-Mail for Visually Challenged
IRJET - Voice based E-Mail for Visually Challenged
 
Ubitous computing ppt
Ubitous computing pptUbitous computing ppt
Ubitous computing ppt
 
Speech Based Search Engine System Control and User Interaction
Speech Based Search Engine System Control and User  InteractionSpeech Based Search Engine System Control and User  Interaction
Speech Based Search Engine System Control and User Interaction
 
Assistive technology
Assistive technologyAssistive technology
Assistive technology
 
IRJET - Voice based E-Mail for Visually Impaired
IRJET - Voice based E-Mail for Visually ImpairedIRJET - Voice based E-Mail for Visually Impaired
IRJET - Voice based E-Mail for Visually Impaired
 
Human computerinterface
Human computerinterfaceHuman computerinterface
Human computerinterface
 
Assistive technology
Assistive technologyAssistive technology
Assistive technology
 
Ed 508 assistive technology powerpoint commander
Ed 508 assistive technology powerpoint commanderEd 508 assistive technology powerpoint commander
Ed 508 assistive technology powerpoint commander
 
Interfaces
InterfacesInterfaces
Interfaces
 
Uses
UsesUses
Uses
 
Sensor based interaction
Sensor based interaction Sensor based interaction
Sensor based interaction
 
Assistiive Technology Presentation
Assistiive Technology PresentationAssistiive Technology Presentation
Assistiive Technology Presentation
 

Similar to Application for Iraqi sign language translation on Android system

Procedia Computer Science 94 ( 2016 ) 295 – 301 Avail.docx
 Procedia Computer Science   94  ( 2016 )  295 – 301 Avail.docx Procedia Computer Science   94  ( 2016 )  295 – 301 Avail.docx
Procedia Computer Science 94 ( 2016 ) 295 – 301 Avail.docx
aryan532920
 
Automatic recognition of Arabic alphabets sign language using deep learning
Automatic recognition of Arabic alphabets sign language using  deep learningAutomatic recognition of Arabic alphabets sign language using  deep learning
Automatic recognition of Arabic alphabets sign language using deep learning
IJECEIAES
 
Sign language translator ieee power point
Sign language translator ieee power pointSign language translator ieee power point
Sign language translator ieee power point
Madhuri Yellapu
 

Similar to Application for Iraqi sign language translation on Android system (20)

Procedia Computer Science 94 ( 2016 ) 295 – 301 Avail.docx
 Procedia Computer Science   94  ( 2016 )  295 – 301 Avail.docx Procedia Computer Science   94  ( 2016 )  295 – 301 Avail.docx
Procedia Computer Science 94 ( 2016 ) 295 – 301 Avail.docx
 
Automatic recognition of Arabic alphabets sign language using deep learning
Automatic recognition of Arabic alphabets sign language using  deep learningAutomatic recognition of Arabic alphabets sign language using  deep learning
Automatic recognition of Arabic alphabets sign language using deep learning
 
IRJET - Sign to Speech Smart Glove
IRJET - Sign to Speech Smart GloveIRJET - Sign to Speech Smart Glove
IRJET - Sign to Speech Smart Glove
 
IRJET- Hand Gesture based Recognition using CNN Methodology
IRJET- Hand Gesture based Recognition using CNN MethodologyIRJET- Hand Gesture based Recognition using CNN Methodology
IRJET- Hand Gesture based Recognition using CNN Methodology
 
Hand Gesture Vocalizer
Hand Gesture VocalizerHand Gesture Vocalizer
Hand Gesture Vocalizer
 
Soundscape for Visually Impaired
Soundscape for Visually ImpairedSoundscape for Visually Impaired
Soundscape for Visually Impaired
 
Real-Time Sign Language Detector
Real-Time Sign Language DetectorReal-Time Sign Language Detector
Real-Time Sign Language Detector
 
HandOVRS p-report
HandOVRS p-reportHandOVRS p-report
HandOVRS p-report
 
IRJET- Communication Aid for Deaf and Dumb People
IRJET- Communication Aid for Deaf and Dumb PeopleIRJET- Communication Aid for Deaf and Dumb People
IRJET- Communication Aid for Deaf and Dumb People
 
Sign language translator ieee power point
Sign language translator ieee power pointSign language translator ieee power point
Sign language translator ieee power point
 
Hand Gesture Recognition and Translation Application
Hand Gesture Recognition and Translation ApplicationHand Gesture Recognition and Translation Application
Hand Gesture Recognition and Translation Application
 
20550-38934-1-PB.pdf
20550-38934-1-PB.pdf20550-38934-1-PB.pdf
20550-38934-1-PB.pdf
 
Phase ii frontpage
Phase ii frontpagePhase ii frontpage
Phase ii frontpage
 
SignReco: Sign Language Translator
SignReco: Sign Language TranslatorSignReco: Sign Language Translator
SignReco: Sign Language Translator
 
IRJET- Voice Assistant for Visually Impaired People
IRJET-  	  Voice Assistant for Visually Impaired PeopleIRJET-  	  Voice Assistant for Visually Impaired People
IRJET- Voice Assistant for Visually Impaired People
 
IRJET- Hand Talk- Assistant Technology for Deaf and Dumb
IRJET- Hand Talk- Assistant Technology for Deaf and DumbIRJET- Hand Talk- Assistant Technology for Deaf and Dumb
IRJET- Hand Talk- Assistant Technology for Deaf and Dumb
 
Review Paper on Two Way Communication System with Binary Code Medium for Peop...
Review Paper on Two Way Communication System with Binary Code Medium for Peop...Review Paper on Two Way Communication System with Binary Code Medium for Peop...
Review Paper on Two Way Communication System with Binary Code Medium for Peop...
 
Design of a Communication System using Sign Language aid for Differently Able...
Design of a Communication System using Sign Language aid for Differently Able...Design of a Communication System using Sign Language aid for Differently Able...
Design of a Communication System using Sign Language aid for Differently Able...
 
183
183183
183
 
GLOVE BASED GESTURE RECOGNITION USING IR SENSOR
GLOVE BASED GESTURE RECOGNITION USING IR SENSORGLOVE BASED GESTURE RECOGNITION USING IR SENSOR
GLOVE BASED GESTURE RECOGNITION USING IR SENSOR
 

More from IJECEIAES

Remote field-programmable gate array laboratory for signal acquisition and de...
Remote field-programmable gate array laboratory for signal acquisition and de...Remote field-programmable gate array laboratory for signal acquisition and de...
Remote field-programmable gate array laboratory for signal acquisition and de...
IJECEIAES
 
An efficient security framework for intrusion detection and prevention in int...
An efficient security framework for intrusion detection and prevention in int...An efficient security framework for intrusion detection and prevention in int...
An efficient security framework for intrusion detection and prevention in int...
IJECEIAES
 
A review on internet of things-based stingless bee's honey production with im...
A review on internet of things-based stingless bee's honey production with im...A review on internet of things-based stingless bee's honey production with im...
A review on internet of things-based stingless bee's honey production with im...
IJECEIAES
 
Seizure stage detection of epileptic seizure using convolutional neural networks
Seizure stage detection of epileptic seizure using convolutional neural networksSeizure stage detection of epileptic seizure using convolutional neural networks
Seizure stage detection of epileptic seizure using convolutional neural networks
IJECEIAES
 
Hyperspectral object classification using hybrid spectral-spatial fusion and ...
Hyperspectral object classification using hybrid spectral-spatial fusion and ...Hyperspectral object classification using hybrid spectral-spatial fusion and ...
Hyperspectral object classification using hybrid spectral-spatial fusion and ...
IJECEIAES
 
SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...
SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...
SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...
IJECEIAES
 
Performance enhancement of machine learning algorithm for breast cancer diagn...
Performance enhancement of machine learning algorithm for breast cancer diagn...Performance enhancement of machine learning algorithm for breast cancer diagn...
Performance enhancement of machine learning algorithm for breast cancer diagn...
IJECEIAES
 
A deep learning framework for accurate diagnosis of colorectal cancer using h...
A deep learning framework for accurate diagnosis of colorectal cancer using h...A deep learning framework for accurate diagnosis of colorectal cancer using h...
A deep learning framework for accurate diagnosis of colorectal cancer using h...
IJECEIAES
 
Predicting churn with filter-based techniques and deep learning
Predicting churn with filter-based techniques and deep learningPredicting churn with filter-based techniques and deep learning
Predicting churn with filter-based techniques and deep learning
IJECEIAES
 
Automatic customer review summarization using deep learningbased hybrid senti...
Automatic customer review summarization using deep learningbased hybrid senti...Automatic customer review summarization using deep learningbased hybrid senti...
Automatic customer review summarization using deep learningbased hybrid senti...
IJECEIAES
 

More from IJECEIAES (20)

Remote field-programmable gate array laboratory for signal acquisition and de...
Remote field-programmable gate array laboratory for signal acquisition and de...Remote field-programmable gate array laboratory for signal acquisition and de...
Remote field-programmable gate array laboratory for signal acquisition and de...
 
Detecting and resolving feature envy through automated machine learning and m...
Detecting and resolving feature envy through automated machine learning and m...Detecting and resolving feature envy through automated machine learning and m...
Detecting and resolving feature envy through automated machine learning and m...
 
Smart monitoring technique for solar cell systems using internet of things ba...
Smart monitoring technique for solar cell systems using internet of things ba...Smart monitoring technique for solar cell systems using internet of things ba...
Smart monitoring technique for solar cell systems using internet of things ba...
 
An efficient security framework for intrusion detection and prevention in int...
An efficient security framework for intrusion detection and prevention in int...An efficient security framework for intrusion detection and prevention in int...
An efficient security framework for intrusion detection and prevention in int...
 
Developing a smart system for infant incubators using the internet of things ...
Developing a smart system for infant incubators using the internet of things ...Developing a smart system for infant incubators using the internet of things ...
Developing a smart system for infant incubators using the internet of things ...
 
A review on internet of things-based stingless bee's honey production with im...
A review on internet of things-based stingless bee's honey production with im...A review on internet of things-based stingless bee's honey production with im...
A review on internet of things-based stingless bee's honey production with im...
 
A trust based secure access control using authentication mechanism for intero...
A trust based secure access control using authentication mechanism for intero...A trust based secure access control using authentication mechanism for intero...
A trust based secure access control using authentication mechanism for intero...
 
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbers
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbersFuzzy linear programming with the intuitionistic polygonal fuzzy numbers
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbers
 
The performance of artificial intelligence in prostate magnetic resonance im...
The performance of artificial intelligence in prostate  magnetic resonance im...The performance of artificial intelligence in prostate  magnetic resonance im...
The performance of artificial intelligence in prostate magnetic resonance im...
 
Seizure stage detection of epileptic seizure using convolutional neural networks
Seizure stage detection of epileptic seizure using convolutional neural networksSeizure stage detection of epileptic seizure using convolutional neural networks
Seizure stage detection of epileptic seizure using convolutional neural networks
 
Analysis of driving style using self-organizing maps to analyze driver behavior
Analysis of driving style using self-organizing maps to analyze driver behaviorAnalysis of driving style using self-organizing maps to analyze driver behavior
Analysis of driving style using self-organizing maps to analyze driver behavior
 
Hyperspectral object classification using hybrid spectral-spatial fusion and ...
Hyperspectral object classification using hybrid spectral-spatial fusion and ...Hyperspectral object classification using hybrid spectral-spatial fusion and ...
Hyperspectral object classification using hybrid spectral-spatial fusion and ...
 
Fuzzy logic method-based stress detector with blood pressure and body tempera...
Fuzzy logic method-based stress detector with blood pressure and body tempera...Fuzzy logic method-based stress detector with blood pressure and body tempera...
Fuzzy logic method-based stress detector with blood pressure and body tempera...
 
SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...
SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...
SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...
 
Performance enhancement of machine learning algorithm for breast cancer diagn...
Performance enhancement of machine learning algorithm for breast cancer diagn...Performance enhancement of machine learning algorithm for breast cancer diagn...
Performance enhancement of machine learning algorithm for breast cancer diagn...
 
A deep learning framework for accurate diagnosis of colorectal cancer using h...
A deep learning framework for accurate diagnosis of colorectal cancer using h...A deep learning framework for accurate diagnosis of colorectal cancer using h...
A deep learning framework for accurate diagnosis of colorectal cancer using h...
 
Swarm flip-crossover algorithm: a new swarm-based metaheuristic enriched with...
Swarm flip-crossover algorithm: a new swarm-based metaheuristic enriched with...Swarm flip-crossover algorithm: a new swarm-based metaheuristic enriched with...
Swarm flip-crossover algorithm: a new swarm-based metaheuristic enriched with...
 
Predicting churn with filter-based techniques and deep learning
Predicting churn with filter-based techniques and deep learningPredicting churn with filter-based techniques and deep learning
Predicting churn with filter-based techniques and deep learning
 
Taxi-out time prediction at Mohammed V Casablanca Airport
Taxi-out time prediction at Mohammed V Casablanca AirportTaxi-out time prediction at Mohammed V Casablanca Airport
Taxi-out time prediction at Mohammed V Casablanca Airport
 
Automatic customer review summarization using deep learningbased hybrid senti...
Automatic customer review summarization using deep learningbased hybrid senti...Automatic customer review summarization using deep learningbased hybrid senti...
Automatic customer review summarization using deep learningbased hybrid senti...
 

Recently uploaded

1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
AldoGarca30
 
DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakes
MayuraD1
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
Epec Engineered Technologies
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Kandungan 087776558899
 

Recently uploaded (20)

1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
1_Introduction + EAM Vocabulary + how to navigate in EAM.pdf
 
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
NO1 Top No1 Amil Baba In Azad Kashmir, Kashmir Black Magic Specialist Expert ...
 
DeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakesDeepFakes presentation : brief idea of DeepFakes
DeepFakes presentation : brief idea of DeepFakes
 
Standard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power PlayStandard vs Custom Battery Packs - Decoding the Power Play
Standard vs Custom Battery Packs - Decoding the Power Play
 
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptxS1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
S1S2 B.Arch MGU - HOA1&2 Module 3 -Temple Architecture of Kerala.pptx
 
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptxHOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
HOA1&2 - Module 3 - PREHISTORCI ARCHITECTURE OF KERALA.pptx
 
PE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and propertiesPE 459 LECTURE 2- natural gas basic concepts and properties
PE 459 LECTURE 2- natural gas basic concepts and properties
 
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
Unit 4_Part 1 CSE2001 Exception Handling and Function Template and Class Temp...
 
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best ServiceTamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
Tamil Call Girls Bhayandar WhatsApp +91-9930687706, Best Service
 
UNIT 4 PTRP final Convergence in probability.pptx
UNIT 4 PTRP final Convergence in probability.pptxUNIT 4 PTRP final Convergence in probability.pptx
UNIT 4 PTRP final Convergence in probability.pptx
 
Thermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.pptThermal Engineering -unit - III & IV.ppt
Thermal Engineering -unit - III & IV.ppt
 
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak HamilCara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
Cara Menggugurkan Sperma Yang Masuk Rahim Biyar Tidak Hamil
 
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptxOrlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
Orlando’s Arnold Palmer Hospital Layout Strategy-1.pptx
 
💚Trustworthy Call Girls Pune Call Girls Service Just Call 🍑👄6378878445 🍑👄 Top...
💚Trustworthy Call Girls Pune Call Girls Service Just Call 🍑👄6378878445 🍑👄 Top...💚Trustworthy Call Girls Pune Call Girls Service Just Call 🍑👄6378878445 🍑👄 Top...
💚Trustworthy Call Girls Pune Call Girls Service Just Call 🍑👄6378878445 🍑👄 Top...
 
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
Navigating Complexity: The Role of Trusted Partners and VIAS3D in Dassault Sy...
 
Theory of Time 2024 (Universal Theory for Everything)
Theory of Time 2024 (Universal Theory for Everything)Theory of Time 2024 (Universal Theory for Everything)
Theory of Time 2024 (Universal Theory for Everything)
 
Computer Networks Basics of Network Devices
Computer Networks  Basics of Network DevicesComputer Networks  Basics of Network Devices
Computer Networks Basics of Network Devices
 
Introduction to Data Visualization,Matplotlib.pdf
Introduction to Data Visualization,Matplotlib.pdfIntroduction to Data Visualization,Matplotlib.pdf
Introduction to Data Visualization,Matplotlib.pdf
 
Max. shear stress theory-Maximum Shear Stress Theory ​ Maximum Distortional ...
Max. shear stress theory-Maximum Shear Stress Theory ​  Maximum Distortional ...Max. shear stress theory-Maximum Shear Stress Theory ​  Maximum Distortional ...
Max. shear stress theory-Maximum Shear Stress Theory ​ Maximum Distortional ...
 
Computer Graphics Introduction To Curves
Computer Graphics Introduction To CurvesComputer Graphics Introduction To Curves
Computer Graphics Introduction To Curves
 

Application for Iraqi sign language translation on Android system

  • 1. International Journal of Electrical and Computer Engineering (IJECE) Vol. 10, No. 5, October 2020, pp. 5227~5234 ISSN: 2088-8708, DOI: 10.11591/ijece.v10i5.pp5227-5234  5226 Journal homepage: http://ijece.iaescore.com/index.php/IJECE Application for Iraqi sign language translation on Android system Miaad Ahmed Alobaidy, Sundus Khaleel Ebraheem Department of Computer Science, College of Computer Science and Mathematics, University of Mosul, Iraq Article Info ABSTRACT Article history: Received Feb 9, 2020 Revised Apr 13, 2020 Accepted Apr 25, 2020 Deaf people suffer from difficulty in social communication, especially those who have been denied the blessing of hearing before the acquisition of spoken language and before learning to read and write. For the purpose of employing mobile devices for the benefit of these people, their teachers and everyone who has contact with them, this research aims to design an application for social communication and learning by translating Iraqi sign language into text in Arabic and vice versa. Iraqi sign language has been chosen because of a lack of applications for this field. The current research, to the best of our knowledge, is the first of its kind in Iraq. The application is open source; words that are not found in the application database can be processed by translating them into letters alphabetically. The importance of the application lies in the fact that it is a means of communication and e-learning through Iraqi sign language, reading and writing in Arabic. Similarly, it is regarded as a means of social communication between deaf people and those with normal hearing. This application is designed by using JAVA language and it was tested on several deaf students at Al-Amal Institute for Special Needs Care in Mosul, Iraq. It was well comprehended and accepted. Keywords: Deaf person Machine translation Mobile application Sign language 3D avatar Copyright © 2020 Institute of Advanced Engineering and Science. All rights reserved. Corresponding Author: Miaad Ahmed Alobaidy, Department of Computer Science, College of Computer Science and Mathematics, University of Mosul, Mosul, Iraq. Email: mayadaalobaidy@uomosul.edu.iq 1. INTRODUCTION It is in human nature to seek to help others in ways possible according to abilities, expertise and knowledge. Similar to how medical experts have an important role in treating patients, and medical engineers in manufacturing tools and instruments to help the sick and injured, such as the manufacture of prostheses for the handicapped and headphones for the hearing impaired, there is also a major role for the programmer in creating and developing applications and technologies to help people with special needs and in order to make their lives easier. The role of the programmer lies in creating and developing applications and technologies to help deaf people to communicate with each other, as well as with people with normal hearing, in addition to facilitating their education. Undoubtedly, that the deaf constitute a large and influential percentage in society, but they suffer from the difficulty of communication. Sign language, as a means of communication, has been developed over thousands of years. It is defined as movements using one or both hands and sometimes using facial expressions [1]. With the invention of the Internet and the development of technology, the deaf have had a share in benefiting from this technology represented by providing means to educate and communicate. This issue has become an important area that has attracted the attention of many scientists and researchers working in this field.
  • 2. Int J Elec & Comp Eng ISSN: 2088-8708  Application for Iraqi sign language translation on Android system (Miaad Ahmed Hussein) 5227 Sign language recognition systems (SLRS) are one of the areas of human interaction with computers (HCI) that have had a significant impact on the deaf community [2]. Sign language differs from one country to another [3] because it is often the deaf who develop the sign language by identifying signs that have a special meaning for them. This is affected by the culture and environment in which they live, for example, when a deaf person in most Gulf countries refers to his chest to indicate the color white, which relates to the color white Gulf uniform they are used to wearing [4], but people of other countries and cultures do not wear this traditional attire so it will not have any special meaning for them. The aim of this paper is to create an application for smart phone devices to translate Iraqi sign language into what it corresponds to in classical Arabic and vice versa , as well as sending messages (SMS). This application helps the deaf to communicate with each other as well as with people with normal hearing. This application is of great significance for the deaf to learn to read and write Arabic. Through monitoring of many deaf people, it has been noticed that their ability to read and write Arabic gradually weakens after graduating from deaf teaching institutes because they do not use these skills as they usually use sign language. This constitutes a problem. Therefore, this application is designed to present the optimal solution as it provides the deaf person with a feasible way to learn and use the Arabic language continuously. This application is not only a means of communication, but also a way to continue learning and using Arabic. It also aims at increasing their level of awareness and mastery of this language in a way that is no less efficient than their level of sign language. The use of this application falls within the areas of e-learning and distance learning as it facilitates the education of deaf people who cannot join deaf institutes. It does not require the physical presence of a person, but it suffices for the deaf person to have the free application on a smart phone device and learn while being at home. This application seeks to store images of signs of all words, letters and numbers in the Iraqi sign language and then link these images to words that correspond to them in the Arabic. 2. RELATED WORKS The development of sign language translation applications is a research-worthy topic that has been given much attention by scholars. Being the focus of attention of many researchers, fruitful research papers in this field have been published. The following is a review of some of this research. Ludeña et al., introduced a system for translating Spanish sign language. This system is designed to translate spoken sentences in Spanish into their equivalent in the Spanish sign language. A speech recognition system was used to convert spoken sentences into written sentences, then the written sentences are translated into the equivalent in Spanish sign language, using a virtual human (avatar). This system has been applied to work in two domains: bus transport information and hotel reception, hence, most of the vocabulary needed by staff and deaf users in these two domains has been absorbed [5]. In 2012, a chat system was designed by Hanadi et al. for deaf people, the first of its kind in the Arab world. Through this system, Deaf people can communicate with each other, share pictures and multimedia and learn about each other's experiences. The system consists of two applications: sign language translation tool and chat system, both of which use the same database which includes the unified Arabic sign language dictionary to translate Arabic sign language [6]. Harsh et al., use a Kinect device that contains a set of sensors with the ability to capture a user photo and take many shots that embody the user's movements when his hands are above the trunk area. These snapshots are used as input to systems that use special algorithms to distinguish images and compare them with special images in Indian sign language stored within a special database. It is then translated into the corresponding Indian language [7]. Mohammed et al., eveloped a data acquisition and control (DAC) system that translates sign language into text that can be read by anyone. The system is called gesture recognition and sign language translator and it utilizes a smart glove that captures the gesture of the hand and interprets these gestures into readable text. This text can be sent wirelessly to a smart phone or shown in an embedded LCD display [8]. In 2016, the health questionnaire EuroQol EQ-5D-5L used to assess an individual's general health was of interest to Rogers et al. They translated this questionnaire from English to British sign language. The descriptive system for this tool consists of five dimensions: movement – self-care – usual activities – pain/discomfort – anxiety/depression. Each dimension has five levels: no problems – simple problems – moderate problems – acute problems – severe problems. All these levels are written in the English language, and when you click on any level, the video clip that represents it in the British sign language (BSL) is displayed, so that the deaf person can choose the level that represents his health status [9]. Another system produced by Basma et al., uses the leap motion controller (LMC). This device works at a rate of 200 frames per second. These frames are used to determine the number, position and rotation of hands and fingers. The workflow consists of several steps: Pre-processing – Tracking – Features – Extraction – sign classification. This system can recognize fixed signals such as letters and numbers. It also
  • 3.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 5, October 2020 : 5227 - 5234 5228 recognizes dynamic signals that include motions. Similarly, it proposes a method for dividing a series of continuous signals based on palm speed tracking. Thus, the system not only translates the already fragmented signals but is also able to translate continuous sentences [10]. Sujay et al., roposed a system to translate American sign language; this system consists of two stages: - Stage 1: isolated gloss recognition system: The inputs into this stage are video clips. Systems are used to distinguish American Sign Language from these videos. These systems consist of video pre-processing and a time series neural network module. - Stage 2: gloss to speech neural translator: This stage works on the outputs of the previous stage, where the signals of the American Sign Language derived from the previous stage are translated into the corresponding words in English [11]. Ebling et al., introduced a system for translating German sign language. Microsoft Kinect v2 depth sensor and two GoPro Hero 4 Black video cameras were used, placed in front of the signer, in addition to three cameras that are installed on the top, right, and left of the signer, so that signals are captured from different angles. The well-known recording program BosphorusSign Recording Software was developed to synchronize the capture from various devices [12]. A mobile application was designed to translate Pakistani sign language by Parvez et al. Samples were selected from 192 deaf participants aged 5-10 years from two special institutes for children. The aim of this study is to determine the effectiveness of the mobile interface through an advanced mobile application to learn basic mathematical concepts using Pakistan sign language (PSL). This study bridges the gap between technology-based and traditional teaching methods, which are used to teach mathematical concepts using PSL [13]. 3. BACKGROUND Sign language translation using a computer or cell phone device is a form of machine translation where words are translated from source language into target language using a sign language translation system. Many scholars and researchers have dealt with the topic of translating sign language in different ways, each method having its own scope of application and use. They have achieved different success rates. Some studies have utilized cameras [14-16] or gloves with sensors [17-19] and many algorithms and artificial intelligence techniques to distinguish the sign language. An optimal method cannot be identified because what suits a particular situation may not be appropriate for another. For example, if a deaf person gives a lecture to a group of students, in this case it is preferable to utilize the methods used with the camera or electronic gloves equipped with sensors to translate the sign language. However, during the daily life of a deaf person when wanting to communicate with family and friends, or anyone encountered on the road or at the work site, it is better to use the sign language translation systems represented by using an application on a mobile device. The data entry into this application is through a dedicated keyboard designed for this purpose. Using this application is easy and cost efficient. It is worth noting that most sign language translation systems are designed to be one-way only. Some of these systems work to translate sign language into spoken language only [20], while others work to translate spoken language into sign language only [21]. This does not allow the full integration of deaf people into society. It is also noted that many of these systems use video clips or images of very large sizes to represent the sign language [22, 23]. Consequently, this requires the use of a dedicated server for storing video clips or images. Therefore, it needs to use the Internet, which leads to restricting the use of these systems and thus reduces the flexibility of use. It was noticed through the study of the Arabic language and sign language that the root word, its derivatives and conjugations have the same translation in the sign language [24, 25]. Hence, this research aims at providing a flexible and free application. 4. METHODOLOGY To explain how the Iraqi sign language translation system works, we offer the following. 4.1. General overview An overview of the steps involved in implementing the proposed system are listed as follows. a. In this research, a free application for smart phones is designed to translate Iraqi sign language into the equivalent in spoken Arabic. This application is designed using “Android Studio” as a development environment to produce applications that run on devices that are managed by Android as an operating system. As for the programming language, Java Micro Edition has been used because of the ease of this language in transferring the application on different devices. Due to the nature of this application that requires the use of a very large number of images that represent words, letters and numbers in the sign
  • 4. Int J Elec & Comp Eng ISSN: 2088-8708  Application for Iraqi sign language translation on Android system (Miaad Ahmed Hussein) 5229 language, needing very large storage space, an efficient way to store images with the least possible storage space was required, and for this purpose “Blender” was used. It is a free 3D design program launched under the open-source General Public License (GNU), which is used to create 3D movies, movie tricks, interactive 3D programs, and video games. Blender capabilities include modeling, cladding, bone positioning and tuning [26, 27]. This program provides the ability to use a virtual human (avatar) to visualize sign language movements, and then store the images of the avatar in the application database for use in the translation process. b. Sign language grammar has been studied so that we can properly represent the sign language images, as the sign language has its own way of representing derivations, conjugation of verbs, expressing singular, folded, and plural, as well as expressing feminine and masculine aspects of grammar amongst others [1]. 4.2. User interface The user interface is designed to be as easy to use as in Figure 1. It contains the following components: - Tape to display the Arabic text. - The button (‫ترجم‬ / translate) to implement the translation of the text into the Iraqi sign language. - Virtual human (avatar) to display translation in the Iraqi sign language. - Button to show and hide the keyboard. - SMS button. - The keyboard to enter text in classical Arabic, containing all the 28 Arabic language characters in addition to the special symbols (‫,ى‬ ‫,ء‬ ‫,ئ‬ ‫,ؤ‬ ‫,)ة‬ as well as a special button to display numbers (0-9). Each button in it was represented so that it carries an image and symbol of the letter or the number that represents it. Thus it can be used easily by the healthy person and the deaf and mute person alike. Also, a special button was added to put a space between words, as well as a button for scanning. - The button “‫االشارة‬ ‫”لغة‬ to go to the window for translating Iraqi sign language into standard Arabic. Figure 1. User interface 4.3. Database A database is designed with several fields. The first field “word” is designated for storing words, letters, and numbers. The second field “img_name” contains the names of the images that correspond to the words. It is to be noted that some words need more than one image to be translated. The third field “Categories” contains the names of the categories, and the fourth field “Additional-img” contains the names of additional images to express the direction of movement. When the user presses the button “‫/ترجم‬ translate”, the application, through a special code, searches inside the database for the names of the images that correspond to each word separately, then a function for displaying images is called using “img_name” field. The process of searching for the names of images and calling the image display function for each word continues until the end of the sentence is reached.
  • 5.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 5, October 2020 : 5227 - 5234 5230 4.4. Images display The process of displaying images from the database is done through a programmed function that is provided with 2 parameters: the first is a series of letters representing the name of the image and the second is a numerical value representing the speed of display determined at (500 ms). Prior experience has proven that this is an ideal speed for display on mobile devices that are managed by the Android operating system. Through implementing this function, sign language images are displayed as animated films on the application screen, represented by the movements that the virtual human (avatar) performs. 5. THE PROPOSED SYSTEM WORKS Application works includes two modes: a. First mode: from a hearing person to a deaf person see Figure 2(a) The text written in classical Arabic is translated into its equivalent in Iraqi sign language once the user uses the keyboard to write text for the purpose of translating it, clicking on each letter or number that represents it. It will be written in a dedicated bar to display the text. This procedure is implemented through the use of a special method programmed in Java language. The work of this method depends on the index for each button on the keyboard, where each index is linked to a letter or number representing that button. After completing writing the text, the user presses the button “‫ترجم‬ /translate”. The program searches within the database for the names of images that correspond to each word separately. As a result, the image display method is called upon. If the user enters words that are not in the database, such as the names of people or words that contain spelling errors, the application displays images of the letters of the text alphabetically. b. Second mode: from the deaf person to the hearing person see Figure 2(b) Iraqi sign language is translated into classical Arabic, where the button “‫االشارة‬ ‫”لغة‬ designed in the user interface, when pressed, will move to the second mode, and to facilitate the process of finding the required images, the images are organized into categories (alphabet, language, verbs, adjectives, history, geography, etc.). Once the button “‫االشارة‬ ‫”لغة‬ is clicked, a list of these categories will be displayed. In the same way, when any category is clicked, images that represent the words in this category will be displayed. Moreover, clicking on any image will have the word representing that image written in a tape for translation in Arabic text. The user is able to send the messages via (SMS). As the application deals with a complete language, it is difficult to absorb all vocabularies of the language in a short period. So, this application is designed to be open source so that more words can be added whenever needed so that as many words as possible can be stored over time. (a) (b) Figure 2. Application's modes: (a) first mode and (b) second mode
  • 6. Int J Elec & Comp Eng ISSN: 2088-8708  Application for Iraqi sign language translation on Android system (Miaad Ahmed Hussein) 5231 6. PRACTICAL EXAMPLES 6.1. Example 1 If the user uses the first mode and writes a sentence for example: “‫ألتعلم‬ ‫المدرسة‬ ‫الى‬ ‫.”أذهب‬ Clicking on the button”‫,”ترجم‬ the sentence will be translated into the Iraqi sign language as shown in Figure 3. - Note that the word “‫”أذهب‬ has been translated by Figure 3(a) and (b). - The word “‫”الى‬ was translated by the display of Figure 3(c) and (d). - The word “‫”المدرسة‬ is translated by displaying images of Figure 3(e) and (f). - The translation of the word “‫”ألتعلم‬ is displayed by displaying Figure 3(g) and (h). (a) (b) (c) (d) (e) (f) (g) (h) Figure 3. Translate a sentence written in standard Arabic into its equivalent in Iraqi sign language 6.2. Example 2 If the user uses the first mode and writes the name of a person, for example “‫حسين‬ ‫”احمد‬ and as is well known, the sign language does not contain images of the names, so the names will be translated alphabetically, as shown in Figure 4. - In Figure 4(a), we notice the image represents the letter “‫.”الف‬ - In Figure 4(b), the image represents the letter “‫.”حاء‬ - In Figure 4(c), the image represents the letter “‫.”ميم‬ - In Figure 4(d), the image represents the letter “‫.”دال‬ - In Figure 4(e), the image represents the letter “‫.”حاء‬ - In Figure 4(f), the image represents the letter “‫.”سين‬ - In Figure 4(g), the image represents the letter “‫.”ياء‬ - In Figure 4(h), the image represents the letter “‫.”نون‬ As for the speed with which these images are displayed, it programmatically determined so that they are comfortable for the eye and clear. 6.3. Example 3 If the user uses the second mode by clicking on the button “‫االشارة‬ ‫,”لغة‬ a list of categories names will appear, as shown in Figure 5. By selecting the category and clicking on it, the sign language images for this category will be displayed. For example, if the user wants to choose images for the phrase “‫كتاب‬ ‫”أقرأ‬ he will
  • 7.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 5, October 2020 : 5227 - 5234 5232 click on the category “‫”أالفعال‬ and from it he will select the image that represent the word “‫”أقرأ‬ as shown in Figure 6(a). Then he moves to the category “‫ومستلزماتها‬ ‫”المدرسة‬ and specifies the image for the word “‫”كتاب‬ as in Figure 6(b), thus, the sentence is translated from sign language into standard Arabic. (a) (b) (c) (d) (e) (f) (g) (h) Figure 4. Translate words alphabetically (a) (b) Figure 5. List of categories names Figure 6. Translating from Iraqi sign language to standard Arabic
  • 8. Int J Elec & Comp Eng ISSN: 2088-8708  Application for Iraqi sign language translation on Android system (Miaad Ahmed Hussein) 5233 7. CONCLUSION The use of this application facilitates the integration of deaf people into society through the use of the easiest and most widely used means in the world, which are mobile phone applications; this paper puts forward a system for translating the Iraqi sign language, due to the lack of applications related to this dialect. This application is an educational system that can be applied to teach deaf people Iraqi sign language. It also teaches them how to read and write in standard Arabic. Additionally, the application teaches healthy people Iraqi sign language easily, thus removing barriers between deaf people and healthy people and facilitating the process of communication between them. Since this application works without using the Internet, this is easy to use it at any time and any place. The use of (Blender) program makes it possible to store the huge amount of images that the application needs by dividing the images and reducing the display resolution, thus reducing the size of the images. There are two proposes for future works: 1) to expand this system to include the use of sound for input and output by healthy people, thus facilitating further communication; and 2) use the root word extraction algorithm in Arabic language, so that the program works on the outputs of this algorithm. Thus, it may be possible to reduce the size of the databases to some extent as well as reducing the number of images that we need for translation. The root word is stored in databases on behalf of the derivatives or the conjugation of that word, which all have the same translation in sign language. This increases the efficiency of the translation process. ACKNOWLEDGEMENTS Our thanks to the University of Mosul, represented by all colleges and departments, especially the Department of Computer Science. Our thanks and appreciation to the members of the journal for allowing us to present our research. We especially thank “Al-Amal Institute for Special Needs Care, Mosul” for their cooperation in highlighting effective ways in helping the deaf to be integrated into society. REFERENCES [1] S. Samreen, M. Al-Binali, “Standard Qatari Sign Language,” Doha The Supreme Council for Doha Affairs, 2010. [2] N. B. Ibrahim, M. M. Selim, and H. H. Zayed, “An Automatic Arabic Sign Language Recognition System (ArSLRS),” Journal of King Saud University - Computer and Information Sciences, vol. 30, issue 4, pp. 470-477, 2018. [3] M. Abdo, et al., “Arabic Alphabet and Numbers Sign Language Recognition,” International Journal of Advanced Computer Science and Applications, vol. 6, no. 11, pp. 209-214, 2015. [4] H. H. Nasereddin, “MMLSL: Modelling Mobile Learning for Sign Language,” Of Engineering and Computer Science Research and Reviews in Applied Sciences, vol. 9, no. 2, pp. 20267-20272, 2017. [5] V. López-Ludeña, et al., “Methodology for developing a Speech into Sign Language Translation System in a New Semantic Domain,” 2012. [6] H. Al-Dosri, N. Alawfi, and Y. Alginahi, “Arabic sign language easy communicate ArSLEC,” in Proc. Int. Conf. Comput. Inf. Technol, pp. 474-479, 2012. [7] H. V. Verma, E. Aggarwal and S. Chandra, "Gesture recognition using kinect for sign language translation," 2013 IEEE Second International Conference on Image Information Processing (ICIIP-2013), Shimla, pp. 96-100, 2013. [8] M. Elmahgiubi, M. Ennajar, N. Drawil and M. S. Elbuni, "Sign language translator and gesture recognition," 2015 Global Summit on Computer & Information Technology (GSCIT), Sousse, pp. 1-6, 2015. [9] K. D. Rogers, et al., “Translation, validity and reliability of the British Sign Language (BSL) version of the EQ-5D-5L,” Quality of Life Research, vol. 25, no. 7, pp. 1825–1834, 2016.. [10] B. Hisham and A. Hamouda, “Arabic Static and Dynamic Gestures Recognition Using Leap Motion,” JCS, vol. 13, no. 8, pp. 337-354, 2017. [11] S. S Kumar, T. Wangyal, V. Saboo and R. Srinath, "Time Series Neural Networks for Real Time Sign Language Translation," 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA), Orlando, FL, pp. 243-248, 2018. [12] S. Ebling, et al., “SMILE Swiss German sign language dataset”. In Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018), pp. 4221-4229, 2018. [13] K. Parvez, et al. “Measuring Effectiveness of Mobile Application in Learning Basic Mathematical Concepts Using Pakistan Sign Language,” Sustainability, vol. 11, no. 11, p. 3064, 2019. doi: https://doi.org/10.3390/su11113064 [14] A. A. Youssif, et al., “Arabic Sign Language (ArSL) Recognition System Using HMM,” International Journal of Advanced Computer Science and Applications, vol. 2, no. 11, pp. 45-51, 2011. [15] N. R. Albelwi and Y. M. Alginahi, “Real-time Arabic sign language (arsl) recognition,” in International Conference on Communications and Information Technology, pp. 497-501, 2012. [16] S. Shahriar et al., "Real-Time American Sign Language Recognition Using Skin Segmentation and Image Category Classification with Convolutional Neural Network and Deep Learning," TENCON 2018 - 2018 IEEE Region 10 Conference, Jeju, Korea (South), pp. 1168-1171, 2018.
  • 9.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 5, October 2020 : 5227 - 5234 5234 [17] P. D. Rosero-Montalvo et al., "Sign Language Recognition Based on Intelligent Glove Using Machine Learning Techniques," 2018 IEEE Third Ecuador Technical Chapters Meeting (ETCM), Cuenca, pp. 1-5, 2018. [18] B. Sapkota, et al., “Smart Glove for Sign Language Translation Using Arduino,” in 1st KEC Conference Proceedings, vol. 1, pp. 5-11, 2018. [19] S. A. Mehdi and Y. N. Khan, "Sign language recognition using sensor gloves," Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02., Singapore, vol. 5, pp. 2204-2206, 2002. [20] R. Ambar, et al., “Development of a Wearable Device for Sign Language Recognition,” Journal of Physics: Conference Series, vol. 1019, p. 012017, 2018. doi: 10.1088/1742-6596/1019/1/012017 [21] A. Wray, et al., “A formulaic approach to translation at the post office: reading the signs,” Language & Communication, vol. 24, no. 1, pp. 59–75, 2004. [22] F. Al Ameiri, M. J. Zemerly and M. Al Marzouqi, "Mobile Arabic sign language," 2011 International Conference for Internet Technology and Secured Transactions, Abu Dhabi, pp. 363-367, 2011. [23] A. Almohimeed, M. Wald, and R. I. Damper, “Arabic text to Arabic sign language translation system for the deaf and hearing-impaired community,” in Proceedings of the Second Workshop on Speech and Language Processing for Assistive Technologies, pp. 101-109, 2011. [24] R. Z. Al-Obaidy, “Sentiment analysis for Arabic tweets using support vector machine,” Master's thesis, College of Computer Sciences And Mathematics, University of Mosul, 2019. [25] A. Y. Taqa, “Constructing Anti – spam filter based on naive bayesian classifier,” PhD thesis, College of Computer Sciences And Mathematics, University of Mosul, 2007. [26] “Blender.” https://ar.wikipedia.org/wiki/%D8%A8%D9%84%D9%86% D8%AF%D8%B1 (accessed 2/1/2020). [27] “Top 55+ Best Animation Software: The Ultimate List 2019,” Renderforest, 2019. [Online]. Available: https://www.renderforest.com/blog/best-animation-software, [accessed: Aug 9 2019]. BIOGRAPHIES OF AUTHORS Miaad Ahmed Alobaidy received her B.Sc.2017 in computer sciences from computer sciences and mathematics college in university of Mosul. She is currently M.Sc. student in computer sciences field of image processing in university of Mosul. She has interests in fields like database and website design. Sundus Khaleel Ebraheem received her B.Sc. 1996 in computer sciences and received her M.Sc. degree 2001 in computer sciences field of image processing from university of Mosul. She is currently working as assistant professor in computer sciences and mathematics college in university of Mosul. She has 22 years experience in teaching field and has 19 published research papers in various scientific journals.