This document discusses a new compression technique called Partition Group Binary Compression (PGBC) for compressing DNA sequences. It begins with background on DNA sequencing and challenges with compressing large DNA databases. It then reviews existing compression algorithms like Huffbit Compress and Genbit Compress that use 2-bit encoding but do not perform well on sequences with few repeats. The proposed PGBC technique aims to achieve better compression ratios than existing methods even for sequences with little repetition. The paper is organized into sections on general compression algorithms, related existing algorithms, a description of how PGBC analysis improves on them, a comparative study on sample sequences, and conclusions.
HPC-MAQ : A PARALLEL SHORT-READ REFERENCE ASSEMBLERcscpconf
Bioinformatics and computational biology are rooted in life sciences as well as computer and
information sciences and technologies. Bioinformatics applies principles of information
sciences and technologies to make the vast, diverse, and complex life sciences data more
understandable and useful. Computational biology uses mathematical and computational
approaches to address theoretical and experimental questions in biology. Short read sequence
assembly is one of the most important steps in the analysis of biological data. There are many
open source software’s available for short read sequence assembly where MAQ is one such
popularly used software by the research community.
In general, biological data sets generated by next generation sequencers are very huge and
massive which requires tremendous amount of computational resources. The algorithm used for
the short read sequence assembly is NP Hard which is computationally expensive and time
consuming. Also MAQ is single threaded software which doesn't use the power of multi core and
distributed computing and it doesn't scale. In this paper we report HPC-MAQ which addresses
the NP-Hard related challenges of genome reference assembly and enables MAQ parallel and scalable through Hadoop which is a software framework for distributed computing.
A comparative review on symmetric and asymmetric DNA-based cryptographyjournalBEEI
Current researchers have focused on DNA-based cryptography, in fact, DNA or deoxyribonucleic acid, has been applied in cryptography for performing computation as well as storing and transmitting information. In the present work, we made use of DNA in cryptographic, i.e. its storing capabilities (superior information density) and parallelism, in order to improve other classical cryptographic algorithms. Data encryption is made possible via DNA sequences. In this paper, two cases utilizing different DNA properties were studied by combining the DNA codes with those conventional cryptography algorithms. The first case concerned on symmetric cryptography that involved DNA coding with OTP (one time pad) algorithms. Asymmetric cryptography was considered in the second case by incorporating the DNA codes in RSA algorithm. The efficiencies of DNA coding in OTP, RSA, and other algorithms were given. As observed, the computational time of RSA algorithm combined with DNA coding was longer. In order to alleviate this problem, data redundancy was reduced by activating the GZIP compressed algorithm. The present experimental results showed that DNA symmetric cryptography worked quite well in both time and size analyses. Nevertheless, it was less efficient than the compressed DNA asymmetric cryptography.
A Study on DNA based Computation and Memory DevicesEditor IJCATR
The present study delineates Deoxyribonucleic Acid (DNA) based computing and storage devices which have good future in the vast era of information technology. The traditional devices mostly used are made up of silicon. The devices are costly and have physical limitations to cause leakage of electrons and circuit to shorten. So, there is a need of materials which are capable of doing fast processing and have vast memory storage. DNA which is a bio-molecule has all these characteristics capable of providing ample storage. In classical computing devices, electronic logic gates are elements which allow storing and transforming of information. Designing of an appropriate sequence or a net of “store” and “transform” operations (in a sense of building a device or writing a program) is equivalent to preparing some computations. In DNA based computation, the situation is analogous. The main difference is the type of computing devices since in this new method of computing instead of electronic gates, DNA molecules have been deployed for the processing of dossier. Moreover, the inherent massive parallelism of DNA computing may lead to methods solving some intractable computational problems. The aim of this research study is to analyze the logical features and memory formation using DNA bio molecules in order to achieve proliferated speed, accuracy and vast storage.
Comparision Of Various Lossless Image Compression TechniquesIJERA Editor
Today images are considered as the major information tanks in the world. They can convey a lot more information to the receptor then a few pages of written information. Due to this very reason image processing has become a field of research today. The processing are basically are of two types; lossy and lossless. Since the information is power, so having it complete and discrete is of great importance today. Hence in such cases lossless techniques are the best options. This paper deals with the comparison of different lossless image compression techniques available today.
Enhanced Level of Security using DNA Computing Technique with Hyperelliptic C...IDES Editor
Hyperelliptic Curve Cryptography (HECC) is a Public
Key Cryptographic technique which is required for secure
transmission. HECC is better than the existing public key
cryptography technique such as RSA, DSA, AES and ECC in
terms of smaller key size. DNA cryptography is a next
generation security mechanism, storing almost a million
gigabytes of data inside DNA strands. Existing DNA based
Elliptic Curve Cryptographic technique require larger key
size to encrypt and decrypt the message resulting in increased
processing time, more computational and memory overhead.
To overcome the above limitations, DNA strands are used to
encode the data to provide first level of security and HECC
encryption algorithm is used for providing second level of
security. Hence this proposed integration of DNA computing
based HECC provides higher level of security with less
computational and memory overhead.
HPC-MAQ : A PARALLEL SHORT-READ REFERENCE ASSEMBLERcscpconf
Bioinformatics and computational biology are rooted in life sciences as well as computer and
information sciences and technologies. Bioinformatics applies principles of information
sciences and technologies to make the vast, diverse, and complex life sciences data more
understandable and useful. Computational biology uses mathematical and computational
approaches to address theoretical and experimental questions in biology. Short read sequence
assembly is one of the most important steps in the analysis of biological data. There are many
open source software’s available for short read sequence assembly where MAQ is one such
popularly used software by the research community.
In general, biological data sets generated by next generation sequencers are very huge and
massive which requires tremendous amount of computational resources. The algorithm used for
the short read sequence assembly is NP Hard which is computationally expensive and time
consuming. Also MAQ is single threaded software which doesn't use the power of multi core and
distributed computing and it doesn't scale. In this paper we report HPC-MAQ which addresses
the NP-Hard related challenges of genome reference assembly and enables MAQ parallel and scalable through Hadoop which is a software framework for distributed computing.
A comparative review on symmetric and asymmetric DNA-based cryptographyjournalBEEI
Current researchers have focused on DNA-based cryptography, in fact, DNA or deoxyribonucleic acid, has been applied in cryptography for performing computation as well as storing and transmitting information. In the present work, we made use of DNA in cryptographic, i.e. its storing capabilities (superior information density) and parallelism, in order to improve other classical cryptographic algorithms. Data encryption is made possible via DNA sequences. In this paper, two cases utilizing different DNA properties were studied by combining the DNA codes with those conventional cryptography algorithms. The first case concerned on symmetric cryptography that involved DNA coding with OTP (one time pad) algorithms. Asymmetric cryptography was considered in the second case by incorporating the DNA codes in RSA algorithm. The efficiencies of DNA coding in OTP, RSA, and other algorithms were given. As observed, the computational time of RSA algorithm combined with DNA coding was longer. In order to alleviate this problem, data redundancy was reduced by activating the GZIP compressed algorithm. The present experimental results showed that DNA symmetric cryptography worked quite well in both time and size analyses. Nevertheless, it was less efficient than the compressed DNA asymmetric cryptography.
A Study on DNA based Computation and Memory DevicesEditor IJCATR
The present study delineates Deoxyribonucleic Acid (DNA) based computing and storage devices which have good future in the vast era of information technology. The traditional devices mostly used are made up of silicon. The devices are costly and have physical limitations to cause leakage of electrons and circuit to shorten. So, there is a need of materials which are capable of doing fast processing and have vast memory storage. DNA which is a bio-molecule has all these characteristics capable of providing ample storage. In classical computing devices, electronic logic gates are elements which allow storing and transforming of information. Designing of an appropriate sequence or a net of “store” and “transform” operations (in a sense of building a device or writing a program) is equivalent to preparing some computations. In DNA based computation, the situation is analogous. The main difference is the type of computing devices since in this new method of computing instead of electronic gates, DNA molecules have been deployed for the processing of dossier. Moreover, the inherent massive parallelism of DNA computing may lead to methods solving some intractable computational problems. The aim of this research study is to analyze the logical features and memory formation using DNA bio molecules in order to achieve proliferated speed, accuracy and vast storage.
Comparision Of Various Lossless Image Compression TechniquesIJERA Editor
Today images are considered as the major information tanks in the world. They can convey a lot more information to the receptor then a few pages of written information. Due to this very reason image processing has become a field of research today. The processing are basically are of two types; lossy and lossless. Since the information is power, so having it complete and discrete is of great importance today. Hence in such cases lossless techniques are the best options. This paper deals with the comparison of different lossless image compression techniques available today.
Enhanced Level of Security using DNA Computing Technique with Hyperelliptic C...IDES Editor
Hyperelliptic Curve Cryptography (HECC) is a Public
Key Cryptographic technique which is required for secure
transmission. HECC is better than the existing public key
cryptography technique such as RSA, DSA, AES and ECC in
terms of smaller key size. DNA cryptography is a next
generation security mechanism, storing almost a million
gigabytes of data inside DNA strands. Existing DNA based
Elliptic Curve Cryptographic technique require larger key
size to encrypt and decrypt the message resulting in increased
processing time, more computational and memory overhead.
To overcome the above limitations, DNA strands are used to
encode the data to provide first level of security and HECC
encryption algorithm is used for providing second level of
security. Hence this proposed integration of DNA computing
based HECC provides higher level of security with less
computational and memory overhead.
Develop and design hybrid genetic algorithms with multiple objectives in data...khalil IBRAHIM
Data compression plays a significant role and is necessary to minimize the storage size and accelerate the data transmission by the communication channel object, The quality of dictionary-based text compression is a new approach, in this concept, the hybrid dictionary compression (HDC) is used in one compression system, instead of several compression systems of the characters, syllables or words, HDC is a new technique and has proven itself especially on the text files. The compression effectiveness is affected by the quality of hybrid dictionary of characters, syllables, and words. The dictionary is created with a forward analysis of text file a. In this research, the genetic algorithm (GAs) will be used as the search engine for obtaining this dictionary and data compression, therefore, GA is stochastic search engine algorithms related to the natural selection and mechanisms of genetics, the s research aims to combine Huffman’s multiple trees and GAs in one compression system. The Huffman algorithm coding is a tactical compression method of creating the code lengths of variable-length prefix code, and the genetic algorithm is algorithms are used for searching and finding the best Huffman tree [1], that gives the best compression ratio. GAs can be used as search algorithms for both Huffman method of different codes in the text characters leads to a difference in the trees, The GAs work in the same way to create the population, and can be used to guide the research to better solutions, after applying the test survive and their genetic information is used.
DNA Encryption Algorithms: Scope and Challenges in Symmetric Key CryptographyAM Publications
Data security is now a crucial issue now in our day to day life. The protection of personal identity, personal finances depend on the protection of important and irreplaceable information. Cryptography is the science of converting some readable information into something unreadable format, which are hard to decipher. In modern times, cryptography has adopted a new medium: human DNA. At a time when conventional cryptography has been losing strength to more advanced cryptanalysis, DNA cryptography has added more elements of confusion and diffusion. The use of DNA sequences to encrypt data has strengthened the existing classical encryption algorithms. Thus, DNA cryptography has added another dimension to conventional cryptography. In the present paper the authors have made a systematics study on DNA encryption algorithms and how it can be used along with standard classical encryption algorithms.
Writing long sentences is bit boring, but with text prediction in the keyboard technology has made
this simple. Learning technology behind the keyboard is developing fast and has become more accurate.
Learning technologies such as machine learning, deep learning here play an important role in predicting the
text. Current trending techniques in deep learning has opened door for data analysis. Emerging technologies
such has Region CNN, Recurrent CNN have been under consideration for the analysis. Many techniques have
been used for text sequence prediction such as Convolutional Neural Networks (CNN), Recurrent Neural
Networks (RNN), and Recurrent Convolution Neural Networks (RCNN). This paper aims to provide a
comparative study of different techniques used for text prediction.
Proteins. 2013 Nov;81(11):1885-99. doi: 10.1002/prot.24330. Epub 2013 Aug 16.
DNABind: A hybrid algorithm for structure-based prediction of DNA-binding residues by combining machine learning- and template-based approaches.
Liu R, Hu J.
Convolutional Neural Network and Feature Transformation for Distant Speech Re...IJECEIAES
In many applications, speech recognition must operate in conditions where there are some distances between speakers and the microphones. This is called distant speech recognition (DSR). In this condition, speech recognition must deal with reverberation. Nowadays, deep learning technologies are becoming the the main technologies for speech recognition. Deep Neural Network (DNN) in hybrid with Hidden Markov Model (HMM) is the commonly used architecture. However, this system is still not robust against reverberation. Previous studies use Convolutional Neural Networks (CNN), which is a variation of neural network, to improve the robustness of speech recognition against noise. CNN has the properties of pooling which is used to find local correlation between neighboring dimensions in the features. With this property, CNN could be used as feature learning emphasizing the information on neighboring frames. In this study we use CNN to deal with reverberation. We also propose to use feature transformation techniques: linear discriminat analysis (LDA) and maximum likelihood linear transformation (MLLT), on mel frequency cepstral coefficient (MFCC) before feeding them to CNN. We argue that transforming features could produce more discriminative features for CNN, and hence improve the robustness of speech recognition against reverberation. Our evaluations on Meeting Recorder Digits (MRD) subset of Aurora-5 database confirm that the use of LDA and MLLT transformations improve the robustness of speech recognition. It is better by 20% relative error reduction on compared to a standard DNN based speech recognition using the same number of hidden layers.
On Text Realization Image SteganographyCSCJournals
In this paper the steganography strategy is going to be implemented but in a different way from a different scope since the important data will neither be hidden in an image nor transferred through the communication channel inside an image, but on the contrary, a well known image will be used that exists on both sides of the channel and a text message contains important data will be transmitted. With the suitable operations, we can re-mix and re-make the source image. MATLAB7 is the program where the algorithm implemented on it, where the algorithm shows high ability for achieving the task to different type and size of images. Perfect reconstruction was achieved on the receiving side. But the most interesting is that the algorithm that deals with secured image transmission transmits no images at all
Develop and design hybrid genetic algorithms with multiple objectives in data...khalil IBRAHIM
Data compression plays a significant role and is necessary to minimize the storage size and accelerate the data transmission by the communication channel object, The quality of dictionary-based text compression is a new approach, in this concept, the hybrid dictionary compression (HDC) is used in one compression system, instead of several compression systems of the characters, syllables or words, HDC is a new technique and has proven itself especially on the text files. The compression effectiveness is affected by the quality of hybrid dictionary of characters, syllables, and words. The dictionary is created with a forward analysis of text file a. In this research, the genetic algorithm (GAs) will be used as the search engine for obtaining this dictionary and data compression, therefore, GA is stochastic search engine algorithms related to the natural selection and mechanisms of genetics, the s research aims to combine Huffman’s multiple trees and GAs in one compression system. The Huffman algorithm coding is a tactical compression method of creating the code lengths of variable-length prefix code, and the genetic algorithm is algorithms are used for searching and finding the best Huffman tree [1], that gives the best compression ratio. GAs can be used as search algorithms for both Huffman method of different codes in the text characters leads to a difference in the trees, The GAs work in the same way to create the population, and can be used to guide the research to better solutions, after applying the test survive and their genetic information is used.
DNA Encryption Algorithms: Scope and Challenges in Symmetric Key CryptographyAM Publications
Data security is now a crucial issue now in our day to day life. The protection of personal identity, personal finances depend on the protection of important and irreplaceable information. Cryptography is the science of converting some readable information into something unreadable format, which are hard to decipher. In modern times, cryptography has adopted a new medium: human DNA. At a time when conventional cryptography has been losing strength to more advanced cryptanalysis, DNA cryptography has added more elements of confusion and diffusion. The use of DNA sequences to encrypt data has strengthened the existing classical encryption algorithms. Thus, DNA cryptography has added another dimension to conventional cryptography. In the present paper the authors have made a systematics study on DNA encryption algorithms and how it can be used along with standard classical encryption algorithms.
Writing long sentences is bit boring, but with text prediction in the keyboard technology has made
this simple. Learning technology behind the keyboard is developing fast and has become more accurate.
Learning technologies such as machine learning, deep learning here play an important role in predicting the
text. Current trending techniques in deep learning has opened door for data analysis. Emerging technologies
such has Region CNN, Recurrent CNN have been under consideration for the analysis. Many techniques have
been used for text sequence prediction such as Convolutional Neural Networks (CNN), Recurrent Neural
Networks (RNN), and Recurrent Convolution Neural Networks (RCNN). This paper aims to provide a
comparative study of different techniques used for text prediction.
Proteins. 2013 Nov;81(11):1885-99. doi: 10.1002/prot.24330. Epub 2013 Aug 16.
DNABind: A hybrid algorithm for structure-based prediction of DNA-binding residues by combining machine learning- and template-based approaches.
Liu R, Hu J.
Convolutional Neural Network and Feature Transformation for Distant Speech Re...IJECEIAES
In many applications, speech recognition must operate in conditions where there are some distances between speakers and the microphones. This is called distant speech recognition (DSR). In this condition, speech recognition must deal with reverberation. Nowadays, deep learning technologies are becoming the the main technologies for speech recognition. Deep Neural Network (DNN) in hybrid with Hidden Markov Model (HMM) is the commonly used architecture. However, this system is still not robust against reverberation. Previous studies use Convolutional Neural Networks (CNN), which is a variation of neural network, to improve the robustness of speech recognition against noise. CNN has the properties of pooling which is used to find local correlation between neighboring dimensions in the features. With this property, CNN could be used as feature learning emphasizing the information on neighboring frames. In this study we use CNN to deal with reverberation. We also propose to use feature transformation techniques: linear discriminat analysis (LDA) and maximum likelihood linear transformation (MLLT), on mel frequency cepstral coefficient (MFCC) before feeding them to CNN. We argue that transforming features could produce more discriminative features for CNN, and hence improve the robustness of speech recognition against reverberation. Our evaluations on Meeting Recorder Digits (MRD) subset of Aurora-5 database confirm that the use of LDA and MLLT transformations improve the robustness of speech recognition. It is better by 20% relative error reduction on compared to a standard DNN based speech recognition using the same number of hidden layers.
On Text Realization Image SteganographyCSCJournals
In this paper the steganography strategy is going to be implemented but in a different way from a different scope since the important data will neither be hidden in an image nor transferred through the communication channel inside an image, but on the contrary, a well known image will be used that exists on both sides of the channel and a text message contains important data will be transmitted. With the suitable operations, we can re-mix and re-make the source image. MATLAB7 is the program where the algorithm implemented on it, where the algorithm shows high ability for achieving the task to different type and size of images. Perfect reconstruction was achieved on the receiving side. But the most interesting is that the algorithm that deals with secured image transmission transmits no images at all
SBVRLDNACOMP:AN EFFECTIVE DNA SEQUENCE COMPRESSION ALGORITHMijcsa
There are plenty specific types of data which are needed to compress for easy storage and to reduce overall retrieval times. Moreover, compressed sequence can be used to understand similarities between biological sequences. DNA data compression challenge has become a major task for many researchers for the last few years as a result of exponential increase of produced sequences in gene databases. In this research paper we have attempt to develop an algorithm by self-reference bases; namely Single Base Variable Repeat Length DNA Compression (SBVRLDNAComp). There are a number of reference based compression methods but they are not satisfactory for forthcoming new species. SBVRLDNAComp is an optimal solution of the result obtained from small to long, uniform identical and non-identical string of nucleotides checked in four different ways. Both exact repetitive and non-repetitive bases are compressed by SBVRLDNAComp.The sound part of it is without any reference database BVRLDNAComp achieves 1.70 to 1.73 compression ratio α after testing on ten benchmark DNA sequences. The compressed file can be further compressed with standard tools (such as WinZip or WinRar) but even without this SBVRLDNAComp outperforms many standard DNA compression algorithms.
Image Compression Through Combination Advantages From Existing TechniquesCSCJournals
The tremendous growth of digital data has led to a high necessity for compressing applications either to minimize memory usage or transmission speed. Despite of the fact that many techniques already exist, there is still space and need for new techniques in this area of study. With this paper we aim to introduce a new technique for data compression through pixel combinations, used for both lossless and lossy compression. This new technique is also able to be used as a standalone solution, or with some other data compression method as an add-on providing better results. It is here applied only on images but it can be easily modified to work on any other type of data. We are going to present a side-by-side comparison, in terms of compression rate, of our technique with other widely used image compression methods. We will show that the compression ratio achieved by this technique tanks among the best in the literature whilst the actual algorithm remains simple and easily extensible. Finally the case will be made for the ability of our method to intrinsically support and enhance methods used for cryptography, steganography and watermarking.
A Novel Framework for Short Tandem Repeats (STRs) Using Parallel String MatchingIJERA Editor
Short tandem repeats (STRs) have become important molecular markers for a broad range of applications, such
as genome mapping and characterization, phenotype mapping, marker assisted selection of crop plants and a
range of molecular ecology and diversity studies. These repeated DNA sequences are found in both Plants and
bacteria. Most of the computer programs that find STRs failed to report its number of occurrences of the
repeated pattern, exact position and it is difficult task to obtain accurate results from the larger datasets. So we
need high performance computing models to extract certain repeats. One of the solution is STRs using parallel
string matching, it gives number of occurrences with corresponding line number and exact location or position
of each STR in the genome of any length. In this, we implemented parallel string matching using JAVA Multithreading
with multi core processing, for this we implemented a basic algorithm and made a comparison with
previous algorithms like Knuth Morris Pratt, Boyer Moore and Brute force string matching algorithms and from
the results our new basic algorithm gives better results than the previous algorithms. We apply this algorithm in
parallel string matching using multi-threading concept to reduce the time by running on multicore processors.
From the test results it is shown that the multicore processing is a remarkably efficient and powerful compared
to lower versions and finally this proposed STR using parallel string matching algorithm is better than the
sequential approaches.
ANALYZING ARCHITECTURES FOR NEURAL MACHINE TRANSLATION USING LOW COMPUTATIONA...ijnlc
With the recent developments in the field of Natural Language Processing, there has been a rise in the use
of different architectures for Neural Machine Translation. Transformer architectures are used to achieve
state-of-the-art accuracy, but they are very computationally expensive to train. Everyone cannot have such
setups consisting of high-end GPUs and other resources. We train our models on low computational
resources and investigate the results. As expected, transformers outperformed other architectures, but
there were some surprising results. Transformers consisting of more encoders and decoders took more
time to train but had fewer BLEU scores. LSTM performed well in the experiment and took comparatively
less time to train than transformers, making it suitable to use in situations having time constraints.
ANALYZING ARCHITECTURES FOR NEURAL MACHINE TRANSLATION USING LOW COMPUTATIO...kevig
With the recent developments in the field of Natural Language Processing, there has been a rise in the use
of different architectures for Neural Machine Translation. Transformer architectures are used to achieve
state-of-the-art accuracy, but they are very computationally expensive to train. Everyone cannot have such
setups consisting of high-end GPUs and other resources. We train our models on low computational
resources and investigate the results. As expected, transformers outperformed other architectures, but
there were some surprising results. Transformers consisting of more encoders and decoders took more
time to train but had fewer BLEU scores. LSTM performed well in the experiment and took comparatively
less time to train than transformers, making it suitable to use in situations having time constraints
ANALYZING ARCHITECTURES FOR NEURAL MACHINE TRANSLATION USING LOW COMPUTATIONA...kevig
With the recent developments in the field of Natural Language Processing, there has been a rise in the use
of different architectures for Neural Machine Translation. Transformer architectures are used to achieve
state-of-the-art accuracy, but they are very computationally expensive to train. Everyone cannot have such
setups consisting of high-end GPUs and other resources. We train our models on low computational
resources and investigate the results. As expected, transformers outperformed other architectures, but
there were some surprising results. Transformers consisting of more encoders and decoders took more
time to train but had fewer BLEU scores. LSTM performed well in the experiment and took comparatively
less time to train than transformers, making it suitable to use in situations having time constraints.
Suggestion Generation for Specific Erroneous Part in a Sentence using Deep Le...ijtsrd
Natural Language Processing NLP is the one of the major filed of Natural Language Generation NLG . NLG can generate natural language from a machine representation. Generating suggestions for a sentence especially for Indian languages is much difficult. One of the major reason is that it is morphologically rich and the format is just reverse of English language. By using deep learning approach with the help of Long Short Term Memory LSTM layers we can generate a possible set of solutions for erroneous part in a sentence. To effectively generate a bunch of sentences having equivalent meaning as the original sentence using Deep Learning DL approach is to train a model on this task, e.g. we need thousands of examples of inputs and outputs with which to train a model. Veena S Nair | Amina Beevi A ""Suggestion Generation for Specific Erroneous Part in a Sentence using Deep Learning"" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-4 , June 2019, URL: https://www.ijtsrd.com/papers/ijtsrd23842.pdf
Paper URL: https://www.ijtsrd.com/engineering/computer-engineering/23842/suggestion-generation-for-specific-erroneous-part-in-a-sentence-using-deep-learning/veena-s-nair
Dna data compression algorithms based on redundancyijfcstjournal
Carl Jung said, 'Collective unconscious' i.e. we are all connected to each other in some way or the other via our DNA.In frequent cases there are four bases in a DNA. They are a (Adenine),c (Cytosine),g(Guanine)
and t (Thymine).Each of these bases can be represented by two bits as 2 powers 2 =4 i.e.a–00,c–01,g–11 and t–10 respectively, although this choice is random.Soredundancy within a sequence is more likely to exist.That’s why in this paper wehave explored different types of repeat to compress DNA.These are direct repeats, palindrome or reverse direct repeat , inverted exact repeats or complementary palindrome or exact reverse complement, inverted approximate repeats or approximate complementary palindrome or approximate reverse complement, interspersed or dispersed repeats,
flanking repeats or terminal repeats etc. Better compression gives better network speed and save storage space.
De novo transcriptome assembly of solid sequencing data in cucumis melobioejjournal
As sequencing technologies progress, focus shifts towards solving bioinformatic challenges, of which sequence read assembly is the first task. In the present study, we have carried out a comparison of two assemblers (SeqMan and CLC) for transcriptome assembly, using a new dataset from Cucumis melo. Between two assemblers SeqMan generated an excess of small, redundant contigs where as CLC generated the least redundant assembly. Since different assemblers use different algorithms to build contigs, wefollowed the merging of assemblies by CAP3 and found that the merged assembly is better than individual assemblies and more consistent in the number and size of contigs. Combining the assemblies from different programs gave a more credible final product, and therefore this approach is recommended for quantitative
output.
DE NOVO TRANSCRIPTOME ASSEMBLY OF SOLID SEQUENCING DATA IN CUCUMIS MELObioejjournal
As sequencing technologies progress, focus shifts towards solving bioinformatic challenges, of which
sequence read assembly is the first task. In the present study, we have carried out a comparison of two
assemblers (SeqMan and CLC) for transcriptome assembly, using a new dataset from Cucumis melo.
Between two assemblers SeqMan generated an excess of small, redundant contigs where as CLC generated
the least redundant assembly. Since different assemblers use different algorithms to build contigs, we
followed the merging of assemblies by CAP3 and found that the merged assembly is better than individual
assemblies and more consistent in the number and size of contigs. Combining the assemblies from different
programs gave a more credible final product, and therefore this approach is recommended for quantitative
output
DE NOVO TRANSCRIPTOME ASSEMBLY OF SOLID SEQUENCING DATA IN CUCUMIS MELObioejjournal
As sequencing technologies progress, focus shifts towards solving bioinformatic challenges, of which sequence read assembly is the first task. In the present study, we have carried out a comparison of two assemblers (SeqMan and CLC) for transcriptome assembly, using a new dataset from Cucumis melo. Between two assemblers SeqMan generated an excess of small, redundant contigs where as CLC generated
the least redundant assembly. Since different assemblers use different algorithms to build contigs, we followed the merging of assemblies by CAP3 and found that the merged assembly is better than individual assemblies and more consistent in the number and size of contigs. Combining the assemblies from different programs gave a more credible final product, and therefore this approach is recommended for quantitative
output.
Similar to A new revisited compression technique through innovative partition group binary (20)
Submission Deadline: 30th September 2022
Acceptance Notification: Within Three Days’ time period
Online Publication: Within 24 Hrs. time Period
Expected Date of Dispatch of Printed Journal: 5th October 2022
MODELING AND ANALYSIS OF SURFACE ROUGHNESS AND WHITE LATER THICKNESS IN WIRE-...IAEME Publication
White layer thickness (WLT) formed and surface roughness in wire electric discharge turning (WEDT) of tungsten carbide composite has been made to model through response surface methodology (RSM). A Taguchi’s standard Design of experiments involving five input variables with three levels has been employed to establish a mathematical model between input parameters and responses. Percentage of cobalt content, spindle speed, Pulse on-time, wire feed and pulse off-time were changed during the experimental tests based on the Taguchi’s orthogonal array L27 (3^13). Analysis of variance (ANOVA) revealed that the mathematical models obtained can adequately describe performance within the parameters of the factors considered. There was a good agreement between the experimental and predicted values in this study.
A STUDY ON THE REASONS FOR TRANSGENDER TO BECOME ENTREPRENEURSIAEME Publication
The study explores the reasons for a transgender to become entrepreneurs. In this study transgender entrepreneur was taken as independent variable and reasons to become as dependent variable. Data were collected through a structured questionnaire containing a five point Likert Scale. The study examined the data of 30 transgender entrepreneurs in Salem Municipal Corporation of Tamil Nadu State, India. Simple Random sampling technique was used. Garrett Ranking Technique (Percentile Position, Mean Scores) was used as the analysis for the present study to identify the top 13 stimulus factors for establishment of trans entrepreneurial venture. Economic advancement of a nation is governed upon the upshot of a resolute entrepreneurial doings. The conception of entrepreneurship has stretched and materialized to the socially deflated uncharted sections of transgender community. Presently transgenders have smashed their stereotypes and are making recent headlines of achievements in various fields of our Indian society. The trans-community is gradually being observed in a new light and has been trying to achieve prospective growth in entrepreneurship. The findings of the research revealed that the optimistic changes are taking place to change affirmative societal outlook of the transgender for entrepreneurial ventureship. It also laid emphasis on other transgenders to renovate their traditional living. The paper also highlights that legislators, supervisory body should endorse an impartial canons and reforms in Tamil Nadu Transgender Welfare Board Association.
BROAD UNEXPOSED SKILLS OF TRANSGENDER ENTREPRENEURSIAEME Publication
Since ages gender difference is always a debatable theme whether caused by nature, evolution or environment. The birth of a transgender is dreadful not only for the child but also for their parents. The pain of living in the wrong physique and treated as second class victimized citizen is outrageous and fully harboured with vicious baseless negative scruples. For so long, social exclusion had perpetuated inequality and deprivation experiencing ingrained malign stigma and besieged victims of crime or violence across their life spans. They are pushed into the murky way of life with a source of eternal disgust, bereft sexual potency and perennial fear. Although they are highly visible but very little is known about them. The common public needs to comprehend the ravaged arrogance on these insensitive souls and assist in integrating them into the mainstream by offering equal opportunity, treat with humanity and respect their dignity. Entrepreneurship in the current age is endorsing the gender fairness movement. Unstable careers and economic inadequacy had inclined one of the gender variant people called Transgender to become entrepreneurs. These tiny budding entrepreneurs resulted in economic transition by means of employment, free from the clutches of stereotype jobs, raised standard of living and handful of financial empowerment. Besides all these inhibitions, they were able to witness a platform for skill set development that ignited them to enter into entrepreneurial domain. This paper epitomizes skill sets involved in trans-entrepreneurs of Thoothukudi Municipal Corporation of Tamil Nadu State and is a groundbreaking determination to sightsee various skills incorporated and the impact on entrepreneurship.
DETERMINANTS AFFECTING THE USER'S INTENTION TO USE MOBILE BANKING APPLICATIONSIAEME Publication
The banking and financial services industries are experiencing increased technology penetration. Among them, the banking industry has made technological advancements to better serve the general populace. The economy focused on transforming the banking sector's system into a cashless, paperless, and faceless one. The researcher wants to evaluate the user's intention for utilising a mobile banking application. The study also examines the variables affecting the user's behaviour intention when selecting specific applications for financial transactions. The researcher employed a well-structured questionnaire and a descriptive study methodology to gather the respondents' primary data utilising the snowball sampling technique. The study includes variables like performance expectations, effort expectations, social impact, enabling circumstances, and perceived risk. Each of the aforementioned variables has a major impact on how users utilise mobile banking applications. The outcome will assist the service provider in comprehending the user's history with mobile banking applications.
ANALYSE THE USER PREDILECTION ON GPAY AND PHONEPE FOR DIGITAL TRANSACTIONSIAEME Publication
Technology upgradation in banking sector took the economy to view that payment mode towards online transactions using mobile applications. This system enabled connectivity between banks, Merchant and user in a convenient mode. there are various applications used for online transactions such as Google pay, Paytm, freecharge, mobikiwi, oxygen, phonepe and so on and it also includes mobile banking applications. The study aimed at evaluating the predilection of the user in adopting digital transaction. The study is descriptive in nature. The researcher used random sample techniques to collect the data. The findings reveal that mobile applications differ with the quality of service rendered by Gpay and Phonepe. The researcher suggest the Phonepe application should focus on implementing the application should be user friendly interface and Gpay on motivating the users to feel the importance of request for money and modes of payments in the application.
VOICE BASED ATM FOR VISUALLY IMPAIRED USING ARDUINOIAEME Publication
The prototype of a voice-based ATM for visually impaired using Arduino is to help people who are blind. This uses RFID cards which contain users fingerprint encrypted on it and interacts with the users through voice commands. ATM operates when sensor detects the presence of one person in the cabin. After scanning the RFID card, it will ask to select the mode like –normal or blind. User can select the respective mode through voice input, if blind mode is selected the balance check or cash withdraw can be done through voice input. Normal mode procedure is same as the existing ATM.
IMPACT OF EMOTIONAL INTELLIGENCE ON HUMAN RESOURCE MANAGEMENT PRACTICES AMONG...IAEME Publication
There is increasing acceptability of emotional intelligence as a major factor in personality assessment and effective human resource management. Emotional intelligence as the ability to build capacity, empathize, co-operate, motivate and develop others cannot be divorced from both effective performance and human resource management systems. The human person is crucial in defining organizational leadership and fortunes in terms of challenges and opportunities and walking across both multinational and bilateral relationships. The growing complexity of the business world requires a great deal of self-confidence, integrity, communication, conflict and diversity management to keep the global enterprise within the paths of productivity and sustainability. Using the exploratory research design and 255 participants the result of this original study indicates strong positive correlation between emotional intelligence and effective human resource management. The paper offers suggestions on further studies between emotional intelligence and human capital development and recommends for conflict management as an integral part of effective human resource management.
VISUALISING AGING PARENTS & THEIR CLOSE CARERS LIFE JOURNEY IN AGING ECONOMYIAEME Publication
Our life journey, in general, is closely defined by the way we understand the meaning of why we coexist and deal with its challenges. As we develop the "inspiration economy", we could say that nearly all of the challenges we have faced are opportunities that help us to discover the rest of our journey. In this note paper, we explore how being faced with the opportunity of being a close carer for an aging parent with dementia brought intangible discoveries that changed our insight of the meaning of the rest of our life journey.
A STUDY ON THE IMPACT OF ORGANIZATIONAL CULTURE ON THE EFFECTIVENESS OF PERFO...IAEME Publication
The main objective of this study is to analyze the impact of aspects of Organizational Culture on the Effectiveness of the Performance Management System (PMS) in the Health Care Organization at Thanjavur. Organizational Culture and PMS play a crucial role in present-day organizations in achieving their objectives. PMS needs employees’ cooperation to achieve its intended objectives. Employees' cooperation depends upon the organization’s culture. The present study uses exploratory research to examine the relationship between the Organization's culture and the Effectiveness of the Performance Management System. The study uses a Structured Questionnaire to collect the primary data. For this study, Thirty-six non-clinical employees were selected from twelve randomly selected Health Care organizations at Thanjavur. Thirty-two fully completed questionnaires were received.
Living in 21st century in itself reminds all of us the necessity of police and its administration. As more and more we are entering into the modern society and culture, the more we require the services of the so called ‘Khaki Worthy’ men i.e., the police personnel. Whether we talk of Indian police or the other nation’s police, they all have the same recognition as they have in India. But as already mentioned, their services and requirements are different after the like 26th November, 2008 incidents, where they without saving their own lives has sacrificed themselves without any hitch and without caring about their respective family members and wards. In other words, they are like our heroes and mentors who can guide us from the darkness of fear, militancy, corruption and other dark sides of life and so on. Now the question arises, if Gandhi would have been alive today, what would have been his reaction/opinion to the police and its functioning? Would he have some thing different in his mind now what he had been in his mind before the partition or would he be going to start some Satyagraha in the form of some improvement in the functioning of the police administration? Really these questions or rather night mares can come to any one’s mind, when there is too much confusion is prevailing in our minds, when there is too much corruption in the society and when the polices working is also in the questioning because of one or the other case throughout the India. It is matter of great concern that we have to thing over our administration and our practical approach because the police personals are also like us, they are part and parcel of our society and among one of us, so why we all are pin pointing towards them.
A STUDY ON TALENT MANAGEMENT AND ITS IMPACT ON EMPLOYEE RETENTION IN SELECTED...IAEME Publication
The goal of this study was to see how talent management affected employee retention in the selected IT organizations in Chennai. The fundamental issue was the difficulty to attract, hire, and retain talented personnel who perform well and the gap between supply and demand of talent acquisition and retaining them within the firms. The study's main goals were to determine the impact of talent management on employee retention in IT companies in Chennai, investigate talent management strategies that IT companies could use to improve talent acquisition, performance management, career planning and formulate retention strategies that the IT firms could use. The respondents were given a structured close-ended questionnaire with the 5 Point Likert Scale as part of the study's quantitative research design. The target population consisted of 289 IT professionals. The questionnaires were distributed and collected by the researcher directly. The Statistical Package for Social Sciences (SPSS) was used to collect and analyse the questionnaire responses. Hypotheses that were formulated for the various areas of the study were tested using a variety of statistical tests. The key findings of the study suggested that talent management had an impact on employee retention. The studies also found that there is a clear link between the implementation of talent management and retention measures. Management should provide enough training and development for employees, clarify job responsibilities, provide adequate remuneration packages, and recognise employees for exceptional performance.
ATTRITION IN THE IT INDUSTRY DURING COVID-19 PANDEMIC: LINKING EMOTIONAL INTE...IAEME Publication
Globally, Millions of dollars were spent by the organizations for employing skilled Information Technology (IT) professionals. It is costly to replace unskilled employees with IT professionals possessing technical skills and competencies that aid in interconnecting the business processes. The organization’s employment tactics were forced to alter by globalization along with technological innovations as they consistently diminish to remain lean, outsource to concentrate on core competencies along with restructuring/reallocate personnel to gather efficiency. As other jobs, organizations or professions have become reasonably more appropriate in a shifting employment landscape, the above alterations trigger both involuntary as well as voluntary turnover. The employee view on jobs is also afflicted by the COVID-19 pandemic along with the employee-driven labour market. So, having effective strategies is necessary to tackle the withdrawal rate of employees. By associating Emotional Intelligence (EI) along with Talent Management (TM) in the IT industry, the rise in attrition rate was analyzed in this study. Only 303 respondents were collected out of 350 participants to whom questionnaires were distributed. From the employees of IT organizations located in Bangalore (India), the data were congregated. A simple random sampling methodology was employed to congregate data as of the respondents. Generating the hypothesis along with testing is eventuated. The effect of EI and TM along with regression analysis between TM and EI was analyzed. The outcomes indicated that employee and Organizational Performance (OP) were elevated by effective EI along with TM.
INFLUENCE OF TALENT MANAGEMENT PRACTICES ON ORGANIZATIONAL PERFORMANCE A STUD...IAEME Publication
By implementing talent management strategy, organizations would have the option to retain their skilled professionals while additionally working on their overall performance. It is the course of appropriately utilizing the ideal individuals, setting them up for future top positions, exploring and dealing with their performance, and holding them back from leaving the organization. It is employee performance that determines the success of every organization. The firm quickly obtains an upper hand over its rivals in the event that its employees having particular skills that cannot be duplicated by the competitors. Thus, firms are centred on creating successful talent management practices and processes to deal with the unique human resources. Firms are additionally endeavouring to keep their top/key staff since on the off chance that they leave; the whole store of information leaves the firm's hands. The study's objective was to determine the impact of talent management on organizational performance among the selected IT organizations in Chennai. The study recommends that talent management limitedly affects performance. On the off chance that this talent is appropriately management and implemented properly, organizations might benefit as much as possible from their maintained assets to support development and productivity, both monetarily and non-monetarily.
A STUDY OF VARIOUS TYPES OF LOANS OF SELECTED PUBLIC AND PRIVATE SECTOR BANKS...IAEME Publication
Banking regulations act of India, 1949 defines banking as “acceptance of deposits for the purpose of lending or investment from the public, repayment on demand or otherwise and withdrawable through cheques, drafts order or otherwise”, the major participants of the Indian financial system are commercial banks, the financial institution encompassing term lending institutions. Investments institutions, specialized financial institution and the state level development banks, non banking financial companies (NBFC) and other market intermediaries such has the stock brokers and money lenders are among the oldest of the certain variants of NBFC and the oldest market participants. The asset quality of banks is one of the most important indicators of their financial health. The Indian banking sector has been facing severe problems of increasing Non- Performing Assets (NPAs). The NPAs growth directly and indirectly affects the quality of assets and profitability of banks. It also shows the efficiency of banks credit risk management and the recovery effectiveness. NPA do not generate any income, whereas, the bank is required to make provisions for such as assets that why is a double edge weapon. This paper outlines the concept of quality of bank loans of different types like Housing, Agriculture and MSME loans in state Haryana of selected public and private sector banks. This study is highlighting problems associated with the role of commercial bank in financing Small and Medium Scale Enterprises (SME). The overall objective of the research was to assess the effect of the financing provisions existing for the setting up and operations of MSMEs in the country and to generate recommendations for more robust financing mechanisms for successful operation of the MSMEs, in turn understanding the impact of MSME loans on financial institutions due to NPA. There are many research conducted on the topic of Non- Performing Assets (NPA) Management, concerning particular bank, comparative study of public and private banks etc. In this paper the researcher is considering the aggregate data of selected public sector and private sector banks and attempts to compare the NPA of Housing, Agriculture and MSME loans in state Haryana of public and private sector banks. The tools used in the study are average and Anova test and variance. The findings reveal that NPA is common problem for both public and private sector banks and is associated with all types of loans either that is housing loans, agriculture loans and loans to SMES. NPAs of both public and private sector banks show the increasing trend. In 2010-11 GNPA of public and private sector were at same level it was 2% but after 2010-11 it increased in many fold and at present there is GNPA in some more than 15%. It shows the dark area of Indian banking sector.
EXPERIMENTAL STUDY OF MECHANICAL AND TRIBOLOGICAL RELATION OF NYLON/BaSO4 POL...IAEME Publication
An experiment conducted in this study found that BaSO4 changed Nylon 6's mechanical properties. By changing the weight ratios, BaSO4 was used to make Nylon 6. This Researcher looked into how hard Nylon-6/BaSO4 composites are and how well they wear. Experiments were done based on Taguchi design L9. Nylon-6/BaSO4 composites can be tested for their hardness number using a Rockwell hardness testing apparatus. On Nylon/BaSO4, the wear behavior was measured by a wear monitor, pinon-disc friction by varying reinforcement, sliding speed, and sliding distance, and the microstructure of the crack surfaces was observed by SEM. This study provides significant contributions to ultimate strength by increasing BaSO4 content up to 16% in the composites, and sliding speed contributes 72.45% to the wear rate
ROLE OF SOCIAL ENTREPRENEURSHIP IN RURAL DEVELOPMENT OF INDIA - PROBLEMS AND ...IAEME Publication
The majority of the population in India lives in villages. The village is the back bone of the country. Village or rural industries play an important role in the national economy, particularly in the rural development. Developing the rural economy is one of the key indicators towards a country’s success. Whether it be the need to look after the welfare of the farmers or invest in rural infrastructure, Governments have to ensure that rural development isn’t compromised. The economic development of our country largely depends on the progress of rural areas and the standard of living of rural masses. Village or rural industries play an important role in the national economy, particularly in the rural development. Rural entrepreneurship is based on stimulating local entrepreneurial talent and the subsequent growth of indigenous enterprises. It recognizes opportunity in the rural areas and accelerates a unique blend of resources either inside or outside of agriculture. Rural entrepreneurship brings an economic value to the rural sector by creating new methods of production, new markets, new products and generate employment opportunities thereby ensuring continuous rural development. Social Entrepreneurship has the direct and primary objective of serving the society along with the earning profits. So, social entrepreneurship is different from the economic entrepreneurship as its basic objective is not to earn profits but for providing innovative solutions to meet the society needs which are not taken care by majority of the entrepreneurs as they are in the business for profit making as a sole objective. So, the Social Entrepreneurs have the huge growth potential particularly in the developing countries like India where we have huge societal disparities in terms of the financial positions of the population. Still 22 percent of the Indian population is below the poverty line and also there is disparity among the rural & urban population in terms of families living under BPL. 25.7 percent of the rural population & 13.7 percent of the urban population is under BPL which clearly shows the disparity of the poor people in the rural and urban areas. The need to develop social entrepreneurship in agriculture is dictated by a large number of social problems. Such problems include low living standards, unemployment, and social tension. The reasons that led to the emergence of the practice of social entrepreneurship are the above factors. The research problem lays upon disclosing the importance of role of social entrepreneurship in rural development of India. The paper the tendencies of social entrepreneurship in India, to present successful examples of such business for providing recommendations how to improve situation in rural areas in terms of social entrepreneurship development. Indian government has made some steps towards development of social enterprises, social entrepreneurship, and social in- novation, but a lot remains to be improved.
OPTIMAL RECONFIGURATION OF POWER DISTRIBUTION RADIAL NETWORK USING HYBRID MET...IAEME Publication
Distribution system is a critical link between the electric power distributor and the consumers. Most of the distribution networks commonly used by the electric utility is the radial distribution network. However in this type of network, it has technical issues such as enormous power losses which affect the quality of the supply. Nowadays, the introduction of Distributed Generation (DG) units in the system help improve and support the voltage profile of the network as well as the performance of the system components through power loss mitigation. In this study network reconfiguration was done using two meta-heuristic algorithms Particle Swarm Optimization and Gravitational Search Algorithm (PSO-GSA) to enhance power quality and voltage profile in the system when simultaneously applied with the DG units. Backward/Forward Sweep Method was used in the load flow analysis and simulated using the MATLAB program. Five cases were considered in the Reconfiguration based on the contribution of DG units. The proposed method was tested using IEEE 33 bus system. Based on the results, there was a voltage profile improvement in the system from 0.9038 p.u. to 0.9594 p.u.. The integration of DG in the network also reduced power losses from 210.98 kW to 69.3963 kW. Simulated results are drawn to show the performance of each case.
APPLICATION OF FRUGAL APPROACH FOR PRODUCTIVITY IMPROVEMENT - A CASE STUDY OF...IAEME Publication
Manufacturing industries have witnessed an outburst in productivity. For productivity improvement manufacturing industries are taking various initiatives by using lean tools and techniques. However, in different manufacturing industries, frugal approach is applied in product design and services as a tool for improvement. Frugal approach contributed to prove less is more and seems indirectly contributing to improve productivity. Hence, there is need to understand status of frugal approach application in manufacturing industries. All manufacturing industries are trying hard and putting continuous efforts for competitive existence. For productivity improvements, manufacturing industries are coming up with different effective and efficient solutions in manufacturing processes and operations. To overcome current challenges, manufacturing industries have started using frugal approach in product design and services. For this study, methodology adopted with both primary and secondary sources of data. For primary source interview and observation technique is used and for secondary source review has done based on available literatures in website, printed magazines, manual etc. An attempt has made for understanding application of frugal approach with the study of manufacturing industry project. Manufacturing industry selected for this project study is Mahindra and Mahindra Ltd. This paper will help researcher to find the connections between the two concepts productivity improvement and frugal approach. This paper will help to understand significance of frugal approach for productivity improvement in manufacturing industry. This will also help to understand current scenario of frugal approach in manufacturing industry. In manufacturing industries various process are involved to deliver the final product. In the process of converting input in to output through manufacturing process productivity plays very critical role. Hence this study will help to evolve status of frugal approach in productivity improvement programme. The notion of frugal can be viewed as an approach towards productivity improvement in manufacturing industries.
A MULTIPLE – CHANNEL QUEUING MODELS ON FUZZY ENVIRONMENTIAEME Publication
In this paper, we investigated a queuing model of fuzzy environment-based a multiple channel queuing model (M/M/C) ( /FCFS) and study its performance under realistic conditions. It applies a nonagonal fuzzy number to analyse the relevant performance of a multiple channel queuing model (M/M/C) ( /FCFS). Based on the sub interval average ranking method for nonagonal fuzzy number, we convert fuzzy number to crisp one. Numerical results reveal that the efficiency of this method. Intuitively, the fuzzy environment adapts well to a multiple channel queuing models (M/M/C) ( /FCFS) are very well.