This document discusses techniques for analyzing electrocardiogram (ECG) signals that are noisy and non-stationary. It compares the Empirical Mode Decomposition (EMD), Ensemble Empirical Mode Decomposition (EEMD), and Discrete Wavelet Transform (DWT) for denoising ECG signals, finding that EEMD performs best by preserving true waveform features while eliminating noise. It also analyzes normal and abnormal (atrial fibrillation) ECG signals using parametric (periodogram, capon, time-varying autoregressive) and non-parametric (S-transform, smoothed pseudo affine Wigner distributions) time-frequency techniques, determining that the periodogram technique provides the best resolution
Accuracy Assessment for Multi-Channel ECG Waveforms Using Soft Computing Meth...IJERA Editor
ECG waveform rhythmic analysis is very important. In recent trends, analysis processes of ECG waveform applications are available in smart devices. Still, existing methods are not able to accomplish the complete accuracy assessment while classify the multi-channel ECG waveforms. In this paper, proposed analysis of accuracy assessment of the classification of multi-channel ECG waveforms using most popular Soft Computing algorithms. In this research, main focus is on the better rule generation to analyze the multi-channel ECG waveforms. Analysis is mainly done inSoft Computing methods like the Decision Trees with different pruning analysis, Logistic Model Trees with different regression process and Support Vector Machine with Particle Swarm Optimization (SVM-PSO). All these analysis methods are trained and tested with MIT-BIH 12 channel ECG waveforms. Before trained these methods, MSO-FIR filter should be used as data preprocessing for removal of noise from original multi-channel ECG waveforms. MSO technique is used for automatically finding out the cutoff frequency of multichannel ECG waveforms which is used in low-pass filtering process. The classification performance is discussed using mean squared error, member function, classification accuracy, complexity of design, and area under curve on MIT-BIH data. Additionally, this research work is extended for the samples of multi-channel ECG waveforms from the Scope diagnostic center, Hyderabad. Our study assets the best process using the Soft Computing methods for analysis of multi-channel ECG waveforms
IRJET- A Survey on Classification and identification of Arrhythmia using Mach...IRJET Journal
This document summarizes a survey on classifying and identifying arrhythmias using machine learning techniques. The survey examines existing research that uses techniques like support vector machines, neural networks, and deep learning algorithms to classify arrhythmias based on features extracted from electrocardiogram (ECG) signals. The proposed system aims to classify four types of arrhythmias (normal rhythm, left/right bundle branch block, premature ventricular contraction) from the MIT-BIH arrhythmia database with high accuracy by optimizing the combination of preprocessing, feature extraction, and classification methods. Performance will be evaluated based on sensitivity, specificity, and accuracy metrics.
A comparative study of wavelet families for electromyography signal classific...journalBEEI
The document presents a study comparing different wavelet families for classifying electromyography (EMG) signals based on discrete wavelet transform (DWT). The proposed method involves decomposing EMG signals into sub-bands using DWT, extracting statistical features from each sub-band, and using support vector machines (SVM) for classification. Results showed that the sym14 wavelet at the 8th decomposition level achieved the best classification performance for detecting neuromuscular disorders. The study demonstrates that the proposed DWT-based approach can effectively classify EMG signals and help diagnose neuromuscular conditions.
IRJET- Arrhythmia Detection using One Dimensional Convolutional Neural NetworkIRJET Journal
1) The document discusses using a 1D convolutional neural network to detect different types of arrhythmias from electrocardiogram (ECG) signals.
2) It proposes a novel wavelet domain multiresolution convolutional neural network approach that avoids complicated heartbeat detection techniques and heavy manual feature engineering.
3) The approach segments ECG signals, applies a discrete cosine transform to select coefficients, and uses a CNN for classification and arrhythmia monitoring. It detects five types of arrhythmias from one-lead ECG signals.
Utilizing ECG Waveform Features as New Biometric Authentication Method IJECEIAES
In this study, we are proposing a practical way for human identification based on a new biometric method. The new method is built on the use of the electrocardiogram (ECG) signal waveform features, which are produced from the process of acquiring electrical activities of the heart by using electrodes placed on the body. This process is launched over a period of time by using a recording device to read and store the ECG signal. On the contrary of other biometrics method like voice, fingerprint and iris scan, ECG signal cannot be copied or manipulated. The first operation for our system is to record a portion of 30 seconds out of whole ECG signal of a certain user in order to register it as user template in the system. Then the system will take 7 to 9 seconds in authenticating the template using template matching techniques. 44 subjects‟ raw ECG data were downloaded from Physionet website repository. We used a template matching technique for the authentication process and Linear SVM algorithm for the classification task. The accuracy rate was 97.2% for the authentication process and 98.6% for the classification task; with false acceptance rate 1.21%.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Novel method to find the parameter for noise removal from multi channel ecg w...eSAT Journals
This document presents a novel method for removing noise from multi-channel electrocardiogram (ECG) waveforms using a multi-swarm optimization (MSO) approach. The method involves extracting features from ECG data, using MSO to identify an optimal cutoff frequency parameter for a finite impulse response (FIR) filter, and applying the FIR filter using the identified parameter to remove noise from the ECG signals. The MSO approach divides particles into multiple swarms that each focus on a region of the search space, helping to overcome sensitivity to initial positions found in traditional particle swarm optimization. The resulting filtered ECG signals are evaluated against original clean signals to validate the noise removal performance of the MSO-identified cutoff frequency parameter and
Performance analysis of ecg qrs complex detection using morphological operatorsIAEME Publication
The QRS complex detection is one of the most essential tasks in ECG analysis. This paper
presents an algorithm of QRS complex detection using morphological operators. The proposed
algorithm utilizes the dilation-erosion mathematical morphology filtering to suppress the background
noise and remove the baseline drift. Then the modulus accumulation is used to enhance the signal
and improve signal-to-noise ratio. The performance of the algorithm is evaluated with MIT-BIH
arrhythmia database and wearable ECG Data. The algorithm gets the high detection rate and high
speed.
Accuracy Assessment for Multi-Channel ECG Waveforms Using Soft Computing Meth...IJERA Editor
ECG waveform rhythmic analysis is very important. In recent trends, analysis processes of ECG waveform applications are available in smart devices. Still, existing methods are not able to accomplish the complete accuracy assessment while classify the multi-channel ECG waveforms. In this paper, proposed analysis of accuracy assessment of the classification of multi-channel ECG waveforms using most popular Soft Computing algorithms. In this research, main focus is on the better rule generation to analyze the multi-channel ECG waveforms. Analysis is mainly done inSoft Computing methods like the Decision Trees with different pruning analysis, Logistic Model Trees with different regression process and Support Vector Machine with Particle Swarm Optimization (SVM-PSO). All these analysis methods are trained and tested with MIT-BIH 12 channel ECG waveforms. Before trained these methods, MSO-FIR filter should be used as data preprocessing for removal of noise from original multi-channel ECG waveforms. MSO technique is used for automatically finding out the cutoff frequency of multichannel ECG waveforms which is used in low-pass filtering process. The classification performance is discussed using mean squared error, member function, classification accuracy, complexity of design, and area under curve on MIT-BIH data. Additionally, this research work is extended for the samples of multi-channel ECG waveforms from the Scope diagnostic center, Hyderabad. Our study assets the best process using the Soft Computing methods for analysis of multi-channel ECG waveforms
IRJET- A Survey on Classification and identification of Arrhythmia using Mach...IRJET Journal
This document summarizes a survey on classifying and identifying arrhythmias using machine learning techniques. The survey examines existing research that uses techniques like support vector machines, neural networks, and deep learning algorithms to classify arrhythmias based on features extracted from electrocardiogram (ECG) signals. The proposed system aims to classify four types of arrhythmias (normal rhythm, left/right bundle branch block, premature ventricular contraction) from the MIT-BIH arrhythmia database with high accuracy by optimizing the combination of preprocessing, feature extraction, and classification methods. Performance will be evaluated based on sensitivity, specificity, and accuracy metrics.
A comparative study of wavelet families for electromyography signal classific...journalBEEI
The document presents a study comparing different wavelet families for classifying electromyography (EMG) signals based on discrete wavelet transform (DWT). The proposed method involves decomposing EMG signals into sub-bands using DWT, extracting statistical features from each sub-band, and using support vector machines (SVM) for classification. Results showed that the sym14 wavelet at the 8th decomposition level achieved the best classification performance for detecting neuromuscular disorders. The study demonstrates that the proposed DWT-based approach can effectively classify EMG signals and help diagnose neuromuscular conditions.
IRJET- Arrhythmia Detection using One Dimensional Convolutional Neural NetworkIRJET Journal
1) The document discusses using a 1D convolutional neural network to detect different types of arrhythmias from electrocardiogram (ECG) signals.
2) It proposes a novel wavelet domain multiresolution convolutional neural network approach that avoids complicated heartbeat detection techniques and heavy manual feature engineering.
3) The approach segments ECG signals, applies a discrete cosine transform to select coefficients, and uses a CNN for classification and arrhythmia monitoring. It detects five types of arrhythmias from one-lead ECG signals.
Utilizing ECG Waveform Features as New Biometric Authentication Method IJECEIAES
In this study, we are proposing a practical way for human identification based on a new biometric method. The new method is built on the use of the electrocardiogram (ECG) signal waveform features, which are produced from the process of acquiring electrical activities of the heart by using electrodes placed on the body. This process is launched over a period of time by using a recording device to read and store the ECG signal. On the contrary of other biometrics method like voice, fingerprint and iris scan, ECG signal cannot be copied or manipulated. The first operation for our system is to record a portion of 30 seconds out of whole ECG signal of a certain user in order to register it as user template in the system. Then the system will take 7 to 9 seconds in authenticating the template using template matching techniques. 44 subjects‟ raw ECG data were downloaded from Physionet website repository. We used a template matching technique for the authentication process and Linear SVM algorithm for the classification task. The accuracy rate was 97.2% for the authentication process and 98.6% for the classification task; with false acceptance rate 1.21%.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Novel method to find the parameter for noise removal from multi channel ecg w...eSAT Journals
This document presents a novel method for removing noise from multi-channel electrocardiogram (ECG) waveforms using a multi-swarm optimization (MSO) approach. The method involves extracting features from ECG data, using MSO to identify an optimal cutoff frequency parameter for a finite impulse response (FIR) filter, and applying the FIR filter using the identified parameter to remove noise from the ECG signals. The MSO approach divides particles into multiple swarms that each focus on a region of the search space, helping to overcome sensitivity to initial positions found in traditional particle swarm optimization. The resulting filtered ECG signals are evaluated against original clean signals to validate the noise removal performance of the MSO-identified cutoff frequency parameter and
Performance analysis of ecg qrs complex detection using morphological operatorsIAEME Publication
The QRS complex detection is one of the most essential tasks in ECG analysis. This paper
presents an algorithm of QRS complex detection using morphological operators. The proposed
algorithm utilizes the dilation-erosion mathematical morphology filtering to suppress the background
noise and remove the baseline drift. Then the modulus accumulation is used to enhance the signal
and improve signal-to-noise ratio. The performance of the algorithm is evaluated with MIT-BIH
arrhythmia database and wearable ECG Data. The algorithm gets the high detection rate and high
speed.
METHODS FOR IMPROVING THE CLASSIFICATION ACCURACY OF BIOMEDICAL SIGNALS BASED...IAEME Publication
Biomedical signals are long records of electrical activity within the human body, and they faithfully represent the state of health of a person. Of the many biomedical signals, focus of this work is on Electro-encephalogram (EEG), Electro-cardiogram (ECG) and Electro-myogram (EMG). It is tiresome for physicians to visually examine the long records of biomedical signals to arrive at conclusions. Automated classification of these signals can largely assist the physicians in their diagnostic process. Classifying a biomedical signal is the process of attaching the signal to a disease state or healthy state. Classification Accuracy (CA) depends on the features extracted from the signal and the classification process involved. Certain critical information on the health of a person is usually hidden in the spectral content of the signal. In this paper, effort is made for the improvement in CA when spectral features are included in the classification process.
IRJET-Electromyogram Signals for Multiuser Interface- A ReviewIRJET Journal
This document reviews various methods for feature extraction and classification of electromyogram (EMG) signals for multi-user myoelectric interfaces. It surveys previous work that used techniques like discrete wavelet transform (DWT) and support vector machines (SVM) for feature extraction and classification of EMG signals. The document concludes that DWT is well-suited for extracting both time and frequency domain features from non-stationary EMG signals. It also finds that SVM performed accurately for classification of features from multi-user EMG signals. The review aims to determine the best methods for a project using DWT for feature extraction and SVM for classification of EMG signals from multiple users.
This document discusses a system for classifying and compressing cardiac vascular diseases using soft computing techniques to enhance rural healthcare. ECG signals are collected from patients and features are extracted from the signals using discrete wavelet transform. The features are classified using an adaptive neuro-fuzzy inference system to identify normal signals or one of four abnormal conditions. Signals classified as critical are compressed using Huffman coding and sent to a hospital server for treatment, while mild cases are advised to see a cardiologist. The system aims to help identify heart conditions early to save lives in rural areas with limited access to healthcare.
Extraction of respiratory rate from ppg signals using pca and emdeSAT Publishing House
This document discusses extracting respiratory rate from photoplethysmography (PPG) signals using principal component analysis (PCA) and empirical mode decomposition (EMD). It begins with an introduction to PPG signals and how they contain respiratory information. It then discusses previous efforts to extract respiratory signals from PPG that used methods like filtering and wavelets. The document proposes using PCA and EMD to improve upon existing methods. It provides background on PCA, EMD, and reviews literature on extracting respiratory information from ECG and how respiration modulates PPG signals. The aim is to evaluate different signal processing techniques to extract respiratory information from commonly available biomedical signals like ECG and PPG to avoid using additional sensors.
Photoplethysmography (PPG) and Phonocardiography (PCG) are two important non-invasive techniques for monitoring physiological parameters of cardiovascular diagnostics. The PCG signal discloses information about cardiac function through vibrations caused by the working heart. PPG measures relative blood volume changes in the blood vessels close to the skin. This paper emphasizes on simultaneous acquisition of PCG and PPG signals from the same subject with the aid of NIELVIS II+ DAQ and the signals are imported to MATLAB for further processing. Heart rate is extracted from both the signals which are found to be distinctive. This analytical approach of processing these signals can abet for analysis of Heart rate variability (HRV) which is widely used for quantifying neural cardiac control and low variability is particularly predictive of death in patients after myocardial infarction.
This document summarizes a research paper that proposes using EEG signals for person identification. It describes collecting EEG data from subjects using electrodes placed on the scalp. Wavelet packet decomposition is used to extract features from the EEG signals, focusing on the alpha frequency band between 8-12 Hz. Learning vector quantization is then used to classify the EEG patterns and identify individuals. The methodology involves preprocessing the EEG data, extracting features using wavelet packet decomposition, and classifying the features with LVQ to identify persons based on their unique EEG signatures.
Electrocardiogram (ECG) flag is the electrical action of the human heart. The ECG contains imperative data about the general execution of the human heart framework. In this way, exact examination of the ECG flag is extremely critical however difficult undertaking. ECG flag is regularly low adequacy and polluted with various kinds of commotions due to its estimation procedure e.g. control line obstruction, amplifier clamor and standard meander. Benchmark meander is a sort of organic commotion caused by the arbitrary development of patient amid ECG estimation and misshapes the ST fragment of the ECG waveform. In this paper, we present a far reaching near investigation of five generally utilized versatile filtering calculations for the evacuation of low recurrence clamor. We perform broad investigations on the Physionet MIT BIH ECG database and contrast the flag with commotion proportion (SNR), combination rate, and time many-sided quality of these calculations. It is discovered that modified LMS has better execution than others regarding SNR and assembly rate.
IRJET-A Survey on Effect of Meditation on Attention Level Using EEGIRJET Journal
This document summarizes a proposed study that investigates the effect of meditation on attention level using EEG data analysis. It begins with an introduction on attention and meditation, then reviews previous related studies that analyzed EEG data to measure attention. The proposed work will record EEG data from subjects using the 10-20 electrode placement system before and after an 8-week meditation program. The EEG data will be preprocessed to remove noise, features will be extracted using wavelet transforms, and a random forest classifier will be used to classify attention levels and analyze the effect of meditation. The goal is to objectively measure how meditation impacts attention to help students improve concentration.
Technology Content Analysis with Technometric Theory Approach to Improve Perf...Nooria Sukmaningtyas
Radiologic installation be facilitated with medical equipment for supporting of health services in
investigation of disease. There are 3 (three) criteria technology of equipment: investigation with
sophisticated equipment (CT scans single slice), investigation with a medium-sized enterprises equipment
(general x-ray 300 mA / 125 KV) and simple equipment (Portable Dental x-ray 8mA /70 KV). In view of
contribution to the Hospital, Radiologic installation from 2008 until 2012 has decreased, the data as
follows: The year 2008 was 6.8 percent; in 2009 was 4.3 percent; in 2010 was 2.5 percent; in 2011 was
2.3 percent; and in 2012 was 2.6 percent.By using approach of technometric theory, the study measures
the significant contribution of each component of technology that consists of aspects: Technoware,
Humanware, Inforware, Orgaware in Radiologic Installation, and also want to know about the
sophistication of the technology which is used in indicators Technology contribution coefficien (TCC), so
factors that affect experienced performance result can be known, where TCC are: TCC High Technology
TCC_tt =0,490 if T_tt= 0,387 H= 0,519 I= 0,538 O=0,534 TCC Middle technology TCC_tm=0,443 if
T_tm=0,258 H=0,519 I=0,538 O=0,534TCC Simple technology TCC_ts =0,398 if T_ts=0,168 H=0,519
I=0,538 O=0,534If the value of component technology (T,H,I,O) is less than TCC, its means that
Radiologic Installation Unit is in decreasing phase, a condition that cannot be left, need the directors’s
action immediately to formulate the right and fast policies to protect it from lost. The final result of study is a
gap almost everywhere from the three technology component Humanware = 0,519, Inforware = 0,538,
Orgaware = 0,534, but the gap between most components in technology aspects Technoware (0,387,
0,258, 0,168), that means that development strategy of Radiologic Installation unit be prioritized on
increasing aspects Technoware (rejuvenation medical equipment).
Characterization of transients and fault diagnosis in transformer by discreteIAEME Publication
This document discusses using discrete wavelet transform (DWT) and artificial neural networks (ANN) to characterize transients and diagnose faults in transformers. It begins with an introduction to the problem and background on using the second harmonic component for discrimination. It then discusses why time-frequency information is needed and the advantages of wavelet transforms over Fourier transforms. The document describes collecting data from a test transformer under normal and faulted conditions. It explains using DWT for feature extraction and visualizing the wavelet decomposition levels to characterize magnetizing inrush versus inter-turn faults. Finally, it proposes using ANN trained on the wavelet spectral energies for automated discrimination between fault cases.
St variability assessment based on complexity factor using independent compon...eSAT Journals
Abstract
In recent days the computerized ECG has become the most effective and convenient diagnostic tool to identify cardiac diseases
such as Myocardial Ischemia (MI). Among the Cardio vascular diseases (CVDs) the Myocardial Ischemia (MI) is one of the
leading causes of heart attacks. The Myocardial Ischemia (MI) occurs due to the difficulties in the flow of the electrical impulses
from SA node to bundle branches because of the abnormalities in the conduction system. Normally the ECG is used as a main
diagnostic tool to identify the cardiac diseases. In order to obtain accurate information from ECG it is necessary to remove all the
artifacts and extract the pure ECG from noise background. In this paper the removal of the artifacts is achieved with linear
filtering and the extraction of the clean ECG signal is performed using Independent Component Analysis (ICA). After
preprocessing and ECG extraction, the QRS complex of each beat is detected by using Hilbert Transform and simple threshold
detection algorithm. Next the Instantaneous Heart Rate (IHR) from RR interval and Complexity Factor (CF) from time series ST
segment are computed for each beat to form desired feature sets. Later a linear regression model is designed using Instantaneous
heart rate (IHR) and ST segment Complexity Factors (STCFs) based on Linear Regression analysis. The proposed ICA-STCFR
model is used to identify the ischemic beats from the test feature sets of ECG signal to assess the ST-Segment Variability (STV).
The ECG data sets obtained from a local hospital were used to design and test the model. The evaluation parameters, Ischemic
Intensity Factor (IIF), Ischemic Activity Factors (IAF) and Peak to Average Value (PAV) were used to evaluate the proposed
method and compared with Wavelet Transform based method. The proposed ICA-STCFR was found to be yielding better results
than WT-ST method.
Key Words: Myocardial Ischemia, ICA, HT, QRS Complex, RR interval, ST segments, IHR, STCF, Scatter-plot
IRJET- Acute Ischemic Stroke Detection and ClassificationIRJET Journal
This document presents a method for detecting and classifying acute ischemic strokes in CT scan images. The method involves pre-processing images using median filtering and skull stripping. Features like mean, entropy, and gray-level co-occurrence matrix values are then extracted. Naive Bayes and k-nearest neighbor classifiers are used to classify images as normal or stroke with 92% accuracy. The k-NN classifier takes longer (8.80 seconds) to process images compared to the Naive Bayes classifier (5.85 seconds). The method accurately detects stroke regions in images and can help in early diagnosis and treatment of ischemic strokes.
IRJET - ECG based Cardiac Arrhythmia Detection using a Deep Neural NetworkIRJET Journal
This paper presents a system for detecting cardiac arrhythmias based on electrocardiogram (ECG) signals using a deep neural network. ECG signals are first transformed into time-frequency spectrograms using short-time Fourier transform. These spectrograms are then used as input for a 2D convolutional neural network to classify five types of arrhythmias: normal beat, normal sinus rhythm, atrial fibrillation, supraventricular tachycardia, and atrial premature beat. The technique is evaluated on the MIT-BIH database and achieves 97% beat classification accuracy and perfect rhythm identification. Compared to other existing methods like SVM, RNN, RF and KNN, the deep learning approach provides better performance for E
This document analyzes spectral features extracted from EEG signals to detect brain tumors. Sixteen candidate features were considered from 102 normal subjects and 100 brain tumor patients. Nine of the features showed a statistically significant difference between the two groups. Specifically, power ratio index, relative intensity ratio for different frequency bands, maximum-to-mean power ratio, peak bispectrum, peak bicoherence, and spectral entropy values were extracted from segmented EEG signals and compared between subjects. Statistical testing found that the mean values of nine features were significantly different between brain tumor patients and normal subjects, suggesting quantitative EEG analysis may help in diagnosis of brain tumors.
This document presents a novel algorithm for automated detection of heartbeats in an electrocardiogram (ECG) signal using morphological filtering and Daubechies wavelet transform. The algorithm consists of three stages: 1) preprocessing using mathematical morphology operations to remove noise and baseline wander, 2) Daubechies wavelet transform decomposition to facilitate heartbeat detection, and 3) feature extraction to identify the QRS complex and detect heartbeats by analyzing the wavelet coefficients. Morphological filtering preserves the original ECG signal shape while removing impulsive noise, and wavelet transform aids in analyzing the non-stationary ECG signal. The algorithm aims to provide accurate and reliable heartbeat detection for diagnosing cardiac conditions.
Analysis of Human Electrocardiogram for Biometric Recognition Using Analytic ...CSCJournals
The electrocardiograph (ECG) contains cardiac features unique to each individual. By analyzing ECG, it should therefore be possible not only to detect the rate and consistency of heartbeats but to also extract other signal features in order to identify ECG records belonging to individual subjects. In this paper, a new approach for automatic analysis of single lead ECG for human recognition is proposed and evaluated. Eighteen temporal, amplitude, width and autoregressive (AR) model parameters are extracted from each ECG beat and classified in order to identify each individual. Proposed system uses pre-processing stage to decrease the effects of noise and other unwanted artifacts usually present in raw ECG data. Following pre-processing steps, ECG stream is partitioned into separate windows where each window includes single beat of ECG signal. Window estimation is based on the localization of the R peaks in the ECG stream that detected by Filter bank method for QRS complex detection. ECG features – temporal, amplitude and AR coefficients are then extracted and used as an input to K-nn and SVM classification algorithms in order to identify the individual subjects and beats. Signal pre-processing techniques, applied feature extraction methods and some intermediate and final classification results are presented in this paper.
Fault Diagnostics of Rolling Bearing based on Improve Time and Frequency Doma...ijsrd.com
The neural network based approaches a feed forward neural network trained with Back Propagation technique was used for automatic diagnosis of defects in bearings. Vibration time domain signals were collected from a normal bearing and defective bearings under various speed conditions. The signals were processed to obtain various statistical parameters, which are good indicators of bearing condition, then best features are selected from graphical method and these inputs were used to train the neural network and the output represented the bearing states. The trained neural networks were used for the recognition of bearing states. The results showed that the trained neural networks were able to distinguish a normal bearing from defective bearings with 83.33 % reliability. Moreover, the network was able to classify the bearings into different states with success rates better than those achieved with the best among the state-of-the-art techniques.
Mri brain tumour detection by histogram and segmentationiaemedu
This document summarizes a research paper on detecting brain tumors in MRI images using a combination of histogram thresholding, modified gradient vector field (GVF), and morphological operators. The non-brain regions are removed using morphological operators. Histogram thresholding is then used to detect if the brain is normal or abnormal/contains a tumor. If abnormal, the modified GVF is used to detect the tumor contour. The proposed method aims to be computationally efficient by only performing segmentation if a tumor is detected. It was tested on many MRI brain images and performance was validated against human expert segmentation.
Development of a web based decision support system for materials selection in...IAEME Publication
This document describes the development of a web-based decision support system for materials selection in construction projects. It proposes a framework that includes a database to store materials information and a decision support component to evaluate materials based on selection criteria. A web interface is developed to allow project participants in different locations to access and manage materials data. The system aims to standardize and automate the materials selection process for construction engineering projects. It discusses developing a conceptual model and database, as well as interfaces like a materials controller to input, view and evaluate options based on criteria like cost, durability and environmental impacts.
Investigation of mrr in wedm for wc co sintered compositeIAEME Publication
The document investigates the influence of wire electrical discharge machining (WEDM) parameters on material removal rate when machining tungsten carbide-cobalt (WC-Co) sintered composite. Experiments were conducted using a response surface methodology with five control factors (pulse on-time, pulse off-time, peak current, servo voltage, and wire tension). The results were used to derive a mathematical model to predict material removal rate and optimize the WEDM process for machining WC-Co composite.
Comparative analysis for higher channel isolation using singleIAEME Publication
This document summarizes the results of two simulations comparing different fiber Bragg grating (FBG) filter configurations for achieving higher channel isolation in dense wavelength-division multiplexing (WDM) systems. The first simulation used a single FBG filter with uniform, Gaussian, or hyperbolic tangent apodization profiles, while the second used two FBG filters connected back-to-back, with the first using uniform apodization and the second using different profiles. For a data rate of 10.52 Gbps and 100 GHz channel spacing, the second simulation achieved lower bit error rates using Gaussian apodization, indicating it provided higher channel isolation than the first simulation or uniform apodization profiles.
This document discusses a proposed software defined network (SDN) based firewall technique. The key aspects of the proposed system are:
1. It separates the control plane functionality (e.g. routing decisions) from the data plane/forwarding plane functionality (e.g. packet forwarding). This is done using an OpenFlow switch as the forwarding element and a central controller to handle control plane functions.
2. The OpenFlow switch uses flow tables populated by the central controller to forward packets. The central controller enforces network policies by programming the flow tables of OpenFlow switches.
3. This architecture allows the OpenFlow switch to be used as a software-defined firewall, with the central controller enforcing firewall rules
METHODS FOR IMPROVING THE CLASSIFICATION ACCURACY OF BIOMEDICAL SIGNALS BASED...IAEME Publication
Biomedical signals are long records of electrical activity within the human body, and they faithfully represent the state of health of a person. Of the many biomedical signals, focus of this work is on Electro-encephalogram (EEG), Electro-cardiogram (ECG) and Electro-myogram (EMG). It is tiresome for physicians to visually examine the long records of biomedical signals to arrive at conclusions. Automated classification of these signals can largely assist the physicians in their diagnostic process. Classifying a biomedical signal is the process of attaching the signal to a disease state or healthy state. Classification Accuracy (CA) depends on the features extracted from the signal and the classification process involved. Certain critical information on the health of a person is usually hidden in the spectral content of the signal. In this paper, effort is made for the improvement in CA when spectral features are included in the classification process.
IRJET-Electromyogram Signals for Multiuser Interface- A ReviewIRJET Journal
This document reviews various methods for feature extraction and classification of electromyogram (EMG) signals for multi-user myoelectric interfaces. It surveys previous work that used techniques like discrete wavelet transform (DWT) and support vector machines (SVM) for feature extraction and classification of EMG signals. The document concludes that DWT is well-suited for extracting both time and frequency domain features from non-stationary EMG signals. It also finds that SVM performed accurately for classification of features from multi-user EMG signals. The review aims to determine the best methods for a project using DWT for feature extraction and SVM for classification of EMG signals from multiple users.
This document discusses a system for classifying and compressing cardiac vascular diseases using soft computing techniques to enhance rural healthcare. ECG signals are collected from patients and features are extracted from the signals using discrete wavelet transform. The features are classified using an adaptive neuro-fuzzy inference system to identify normal signals or one of four abnormal conditions. Signals classified as critical are compressed using Huffman coding and sent to a hospital server for treatment, while mild cases are advised to see a cardiologist. The system aims to help identify heart conditions early to save lives in rural areas with limited access to healthcare.
Extraction of respiratory rate from ppg signals using pca and emdeSAT Publishing House
This document discusses extracting respiratory rate from photoplethysmography (PPG) signals using principal component analysis (PCA) and empirical mode decomposition (EMD). It begins with an introduction to PPG signals and how they contain respiratory information. It then discusses previous efforts to extract respiratory signals from PPG that used methods like filtering and wavelets. The document proposes using PCA and EMD to improve upon existing methods. It provides background on PCA, EMD, and reviews literature on extracting respiratory information from ECG and how respiration modulates PPG signals. The aim is to evaluate different signal processing techniques to extract respiratory information from commonly available biomedical signals like ECG and PPG to avoid using additional sensors.
Photoplethysmography (PPG) and Phonocardiography (PCG) are two important non-invasive techniques for monitoring physiological parameters of cardiovascular diagnostics. The PCG signal discloses information about cardiac function through vibrations caused by the working heart. PPG measures relative blood volume changes in the blood vessels close to the skin. This paper emphasizes on simultaneous acquisition of PCG and PPG signals from the same subject with the aid of NIELVIS II+ DAQ and the signals are imported to MATLAB for further processing. Heart rate is extracted from both the signals which are found to be distinctive. This analytical approach of processing these signals can abet for analysis of Heart rate variability (HRV) which is widely used for quantifying neural cardiac control and low variability is particularly predictive of death in patients after myocardial infarction.
This document summarizes a research paper that proposes using EEG signals for person identification. It describes collecting EEG data from subjects using electrodes placed on the scalp. Wavelet packet decomposition is used to extract features from the EEG signals, focusing on the alpha frequency band between 8-12 Hz. Learning vector quantization is then used to classify the EEG patterns and identify individuals. The methodology involves preprocessing the EEG data, extracting features using wavelet packet decomposition, and classifying the features with LVQ to identify persons based on their unique EEG signatures.
Electrocardiogram (ECG) flag is the electrical action of the human heart. The ECG contains imperative data about the general execution of the human heart framework. In this way, exact examination of the ECG flag is extremely critical however difficult undertaking. ECG flag is regularly low adequacy and polluted with various kinds of commotions due to its estimation procedure e.g. control line obstruction, amplifier clamor and standard meander. Benchmark meander is a sort of organic commotion caused by the arbitrary development of patient amid ECG estimation and misshapes the ST fragment of the ECG waveform. In this paper, we present a far reaching near investigation of five generally utilized versatile filtering calculations for the evacuation of low recurrence clamor. We perform broad investigations on the Physionet MIT BIH ECG database and contrast the flag with commotion proportion (SNR), combination rate, and time many-sided quality of these calculations. It is discovered that modified LMS has better execution than others regarding SNR and assembly rate.
IRJET-A Survey on Effect of Meditation on Attention Level Using EEGIRJET Journal
This document summarizes a proposed study that investigates the effect of meditation on attention level using EEG data analysis. It begins with an introduction on attention and meditation, then reviews previous related studies that analyzed EEG data to measure attention. The proposed work will record EEG data from subjects using the 10-20 electrode placement system before and after an 8-week meditation program. The EEG data will be preprocessed to remove noise, features will be extracted using wavelet transforms, and a random forest classifier will be used to classify attention levels and analyze the effect of meditation. The goal is to objectively measure how meditation impacts attention to help students improve concentration.
Technology Content Analysis with Technometric Theory Approach to Improve Perf...Nooria Sukmaningtyas
Radiologic installation be facilitated with medical equipment for supporting of health services in
investigation of disease. There are 3 (three) criteria technology of equipment: investigation with
sophisticated equipment (CT scans single slice), investigation with a medium-sized enterprises equipment
(general x-ray 300 mA / 125 KV) and simple equipment (Portable Dental x-ray 8mA /70 KV). In view of
contribution to the Hospital, Radiologic installation from 2008 until 2012 has decreased, the data as
follows: The year 2008 was 6.8 percent; in 2009 was 4.3 percent; in 2010 was 2.5 percent; in 2011 was
2.3 percent; and in 2012 was 2.6 percent.By using approach of technometric theory, the study measures
the significant contribution of each component of technology that consists of aspects: Technoware,
Humanware, Inforware, Orgaware in Radiologic Installation, and also want to know about the
sophistication of the technology which is used in indicators Technology contribution coefficien (TCC), so
factors that affect experienced performance result can be known, where TCC are: TCC High Technology
TCC_tt =0,490 if T_tt= 0,387 H= 0,519 I= 0,538 O=0,534 TCC Middle technology TCC_tm=0,443 if
T_tm=0,258 H=0,519 I=0,538 O=0,534TCC Simple technology TCC_ts =0,398 if T_ts=0,168 H=0,519
I=0,538 O=0,534If the value of component technology (T,H,I,O) is less than TCC, its means that
Radiologic Installation Unit is in decreasing phase, a condition that cannot be left, need the directors’s
action immediately to formulate the right and fast policies to protect it from lost. The final result of study is a
gap almost everywhere from the three technology component Humanware = 0,519, Inforware = 0,538,
Orgaware = 0,534, but the gap between most components in technology aspects Technoware (0,387,
0,258, 0,168), that means that development strategy of Radiologic Installation unit be prioritized on
increasing aspects Technoware (rejuvenation medical equipment).
Characterization of transients and fault diagnosis in transformer by discreteIAEME Publication
This document discusses using discrete wavelet transform (DWT) and artificial neural networks (ANN) to characterize transients and diagnose faults in transformers. It begins with an introduction to the problem and background on using the second harmonic component for discrimination. It then discusses why time-frequency information is needed and the advantages of wavelet transforms over Fourier transforms. The document describes collecting data from a test transformer under normal and faulted conditions. It explains using DWT for feature extraction and visualizing the wavelet decomposition levels to characterize magnetizing inrush versus inter-turn faults. Finally, it proposes using ANN trained on the wavelet spectral energies for automated discrimination between fault cases.
St variability assessment based on complexity factor using independent compon...eSAT Journals
Abstract
In recent days the computerized ECG has become the most effective and convenient diagnostic tool to identify cardiac diseases
such as Myocardial Ischemia (MI). Among the Cardio vascular diseases (CVDs) the Myocardial Ischemia (MI) is one of the
leading causes of heart attacks. The Myocardial Ischemia (MI) occurs due to the difficulties in the flow of the electrical impulses
from SA node to bundle branches because of the abnormalities in the conduction system. Normally the ECG is used as a main
diagnostic tool to identify the cardiac diseases. In order to obtain accurate information from ECG it is necessary to remove all the
artifacts and extract the pure ECG from noise background. In this paper the removal of the artifacts is achieved with linear
filtering and the extraction of the clean ECG signal is performed using Independent Component Analysis (ICA). After
preprocessing and ECG extraction, the QRS complex of each beat is detected by using Hilbert Transform and simple threshold
detection algorithm. Next the Instantaneous Heart Rate (IHR) from RR interval and Complexity Factor (CF) from time series ST
segment are computed for each beat to form desired feature sets. Later a linear regression model is designed using Instantaneous
heart rate (IHR) and ST segment Complexity Factors (STCFs) based on Linear Regression analysis. The proposed ICA-STCFR
model is used to identify the ischemic beats from the test feature sets of ECG signal to assess the ST-Segment Variability (STV).
The ECG data sets obtained from a local hospital were used to design and test the model. The evaluation parameters, Ischemic
Intensity Factor (IIF), Ischemic Activity Factors (IAF) and Peak to Average Value (PAV) were used to evaluate the proposed
method and compared with Wavelet Transform based method. The proposed ICA-STCFR was found to be yielding better results
than WT-ST method.
Key Words: Myocardial Ischemia, ICA, HT, QRS Complex, RR interval, ST segments, IHR, STCF, Scatter-plot
IRJET- Acute Ischemic Stroke Detection and ClassificationIRJET Journal
This document presents a method for detecting and classifying acute ischemic strokes in CT scan images. The method involves pre-processing images using median filtering and skull stripping. Features like mean, entropy, and gray-level co-occurrence matrix values are then extracted. Naive Bayes and k-nearest neighbor classifiers are used to classify images as normal or stroke with 92% accuracy. The k-NN classifier takes longer (8.80 seconds) to process images compared to the Naive Bayes classifier (5.85 seconds). The method accurately detects stroke regions in images and can help in early diagnosis and treatment of ischemic strokes.
IRJET - ECG based Cardiac Arrhythmia Detection using a Deep Neural NetworkIRJET Journal
This paper presents a system for detecting cardiac arrhythmias based on electrocardiogram (ECG) signals using a deep neural network. ECG signals are first transformed into time-frequency spectrograms using short-time Fourier transform. These spectrograms are then used as input for a 2D convolutional neural network to classify five types of arrhythmias: normal beat, normal sinus rhythm, atrial fibrillation, supraventricular tachycardia, and atrial premature beat. The technique is evaluated on the MIT-BIH database and achieves 97% beat classification accuracy and perfect rhythm identification. Compared to other existing methods like SVM, RNN, RF and KNN, the deep learning approach provides better performance for E
This document analyzes spectral features extracted from EEG signals to detect brain tumors. Sixteen candidate features were considered from 102 normal subjects and 100 brain tumor patients. Nine of the features showed a statistically significant difference between the two groups. Specifically, power ratio index, relative intensity ratio for different frequency bands, maximum-to-mean power ratio, peak bispectrum, peak bicoherence, and spectral entropy values were extracted from segmented EEG signals and compared between subjects. Statistical testing found that the mean values of nine features were significantly different between brain tumor patients and normal subjects, suggesting quantitative EEG analysis may help in diagnosis of brain tumors.
This document presents a novel algorithm for automated detection of heartbeats in an electrocardiogram (ECG) signal using morphological filtering and Daubechies wavelet transform. The algorithm consists of three stages: 1) preprocessing using mathematical morphology operations to remove noise and baseline wander, 2) Daubechies wavelet transform decomposition to facilitate heartbeat detection, and 3) feature extraction to identify the QRS complex and detect heartbeats by analyzing the wavelet coefficients. Morphological filtering preserves the original ECG signal shape while removing impulsive noise, and wavelet transform aids in analyzing the non-stationary ECG signal. The algorithm aims to provide accurate and reliable heartbeat detection for diagnosing cardiac conditions.
Analysis of Human Electrocardiogram for Biometric Recognition Using Analytic ...CSCJournals
The electrocardiograph (ECG) contains cardiac features unique to each individual. By analyzing ECG, it should therefore be possible not only to detect the rate and consistency of heartbeats but to also extract other signal features in order to identify ECG records belonging to individual subjects. In this paper, a new approach for automatic analysis of single lead ECG for human recognition is proposed and evaluated. Eighteen temporal, amplitude, width and autoregressive (AR) model parameters are extracted from each ECG beat and classified in order to identify each individual. Proposed system uses pre-processing stage to decrease the effects of noise and other unwanted artifacts usually present in raw ECG data. Following pre-processing steps, ECG stream is partitioned into separate windows where each window includes single beat of ECG signal. Window estimation is based on the localization of the R peaks in the ECG stream that detected by Filter bank method for QRS complex detection. ECG features – temporal, amplitude and AR coefficients are then extracted and used as an input to K-nn and SVM classification algorithms in order to identify the individual subjects and beats. Signal pre-processing techniques, applied feature extraction methods and some intermediate and final classification results are presented in this paper.
Fault Diagnostics of Rolling Bearing based on Improve Time and Frequency Doma...ijsrd.com
The neural network based approaches a feed forward neural network trained with Back Propagation technique was used for automatic diagnosis of defects in bearings. Vibration time domain signals were collected from a normal bearing and defective bearings under various speed conditions. The signals were processed to obtain various statistical parameters, which are good indicators of bearing condition, then best features are selected from graphical method and these inputs were used to train the neural network and the output represented the bearing states. The trained neural networks were used for the recognition of bearing states. The results showed that the trained neural networks were able to distinguish a normal bearing from defective bearings with 83.33 % reliability. Moreover, the network was able to classify the bearings into different states with success rates better than those achieved with the best among the state-of-the-art techniques.
Mri brain tumour detection by histogram and segmentationiaemedu
This document summarizes a research paper on detecting brain tumors in MRI images using a combination of histogram thresholding, modified gradient vector field (GVF), and morphological operators. The non-brain regions are removed using morphological operators. Histogram thresholding is then used to detect if the brain is normal or abnormal/contains a tumor. If abnormal, the modified GVF is used to detect the tumor contour. The proposed method aims to be computationally efficient by only performing segmentation if a tumor is detected. It was tested on many MRI brain images and performance was validated against human expert segmentation.
Development of a web based decision support system for materials selection in...IAEME Publication
This document describes the development of a web-based decision support system for materials selection in construction projects. It proposes a framework that includes a database to store materials information and a decision support component to evaluate materials based on selection criteria. A web interface is developed to allow project participants in different locations to access and manage materials data. The system aims to standardize and automate the materials selection process for construction engineering projects. It discusses developing a conceptual model and database, as well as interfaces like a materials controller to input, view and evaluate options based on criteria like cost, durability and environmental impacts.
Investigation of mrr in wedm for wc co sintered compositeIAEME Publication
The document investigates the influence of wire electrical discharge machining (WEDM) parameters on material removal rate when machining tungsten carbide-cobalt (WC-Co) sintered composite. Experiments were conducted using a response surface methodology with five control factors (pulse on-time, pulse off-time, peak current, servo voltage, and wire tension). The results were used to derive a mathematical model to predict material removal rate and optimize the WEDM process for machining WC-Co composite.
Comparative analysis for higher channel isolation using singleIAEME Publication
This document summarizes the results of two simulations comparing different fiber Bragg grating (FBG) filter configurations for achieving higher channel isolation in dense wavelength-division multiplexing (WDM) systems. The first simulation used a single FBG filter with uniform, Gaussian, or hyperbolic tangent apodization profiles, while the second used two FBG filters connected back-to-back, with the first using uniform apodization and the second using different profiles. For a data rate of 10.52 Gbps and 100 GHz channel spacing, the second simulation achieved lower bit error rates using Gaussian apodization, indicating it provided higher channel isolation than the first simulation or uniform apodization profiles.
This document discusses a proposed software defined network (SDN) based firewall technique. The key aspects of the proposed system are:
1. It separates the control plane functionality (e.g. routing decisions) from the data plane/forwarding plane functionality (e.g. packet forwarding). This is done using an OpenFlow switch as the forwarding element and a central controller to handle control plane functions.
2. The OpenFlow switch uses flow tables populated by the central controller to forward packets. The central controller enforces network policies by programming the flow tables of OpenFlow switches.
3. This architecture allows the OpenFlow switch to be used as a software-defined firewall, with the central controller enforcing firewall rules
Performance and emission characteristics of di ci diesel engine with preIAEME Publication
This document summarizes an experimental study that investigated the performance and emission characteristics of a direct injection compression ignition diesel engine fueled with preheated chicken fat biodiesel. The engine performance, combustion characteristics, and exhaust emissions were tested and compared for petroleum diesel, unheated chicken fat biodiesel, and chicken fat biodiesel preheated to 50°C. Results showed that preheating the chicken fat biodiesel improved engine performance by reducing fuel consumption and brake specific fuel consumption. Preheating also improved combustion and lowered exhaust emissions of carbon monoxide and unburned hydrocarbons compared to unheated chicken fat biodiesel. The study concluded that preheated chicken fat biodiesel can be used
O documento resume os principais conceitos de divisão celular, incluindo o zigoto, células somáticas e germinativas, as fases do ciclo celular (interfase e divisão celular), as etapas da interfase (G1, S e G2) e da meiose.
El documento describe las especificaciones de la aplicación Onfan APP2, incluyendo las páginas principales como Explore, Crear, Perfil y Búsqueda, así como las interacciones y funcionalidad de cada página. Se detallan las pantallas de tutorial, registro, inicio de sesión y creación de contenido, con las transiciones entre ellas. La aplicación pretende ser una plataforma para compartir experiencias y especialidades entre usuarios.
El documento explica el pretérito perfecto en español, un tiempo verbal que se usa para describir acciones pasadas que están relacionadas con el presente. Explica que se forma con el verbo haber y el participio pasado del verbo principal, y muestra ejemplos de cómo se conjugan diferentes verbos en esta forma, incluyendo verbos regulares que terminan en -ar, -er, e -ir, así como verbos irregulares. También proporciona ejemplos de frases comunes que usan este tiempo verbal con expresiones de tiempo.
Relieves del paisaje 4º (conocimiento del medio tema 2)Eva Maria Garcia
The document discusses different types of landforms that provide relief on the earth's surface and subsurface water relief, including ravines, terraces, waterways, and underground aquifers.
Para realizar llamadas internacionales, se debe marcar el código del país de destino antes del número de teléfono. Para llamar a Colombia desde el extranjero, marque 0011 + 57 + número de teléfono colombiano.
The document discusses the impact of Articles 2 and 5 of the European Convention on Human Rights (ECHR) on police powers in the United Kingdom following the enactment of the ECHR in 1950 and the Human Rights Act 1998. It outlines that the collective impact of these laws and the influx of rights they introduced must be analyzed to determine if they have been upheld as originally intended by the Convention and whether they have affected human rights, police powers, and public awareness. Relevant case law will be examined and the views of the judiciary considered to illustrate the most pertinent and impactful articles.
El documento presenta los próximos servicios de la Bibliored, incluyendo un buzón de sugerencias, un blog literario, un blog de novedades, un foro de biblioteca, referencia virtual, tutoriales y un recorrido virtual, además de responder preguntas frecuentes sobre el horario, préstamos de libros y renovaciones.
Este documento apresenta o Estatuto da Cidade, que regulamenta os artigos 182 e 183 da Constituição Federal. Ele estabelece diretrizes gerais para a política urbana brasileira, como garantir o direito às cidades sustentáveis e a gestão democrática das cidades. Também define instrumentos de política urbana como parcelamento compulsório, IPTU progressivo, desapropriação com títulos da dívida pública e usucapião especial de imóvel urbano.
Este documento presenta la visión, misión, principios, valores y promoción de Yajhaira Manzano Campo. Su visión es ser una gran profesional en mercadeo y crear su propia empresa. Su misión es aprovechar su capacidad de razonamiento y seguridad en sí misma para alcanzar grandes metas con la ayuda de su familia y Dios. Sus principios incluyen la fe, el respeto y mejorar constantemente.
The document lists the names of 17 male and 8 female students in two separate sections. It does not provide any additional context or information about the students.
Este plan de estudios para Lengua Castellana describe las habilidades comunicativas de escuchar, hablar, escribir y leer. Detalla cinco competencias clave y varios estándares básicos de competencia. Explica factores como competencias, procesos de aula y evidencias evaluadoras. Además, enumera recursos informáticos y telemáticos para apoyar la enseñanza, incluyendo aplicaciones, páginas web, bases de datos y herramientas de comunicación sincrónica y asincrónica.
Suppression of power line interference correction of baselinewanders andIAEME Publication
This document summarizes a research paper that proposes a new method for enhancing electrocardiogram (ECG) signals based on the Constrained Stability Least Mean Square (CSLMS) algorithm. The CSLMS algorithm is applied to an adaptive noise cancellation filter to remove two dominant artifacts from ECG signals: high-frequency noise and baseline wander. Simulation results on ECG data from the MIT-BIH database show that the CSLMS method provides better denoising and artifact removal compared to the conventional LMS algorithm, improving signal-to-noise ratio by 3-6 decibels. The CSLMS algorithm exhibits smaller excess mean squared error and faster convergence than LMS, resulting in less signal distortion in
This document summarizes a research paper that uses a backward propagation neural network with the Levenberg Marquardt algorithm to classify electrocardiogram (ECG) signals as normal or abnormal. The network was trained on the MIT-BIH arrhythmia database and achieved 99.9% accuracy at classifying heartbeats, outperforming other methods. Features extracted from the ECG signals like standard deviation and wavelet coefficients were used as input to the neural network. The results demonstrate that neural networks can accurately analyze ECG signals and detect cardiac abnormalities.
This document summarizes a research paper that uses a backward propagation neural network with the Levenberg Marquardt algorithm to classify electrocardiogram (ECG) signals as normal or abnormal. The network was trained on the MIT-BIH arrhythmia database and achieved 99.9% accuracy at classifying heartbeats, outperforming other methods. Features extracted from the ECG signals like standard deviation and wavelet coefficients were used as input to the neural network. The results demonstrate that neural networks can accurately analyze ECG signals and detect cardiac abnormalities.
Classification of Cardiac Arrhythmia using WT, HRV, and Fuzzy C-Means ClusteringCSCJournals
The classification of the electrocardiogram registration into different pathologies disease devises is a complex pattern recognition task. In this paper, we propose a generic feature extraction for classification of ECG arrhythmias using a fuzzy c-means (FCM) clustering and Heart Rate variability (HRV). The traditional methods of diagnosis and classification present some inconveniences; seen that the precision of credit note one diagnosis exact depends on the cardiologist experience and the rate concentration. Due to the high mortality rate of heart diseases, early detection and precise discrimination of ECG arrhythmia is essential for the treatment of patients. During the recording of ECG signal, different forms of noise can be superimposed in the useful signal. The pre-treatment of ECG imposes the suppression of these perturbation signals. The row date is preprocessed, normalized and then data points are clustered using FCM technique. In this work, four different structures, FCM-HRV, PCM-HRV, FCMC-HRV and FPCM-HRV are formed by using heart rate variability technique and fuzzy c-means clustering. In addition, FCM-HRV is the new method proposed for classification of ECG. This paper presents a comparative study of the classification accuracy of ECG signals by using these four structures for computationally efficient diagnosis. The ECG signals taken from MIT-BIH ECG database are used in training to classify 4 different arrhythmias (Atrial Fibrillation Termination). All of the structures are tested by using the same ECG records. The test results suggest that FCMC-HRV structure can generalize better and is faster than the other structures.
ELM and K-nn machine learning in classification of Breath sounds signals IJECEIAES
The acquisition of Breath sounds (BS) signals from a human respiratory system with an electronic stethoscope, provide and offer prominent information which helps the doctors to diagnosis and classification of pulmonary diseases. Unfortunately, this BS signals with other biological signals have a non-stationary nature according to the variation of the lung volume, and this nature makes it difficult to analyze and classify between several diseases. In this study, we were focused on comparing the ability of the extreme learning machine (ELM) and k-nearest neighbour (K-nn) machine learning algorithms in the classification of adventitious and normal breath sounds. To do so, the empirical mode decomposition (EMD) was used in this work to analyze BS, this method is rarely used in the breath sounds analysis. After the EMD decomposition of the signals into Intrinsic Mode Functions (IMFs), the Hjorth descriptors (Activity) and Permutation Entropy (PE) features were extracted from each IMFs and combined for classification stage. The study has found that the combination of features (activity and PE) yielded an accuracy of 90.71%, 95% using ELM and K-nn respectively in binary classification (normal and abnormal breath sounds), and 83.57%, 86.42% in multiclass classification (five classes).
The Effect of Kronecker Tensor Product Values on ECG Rates: A Study on Savitz...IIJSRJournal
This article presents a study on ECG signal filtering algorithms to denoise signals corrupted by various types of noise sources. The study also examines the effect of Kronecker tensor product values on ECG rates. The study is conducted in a Matlab environment, and the results demonstrate that a constant number for the respective codes can effectively denoise ECG signals without any trouble. These findings have significant implications for diagnosing abnormal heart rhythms and investigating chest pains. The present study is novel in that it explores the relationship between ECG rate and Kronecker delta values across different age groups, which has not been extensively studied in previous literature. The study's unique contribution is the determination of age-specific values of the constant K required to represent this relationship accurately in different populations, which could inform the development of more effective algorithms for denoising ECG signals in clinical settings. Additionally, this study's finding of an inverse relationship between ECG rate and Kronecker delta values could have broader implications for understanding the physiological factors that contribute to variability in ECG measurements. The study provides valuable insights into ECG signal processing and suggests that the implemented techniques can improve the accuracy of ECG signal analysis in real-time clinical settings. Overall, the manuscript is a valuable contribution to the field of biomedical signal processing and provides important information for researchers and healthcare professionals.
Wavelet-Based Approach for Automatic Seizure Detection Using EEG SignalsIRJET Journal
This document presents a wavelet-based approach for automatically detecting seizures using EEG signals. EEG data is decomposed into detailed and approximate coefficients using discrete wavelet transform up to the fourth level. Statistical features are extracted from the wavelet coefficients and the most significant features are selected using the Wilcoxon rank-sum test. Three classifiers - SVM, kNN, and ensemble subspace kNN - are used to classify EEG segments as pre-ictal, inter-ictal, or ictal. The proposed method achieves 100% classification accuracy when discriminating between healthy and epileptic EEG signals on the neurology and sleep centre EEG database.
IRJET- Study of Hypocalcemic Cardiac Disorder by Analyzing the Features o...IRJET Journal
This document presents a study that analyzes ECG signals using discrete wavelet transform (DWT) to detect hypocalcemia, a condition caused by low calcium levels. The proposed methodology involves denoising the ECG signal, detecting peaks (Q, R, S) using DWT, calculating time intervals, and using statistical measures like mean square error, root mean square deviation, and percentage deviation to distinguish between healthy and hypocalcemic patients. The results of applying this methodology to ECG signals from a database are discussed.
5. detection and separation of eeg artifacts using wavelet transform nov 11, ...IAESIJEECS
This document summarizes a study on using wavelet transforms to detect and separate artifacts in EEG signals. The study aimed to minimize artifacts and noise in EEG signals without affecting the original signal. Wavelet transforms were found to be effective for analyzing non-stationary EEG signals. The results showed that wavelet transforms significantly reduced input size without compromising performance. Decomposing EEG signals using wavelet transforms extracted different frequency bands and resolved signals at different resolutions. This allowed artifacts and noise to be detected and the original signal to be recovered. Simulation results demonstrated the wavelet transform's ability to denoise EEG signals and extract key frequency components.
Detection of EEG Spikes Using Machine Learning ClassifierIRJET Journal
This document discusses a study on detecting epileptic seizures from EEG data using machine learning classifiers. It begins with an introduction to epilepsy and EEG signals. Feature extraction is identified as an important step, as is understanding the statistical properties of the data. Previous studies that used time domain, frequency domain, and time-frequency domain features are summarized. Commonly used machine learning classifiers like SVMs, ANNs, and random forests are also mentioned. The methodology of the presented study involved recording EEG data from rats injected with penicillin to induce seizures, extracting time and frequency domain features, and using an SVM classifier to classify signals as epileptic or non-epileptic. The goal of the study was to analyze and identify features to classify EEG
This document summarizes a proposed FPGA-based ECG analysis system for arrhythmia detection. The system uses empirical mode decomposition (EMD) for ECG signal preprocessing to remove noise. EMD decomposes the noisy ECG signal into intrinsic mode functions (IMFs) and spectral flatness is used to identify noisy IMFs. After enhancement, R peak detection is performed using a threshold to extract heart rate for arrhythmia detection. The design was implemented on an FPGA board using Verilog and was able to detect arrhythmias through LEDs while using a small portion of the FPGA's resources.
New Method of R-Wave Detection by Continuous Wavelet TransformCSCJournals
In this paper we have employed a new method of R-peaks detection in electrocardiogram (ECG) signals. This method is based on the application of the discretised Continuous Wavelet Transform (CWT) used for the Bionic Wavelet Transform (BWT). The mother wavelet associated to this transform is the Morlet wavelet. For evaluating the proposed method, we have compared it to others methods that are based on Discrete Wavelet Transform (DWT). In this evaluation, the used ECG signals are taken from MIT-BIH database. The obtained results show that the proposed method outperforms some conventional techniques used in our evaluation.
IRJET - FPGA based Electrocardiogram (ECG) Signal Analysis using Linear Phase...IRJET Journal
This document presents a design for analyzing electrocardiogram (ECG) signals using an FPGA. It employs a least-square linear phase finite impulse response filter to remove noise from the ECG signal. It then uses discrete wavelet transform for feature extraction and a backpropagation neural network classifier to classify the ECG signal as normal or abnormal. If abnormal, a support vector machine is used to detect the type of heart disease. The system is implemented on a Xilinx FPGA using MATLAB.
CLASSIFICATION OF ECG ARRHYTHMIAS USING /DISCRETE WAVELET TRANSFORM AND NEURA...IJCSEA Journal
Automatic recognition of cardiac arrhythmias is important for diagnosis of cardiac abnormalies. Several algorithms have been proposed to classify ECG arrhythmias; however, they cannot perform very well. Therefore, in this paper, an expert system for ElectroCardioGram (ECG) arrhythmia classification is proposed. Discrete wavelet transform is used for processing ECG recordings, and extracting some features, and the Multi-Layer Perceptron (MLP) neural network performs the classification task. Two types of arrhythmias can be detected by the proposed system. Some recordings of the MIT-BIH arrhythmias database have been used for training and testing our neural network based classifier. The simulation results show that the classification accuracy of our algorithm is 96.5% using 10 files including normal and two arrhythmias.
IRJET-Advanced Method of Epileptic detection using EEG by Wavelet DecompositionIRJET Journal
This document proposes a method to detect epileptic seizures from EEG signals using wavelet decomposition and entropy-based feature extraction. EEG data is decomposed using wavelets and features like entropy measures and power ratios in different frequency bands are extracted. These features are then used as inputs to a k-nearest neighbors classifier to classify signals as normal, ictal or inter-ictal. The method is tested on two benchmark EEG databases and aims to increase prediction accuracy of seizure onset to help localize epileptic foci. Statistical, spectral and nonlinear features are commonly used in existing methods. The proposed method uses entropy measures like Shannon, Renyi, approximate and sample entropy along with power ratios in frequency bands as features for classification.
IRJET- Detection of Abnormal ECG Signal using DWT Feature Extraction and CNNIRJET Journal
This document discusses a study that uses discrete wavelet transform (DWT) to extract features from electrocardiogram (ECG) signals and then uses a convolutional neural network (CNN) to classify the signals as normal or abnormal. DWT is used to represent the ECG signals at different resolutions, which allows numerical features to be extracted. A CNN is then trained on the extracted features to predict whether signals indicate normal or abnormal heart conditions. The goal is to develop an efficient early detection system for cardiovascular disease by combining DWT feature extraction and CNN classification of ECG signals.
Real time ecg signal analysis by using new data reduction algorithm forIAEME Publication
This document summarizes a research paper that proposes a new method for compressing electrocardiogram (ECG) signals for transmission over wireless personal area networks (WPANs). The method uses curvature analysis to select feature points in the ECG signal, including the P, Q, R, S, and T waves, which are important for diagnosis. Additional points are then selected iteratively to minimize reconstruction errors when decompressing the signal. The researchers conclude that the curvature-based method is able to preserve all important diagnostic features of the ECG signal while significantly compressing the data size for transmission over bandwidth-limited WPANs.
Drivers’ drowsiness detection based on an optimized random forest classificat...IJECEIAES
This document summarizes a study that developed a hybrid machine learning approach for detecting driver drowsiness using electroencephalogram (EEG) signals. The study extracted features from single-channel EEG recordings in the time, frequency, and power spectral density domains. Various machine learning classifiers were tested on the features, including support vector machine, random forest, decision tree, and neural networks. The optimized random forest classifier achieved 98.5% accuracy in detecting drowsiness, with a fast processing time of 13 milliseconds. This high accuracy and speed demonstrate that the proposed hybrid approach outperforms existing methods for EEG-based driver drowsiness detection.
Noise reduction in ecg by iir filters a comparative studyIAEME Publication
The document describes a study comparing different digital filters for reducing noise in electrocardiogram (ECG) signals. ECG data was obtained from a database and noise was added, including 50Hz interference and high/low frequency noise. Fourth-order Butterworth, Chebyshev 1, Chebyshev 2, and elliptic filters were applied digitally. Butterworth filtering performed best by introducing minimum distortion while reducing noise, as determined by analyzing signal power and waveform distortion before and after filtering. The document aims to find the most effective digital filter for denoising ECG signals.
Significant variables extraction of post-stroke EEG signal using wavelet and ...TELKOMNIKA JOURNAL
Stroke patients require a long recovery. One success of the treatment given is the evaluation and
monitoring during recovery. One device for monitoring the development of post-stroke patients is
Electroencephalogram (EEG). This research proposed a method for extracting variables of EEG signals for
post-stroke patient analysis using Wavelet and Self-Organizing Map Kohonen clustering. EEG signal was
extracted by Wavelet to obtain Alpha, beta, theta, gamma, and Mu waves. These waves, the amplitude
and asymmetric of the symmetric channel pairs are features in Self Organizing Map Kohonen Clustering.
Clustering results were compared with actual clusters of post-stroke and no-stroke subjects to extract
significant variable. These results showed that the configuration of Alpha, Beta, and Mu waves, amplitude
together with the difference between the variable of symmetric channel pairs are significant in the analysis
of post-stroke patients. The results gave using symmetric channel pairs provided 54-74% accuracy.
Similar to Analysis electrocardiogram signal using ensemble empirical mode decomposition and time (20)
Submission Deadline: 30th September 2022
Acceptance Notification: Within Three Days’ time period
Online Publication: Within 24 Hrs. time Period
Expected Date of Dispatch of Printed Journal: 5th October 2022
MODELING AND ANALYSIS OF SURFACE ROUGHNESS AND WHITE LATER THICKNESS IN WIRE-...IAEME Publication
White layer thickness (WLT) formed and surface roughness in wire electric discharge turning (WEDT) of tungsten carbide composite has been made to model through response surface methodology (RSM). A Taguchi’s standard Design of experiments involving five input variables with three levels has been employed to establish a mathematical model between input parameters and responses. Percentage of cobalt content, spindle speed, Pulse on-time, wire feed and pulse off-time were changed during the experimental tests based on the Taguchi’s orthogonal array L27 (3^13). Analysis of variance (ANOVA) revealed that the mathematical models obtained can adequately describe performance within the parameters of the factors considered. There was a good agreement between the experimental and predicted values in this study.
A STUDY ON THE REASONS FOR TRANSGENDER TO BECOME ENTREPRENEURSIAEME Publication
The study explores the reasons for a transgender to become entrepreneurs. In this study transgender entrepreneur was taken as independent variable and reasons to become as dependent variable. Data were collected through a structured questionnaire containing a five point Likert Scale. The study examined the data of 30 transgender entrepreneurs in Salem Municipal Corporation of Tamil Nadu State, India. Simple Random sampling technique was used. Garrett Ranking Technique (Percentile Position, Mean Scores) was used as the analysis for the present study to identify the top 13 stimulus factors for establishment of trans entrepreneurial venture. Economic advancement of a nation is governed upon the upshot of a resolute entrepreneurial doings. The conception of entrepreneurship has stretched and materialized to the socially deflated uncharted sections of transgender community. Presently transgenders have smashed their stereotypes and are making recent headlines of achievements in various fields of our Indian society. The trans-community is gradually being observed in a new light and has been trying to achieve prospective growth in entrepreneurship. The findings of the research revealed that the optimistic changes are taking place to change affirmative societal outlook of the transgender for entrepreneurial ventureship. It also laid emphasis on other transgenders to renovate their traditional living. The paper also highlights that legislators, supervisory body should endorse an impartial canons and reforms in Tamil Nadu Transgender Welfare Board Association.
BROAD UNEXPOSED SKILLS OF TRANSGENDER ENTREPRENEURSIAEME Publication
Since ages gender difference is always a debatable theme whether caused by nature, evolution or environment. The birth of a transgender is dreadful not only for the child but also for their parents. The pain of living in the wrong physique and treated as second class victimized citizen is outrageous and fully harboured with vicious baseless negative scruples. For so long, social exclusion had perpetuated inequality and deprivation experiencing ingrained malign stigma and besieged victims of crime or violence across their life spans. They are pushed into the murky way of life with a source of eternal disgust, bereft sexual potency and perennial fear. Although they are highly visible but very little is known about them. The common public needs to comprehend the ravaged arrogance on these insensitive souls and assist in integrating them into the mainstream by offering equal opportunity, treat with humanity and respect their dignity. Entrepreneurship in the current age is endorsing the gender fairness movement. Unstable careers and economic inadequacy had inclined one of the gender variant people called Transgender to become entrepreneurs. These tiny budding entrepreneurs resulted in economic transition by means of employment, free from the clutches of stereotype jobs, raised standard of living and handful of financial empowerment. Besides all these inhibitions, they were able to witness a platform for skill set development that ignited them to enter into entrepreneurial domain. This paper epitomizes skill sets involved in trans-entrepreneurs of Thoothukudi Municipal Corporation of Tamil Nadu State and is a groundbreaking determination to sightsee various skills incorporated and the impact on entrepreneurship.
DETERMINANTS AFFECTING THE USER'S INTENTION TO USE MOBILE BANKING APPLICATIONSIAEME Publication
The banking and financial services industries are experiencing increased technology penetration. Among them, the banking industry has made technological advancements to better serve the general populace. The economy focused on transforming the banking sector's system into a cashless, paperless, and faceless one. The researcher wants to evaluate the user's intention for utilising a mobile banking application. The study also examines the variables affecting the user's behaviour intention when selecting specific applications for financial transactions. The researcher employed a well-structured questionnaire and a descriptive study methodology to gather the respondents' primary data utilising the snowball sampling technique. The study includes variables like performance expectations, effort expectations, social impact, enabling circumstances, and perceived risk. Each of the aforementioned variables has a major impact on how users utilise mobile banking applications. The outcome will assist the service provider in comprehending the user's history with mobile banking applications.
ANALYSE THE USER PREDILECTION ON GPAY AND PHONEPE FOR DIGITAL TRANSACTIONSIAEME Publication
Technology upgradation in banking sector took the economy to view that payment mode towards online transactions using mobile applications. This system enabled connectivity between banks, Merchant and user in a convenient mode. there are various applications used for online transactions such as Google pay, Paytm, freecharge, mobikiwi, oxygen, phonepe and so on and it also includes mobile banking applications. The study aimed at evaluating the predilection of the user in adopting digital transaction. The study is descriptive in nature. The researcher used random sample techniques to collect the data. The findings reveal that mobile applications differ with the quality of service rendered by Gpay and Phonepe. The researcher suggest the Phonepe application should focus on implementing the application should be user friendly interface and Gpay on motivating the users to feel the importance of request for money and modes of payments in the application.
VOICE BASED ATM FOR VISUALLY IMPAIRED USING ARDUINOIAEME Publication
The prototype of a voice-based ATM for visually impaired using Arduino is to help people who are blind. This uses RFID cards which contain users fingerprint encrypted on it and interacts with the users through voice commands. ATM operates when sensor detects the presence of one person in the cabin. After scanning the RFID card, it will ask to select the mode like –normal or blind. User can select the respective mode through voice input, if blind mode is selected the balance check or cash withdraw can be done through voice input. Normal mode procedure is same as the existing ATM.
IMPACT OF EMOTIONAL INTELLIGENCE ON HUMAN RESOURCE MANAGEMENT PRACTICES AMONG...IAEME Publication
There is increasing acceptability of emotional intelligence as a major factor in personality assessment and effective human resource management. Emotional intelligence as the ability to build capacity, empathize, co-operate, motivate and develop others cannot be divorced from both effective performance and human resource management systems. The human person is crucial in defining organizational leadership and fortunes in terms of challenges and opportunities and walking across both multinational and bilateral relationships. The growing complexity of the business world requires a great deal of self-confidence, integrity, communication, conflict and diversity management to keep the global enterprise within the paths of productivity and sustainability. Using the exploratory research design and 255 participants the result of this original study indicates strong positive correlation between emotional intelligence and effective human resource management. The paper offers suggestions on further studies between emotional intelligence and human capital development and recommends for conflict management as an integral part of effective human resource management.
VISUALISING AGING PARENTS & THEIR CLOSE CARERS LIFE JOURNEY IN AGING ECONOMYIAEME Publication
Our life journey, in general, is closely defined by the way we understand the meaning of why we coexist and deal with its challenges. As we develop the "inspiration economy", we could say that nearly all of the challenges we have faced are opportunities that help us to discover the rest of our journey. In this note paper, we explore how being faced with the opportunity of being a close carer for an aging parent with dementia brought intangible discoveries that changed our insight of the meaning of the rest of our life journey.
A STUDY ON THE IMPACT OF ORGANIZATIONAL CULTURE ON THE EFFECTIVENESS OF PERFO...IAEME Publication
The main objective of this study is to analyze the impact of aspects of Organizational Culture on the Effectiveness of the Performance Management System (PMS) in the Health Care Organization at Thanjavur. Organizational Culture and PMS play a crucial role in present-day organizations in achieving their objectives. PMS needs employees’ cooperation to achieve its intended objectives. Employees' cooperation depends upon the organization’s culture. The present study uses exploratory research to examine the relationship between the Organization's culture and the Effectiveness of the Performance Management System. The study uses a Structured Questionnaire to collect the primary data. For this study, Thirty-six non-clinical employees were selected from twelve randomly selected Health Care organizations at Thanjavur. Thirty-two fully completed questionnaires were received.
Living in 21st century in itself reminds all of us the necessity of police and its administration. As more and more we are entering into the modern society and culture, the more we require the services of the so called ‘Khaki Worthy’ men i.e., the police personnel. Whether we talk of Indian police or the other nation’s police, they all have the same recognition as they have in India. But as already mentioned, their services and requirements are different after the like 26th November, 2008 incidents, where they without saving their own lives has sacrificed themselves without any hitch and without caring about their respective family members and wards. In other words, they are like our heroes and mentors who can guide us from the darkness of fear, militancy, corruption and other dark sides of life and so on. Now the question arises, if Gandhi would have been alive today, what would have been his reaction/opinion to the police and its functioning? Would he have some thing different in his mind now what he had been in his mind before the partition or would he be going to start some Satyagraha in the form of some improvement in the functioning of the police administration? Really these questions or rather night mares can come to any one’s mind, when there is too much confusion is prevailing in our minds, when there is too much corruption in the society and when the polices working is also in the questioning because of one or the other case throughout the India. It is matter of great concern that we have to thing over our administration and our practical approach because the police personals are also like us, they are part and parcel of our society and among one of us, so why we all are pin pointing towards them.
A STUDY ON TALENT MANAGEMENT AND ITS IMPACT ON EMPLOYEE RETENTION IN SELECTED...IAEME Publication
The goal of this study was to see how talent management affected employee retention in the selected IT organizations in Chennai. The fundamental issue was the difficulty to attract, hire, and retain talented personnel who perform well and the gap between supply and demand of talent acquisition and retaining them within the firms. The study's main goals were to determine the impact of talent management on employee retention in IT companies in Chennai, investigate talent management strategies that IT companies could use to improve talent acquisition, performance management, career planning and formulate retention strategies that the IT firms could use. The respondents were given a structured close-ended questionnaire with the 5 Point Likert Scale as part of the study's quantitative research design. The target population consisted of 289 IT professionals. The questionnaires were distributed and collected by the researcher directly. The Statistical Package for Social Sciences (SPSS) was used to collect and analyse the questionnaire responses. Hypotheses that were formulated for the various areas of the study were tested using a variety of statistical tests. The key findings of the study suggested that talent management had an impact on employee retention. The studies also found that there is a clear link between the implementation of talent management and retention measures. Management should provide enough training and development for employees, clarify job responsibilities, provide adequate remuneration packages, and recognise employees for exceptional performance.
ATTRITION IN THE IT INDUSTRY DURING COVID-19 PANDEMIC: LINKING EMOTIONAL INTE...IAEME Publication
Globally, Millions of dollars were spent by the organizations for employing skilled Information Technology (IT) professionals. It is costly to replace unskilled employees with IT professionals possessing technical skills and competencies that aid in interconnecting the business processes. The organization’s employment tactics were forced to alter by globalization along with technological innovations as they consistently diminish to remain lean, outsource to concentrate on core competencies along with restructuring/reallocate personnel to gather efficiency. As other jobs, organizations or professions have become reasonably more appropriate in a shifting employment landscape, the above alterations trigger both involuntary as well as voluntary turnover. The employee view on jobs is also afflicted by the COVID-19 pandemic along with the employee-driven labour market. So, having effective strategies is necessary to tackle the withdrawal rate of employees. By associating Emotional Intelligence (EI) along with Talent Management (TM) in the IT industry, the rise in attrition rate was analyzed in this study. Only 303 respondents were collected out of 350 participants to whom questionnaires were distributed. From the employees of IT organizations located in Bangalore (India), the data were congregated. A simple random sampling methodology was employed to congregate data as of the respondents. Generating the hypothesis along with testing is eventuated. The effect of EI and TM along with regression analysis between TM and EI was analyzed. The outcomes indicated that employee and Organizational Performance (OP) were elevated by effective EI along with TM.
INFLUENCE OF TALENT MANAGEMENT PRACTICES ON ORGANIZATIONAL PERFORMANCE A STUD...IAEME Publication
By implementing talent management strategy, organizations would have the option to retain their skilled professionals while additionally working on their overall performance. It is the course of appropriately utilizing the ideal individuals, setting them up for future top positions, exploring and dealing with their performance, and holding them back from leaving the organization. It is employee performance that determines the success of every organization. The firm quickly obtains an upper hand over its rivals in the event that its employees having particular skills that cannot be duplicated by the competitors. Thus, firms are centred on creating successful talent management practices and processes to deal with the unique human resources. Firms are additionally endeavouring to keep their top/key staff since on the off chance that they leave; the whole store of information leaves the firm's hands. The study's objective was to determine the impact of talent management on organizational performance among the selected IT organizations in Chennai. The study recommends that talent management limitedly affects performance. On the off chance that this talent is appropriately management and implemented properly, organizations might benefit as much as possible from their maintained assets to support development and productivity, both monetarily and non-monetarily.
A STUDY OF VARIOUS TYPES OF LOANS OF SELECTED PUBLIC AND PRIVATE SECTOR BANKS...IAEME Publication
Banking regulations act of India, 1949 defines banking as “acceptance of deposits for the purpose of lending or investment from the public, repayment on demand or otherwise and withdrawable through cheques, drafts order or otherwise”, the major participants of the Indian financial system are commercial banks, the financial institution encompassing term lending institutions. Investments institutions, specialized financial institution and the state level development banks, non banking financial companies (NBFC) and other market intermediaries such has the stock brokers and money lenders are among the oldest of the certain variants of NBFC and the oldest market participants. The asset quality of banks is one of the most important indicators of their financial health. The Indian banking sector has been facing severe problems of increasing Non- Performing Assets (NPAs). The NPAs growth directly and indirectly affects the quality of assets and profitability of banks. It also shows the efficiency of banks credit risk management and the recovery effectiveness. NPA do not generate any income, whereas, the bank is required to make provisions for such as assets that why is a double edge weapon. This paper outlines the concept of quality of bank loans of different types like Housing, Agriculture and MSME loans in state Haryana of selected public and private sector banks. This study is highlighting problems associated with the role of commercial bank in financing Small and Medium Scale Enterprises (SME). The overall objective of the research was to assess the effect of the financing provisions existing for the setting up and operations of MSMEs in the country and to generate recommendations for more robust financing mechanisms for successful operation of the MSMEs, in turn understanding the impact of MSME loans on financial institutions due to NPA. There are many research conducted on the topic of Non- Performing Assets (NPA) Management, concerning particular bank, comparative study of public and private banks etc. In this paper the researcher is considering the aggregate data of selected public sector and private sector banks and attempts to compare the NPA of Housing, Agriculture and MSME loans in state Haryana of public and private sector banks. The tools used in the study are average and Anova test and variance. The findings reveal that NPA is common problem for both public and private sector banks and is associated with all types of loans either that is housing loans, agriculture loans and loans to SMES. NPAs of both public and private sector banks show the increasing trend. In 2010-11 GNPA of public and private sector were at same level it was 2% but after 2010-11 it increased in many fold and at present there is GNPA in some more than 15%. It shows the dark area of Indian banking sector.
EXPERIMENTAL STUDY OF MECHANICAL AND TRIBOLOGICAL RELATION OF NYLON/BaSO4 POL...IAEME Publication
An experiment conducted in this study found that BaSO4 changed Nylon 6's mechanical properties. By changing the weight ratios, BaSO4 was used to make Nylon 6. This Researcher looked into how hard Nylon-6/BaSO4 composites are and how well they wear. Experiments were done based on Taguchi design L9. Nylon-6/BaSO4 composites can be tested for their hardness number using a Rockwell hardness testing apparatus. On Nylon/BaSO4, the wear behavior was measured by a wear monitor, pinon-disc friction by varying reinforcement, sliding speed, and sliding distance, and the microstructure of the crack surfaces was observed by SEM. This study provides significant contributions to ultimate strength by increasing BaSO4 content up to 16% in the composites, and sliding speed contributes 72.45% to the wear rate
ROLE OF SOCIAL ENTREPRENEURSHIP IN RURAL DEVELOPMENT OF INDIA - PROBLEMS AND ...IAEME Publication
The majority of the population in India lives in villages. The village is the back bone of the country. Village or rural industries play an important role in the national economy, particularly in the rural development. Developing the rural economy is one of the key indicators towards a country’s success. Whether it be the need to look after the welfare of the farmers or invest in rural infrastructure, Governments have to ensure that rural development isn’t compromised. The economic development of our country largely depends on the progress of rural areas and the standard of living of rural masses. Village or rural industries play an important role in the national economy, particularly in the rural development. Rural entrepreneurship is based on stimulating local entrepreneurial talent and the subsequent growth of indigenous enterprises. It recognizes opportunity in the rural areas and accelerates a unique blend of resources either inside or outside of agriculture. Rural entrepreneurship brings an economic value to the rural sector by creating new methods of production, new markets, new products and generate employment opportunities thereby ensuring continuous rural development. Social Entrepreneurship has the direct and primary objective of serving the society along with the earning profits. So, social entrepreneurship is different from the economic entrepreneurship as its basic objective is not to earn profits but for providing innovative solutions to meet the society needs which are not taken care by majority of the entrepreneurs as they are in the business for profit making as a sole objective. So, the Social Entrepreneurs have the huge growth potential particularly in the developing countries like India where we have huge societal disparities in terms of the financial positions of the population. Still 22 percent of the Indian population is below the poverty line and also there is disparity among the rural & urban population in terms of families living under BPL. 25.7 percent of the rural population & 13.7 percent of the urban population is under BPL which clearly shows the disparity of the poor people in the rural and urban areas. The need to develop social entrepreneurship in agriculture is dictated by a large number of social problems. Such problems include low living standards, unemployment, and social tension. The reasons that led to the emergence of the practice of social entrepreneurship are the above factors. The research problem lays upon disclosing the importance of role of social entrepreneurship in rural development of India. The paper the tendencies of social entrepreneurship in India, to present successful examples of such business for providing recommendations how to improve situation in rural areas in terms of social entrepreneurship development. Indian government has made some steps towards development of social enterprises, social entrepreneurship, and social in- novation, but a lot remains to be improved.
OPTIMAL RECONFIGURATION OF POWER DISTRIBUTION RADIAL NETWORK USING HYBRID MET...IAEME Publication
Distribution system is a critical link between the electric power distributor and the consumers. Most of the distribution networks commonly used by the electric utility is the radial distribution network. However in this type of network, it has technical issues such as enormous power losses which affect the quality of the supply. Nowadays, the introduction of Distributed Generation (DG) units in the system help improve and support the voltage profile of the network as well as the performance of the system components through power loss mitigation. In this study network reconfiguration was done using two meta-heuristic algorithms Particle Swarm Optimization and Gravitational Search Algorithm (PSO-GSA) to enhance power quality and voltage profile in the system when simultaneously applied with the DG units. Backward/Forward Sweep Method was used in the load flow analysis and simulated using the MATLAB program. Five cases were considered in the Reconfiguration based on the contribution of DG units. The proposed method was tested using IEEE 33 bus system. Based on the results, there was a voltage profile improvement in the system from 0.9038 p.u. to 0.9594 p.u.. The integration of DG in the network also reduced power losses from 210.98 kW to 69.3963 kW. Simulated results are drawn to show the performance of each case.
APPLICATION OF FRUGAL APPROACH FOR PRODUCTIVITY IMPROVEMENT - A CASE STUDY OF...IAEME Publication
Manufacturing industries have witnessed an outburst in productivity. For productivity improvement manufacturing industries are taking various initiatives by using lean tools and techniques. However, in different manufacturing industries, frugal approach is applied in product design and services as a tool for improvement. Frugal approach contributed to prove less is more and seems indirectly contributing to improve productivity. Hence, there is need to understand status of frugal approach application in manufacturing industries. All manufacturing industries are trying hard and putting continuous efforts for competitive existence. For productivity improvements, manufacturing industries are coming up with different effective and efficient solutions in manufacturing processes and operations. To overcome current challenges, manufacturing industries have started using frugal approach in product design and services. For this study, methodology adopted with both primary and secondary sources of data. For primary source interview and observation technique is used and for secondary source review has done based on available literatures in website, printed magazines, manual etc. An attempt has made for understanding application of frugal approach with the study of manufacturing industry project. Manufacturing industry selected for this project study is Mahindra and Mahindra Ltd. This paper will help researcher to find the connections between the two concepts productivity improvement and frugal approach. This paper will help to understand significance of frugal approach for productivity improvement in manufacturing industry. This will also help to understand current scenario of frugal approach in manufacturing industry. In manufacturing industries various process are involved to deliver the final product. In the process of converting input in to output through manufacturing process productivity plays very critical role. Hence this study will help to evolve status of frugal approach in productivity improvement programme. The notion of frugal can be viewed as an approach towards productivity improvement in manufacturing industries.
A MULTIPLE – CHANNEL QUEUING MODELS ON FUZZY ENVIRONMENTIAEME Publication
In this paper, we investigated a queuing model of fuzzy environment-based a multiple channel queuing model (M/M/C) ( /FCFS) and study its performance under realistic conditions. It applies a nonagonal fuzzy number to analyse the relevant performance of a multiple channel queuing model (M/M/C) ( /FCFS). Based on the sub interval average ranking method for nonagonal fuzzy number, we convert fuzzy number to crisp one. Numerical results reveal that the efficiency of this method. Intuitively, the fuzzy environment adapts well to a multiple channel queuing models (M/M/C) ( /FCFS) are very well.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
AI-Powered Food Delivery Transforming App Development in Saudi Arabia.pdfTechgropse Pvt.Ltd.
In this blog post, we'll delve into the intersection of AI and app development in Saudi Arabia, focusing on the food delivery sector. We'll explore how AI is revolutionizing the way Saudi consumers order food, how restaurants manage their operations, and how delivery partners navigate the bustling streets of cities like Riyadh, Jeddah, and Dammam. Through real-world case studies, we'll showcase how leading Saudi food delivery apps are leveraging AI to redefine convenience, personalization, and efficiency.
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
AI 101: An Introduction to the Basics and Impact of Artificial IntelligenceIndexBug
Imagine a world where machines not only perform tasks but also learn, adapt, and make decisions. This is the promise of Artificial Intelligence (AI), a technology that's not just enhancing our lives but revolutionizing entire industries.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
CAKE: Sharing Slices of Confidential Data on BlockchainClaudio Di Ciccio
Presented at the CAiSE 2024 Forum, Intelligent Information Systems, June 6th, Limassol, Cyprus.
Synopsis: Cooperative information systems typically involve various entities in a collaborative process within a distributed environment. Blockchain technology offers a mechanism for automating such processes, even when only partial trust exists among participants. The data stored on the blockchain is replicated across all nodes in the network, ensuring accessibility to all participants. While this aspect facilitates traceability, integrity, and persistence, it poses challenges for adopting public blockchains in enterprise settings due to confidentiality issues. In this paper, we present a software tool named Control Access via Key Encryption (CAKE), designed to ensure data confidentiality in scenarios involving public blockchains. After outlining its core components and functionalities, we showcase the application of CAKE in the context of a real-world cyber-security project within the logistics domain.
Paper: https://doi.org/10.1007/978-3-031-61000-4_16
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/building-and-scaling-ai-applications-with-the-nx-ai-manager-a-presentation-from-network-optix/
Robin van Emden, Senior Director of Data Science at Network Optix, presents the “Building and Scaling AI Applications with the Nx AI Manager,” tutorial at the May 2024 Embedded Vision Summit.
In this presentation, van Emden covers the basics of scaling edge AI solutions using the Nx tool kit. He emphasizes the process of developing AI models and deploying them globally. He also showcases the conversion of AI models and the creation of effective edge AI pipelines, with a focus on pre-processing, model conversion, selecting the appropriate inference engine for the target hardware and post-processing.
van Emden shows how Nx can simplify the developer’s life and facilitate a rapid transition from concept to production-ready applications.He provides valuable insights into developing scalable and efficient edge AI solutions, with a strong focus on practical implementation.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program