In this paper one optimization heuristic search
technique, Hybrid Evolutionary Programming (HEP) is
applied to the process of synthesizing three-ring Concentric
Circular Antenna Array (CCAA) focused on maximum
sidelobe-level reduction. This paper assumes non-uniform
excitations and uniform spacing of excitation elements in each
three-ring CCAA design. Experimental results reveal that the
design of non-uniformly excited CCAAs with optimal current
excitations using the method of HEP provides a considerable
sidelobe level reduction with respect to the uniform current
excitation with d=λ/2 element-to-element spacing. Among the
various CCAA designs, the design containing central element
and 4, 6 and 8 elements in three successive concentric rings
proves to be such global optimal design with global minimum
SLL (-40.22 dB) as determined by HEP.
Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...IJERA Editor
Spatial filtering for mobile communications has attracted a lot of attention over the last decade and is cur-rently considered a very promising technique that will help future cellular networks achieve their ambi-tious goals. One way to accomplish this is via array signal processing with algorithms which estimate the Direction-Of-Arrival (DOA) of the received waves from the mobile users. This paper evaluates the per-formance of a number of DOA estimation algorithms. In all cases a linear antenna array at the base station is assumed to be operating typical cellular environment.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
A New Approach of Medical Image Fusion using Discrete Wavelet TransformIDES Editor
MRI-PET medical image fusion has important
clinical significance. Medical image fusion is the important
step after registration, which is an integrative display method
of two images. The PET image shows the brain function with
a low spatial resolution, MRI image shows the brain tissue
anatomy and contains no functional information. Hence, a
perfect fused image should contains both functional
information and more spatial characteristics with no spatial
& color distortion. The DWT coefficients of MRI-PET
intensity values are fused based on the even degree method
and cross correlation method The performance of proposed
image fusion scheme is evaluated with PSNR and RMSE and
its also compared with the existing techniques.
The document compares three image fusion techniques: wavelet transform, IHS (Intensity-Hue-Saturation), and PCA (Principal Component Analysis). For each technique, it describes the methodology, syntax used, and features. It then applies each technique to sample images to produce fused images. The RGB values of the fused images are recorded and compared in a table. The wavelet technique uses max area selection and consistency verification for feature selection. IHS transforms RGB to IHS values and replaces intensity with a panchromatic image. PCA replaces the first principal component with a high-resolution panchromatic image. The document concludes no single technique is best and the quality depends on the application.
Biogeography Based Optimization Approach for Optimal Power Flow Problem Consi...IDES Editor
This paper presents a novel Biogeography Based
Optimization (BBO) algorithm for solving multi-objective
constrained optimal power flow problems in power system. In
this paper, the feasibility of the proposed algorithm is
demonstrated for IEEE 30-bus system with three different
objective functions and it is compared to other well
established population based optimization techniques. A
comparison of simulation results reveals better solution
quality and computation efficiency of the proposed algorithm
over particle swarm optimization (PSO), Real Coded Genetic
algorithm (RGA) for the global optimization of multiobjective
constrained OPF problems.
Performance Evaluation of Quarter Shift Dual Tree Complex Wavelet Transform B...IJECEIAES
In this paper, multifocus image fusion using quarter shift dual tree complex wavelet transform is proposed. Multifocus image fusion is a technique that combines the partially focused regions of multiple images of the same scene into a fully focused fused image. Directional selectivity and shift invariance properties are essential to produce a high quality fused image. However conventional wavelet based fusion algorithms introduce the ringing artifacts into fused image due to lack of shift invariance and poor directionality. The quarter shift dual tree complex wavelet transform has proven to be an effective multi-resolution transform for image fusion with its directional and shift invariant properties. Experimentation with this transform led to the conclusion that the proposed method not only produce sharp details (focused regions) in fused image due to its good directionality but also removes artifacts with its shift invariance in order to get high quality fused image. Proposed method performance is compared with traditional fusion methods in terms of objective measures.
Analysis of Image Super-Resolution via Reconstruction Filters for Pure Transl...CSCJournals
In this work, a special case of the image super-resolution problem where the only type of motion is global translational motion and the blurs are shift-invariant is investigated. The necessary conditions for exact reconstruction of the original image by using finite impulse-response reconstruction filters are investigated and determined. If the number of available low-resolution images is larger than a threshold and the blur functions meet a certain property, a reconstruction filter set for perfect image super-resolution can be generated even in the absence of motion. Given that the conditions are satisfied, a method for exact super-resolution is presented to validate the analysis results and it is shown that for the fully determined case, perfect reconstruction of the original image is achieved. Finally, some realistic conditions that make the super-resolution problem ill-posed are treated and their effects on exact super-resolution are discussed.
Performance Analysis of Adaptive DOA Estimation Algorithms For Mobile Applica...IJERA Editor
Spatial filtering for mobile communications has attracted a lot of attention over the last decade and is cur-rently considered a very promising technique that will help future cellular networks achieve their ambi-tious goals. One way to accomplish this is via array signal processing with algorithms which estimate the Direction-Of-Arrival (DOA) of the received waves from the mobile users. This paper evaluates the per-formance of a number of DOA estimation algorithms. In all cases a linear antenna array at the base station is assumed to be operating typical cellular environment.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
A New Approach of Medical Image Fusion using Discrete Wavelet TransformIDES Editor
MRI-PET medical image fusion has important
clinical significance. Medical image fusion is the important
step after registration, which is an integrative display method
of two images. The PET image shows the brain function with
a low spatial resolution, MRI image shows the brain tissue
anatomy and contains no functional information. Hence, a
perfect fused image should contains both functional
information and more spatial characteristics with no spatial
& color distortion. The DWT coefficients of MRI-PET
intensity values are fused based on the even degree method
and cross correlation method The performance of proposed
image fusion scheme is evaluated with PSNR and RMSE and
its also compared with the existing techniques.
The document compares three image fusion techniques: wavelet transform, IHS (Intensity-Hue-Saturation), and PCA (Principal Component Analysis). For each technique, it describes the methodology, syntax used, and features. It then applies each technique to sample images to produce fused images. The RGB values of the fused images are recorded and compared in a table. The wavelet technique uses max area selection and consistency verification for feature selection. IHS transforms RGB to IHS values and replaces intensity with a panchromatic image. PCA replaces the first principal component with a high-resolution panchromatic image. The document concludes no single technique is best and the quality depends on the application.
Biogeography Based Optimization Approach for Optimal Power Flow Problem Consi...IDES Editor
This paper presents a novel Biogeography Based
Optimization (BBO) algorithm for solving multi-objective
constrained optimal power flow problems in power system. In
this paper, the feasibility of the proposed algorithm is
demonstrated for IEEE 30-bus system with three different
objective functions and it is compared to other well
established population based optimization techniques. A
comparison of simulation results reveals better solution
quality and computation efficiency of the proposed algorithm
over particle swarm optimization (PSO), Real Coded Genetic
algorithm (RGA) for the global optimization of multiobjective
constrained OPF problems.
Performance Evaluation of Quarter Shift Dual Tree Complex Wavelet Transform B...IJECEIAES
In this paper, multifocus image fusion using quarter shift dual tree complex wavelet transform is proposed. Multifocus image fusion is a technique that combines the partially focused regions of multiple images of the same scene into a fully focused fused image. Directional selectivity and shift invariance properties are essential to produce a high quality fused image. However conventional wavelet based fusion algorithms introduce the ringing artifacts into fused image due to lack of shift invariance and poor directionality. The quarter shift dual tree complex wavelet transform has proven to be an effective multi-resolution transform for image fusion with its directional and shift invariant properties. Experimentation with this transform led to the conclusion that the proposed method not only produce sharp details (focused regions) in fused image due to its good directionality but also removes artifacts with its shift invariance in order to get high quality fused image. Proposed method performance is compared with traditional fusion methods in terms of objective measures.
Analysis of Image Super-Resolution via Reconstruction Filters for Pure Transl...CSCJournals
In this work, a special case of the image super-resolution problem where the only type of motion is global translational motion and the blurs are shift-invariant is investigated. The necessary conditions for exact reconstruction of the original image by using finite impulse-response reconstruction filters are investigated and determined. If the number of available low-resolution images is larger than a threshold and the blur functions meet a certain property, a reconstruction filter set for perfect image super-resolution can be generated even in the absence of motion. Given that the conditions are satisfied, a method for exact super-resolution is presented to validate the analysis results and it is shown that for the fully determined case, perfect reconstruction of the original image is achieved. Finally, some realistic conditions that make the super-resolution problem ill-posed are treated and their effects on exact super-resolution are discussed.
Noise resistance territorial intensity-based optical flow using inverse confi...journalBEEI
This paper presents the use of the inverse confidential technique on bilateral function with the territorial intensity-based optical flow to prove the effectiveness in noise resistance environment. In general, the image’s motion vector is coded by the technique called optical flow where the sequences of the image are used to determine the motion vector. But, the accuracy rate of the motion vector is reduced when the source of image sequences is interfered by noises. This work proved that the inverse confidential technique on bilateral function can increase the percentage of accuracy in the motion vector determination by the territorial intensity-based optical flow under the noisy environment. We performed the testing with several kinds of non-Gaussian noises at several patterns of standard image sequences by analyzing the result of the motion vector in a form of the error vector magnitude (EVM) and compared it with several noise resistance techniques in territorial intensity-based optical flow method.
Fpga implementation of fusion technique for fingerprint applicationIAEME Publication
Image Fusion is a process of combining relevant information from a set of images, into a
single image, wherein the resultant fused image will be more informative and complete than any of
the input images. This paper discusses Laplacian Pyramid (LP) based image fusion techniques for
fingerprint application. The technique is implemented in MatLab and evaluation parameters Mean
Square Error (MSE), Peak Signal to Noise Ratio (PSNR) and Matching score are discussed. As well
the same implemented on Virtex-5 FPGA development board using Verilog HDL. LP based
technique provides better results for image fusion than other techniques.
IRJET- Lunar Image Fusion based on DT-CWT, Curvelet Transform and NSCTIRJET Journal
This document compares three image fusion techniques - Dual Tree Complex Wavelet Transform (DT-CWT), Curvelet transform, and Nonsubsampled Contourlet Transform (NSCT) - for fusing high spectral resolution lunar images from the HySI instrument and high spatial resolution images from the TMC instrument on the Chandrayaan-1 satellite. Statistical analysis of the fused images shows that the NSCT technique best preserves both the spectral information of the HySI image and the spatial information of the TMC image, with correlation coefficients over 0.99, higher entropy and average gradients, and lower root mean square errors than the other techniques. Therefore, NSCT produces the highest quality fused lunar images according to these evaluation metrics.
This document discusses the application of photon counting multibin detectors to spectral CT. It describes how basis decomposition and energy weighting techniques can be used with multibin detectors to overcome limitations of conventional CT like beamhardening artifacts, non-standardized CT numbers, suboptimal contrast-to-noise ratio, and contrast cancellation. Basis decomposition involves representing the linear attenuation coefficient as a combination of energy dependent basis functions, while energy weighting optimizes contrast-to-noise by assigning different weights to different energy bins. These techniques enable spectral CT to perform quantitative imaging and low-dose scanning.
Design of Quadrature Mirror Filter Bank using Particle Swarm Optimization (PSO)IDES Editor
In this paper, the particle swarm optimization
technique is used for the design of a two channel quadrature
mirror filter (QMF) bank. A new method is developed to
optimize the prototype filter response in passband, stopband
and overall filter bank response. The design problem is
formulated as nonlinear unconstrained optimization of an
objective function, which is weighted sum of square of error
in passband, stop band and overall filter bank response at
frequency ( ω=0.5π ). For solving the given optimization
problem, the particle swarm optimization (PSO) technique
is used. As compared to the conventional design techniques,
the proposed method gives better performance in terms of
reconstruction error, mean square error in passband,
stopband, and computational time. Various design examples
are presented to illustrate the benefits provided by the
proposed method.
Molecular dynamics (MD) is a very useful tool to understand various phenomena in atomistic detail. In MD, we can overcome the size- and time-scale problems by efficient parallelization. In this lecture, I’ll explain various parallelization methods of MD with some examples of GENESIS MD software optimization on Fugaku.
A Novel Blind SR Method to Improve the Spatial Resolution of Real Life Video ...IRJET Journal
This document proposes a novel blind super resolution method to improve the spatial resolution of real-life video sequences. The key aspects of the proposed method are:
1) It estimates blur without knowing the point spread function or noise statistics using a non-uniform interpolation super resolution method and multi-scale processing.
2) It uses a cost function with fidelity and regularization terms of a Huber-Markov random field to preserve edges and fine details in the reconstructed high resolution frames.
3) It performs masking to suppress artifacts from inaccurate motions, adaptively weighting the fidelity term at each iteration for faster convergence.
The method is tested on real-life videos with complex motions, objects, and brightness changes, showing
This document discusses techniques for optimizing the behavior of nonlinear dynamical systems using cell mapping. Specifically, it addresses maximizing the size of domains of attraction in parametrized dynamical systems. Cell mapping is used to estimate domain sizes for different parameter values. Genetic algorithms, a stochastic approximation algorithm, and neural network methods are evaluated for optimizing this size function to determine optimal parameter values. The performance of the methods is illustrated using applications like the van der Pol equation.
Towards better performance: phase congruency based face recognitionTELKOMNIKA JOURNAL
Phase congruency is an edge detector and measurement of the significant feature in the image. It is a robust method against contrast and illumination variation. In this paper, two novel techniques are introduced for developing alow-cost human identification system based on face recognition. Firstly, the valuable phase congruency features, the gradient-edges and their associate dangles are utilized separately for classifying 130 subjects taken from three face databases with the motivation of eliminating the feature extraction phase. By doing this, the complexity can be significantly reduced. Secondly, the training process is modified when a new technique, called averaging-vectors is developed to accelerate the training process and minimizes the matching time to the lowest value. However, for more comparison and accurate evaluation,three competitive classifiers: Euclidean distance (ED),cosine distance (CD), and Manhattan distance (MD) are considered in this work. The system performance is very competitive and acceptable, where the experimental results show promising recognition rates with a reasonable matching time.
DESPECKLING OF SAR IMAGES BY OPTIMIZING AVERAGED POWER SPECTRAL VALUE IN CURV...ijistjournal
The document describes a novel algorithm for despeckling synthetic aperture radar (SAR) images using particle swarm optimization (PSO) in the curvelet domain. The algorithm first identifies homogeneous regions in the speckled image using variance calculations. It then uses PSO to optimize the thresholding of curvelet coefficients, with the objective of minimizing the average power spectral value. This provides an optimized threshold to apply curvelet-based despeckling. The proposed method is tested on standard images and shown to outperform conventional filters like median and Lee filters in reducing speckle noise.
Remote sensing image fusion using contourlet transform with sharp frequency l...Zac Darcy
This paper addresses four different aspects of the remote sensing image fusion: i) image fusion method, ii)
quality analysis of fusion results, iii) effects of image decomposition level, and iv) importance of image
registration. First, a new contourlet-based image fusion method is presented, which is an improvement
over the wavelet-based fusion. This fusion method is then utilized withinthe main fusion process to analyze
the final fusion results. Fusion framework, scheme and datasets used in the study are discussed in detail.
Second, quality analysis of the fusion results is discussed using various quantitative metrics for both spatial
and spectral analyses. Our results indicate that the proposed contourlet-based fusion method performs
better than the conventional wavelet-based fusion methodsin terms of both spatial and spectral analyses.
Third, we conducted an analysis on the effects of the image decomposition level and observed that the
decomposition level of 3 produced better fusion results than both smaller and greater number of levels.
Last, we created four different fusion scenarios to examine the importance of the image registration. As a
result, the feature-based image registration using the edge features of the source images produced better
fusion results than the intensity-based imageregistration.
1. The document presents an approach to enhance the realism of synthetic images rendered by game engines. A convolutional network is trained to modify rendered images using intermediate representations from the rendering process.
2. The network is trained with an adversarial objective to provide strong supervision at multiple perceptual levels. A new strategy is proposed for sampling image patches during training to address differences in scene layout distributions between datasets.
3. The approach significantly enhances photorealism over recent image-to-image translation methods and baselines, as shown in controlled experiments. It can add realistic details like gloss, vegetation, and road textures while keeping enhancements consistent with the input image content.
2015_Reduced-Complexity Super-Resolution DOA Estimation with Unknown Number o...Mohamed Mubeen S
The document presents a novel technique for super-resolution direction-of-arrival (DOA) estimation when the number of sources is unknown. The technique formulates an optimization problem to minimize beamformer output power while constraining the weight vector norm, making it insensitive to the estimated number of sources. This provides resolution comparable to super-resolution techniques like MUSIC but with significantly lower computational cost, as it requires solving a generalized eigenvalue problem only once rather than for each scan direction. Analysis shows the technique works similarly to the minimum-norm algorithm while avoiding dependence on the estimated model order. Simulation results demonstrate it outperforms using model order estimation with subspace-based techniques.
This document summarizes a research paper on using discrete wavelet transform for medical image retrieval. It discusses extracting texture features like energy, entropy, contrast and correlation from images using DWT. Haar wavelet is used to analyze texture features. The texture features of images in a database are calculated and compared to an input image to retrieve similar images from the database. Local binary patterns are also extracted as features for classification and retrieval of medical images.
Google Research Siggraph Whitepaper | Total Relighting: Learning to Relight P...Alejandro Franceschi
Google Research Siggraph Whitepaper | Total Relighting: Learning to Relight Portraits for Background Replacement
Abstract:
Given a portrait and an arbitrary high dynamic range lighting environment, our framework uses machine learning to composite the subject into a new scene, while accurately modeling their appearance in the target illumination condition. We estimate a high quality alpha matte, foreground element, albedo map, and surface normals, and we propose a novel, per-pixel lighting representation within a deep learning framework.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
A C OMPARATIVE S TUDY ON A DAPTIVE L IFTING B ASED S CHEME AND I NTERACT...ijma
Adaptive update lifting scheme based Interactive ar
tificial bee colony algorithm is proposed in this p
aper.
Wavelet transform based compression technique is us
ed for images and multimedia files. Approximation
and detail coefficients are extracted from the sign
al by filtering in wavelet transform. To increase
frequency resolution both approximation and detail
coefficients are re-decomposed up to some level.
Artificial bee colony algorithm by local search fin
ds different update coefficients to get quality of
compressed image by choosing optimally best update
coefficient. In IABC, the affection between employe
d
bees and the onlooker bees is found by considering
the concept of universal gravitation. By passing on
control parameter different values, the universal g
ravitation involved in the IABC has a single onlook
er
bee & variety of quantities of employed bees. As a
result, IABC compared with existing image compressi
on
schemes such as wavelet transform and Artificial Be
e colony Algorithm, the proposed work gives better
PSNR
The document discusses optimizing rate allocation for compressing hyperspectral images using JPEG2000. It proposes using the discrete wavelet transform instead of the Karhunen-Loeve transform for decorrelation due to lower computational complexity. A mixed model is used for rate distortion optimal bit allocation instead of experimentally obtained rate distortion curves. Comparisons show the mixed model approach results in lower mean squared error than traditional bit allocation schemes, while having lower implementation complexity than prior methods.
This document analyzes and compares the performance of different adaptive beamforming techniques for smart antennas. It describes switched beamforming, which uses fixed beams, and adaptive beamforming techniques that use algorithms like LMS, SMI, RLS, CGA, CMA, and LSCMA to form an adaptive beam. It simulates these algorithms using MATLAB for a uniform linear array and compares their ability to direct the main beam towards the desired user while nulling interference. The LMS, SMI, RLS, CMA and LSCMA algorithms are found to perform well at directing the beam to the desired user, with LMS having good interference rejection and RLS providing the fastest convergence but highest computational cost.
Particle Swarm Optimization with Constriction Factor and Inertia Weight Appro...IDES Editor
In this paper, an evolutionary optimization
technique, Particle Swarm Optimization with Constriction
Factor and Inertia Weight Approach (PSOCFIWA) is adopted
for the complex synthesis of three-ring Concentric Circular
Antenna Arrays (CCAA) with non-isotropic elements and
without and with central element feeding. It is shown that by
selection of a fitness function which controls more than one
parameter of the array pattern, and also by proper setting of
weight factors in fitness function, one can achieve very good
results. For each optimal design, optimal current excitation
weights and optimal radii are determined having the objective
of maximum Sidelobe Level (SLL) reduction. The extensive
computational results show that the CCAA designs having
central element feeding with non-isotropic elements yield
much more reduction in SLL as compared to the same not
having central element feeding. Moreover, the particular
CCAA containing 4, 6 and 8 number of elements in three
successive rings along with central element feeding yields
grand minimum SLL (-46.4 dB). Standard Particle Swarm
Optimization (PSO) is adopted to compare the results of the
PSOCFIWA algorithm.
Application of Bio-Inspired Optimization Technique for Finding the Optimal se...IDES Editor
In this paper the maximum sidelobe level (SLL) reductions
of three-ring concentric circular antenna arrays
(CCAA) without and with central element feeding are examined
using two different classes of evolutionary optimization
techniques to finally determine the global optimal three-ring
CCAA design. Apart from physical construction of a CCAA,
one may broadly classify its design into two major categories:
uniformly excited arrays and non-uniformly excited arrays.
The present paper assumes non-uniform excitations and uniform
spacing of excitation elements in each three-ring CCAA
design and a design goal of maximizing SLL reduction associated
with optimal beam patterns and beam widths. The design
problem is modeled as an optimization problem for each CCAA
design. Binary coded Genetic Algorithm (BGA) and Bacteria
Foraging Optimization (BFO) are used to determine an optimum
set of normalized excitation weights for CCAA elements,
which, when incorporated, results in a radiation pattern with
optimal (maximum) SLL reduction. Among the various CCAA
designs the three-ring CCAA containing (N1=4, N2=6, N3=8)
elements along with central element feeding proves to be global
optimal design. BFO yields global minimum SLL (-34.18
dB) and global minimum BWFN (81.50) for the optimal design.
Noise resistance territorial intensity-based optical flow using inverse confi...journalBEEI
This paper presents the use of the inverse confidential technique on bilateral function with the territorial intensity-based optical flow to prove the effectiveness in noise resistance environment. In general, the image’s motion vector is coded by the technique called optical flow where the sequences of the image are used to determine the motion vector. But, the accuracy rate of the motion vector is reduced when the source of image sequences is interfered by noises. This work proved that the inverse confidential technique on bilateral function can increase the percentage of accuracy in the motion vector determination by the territorial intensity-based optical flow under the noisy environment. We performed the testing with several kinds of non-Gaussian noises at several patterns of standard image sequences by analyzing the result of the motion vector in a form of the error vector magnitude (EVM) and compared it with several noise resistance techniques in territorial intensity-based optical flow method.
Fpga implementation of fusion technique for fingerprint applicationIAEME Publication
Image Fusion is a process of combining relevant information from a set of images, into a
single image, wherein the resultant fused image will be more informative and complete than any of
the input images. This paper discusses Laplacian Pyramid (LP) based image fusion techniques for
fingerprint application. The technique is implemented in MatLab and evaluation parameters Mean
Square Error (MSE), Peak Signal to Noise Ratio (PSNR) and Matching score are discussed. As well
the same implemented on Virtex-5 FPGA development board using Verilog HDL. LP based
technique provides better results for image fusion than other techniques.
IRJET- Lunar Image Fusion based on DT-CWT, Curvelet Transform and NSCTIRJET Journal
This document compares three image fusion techniques - Dual Tree Complex Wavelet Transform (DT-CWT), Curvelet transform, and Nonsubsampled Contourlet Transform (NSCT) - for fusing high spectral resolution lunar images from the HySI instrument and high spatial resolution images from the TMC instrument on the Chandrayaan-1 satellite. Statistical analysis of the fused images shows that the NSCT technique best preserves both the spectral information of the HySI image and the spatial information of the TMC image, with correlation coefficients over 0.99, higher entropy and average gradients, and lower root mean square errors than the other techniques. Therefore, NSCT produces the highest quality fused lunar images according to these evaluation metrics.
This document discusses the application of photon counting multibin detectors to spectral CT. It describes how basis decomposition and energy weighting techniques can be used with multibin detectors to overcome limitations of conventional CT like beamhardening artifacts, non-standardized CT numbers, suboptimal contrast-to-noise ratio, and contrast cancellation. Basis decomposition involves representing the linear attenuation coefficient as a combination of energy dependent basis functions, while energy weighting optimizes contrast-to-noise by assigning different weights to different energy bins. These techniques enable spectral CT to perform quantitative imaging and low-dose scanning.
Design of Quadrature Mirror Filter Bank using Particle Swarm Optimization (PSO)IDES Editor
In this paper, the particle swarm optimization
technique is used for the design of a two channel quadrature
mirror filter (QMF) bank. A new method is developed to
optimize the prototype filter response in passband, stopband
and overall filter bank response. The design problem is
formulated as nonlinear unconstrained optimization of an
objective function, which is weighted sum of square of error
in passband, stop band and overall filter bank response at
frequency ( ω=0.5π ). For solving the given optimization
problem, the particle swarm optimization (PSO) technique
is used. As compared to the conventional design techniques,
the proposed method gives better performance in terms of
reconstruction error, mean square error in passband,
stopband, and computational time. Various design examples
are presented to illustrate the benefits provided by the
proposed method.
Molecular dynamics (MD) is a very useful tool to understand various phenomena in atomistic detail. In MD, we can overcome the size- and time-scale problems by efficient parallelization. In this lecture, I’ll explain various parallelization methods of MD with some examples of GENESIS MD software optimization on Fugaku.
A Novel Blind SR Method to Improve the Spatial Resolution of Real Life Video ...IRJET Journal
This document proposes a novel blind super resolution method to improve the spatial resolution of real-life video sequences. The key aspects of the proposed method are:
1) It estimates blur without knowing the point spread function or noise statistics using a non-uniform interpolation super resolution method and multi-scale processing.
2) It uses a cost function with fidelity and regularization terms of a Huber-Markov random field to preserve edges and fine details in the reconstructed high resolution frames.
3) It performs masking to suppress artifacts from inaccurate motions, adaptively weighting the fidelity term at each iteration for faster convergence.
The method is tested on real-life videos with complex motions, objects, and brightness changes, showing
This document discusses techniques for optimizing the behavior of nonlinear dynamical systems using cell mapping. Specifically, it addresses maximizing the size of domains of attraction in parametrized dynamical systems. Cell mapping is used to estimate domain sizes for different parameter values. Genetic algorithms, a stochastic approximation algorithm, and neural network methods are evaluated for optimizing this size function to determine optimal parameter values. The performance of the methods is illustrated using applications like the van der Pol equation.
Towards better performance: phase congruency based face recognitionTELKOMNIKA JOURNAL
Phase congruency is an edge detector and measurement of the significant feature in the image. It is a robust method against contrast and illumination variation. In this paper, two novel techniques are introduced for developing alow-cost human identification system based on face recognition. Firstly, the valuable phase congruency features, the gradient-edges and their associate dangles are utilized separately for classifying 130 subjects taken from three face databases with the motivation of eliminating the feature extraction phase. By doing this, the complexity can be significantly reduced. Secondly, the training process is modified when a new technique, called averaging-vectors is developed to accelerate the training process and minimizes the matching time to the lowest value. However, for more comparison and accurate evaluation,three competitive classifiers: Euclidean distance (ED),cosine distance (CD), and Manhattan distance (MD) are considered in this work. The system performance is very competitive and acceptable, where the experimental results show promising recognition rates with a reasonable matching time.
DESPECKLING OF SAR IMAGES BY OPTIMIZING AVERAGED POWER SPECTRAL VALUE IN CURV...ijistjournal
The document describes a novel algorithm for despeckling synthetic aperture radar (SAR) images using particle swarm optimization (PSO) in the curvelet domain. The algorithm first identifies homogeneous regions in the speckled image using variance calculations. It then uses PSO to optimize the thresholding of curvelet coefficients, with the objective of minimizing the average power spectral value. This provides an optimized threshold to apply curvelet-based despeckling. The proposed method is tested on standard images and shown to outperform conventional filters like median and Lee filters in reducing speckle noise.
Remote sensing image fusion using contourlet transform with sharp frequency l...Zac Darcy
This paper addresses four different aspects of the remote sensing image fusion: i) image fusion method, ii)
quality analysis of fusion results, iii) effects of image decomposition level, and iv) importance of image
registration. First, a new contourlet-based image fusion method is presented, which is an improvement
over the wavelet-based fusion. This fusion method is then utilized withinthe main fusion process to analyze
the final fusion results. Fusion framework, scheme and datasets used in the study are discussed in detail.
Second, quality analysis of the fusion results is discussed using various quantitative metrics for both spatial
and spectral analyses. Our results indicate that the proposed contourlet-based fusion method performs
better than the conventional wavelet-based fusion methodsin terms of both spatial and spectral analyses.
Third, we conducted an analysis on the effects of the image decomposition level and observed that the
decomposition level of 3 produced better fusion results than both smaller and greater number of levels.
Last, we created four different fusion scenarios to examine the importance of the image registration. As a
result, the feature-based image registration using the edge features of the source images produced better
fusion results than the intensity-based imageregistration.
1. The document presents an approach to enhance the realism of synthetic images rendered by game engines. A convolutional network is trained to modify rendered images using intermediate representations from the rendering process.
2. The network is trained with an adversarial objective to provide strong supervision at multiple perceptual levels. A new strategy is proposed for sampling image patches during training to address differences in scene layout distributions between datasets.
3. The approach significantly enhances photorealism over recent image-to-image translation methods and baselines, as shown in controlled experiments. It can add realistic details like gloss, vegetation, and road textures while keeping enhancements consistent with the input image content.
2015_Reduced-Complexity Super-Resolution DOA Estimation with Unknown Number o...Mohamed Mubeen S
The document presents a novel technique for super-resolution direction-of-arrival (DOA) estimation when the number of sources is unknown. The technique formulates an optimization problem to minimize beamformer output power while constraining the weight vector norm, making it insensitive to the estimated number of sources. This provides resolution comparable to super-resolution techniques like MUSIC but with significantly lower computational cost, as it requires solving a generalized eigenvalue problem only once rather than for each scan direction. Analysis shows the technique works similarly to the minimum-norm algorithm while avoiding dependence on the estimated model order. Simulation results demonstrate it outperforms using model order estimation with subspace-based techniques.
This document summarizes a research paper on using discrete wavelet transform for medical image retrieval. It discusses extracting texture features like energy, entropy, contrast and correlation from images using DWT. Haar wavelet is used to analyze texture features. The texture features of images in a database are calculated and compared to an input image to retrieve similar images from the database. Local binary patterns are also extracted as features for classification and retrieval of medical images.
Google Research Siggraph Whitepaper | Total Relighting: Learning to Relight P...Alejandro Franceschi
Google Research Siggraph Whitepaper | Total Relighting: Learning to Relight Portraits for Background Replacement
Abstract:
Given a portrait and an arbitrary high dynamic range lighting environment, our framework uses machine learning to composite the subject into a new scene, while accurately modeling their appearance in the target illumination condition. We estimate a high quality alpha matte, foreground element, albedo map, and surface normals, and we propose a novel, per-pixel lighting representation within a deep learning framework.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
A C OMPARATIVE S TUDY ON A DAPTIVE L IFTING B ASED S CHEME AND I NTERACT...ijma
Adaptive update lifting scheme based Interactive ar
tificial bee colony algorithm is proposed in this p
aper.
Wavelet transform based compression technique is us
ed for images and multimedia files. Approximation
and detail coefficients are extracted from the sign
al by filtering in wavelet transform. To increase
frequency resolution both approximation and detail
coefficients are re-decomposed up to some level.
Artificial bee colony algorithm by local search fin
ds different update coefficients to get quality of
compressed image by choosing optimally best update
coefficient. In IABC, the affection between employe
d
bees and the onlooker bees is found by considering
the concept of universal gravitation. By passing on
control parameter different values, the universal g
ravitation involved in the IABC has a single onlook
er
bee & variety of quantities of employed bees. As a
result, IABC compared with existing image compressi
on
schemes such as wavelet transform and Artificial Be
e colony Algorithm, the proposed work gives better
PSNR
The document discusses optimizing rate allocation for compressing hyperspectral images using JPEG2000. It proposes using the discrete wavelet transform instead of the Karhunen-Loeve transform for decorrelation due to lower computational complexity. A mixed model is used for rate distortion optimal bit allocation instead of experimentally obtained rate distortion curves. Comparisons show the mixed model approach results in lower mean squared error than traditional bit allocation schemes, while having lower implementation complexity than prior methods.
This document analyzes and compares the performance of different adaptive beamforming techniques for smart antennas. It describes switched beamforming, which uses fixed beams, and adaptive beamforming techniques that use algorithms like LMS, SMI, RLS, CGA, CMA, and LSCMA to form an adaptive beam. It simulates these algorithms using MATLAB for a uniform linear array and compares their ability to direct the main beam towards the desired user while nulling interference. The LMS, SMI, RLS, CMA and LSCMA algorithms are found to perform well at directing the beam to the desired user, with LMS having good interference rejection and RLS providing the fastest convergence but highest computational cost.
Particle Swarm Optimization with Constriction Factor and Inertia Weight Appro...IDES Editor
In this paper, an evolutionary optimization
technique, Particle Swarm Optimization with Constriction
Factor and Inertia Weight Approach (PSOCFIWA) is adopted
for the complex synthesis of three-ring Concentric Circular
Antenna Arrays (CCAA) with non-isotropic elements and
without and with central element feeding. It is shown that by
selection of a fitness function which controls more than one
parameter of the array pattern, and also by proper setting of
weight factors in fitness function, one can achieve very good
results. For each optimal design, optimal current excitation
weights and optimal radii are determined having the objective
of maximum Sidelobe Level (SLL) reduction. The extensive
computational results show that the CCAA designs having
central element feeding with non-isotropic elements yield
much more reduction in SLL as compared to the same not
having central element feeding. Moreover, the particular
CCAA containing 4, 6 and 8 number of elements in three
successive rings along with central element feeding yields
grand minimum SLL (-46.4 dB). Standard Particle Swarm
Optimization (PSO) is adopted to compare the results of the
PSOCFIWA algorithm.
Application of Bio-Inspired Optimization Technique for Finding the Optimal se...IDES Editor
In this paper the maximum sidelobe level (SLL) reductions
of three-ring concentric circular antenna arrays
(CCAA) without and with central element feeding are examined
using two different classes of evolutionary optimization
techniques to finally determine the global optimal three-ring
CCAA design. Apart from physical construction of a CCAA,
one may broadly classify its design into two major categories:
uniformly excited arrays and non-uniformly excited arrays.
The present paper assumes non-uniform excitations and uniform
spacing of excitation elements in each three-ring CCAA
design and a design goal of maximizing SLL reduction associated
with optimal beam patterns and beam widths. The design
problem is modeled as an optimization problem for each CCAA
design. Binary coded Genetic Algorithm (BGA) and Bacteria
Foraging Optimization (BFO) are used to determine an optimum
set of normalized excitation weights for CCAA elements,
which, when incorporated, results in a radiation pattern with
optimal (maximum) SLL reduction. Among the various CCAA
designs the three-ring CCAA containing (N1=4, N2=6, N3=8)
elements along with central element feeding proves to be global
optimal design. BFO yields global minimum SLL (-34.18
dB) and global minimum BWFN (81.50) for the optimal design.
Implementation of Digital Beamforming Technique for Linear Antenna Arraysijsrd.com
A digital Beamforming technique used for increased channel capacity and also increased signal to noise and interference ratio. In smart antenna, different type of radiation pattern of an antenna can be changed either by selecting appropriate weights or by changing the array geometry. This paper presented based on auxiliary phase algorithm by using this algorithm in linear antenna array determine the array pattern approximating the auxiliary function in both amplitude and phase. Cost function involving auxiliary function and array pattern is minimized by modifying the pattern.
In the recent years the development in communication systems requires the development of low cost, minimal weight, low profile antennas that are capable of maintaining high performance over a wide spectrum of frequencies. This technological trend has focused much effort into the design of a micro strip patch antennas. Nowadays Evolutionary Computation has its growth to extent. Generally electromagnetic optimization problems generally involve a large number of parameters. Synthesis of non-uniform linear antenna arrays is one of the most important electromagnetic optimization problems of the current interest.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a 49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization
problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular
choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation
of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of
nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective
approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a
49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In
the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery
is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry
out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a 49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a 49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
This document discusses using particle swarm optimization (PSO) to optimize the array factor pattern of an active planar phased array after antenna element failures. It presents results from optimizing the pattern of a 49-element quasi-circular array with up to 20% element failure. PSO is used to calculate a new set of input amplitudes for the functional elements that recover the original pattern levels for the main beam, first side lobes, and first nulls. The results show PSO can effectively recover the pattern in cases with 1, 3, 5, and 10 non-functioning elements.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a 49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a 49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a 49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a 49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a 49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation
of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a
49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
ARRAY FACTOR OPTIMIZATION OF AN ACTIVE PLANAR PHASED ARRAY USING EVOLUTIONARY...jantjournal
Evolutionary algorithms (EAs) have the potential to handle complex, multi-dimensional optimization problems in the field of phased array. Out of different EAs, particle swarm optimization (PSO) is a popular choice. In a phased array, antenna element failure is a common phenomenon and this leads to degradation
of the array factor (AF) pattern, primarily in terms of increased side lobe levels (SLLs), displacement of nulls and reduction in the null depths. The recovery of a degraded pattern using a cost and time-effective approach is on demand. In this context, an attempt made to obtain an optimized AF pattern after fault in a
49 elements quasi-circular aperture equilateral triangular grid active planar phased array using PSO. In the paper, multiple cases on recovery are discussed having a maximum 20% element failure. Each recovery is also further evaluated by different statistical analyses. A dedicated software tool was developed to carry out the work presented in this paper.
A RRAY F ACTOR O PTIMIZATION OF AN A CTIVE P LANAR P HASED A RRAY USING...jantjournal
Evolutionary algorithms (EAs) have the potential to
handle complex, multi-dimensional optimization
problems in the field of phased array. Out of diffe
rent EAs, particle swarm optimization (PSO) is a po
pular
choice. In a phased array, antenna element failure
is a common phenomenon and this leads to degradatio
n
of the array factor (AF) pattern, primarily in term
s of increased side lobe levels (SLLs), displacemen
t of
nulls and reduction in the null depths. The recover
y of a degraded pattern using a cost and time-effec
tive
approach is on demand. In this context, an attempt
made to obtain an optimized AF pattern after fault
in a
49 elements quasi-circular aperture equilateral tri
angular grid active planar phased array using PSO.
In
the paper, multiple cases on recovery are discussed
having a maximum 20% element failure. Each recover
y
is also further evaluated by different statistical
analyses. A dedicated software tool was developed t
o carry
out the work presented in this paper.
Ill-posedness formulation of the emission source localization in the radio- d...Ahmed Ammar Rebai PhD
To contact the authors : tarek.salhi@gmail.com and ahmed.rebai2@gmail.com
In the field of radio detection in astroparticle physics, many studies have shown the strong dependence of the solution of the radio-transient sources localization problem (the radio-shower time of arrival on antennas) such solutions are purely numerical artifacts. Based on a detailed analysis of some already published results of radio-detection experiments like : CODALEMA 3 in France, AERA in Argentina and TREND in China, we demonstrate the ill-posed character of this problem in the sens of Hadamard. Two approaches have been used as the existence of solutions degeneration and the bad conditioning of the mathematical formulation problem. A comparison between experimental results and simulations have been made, to highlight the mathematical studies. Many properties of the non-linear least square function are discussed such as the configuration of the set of solutions and the bias.
This document summarizes research on using particle swarm optimization (PSO) to enhance the radiation pattern of a phase array antenna. It begins by introducing the problem of high sidelobes in phase array antenna patterns. It then provides background on phased array antennas and array pattern modeling. A 24-element linear array modeled in MATLAB is used as a case study. Standard PSO and a modified PSO algorithm are applied to optimize the current excitations and minimize the sidelobe levels. Simulation results show that both PSO approaches reduce sidelobes compared to uniform excitation, with the modified PSO performing better. Overall, the document presents research on using computational optimization methods to improve phase array antenna radiation patterns by controlling sidelobe levels
Enhancing the Radiation Pattern of Phase Array Antenna Using Particle Swarm O...IOSR Journals
The document describes a study that uses particle swarm optimization to enhance the radiation pattern of a phase array antenna by minimizing sidelobe levels. It first provides background on issues with high sidelobes in phase array antennas, such as power losses and interference. It then summarizes previous research using techniques like genetic algorithms for antenna array optimization. The study models the radiation pattern of linear arrays with different element numbers and calculates gain, finding that gain increases with more elements. However, sidelobe levels also increase relatively. Therefore, the study proposes using particle swarm optimization to optimize current excitation and control sidelobe levels while maintaining a narrow beamwidth.
Similar to Optimal Synthesis of Array Pattern for Concentric Circular Antenna Array Using Hybrid Evolutionary Programming (20)
Power System State Estimation - A ReviewIDES Editor
This document provides a review of power system state estimation techniques. It discusses both static and dynamic state estimation algorithms. For static state estimation, it covers weighted least squares, decoupled, and robust estimation methods. Weighted least squares is commonly used but can have numerical instability issues. Decoupled state estimation approximates the gain matrix for faster computation. Robust estimation uses M-estimators and other techniques to handle outliers and bad data. Dynamic state estimation applies Kalman filtering, leapfrog algorithms, and other methods to continuously monitor system states over time.
Artificial Intelligence Technique based Reactive Power Planning Incorporating...IDES Editor
This document summarizes a research paper that proposes using artificial intelligence techniques and FACTS controllers for reactive power planning in real-time power transmission systems. The paper formulates the reactive power planning problem and incorporates flexible AC transmission system (FACTS) devices like static VAR compensators (SVC), thyristor controlled series capacitors (TCSC), and unified power flow controllers (UPFC). Evolutionary algorithms like evolutionary programming (EP) and differential evolution (DE) are applied to find the optimal locations and settings of the FACTS controllers to minimize losses and costs. Simulation results on IEEE 30-bus and 72-bus Indian test systems show that UPFC performs best in reducing losses compared to SVC and TCSC.
Design and Performance Analysis of Genetic based PID-PSS with SVC in a Multi-...IDES Editor
Damping of power system oscillations with the help
of proposed optimal Proportional Integral Derivative Power
System Stabilizer (PID-PSS) and Static Var Compensator
(SVC)-based controllers are thoroughly investigated in this
paper. This study presents robust tuning of PID-PSS and
SVC-based controllers using Genetic Algorithms (GA) in
multi machine power systems by considering detailed model
of the generators (model 1.1). The effectiveness of FACTSbased
controllers in general and SVC-based controller in
particular depends upon their proper location. Modal
controllability and observability are used to locate SVC–based
controller. The performance of the proposed controllers is
compared with conventional lead-lag power system stabilizer
(CPSS) and demonstrated on 10 machines, 39 bus New England
test system. Simulation studies show that the proposed genetic
based PID-PSS with SVC based controller provides better
performance.
Optimal Placement of DG for Loss Reduction and Voltage Sag Mitigation in Radi...IDES Editor
This paper presents the need to operate the power
system economically and with optimum levels of voltages has
further led to an increase in interest in Distributed
Generation. In order to reduce the power losses and to improve
the voltage in the distribution system, distributed generators
(DGs) are connected to load bus. To reduce the total power
losses in the system, the most important process is to identify
the proper location for fixing and sizing of DGs. It presents a
new methodology using a new population based meta heuristic
approach namely Artificial Bee Colony algorithm(ABC) for
the placement of Distributed Generators(DG) in the radial
distribution systems to reduce the real power losses and to
improve the voltage profile, voltage sag mitigation. The power
loss reduction is important factor for utility companies because
it is directly proportional to the company benefits in a
competitive electricity market, while reaching the better power
quality standards is too important as it has vital effect on
customer orientation. In this paper an ABC algorithm is
developed to gain these goals all together. In order to evaluate
sag mitigation capability of the proposed algorithm, voltage
in voltage sensitive buses is investigated. An existing 20KV
network has been chosen as test network and results are
compared with the proposed method in the radial distribution
system.
Line Losses in the 14-Bus Power System Network using UPFCIDES Editor
Controlling power flow in modern power systems
can be made more flexible by the use of recent developments
in power electronic and computing control technology. The
Unified Power Flow Controller (UPFC) is a Flexible AC
transmission system (FACTS) device that can control all the
three system variables namely line reactance, magnitude and
phase angle difference of voltage across the line. The UPFC
provides a promising means to control power flow in modern
power systems. Essentially the performance depends on proper
control setting achievable through a power flow analysis
program. This paper presents a reliable method to meet the
requirements by developing a Newton-Raphson based load
flow calculation through which control settings of UPFC can
be determined for the pre-specified power flow between the
lines. The proposed method keeps Newton-Raphson Load Flow
(NRLF) algorithm intact and needs (little modification in the
Jacobian matrix). A MATLAB program has been developed to
calculate the control settings of UPFC and the power flow
between the lines after the load flow is converged. Case studies
have been performed on IEEE 5-bus system and 14-bus system
to show that the proposed method is effective. These studies
indicate that the method maintains the basic NRLF properties
such as fast computational speed, high degree of accuracy and
good convergence rate.
Study of Structural Behaviour of Gravity Dam with Various Features of Gallery...IDES Editor
The size and shape of opening in dam causes the
stress concentration, it also causes the stress variation in the
rest of the dam cross section. The gravity method of the analysis
does not consider the size of opening and the elastic property
of dam material. Thus the objective of study is comprises of
the Finite Element Method which considers the size of
opening, elastic property of material, and stress distribution
because of geometric discontinuity in cross section of dam.
Stress concentration inside the dam increases with the opening
in dam which results in the failure of dam. Hence it is
necessary to analyses large opening inside the dam. By making
the percentage area of opening constant and varying size and
shape of opening the analysis is carried out. For this purpose
a section of Koyna Dam is considered. Dam is defined as a
plane strain element in FEM, based on geometry and loading
condition. Thus this available information specified our path
of approach to carry out 2D plane strain analysis. The results
obtained are then compared mutually to get most efficient
way of providing large opening in the gravity dam.
Assessing Uncertainty of Pushover Analysis to Geometric ModelingIDES Editor
Pushover Analysis a popular tool for seismic
performance evaluation of existing and new structures and is
nonlinear Static procedure where in monotonically increasing
loads are applied to the structure till the structure is unable
to resist the further load .During the analysis, whatever the
strength of concrete and steel is adopted for analysis of
structure may not be the same when real structure is
constructed and the pushover analysis results are very sensitive
to material model adopted, geometric model adopted, location
of plastic hinges and in general to procedure followed by the
analyzer. In this paper attempt has been made to assess
uncertainty in pushover analysis results by considering user
defined hinges and frame modeled as bare frame and frame
with slab modeled as rigid diaphragm and results compared
with experimental observations. Uncertain parameters
considered includes the strength of concrete, strength of steel
and cover to the reinforcement which are randomly generated
and incorporated into the analysis. The results are then
compared with experimental observations.
Secure Multi-Party Negotiation: An Analysis for Electronic Payments in Mobile...IDES Editor
This document summarizes and analyzes secure multi-party negotiation protocols for electronic payments in mobile computing. It presents a framework for secure multi-party decision protocols using lightweight implementations. The main focus is on synchronizing security features to avoid agreement manipulation and reduce user traffic. The paper describes negotiation between an auctioneer and bidders, showing multiparty security is better than existing systems. It analyzes the performance of encryption algorithms like ECC, XTR, and RSA for use in the multiparty negotiation protocols.
Selfish Node Isolation & Incentivation using Progressive ThresholdsIDES Editor
The problems associated with selfish nodes in
MANET are addressed by a collaborative watchdog approach
which reduces the detection time for selfish nodes thereby
improves the performance and accuracy of watchdogs[1]. In
the related works they make use of credit based systems, reputation
based mechanisms, pathrater and watchdog mechanism
to detect such selfish nodes. In this paper we follow an approach
of collaborative watchdog which reduces the detection
time for selfish nodes and also involves the removal of such
selfish nodes based on some progressively assessed thresholds.
The threshold gives the nodes a chance to stop misbehaving
before it is permanently deleted from the network.
The node passes through several isolation processes before it
is permanently removed. Another version of AODV protocol
is used here which allows the simulation of selfish nodes in
NS2 by adding or modifying log files in the protocol.
Various OSI Layer Attacks and Countermeasure to Enhance the Performance of WS...IDES Editor
Wireless sensor networks are networks having non
wired infrastructure and dynamic topology. In OSI model each
layer is prone to various attacks, which halts the performance
of a network .In this paper several attacks on four layers of
OSI model are discussed and security mechanism is described
to prevent attack in network layer i.e wormhole attack. In
Wormhole attack two or more malicious nodes makes a covert
channel which attracts the traffic towards itself by depicting a
low latency link and then start dropping and replaying packets
in the multi-path route. This paper proposes promiscuous mode
method to detect and isolate the malicious node during
wormhole attack by using Ad-hoc on demand distance vector
routing protocol (AODV) with omnidirectional antenna. The
methodology implemented notifies that the nodes which are
not participating in multi-path routing generates an alarm
message during delay and then detects and isolate the
malicious node from network. We also notice that not only
the same kind of attacks but also the same kind of
countermeasures can appear in multiple layer. For example,
misbehavior detection techniques can be applied to almost all
the layers we discussed.
Responsive Parameter based an AntiWorm Approach to Prevent Wormhole Attack in...IDES Editor
The recent advancements in the wireless technology
and their wide-spread deployment have made remarkable
enhancements in efficiency in the corporate and industrial
and Military sectors The increasing popularity and usage of
wireless technology is creating a need for more secure wireless
Ad hoc networks. This paper aims researched and developed
a new protocol that prevents wormhole attacks on a ad hoc
network. A few existing protocols detect wormhole attacks but
they require highly specialized equipment not found on most
wireless devices. This paper aims to develop a defense against
wormhole attacks as an Anti-worm protocol which is based on
responsive parameters, that does not require as a significant
amount of specialized equipment, trick clock synchronization,
no GPS dependencies.
Cloud Security and Data Integrity with Client Accountability FrameworkIDES Editor
This document summarizes a proposed cloud security and data integrity framework that provides client accountability. The framework aims to address issues like lack of user control over cloud data, need for data transparency and tracking, and ensuring data integrity. It proposes using JAR (Java Archive) files for data sharing due to benefits like portability. The framework incorporates client-side verification using MD5 hashing, digital signature-based authentication of JAR files, and use of HMAC to ensure data integrity. It also uses password-based encryption of log files to keep them tamper-proof. The framework is intended to provide both accountability and security for data sharing in cloud environments.
Genetic Algorithm based Layered Detection and Defense of HTTP BotnetIDES Editor
A System state in HTTP botnet uses HTTP protocol
for the creation of chain of Botnets thereby compromising
other systems. By using HTTP protocol and port number 80,
attacks can not only be hidden but also pass through the
firewall without being detected. The DPR based detection
leads to better analysis of botnet attacks [3]. However, it
provides only probabilistic detection of the attacker and also
time consuming and error prone. This paper proposes a Genetic
algorithm based layered approach for detecting as well as
preventing botnet attacks. The paper reviews p2p firewall
implementation which forms the basis of filtering.
Performance evaluation is done based on precision, F-value
and probability. Layered approach reduces the computation
and overall time requirement [7]. Genetic algorithm promises
a low false positive rate.
Enhancing Data Storage Security in Cloud Computing Through SteganographyIDES Editor
This document summarizes a research paper that proposes a method for enhancing data security in cloud computing through steganography. The method hides user data in digital images stored on cloud servers. When data needs to be accessed, it is extracted from the images. The document outlines the cloud architecture and security issues addressed. It then describes the proposed system architecture, security model, and data storage and retrieval process. Data is partitioned and hidden in multiple images to improve security. The goal is to prevent unauthorized access to user data stored on cloud servers.
The main tasks of a Wireless Sensor Network
(WSN) are data collection from its nodes and communication
of this data to the base station (BS). The protocols used for
communication among the WSN nodes and between the WSN
and the BS, must consider the resource constraints of nodes,
battery energy, computational capabilities and memory. The
WSN applications involve unattended operation of the network
over an extended period of time. In order to extend the lifetime
of a WSN, efficient routing protocols need to be adopted. The
proposed low power routing protocol based on tree-based
network structure reliably forwards the measured data towards
the BS using TDMA. An energy consumption analysis of the
WSN making use of this protocol is also carried out. It is
found that the network is energy efficient with an average
duty cycle of 0:7% for the WSN nodes. The OmNET++
simulation platform along with MiXiM framework is made
use of.
Permutation of Pixels within the Shares of Visual Cryptography using KBRP for...IDES Editor
The security of authentication of internet based
co-banking services should not be susceptible to high risks.
The passwords are highly vulnerable to virus attacks due to
the lack of high end embedding of security methods. In order
for the passwords to be more secure, people are generally
compelled to select jumbled up character based passwords
which are not only less memorable but are also equally prone
to insecurity. Multiple use of distributed shares has been
studied to solve the problem of authentication by algorithms
based on thresholding of pixels in image processing and visual
cryptography concepts where the subset of shares is considered
for the recovery of the original image for authentication using
correlation function[1][2].The main disadvantage in the above
study is the plain storage of shares and also one of the shares
is being supplied to the customer, which will lead to the
possibility of misuse by a third party. This paper proposes a
technique for scrambling of pixels by key based random
permutation (KBRP) within the shares before the
authentication has been attempted. Total number of shares to
be created is dependent on the multiplicity of ownership of
the account. By this method the problem of uncertainty among
the customers with regard to security, storage, retrieval of
holding of half of the shares is minimized.
This paper presents a trifocal Rotman Lens Design
approach. The effects of focal ratio and element spacing on
the performance of Rotman Lens are described. A three beam
prototype feeding 4 element antenna array working in L-band
has been simulated using RLD v1.7 software. Simulated
results show that the simulated lens has a return loss of –
12.4dB at 1.8GHz. Beam to array port phase error variation
with change in the focal ratio and element spacing has also
been investigated.
Band Clustering for the Lossless Compression of AVIRIS Hyperspectral ImagesIDES Editor
Hyperspectral images can be efficiently compressed
through a linear predictive model, as for example the one
used in the SLSQ algorithm. In this paper we exploit this
predictive model on the AVIRIS images by individuating,
through an off-line approach, a common subset of bands, which
are not spectrally related with any other bands. These bands
are not useful as prediction reference for the SLSQ 3-D
predictive model and we need to encode them via other
prediction strategies which consider only spatial correlation.
We have obtained this subset by clustering the AVIRIS bands
via the clustering by compression approach. The main result
of this paper is the list of the bands, not related with the
others, for AVIRIS images. The clustering trees obtained for
AVIRIS and the relationship among bands they depict is also
an interesting starting point for future research.
Microelectronic Circuit Analogous to Hydrogen Bonding Network in Active Site ...IDES Editor
A microelectronic circuit of block-elements
functionally analogous to two hydrogen bonding networks is
investigated. The hydrogen bonding networks are extracted
from â-lactamase protein and are formed in its active site.
Each hydrogen bond of the network is described in equivalent
electrical circuit by three or four-terminal block-element.
Each block-element is coded in Matlab. Static and dynamic
analyses are performed. The resultant microelectronic circuit
analogous to the hydrogen bonding network operates as
current mirror, sine pulse source, triangular pulse source as
well as signal modulator.
Texture Unit based Monocular Real-world Scene Classification using SOM and KN...IDES Editor
In this paper a method is proposed to discriminate
real world scenes in to natural and manmade scenes of similar
depth. Global-roughness of a scene image varies as a function
of image-depth. Increase in image depth leads to increase in
roughness in manmade scenes; on the contrary natural scenes
exhibit smooth behavior at higher image depth. This particular
arrangement of pixels in scene structure can be well explained
by local texture information in a pixel and its neighborhood.
Our proposed method analyses local texture information of a
scene image using texture unit matrix. For final classification
we have used both supervised and unsupervised learning using
K-Nearest Neighbor classifier (KNN) and Self Organizing
Map (SOM) respectively. This technique is useful for online
classification due to very less computational complexity.
How information systems are built or acquired puts information, which is what they should be about, in a secondary place. Our language adapted accordingly, and we no longer talk about information systems but applications. Applications evolved in a way to break data into diverse fragments, tightly coupled with applications and expensive to integrate. The result is technical debt, which is re-paid by taking even bigger "loans", resulting in an ever-increasing technical debt. Software engineering and procurement practices work in sync with market forces to maintain this trend. This talk demonstrates how natural this situation is. The question is: can something be done to reverse the trend?
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
The Microsoft 365 Migration Tutorial For Beginner.pptxoperationspcvita
This presentation will help you understand the power of Microsoft 365. However, we have mentioned every productivity app included in Office 365. Additionally, we have suggested the migration situation related to Office 365 and how we can help you.
You can also read: https://www.systoolsgroup.com/updates/office-365-tenant-to-tenant-migration-step-by-step-complete-guide/
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away