Rahul Soni has over 5 years of experience as an Informatica developer working on projects in marketing, finance, and other domains. He has extensive experience designing ETL mappings to extract, transform, and load data from various sources like databases, files, and applications. Some of his skills include working with Informatica PowerCenter, databases like Oracle and DB2, scripting languages like Unix, and data modeling. He provides details of his work experience and education and includes his contact information.
Oracle Certified Professional (OCP) and Tuning expert having 13 yr’s experience on medium to large-scale global project in Media and intertainment, Finance, Telecom and Insurance domain.
Oracle Certified Professional (OCP) and Tuning expert having 13 yr’s experience on medium to large-scale global project in Media and intertainment, Finance, Telecom and Insurance domain.
BizViz CA PPM Integration communicates directly with CA PPM using a web service, gets a list of active users, and leverages the CA PPM security model to ensure that all the data is shown based on the security rights. You can also leverage Active Directory or another application. Thus we bring Clarity to your PPM. Administrators can embed the dashboards in the corporate intranet, CA PPM or SharePoint.
Considerations for Data Migration D365 Finance & OperationsGina Pabalan
Harvesting enterprise data is central to how organizations compete, and even survive, as industries transform digitally. Yet, as companies merge and technologies shift, managing data has become an extremely complex but critical task, especially handled alongside of an enterprise ERP implementation.
For companies moving from an on-premise legacy ERP system to Microsoft’s cloud-based Dynamics 365 for Finance and Operations (“D365”), there are some unique challenges and new tools to leverage when considering the data migration activity.
Microsoft delivers the Data Management Framework (“DMF”) tool to assist customers with data migration for D365. Data migration itself consists of three distinct activities, as illustrated below: Data extraction (from legacy systems), data transformation and data import into D365. DMF will assist in the import into the new D365 application, but what is the best way to extract and transform, to “ready” the data for the import?
What you need to know about Data Migration for D365 Finance & OperationsGina Pabalan
This is a "nuts & bolts" whitepaper discussing the capabilities and challenges of migrating data to Microsoft Dynamics365 for Finance and Operations (D365).
Why BI ?
Performance management
Identify trends
Cash flow trend
Fine-tune operations
Sales pipeline analysis
Future projections
business Forecasting
Decision Making Tools
Convert data into information
How to Think ?
What happened?
What is happening?
Why did it happen?
What will happen?
What do I want to happen?
Informatica Tutorial For Beginners | Informatica Powercenter Tutorial | EdurekaEdureka!
This Edureka Informatica Tutorial helps you understand Informatica PowerCenter in detail. This Informatica tutorial is ideal for both beginners as well as professionals who want to learn or brush up their Informatica concepts. Below are the topics covered in this tutorial:
1. What Is Informatica?
2. Informatica Products and Functionalities
3. Informatica Architecture Overview and Components
4. Domain and Nodes
5. Informatica Services
6. Overview of ETL
7. Component Based Development
BizViz CA PPM Integration communicates directly with CA PPM using a web service, gets a list of active users, and leverages the CA PPM security model to ensure that all the data is shown based on the security rights. You can also leverage Active Directory or another application. Thus we bring Clarity to your PPM. Administrators can embed the dashboards in the corporate intranet, CA PPM or SharePoint.
Considerations for Data Migration D365 Finance & OperationsGina Pabalan
Harvesting enterprise data is central to how organizations compete, and even survive, as industries transform digitally. Yet, as companies merge and technologies shift, managing data has become an extremely complex but critical task, especially handled alongside of an enterprise ERP implementation.
For companies moving from an on-premise legacy ERP system to Microsoft’s cloud-based Dynamics 365 for Finance and Operations (“D365”), there are some unique challenges and new tools to leverage when considering the data migration activity.
Microsoft delivers the Data Management Framework (“DMF”) tool to assist customers with data migration for D365. Data migration itself consists of three distinct activities, as illustrated below: Data extraction (from legacy systems), data transformation and data import into D365. DMF will assist in the import into the new D365 application, but what is the best way to extract and transform, to “ready” the data for the import?
What you need to know about Data Migration for D365 Finance & OperationsGina Pabalan
This is a "nuts & bolts" whitepaper discussing the capabilities and challenges of migrating data to Microsoft Dynamics365 for Finance and Operations (D365).
Why BI ?
Performance management
Identify trends
Cash flow trend
Fine-tune operations
Sales pipeline analysis
Future projections
business Forecasting
Decision Making Tools
Convert data into information
How to Think ?
What happened?
What is happening?
Why did it happen?
What will happen?
What do I want to happen?
Informatica Tutorial For Beginners | Informatica Powercenter Tutorial | EdurekaEdureka!
This Edureka Informatica Tutorial helps you understand Informatica PowerCenter in detail. This Informatica tutorial is ideal for both beginners as well as professionals who want to learn or brush up their Informatica concepts. Below are the topics covered in this tutorial:
1. What Is Informatica?
2. Informatica Products and Functionalities
3. Informatica Architecture Overview and Components
4. Domain and Nodes
5. Informatica Services
6. Overview of ETL
7. Component Based Development
Adjusting primitives for graph : SHORT REPORT / NOTESSubhajit Sahu
Graph algorithms, like PageRank Compressed Sparse Row (CSR) is an adjacency-list based graph representation that is
Multiply with different modes (map)
1. Performance of sequential execution based vs OpenMP based vector multiply.
2. Comparing various launch configs for CUDA based vector multiply.
Sum with different storage types (reduce)
1. Performance of vector element sum using float vs bfloat16 as the storage type.
Sum with different modes (reduce)
1. Performance of sequential execution based vs OpenMP based vector element sum.
2. Performance of memcpy vs in-place based CUDA based vector element sum.
3. Comparing various launch configs for CUDA based vector element sum (memcpy).
4. Comparing various launch configs for CUDA based vector element sum (in-place).
Sum with in-place strategies of CUDA mode (reduce)
1. Comparing various launch configs for CUDA based vector element sum (in-place).
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
1. Capgemini Confidential
Rahul Soni
rsoni4980@gmail.com +919148250559
EXPERIENCE SUMMARY:
Have Around 5.3 years of experience in Informatica power centre an ETL Tool and data base.
Working as a Developer for projects in domains like Marketing and Finance.
Ability to design mappings as per the requirement using different transformations.
Extensive usage transformation like Router, Aggregator, Update Strategy, Expression,XML Gen, Parser.
Extensively used ETL Methodology for supporting data Extraction, Transformation and Load processing.
Extraction of Source data to perform data cleansing and to load the data to staging area.
Design of Mapplets to reuse a set of transformations using mapplets designer.
Design of mappings to perform Slowly Changing Dimensions like SCD1 ,SCD2,SCD3 Dimension.
Extraction of data from flat files and to perform necessary cleansing operations.
Design of mappings to perform Change Data Capture (CDC).
Design of mappings using mapping variables and parameters.
Design script using UNIX.
Design LLD and HLD document based on SRS document.
I have Knowledge on some Product (Tools) like Microstrategy, Boxi, Complex Event Processing (CEP),Spring
Book, Qmetry, Jira, and PC Express Edition 9.6.0. SQL.
TECHNICAL SKILLS:
Operating Systems Windows , Unix
Databases Oracle,DB2,Siebel,Teradata,HPDM,
ETL Tools Informatica PowerCenter Informatica,BDE,DIAL,IICS,IDQ
Languages SQL, TSQL, PLSQL
Scripting Language&Tools Unix, GIT Hub ,Tidal, Control M
WORK EXPERINCE:
• Working as Senior Software Engineer in KPIT Technology.
• Worked as Consultant in Capgemini Technology Services.
• Worked as Software Engineer in ASAP Infosystem, Bangalore
• Worked as Software Engineer in Informatica Corporation , Bangalore
PROJECT PROFILE:
Project # 1: IDM Finance Investment Analytics
Module 1 Title : Investment & Finance Analytics (Finance)
Client : T ROWE Price Corporation,
Role : Developer/Support
Data Base/Tools : Oracle11g/SAP/Informatica Cloud/DB2/Informatica DQ/AWS S3 & Webservices
Description:
Roles and Responsibilities:
• Involved in gathering and analysing the requirements given by the Business Analyst in CRM tool rally.
• Involved in ETL mapping specification documents like HLD & LLD called as Technical design document.
• Create sql script for creating table and hierarchy for checking records count and mismatch records from source
to target
• Tools for integrate data from different sources like DB2, and Web application & Mainframe, oracle and SAP
Informatica Data Quality for improve the quality of data as per business expected.
• Using Informatica Power Centre create mapping and doing data masking.
2. Capgemini Confidential
• Creating Autosys jobs in Autosys tools and scheduling and monitoring as well.
Project # 2: GE AVIATION
Module 1 Title : Malaysia/Brazil Changes (Finance)
Client : GE Digital,
Role : Developer
Data Base/Tools : Oracle11g/SAP/Informatica PC/PLSQL,
Description:
Roles and Responsibilities:
• Involved in gathering and analysing the requirements given by the Business Analyst in CRM tool raily.
• Involved in ETL mapping specification documents like HLD & LLD called as Technical design document.
• Create sql script for creating table and hierarchy and for checking records count and mismatch records from
source to target
• Tools for integrate data from different sources like Siebel, oracle and SAP Informatica
BDE,SSS,WEBCASH,IBS,OAR interfaces etc.
• Using Informatica Power Centre tools load data from SAP to different data warehouse.
Project # 3 Customer Migration (FADS):
Module-2 Title : CHUB Migration (Finance)
Client : T-Mobile USA Corp,
Role : Developer/Unit Tester,
Data Base : Oracle11g/Teradata 15 version,
Description:
T-Mobile USA is the Telecom domain MNC level 5 company, Working on telecom industries like Pre paid and
post-paid SIM and provide services to others small telecom organisations.
We are handling product information and customer and finance information and load in Teradata data warehouse.
Roles and Responsibilities:
• Involved in gathering and analysing the requirements given by the Business Analyst in CRM tool Jira.
• Involved in ETL mapping specification documents like HLD & LLD called as S2TM
• Create sql script for ABC validation for checking records count and mismatch records from source to target
• Tools for integrate data from different sources like Siebel, oracle and Samson and Hadoop(hive hdfs)
Informatica BDE and Teradata.
• Sometime we created BTEQ script for integrate the data in SQV level.
• Using Informatica Power Center Big data Edition tools load data from Hadoop source to our data warehouse.
Project # 4- Access Statement My Account Phase 2, 3&4 (EDW):
Module Title : Dad Phase3 (Finance & Risk Stream)
Client : Thomson Reuters Corp,
Role : Developer/Unit Tester,
Data Base : Oracle11g,
Description:
AccessStatement Phase2 delivered the capability for an AccessStatementmanager to go into My Account Access
Statement Portal to make declaration of the accesses being consumed. My Account UI capability is very much
scalable for small – medium user base customer. However for large SCS accounts (ex: UBS, Barclays, Credit
Suisse, JP Morgan), there would be one global Market data manager or a central team who would be responsible
to ensure the declarations are made for the location accounts. UBS for example would have 49 locations accounts,
about 50 billing accountsin 25 countries. For these accounts,ASM making a declaration via the UIis not scalable.
The point is perfectly illustrated by the below Access Statement for JP Morgan in the US. To complete the
declaration via the UI, the customer would have to click on 192 tabs, making individual updates to their access
counts for each of the Services that they subscribe to.
Roles and Responsibilities:
3. Capgemini Confidential
• Involved in gathering and analysing the requirements given by the Business Analyst in CRM tool Jira.
• Involved in ETL mapping specification documents like HLD & LLD.
• Mainly involved in developing ETL by creating mappings and transformations using Informatics Power
Centre
• Loaded data from the PeopleSoft (PSFT) to a staging area by using a transformation and pulling all the data
from staging to the target data warehouse
• Optimized and tuned mappings to achieve better performance and efficiency
• Extensively used transformations like Filter, Aggregation, Joiner, Expression, Lookup and Router
• Thoroughly involved in designing the maple and reusable transformation according to business needs.
Module-3:
Module Title : Dad Phase 2. (Finance & Risk Stream)
Client : Thomson Reuters Corp,
Role : Developer/Unit Tester,
Data Base : Oracle11g,
Description:
Thomson Reuters is the product base company we are managing all the product information and services of TR
products in EDW data warehouse so we are taking data from the different source like flat file and Siebel sap and
sfdc and perform some mapping logic depend on business requirement and loaded data in our data warehouse.
Project # 5: Finance Analytics (EDM):
Module Title : Account Receivable & Account Payable.(Finance Stream)
Client : Informatica Corp,
Role : Developer/Tester,
Data Base : Oracle11g,
Description:
“Informatica Business solution” is a product development company to design a Data warehouse. Account
Receivable & Account Payable project is developed using Infromatica as an ETL Tool (Informatica 9.6.1) and
oracle 11g to load the data. They need a centralized database to store their product wise information and customer
information and company revenue information also.
Roles and Responsibilities:
Involved in requirements gathering, analysis, and requirements given by Business Analyst in CRM tools.
Involved in ETL mapping specification documents.
• Design of mapping to retrieve the Business Unit ,Invoice number customer name and Receivable Amount
and collector name and convert the amount in the form of US Dollar and based on some time duration how
much customer are there which are not paid amount in specific time limit.
• Design of mapping to perform the process of denormalisation for this we don’ t have a denoramaliser
transformation to achieve this we can use the combination of both aggregator and expression transformation
with last and first functions in aggregator transformation.
• Design of mappings involving scd2.
• Design of mapping to load the data from the People soft to oracle database. For this we need to create
Mapping with some transformation and load the data in warehouse.
Module-2:
Module Title : Sales Channel Hierarchy Project (Finance Stream)
Client : Informatica Business Solution,
Role : Developer/Unit Tester,
Data Base : Oracle11g,
Description:
A global investment bank serving the financial needs of corporations, institutions, governments and high-net-worth
investors worldwide. Unlike normal banking this provides payday loans i.e. Small loans for a short period. They
have a system in place called Advantage which captures data into flat files. Each location at the end of the day has
to pump data into central zones.
4. Capgemini Confidential
They needed a Data Warehouse so that they could maintain historical data atcentrallocation for integration, analyze
business in different locations, according profit areas,which could server a purpose of a DSS for decision makers.
• Optimizing/Tuning mappings for better performance and efficiency.
• Extensively used transformations like Filter, Aggregation, Joiner, Expression, Lookup, Router.
• Involved in designing the mapplet and reusable transformation according to the business needs.
• Knowledge of slowly changing dimension tables and fact tables.
• In this project we need to develop a mapping for the Sales channel(CART) CART means channel, Area
Region Territory .Based on Territory Id from Territory dim table, Change the Hierarchy who are Changing
Module-3:
Module Title: : Apttus project phase 1 Phase 2
Client : Informatica Corp,
Role : Developer/Tester
Data Base : Oracle11g,
Description:
The products and price information are pushed to SFDC from the PeopleSoft system. Same products with different
attributes like volume; Region, maintenance type etc. are maintained as different product SKUs in PeopleSoft.
However,when Apttus CPQ is implemented in SFDC we want such products to be maintained as one product and
map them to different SKUs using a mapping table. This will help users choose a product easily and select the
attributes as required to see the corresponding price.
Module-3:
Module Title: : Discount Calculation
Client : Informatica Corp,
Role : Developer/Tester
Data Base : Oracle11g,
Description:
"% Discount Calculation" report which will allows to determine whether discount provided to customer on a deal
is within the permissible limits.
In this Project we are maintaining customer information, Discount provide based on product bundle. How many
customer are there which is getting discount.
ACADEMICS:
MCA(Master ofComputer Application 2013) from Bundelkhand University
BCA (Bachelor of Computer Application 2009) from Jiwaji University.
Certification:
Data Specialist AMCAT: 1052123-211 No
PERSONAL DETAILS:
Father Name : Mahesh Chandra Soni,
DOB : 02/07/1990
Address for communication : H.no-224 Ayappa Layout 1st
main 3rd
block, Blore.
Nationality : Indian.
Alternate Contact No : 8050673622/9148250559
_______________________
(Rahul Soni)