Ajay Agrawal has over 3 years of experience in data warehousing and integration solutions. He has worked on projects for clients like Barclays and PNC Bank. He is proficient with technologies like Informatica, Oracle, Teradata, Hadoop and has experience in requirements gathering, design, development, testing and support. Some of his responsibilities include creating ETL mappings, stored procedures, managing code migration, and interacting with customers on UAT. He currently works as a Systems Engineer at TATA Consultancy Services.
SQL Analytics Powering Telemetry Analysis at ComcastDatabricks
Comcast is one of the leading providers of communications, entertainment, and cable products and services. At the heart of it is Comcast RDK providing the backbone of telemetry to the industry. RDK (Reference Design Kit) is pre-bundled opensource firmware for a complete home platform covering video, broadband and IoT devices. RDK team at Comcast analyzes petabytes of data, collected every 15 minutes from 70 million devices (video and broadband and IoT devices) installed in customer homes. They run ETL and aggregation pipelines and publish analytical dashboards on a daily basis to reduce customer calls and firmware rollout. The analysis is also used to calculate WIFI happiness index which is a critical KPI for Comcast customer experience.
In addition to this, RDK team also does release tracking by analyzing the RDK firmware quality. SQL Analytics allows customers to operate a lakehouse architecture that provides data warehousing performance at data lake economics for up to 4x better price/performance for SQL workloads than traditional cloud data warehouses.
We present the results of the “Test and Learn” with SQL Analytics and the delta engine that we worked in partnership with the Databricks team. We present a quick demo introducing the SQL native interface, the challenges we faced with migration, The results of the execution and our journey of productionizing this at scale.
SQL Analytics Powering Telemetry Analysis at ComcastDatabricks
Comcast is one of the leading providers of communications, entertainment, and cable products and services. At the heart of it is Comcast RDK providing the backbone of telemetry to the industry. RDK (Reference Design Kit) is pre-bundled opensource firmware for a complete home platform covering video, broadband and IoT devices. RDK team at Comcast analyzes petabytes of data, collected every 15 minutes from 70 million devices (video and broadband and IoT devices) installed in customer homes. They run ETL and aggregation pipelines and publish analytical dashboards on a daily basis to reduce customer calls and firmware rollout. The analysis is also used to calculate WIFI happiness index which is a critical KPI for Comcast customer experience.
In addition to this, RDK team also does release tracking by analyzing the RDK firmware quality. SQL Analytics allows customers to operate a lakehouse architecture that provides data warehousing performance at data lake economics for up to 4x better price/performance for SQL workloads than traditional cloud data warehouses.
We present the results of the “Test and Learn” with SQL Analytics and the delta engine that we worked in partnership with the Databricks team. We present a quick demo introducing the SQL native interface, the challenges we faced with migration, The results of the execution and our journey of productionizing this at scale.
Oracle DBA - Oracle Apps DBA - Technical Architect - IT Infrastructure Management - IT Application Management
Snap Information:
Title: Oracle DBA / Oracle APPS DBA Experience
· 17+ years of experience in IT Infrastructure as Oracle Apps DBA, Oracle DBA, Oracle RAC DBA, ODA, DBA & SQL DBA. Exadata x2, Exadata x 5 & Exadata x 8 administration, Technical Architect, Project management, IT Infra Lead. 10 years exclusively in UAE.Supported on different versions and platforms of database and ERP up to 19c database and 12.2.x ERP version……….
· Designed and implemented ODA 5x servers for database.
· Designed and implemented Exadata servers for ERP database.
· Oracle Database production support, Oracle RAC support, Cloning, Disaster Recovery.
· 40+ upgrades, migration and Implementation projects on different versions and platforms of Oracle ERP and database.
· Certified Six Sigma Green Belt and OCP. Completed trainings on PMP and Togaf9.2
Embarking on building a modern data warehouse in the cloud can be an overwhelming experience due to the sheer number of products that can be used, especially when the use cases for many products overlap others. In this talk I will cover the use cases of many of the Microsoft products that you can use when building a modern data warehouse, broken down into four areas: ingest, store, prep, and model & serve. It’s a complicated story that I will try to simplify, giving blunt opinions of when to use what products and the pros/cons of each.
by Darin Briskman, Technical Evangelist, AWS
Database Freedom means being able to use the database engine that’s right for you as your needs evolve. Being locked into a specific technology can prevent you from achieving your mission. Fortunately, AWS Database Migration Service makes it easy to switch between different database engines. We’ll look at how to use Schema Migration Tool with DMS to switch from a commercial database to open source. You’ll need a laptop with a Firefox or Chrome browser.
The new Microsoft Azure SQL Data Warehouse (SQL DW) is an elastic data warehouse-as-a-service and is a Massively Parallel Processing (MPP) solution for "big data" with true enterprise class features. The SQL DW service is built for data warehouse workloads from a few hundred gigabytes to petabytes of data with truly unique features like disaggregated compute and storage allowing for customers to be able to utilize the service to match their needs. In this presentation, we take an in-depth look at implementing a SQL DW, elastic scale (grow, shrink, and pause), and hybrid data clouds with Hadoop integration via Polybase allowing for a true SQL experience across structured and unstructured data.
Hotels.com’s Journey to Becoming an Algorithmic Business… Exponential Growth ...Databricks
In the last year Hotels.com has begun it’s journey to becoming an algorithmic business. Matt will talk about their experiences of exponential growth in Data Science Algorithms whilst at the same time the team have migrated to using Spark as their core underlying architecture from SAS / SQL, migrated to the cloud from on-premise are transforming the capability of the data science function. He will also highlight the key enablers that have made this successful including CEO support, the internal concepts of organic intelligence and how Databricks has helped make this happen. He will also highlight the pitfalls on the journey.
Seeking a role as a Java Developer which will enable me to use my skills in alignment with organizational goals for growth and betterment of institution, self and society.
Making Data Timelier and More Reliable with Lakehouse TechnologyMatei Zaharia
Enterprise data architectures usually contain many systems—data lakes, message queues, and data warehouses—that data must pass through before it can be analyzed. Each transfer step between systems adds a delay and a potential source of errors. What if we could remove all these steps? In recent years, cloud storage and new open source systems have enabled a radically new architecture: the lakehouse, an ACID transactional layer over cloud storage that can provide streaming, management features, indexing, and high-performance access similar to a data warehouse. Thousands of organizations including the largest Internet companies are now using lakehouses to replace separate data lake, warehouse and streaming systems and deliver high-quality data faster internally. I’ll discuss the key trends and recent advances in this area based on Delta Lake, the most widely used open source lakehouse platform, which was developed at Databricks.
Look at Oracle Integration Cloud – its relationship to ICS. Customer use Case...Phil Wilkins
This is a presentation about Oracle Integration Cloud (ICS) and Oracle Integration Cloud Service - the relationship between the two products. We also look at customer use cases and what lead to an ICS based recommendation and what would we recommend now OIC has been made available
Data Modelling 101 half day workshop presented by Chris Bradley at the Enterprise Data and Business Intelligence conference London on November 3rd 2014.
Chris Bradley is a leading independent information strategist.
Contact chris.bradley@dmadvisors.co.uk
Webinar future dataintegration-datamesh-and-goldengatekafkaJeffrey T. Pollock
The Future of Data Integration: Data Mesh, and a Special Deep Dive into Stream Processing with GoldenGate, Apache Kafka and Apache Spark. This video is a replay of a Live Webinar hosted on 03/19/2020.
Join us for a timely 45min webinar to see our take on the future of Data Integration. As the global industry shift towards the “Fourth Industrial Revolution” continues, outmoded styles of centralized batch processing and ETL tooling continue to be replaced by realtime, streaming, microservices and distributed data architecture patterns.
This webinar will start with a brief look at the macro-trends happening around distributed data management and how that affects Data Integration. Next, we’ll discuss the event-driven integrations provided by GoldenGate Big Data, and continue with a deep-dive into some essential patterns we see when replicating Database change events into Apache Kafka. In this deep-dive we will explain how to effectively deal with issues like Transaction Consistency, Table/Topic Mappings, managing the DB Change Stream, and various Deployment Topologies to consider. Finally, we’ll wrap up with a brief look into how Stream Processing will help to empower modern Data Integration by supplying realtime data transformations, time-series analytics, and embedded Machine Learning from within data pipelines.
GoldenGate: https://www.oracle.com/middleware/tec...
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Informatica provides the market's leading data integration platform. Tested on nearly 500,000 combinations of platforms and applications, the data integration platform inter operates with the broadest possible range of disparate standards, systems, and applications. This unbiased and universal view makes Informatica unique in today's market as a leader in the data integration platform. It also makes Informatica the ideal strategic platform for companies looking to solve data integration issues of any size.
Oracle DBA - Oracle Apps DBA - Technical Architect - IT Infrastructure Management - IT Application Management
Snap Information:
Title: Oracle DBA / Oracle APPS DBA Experience
· 17+ years of experience in IT Infrastructure as Oracle Apps DBA, Oracle DBA, Oracle RAC DBA, ODA, DBA & SQL DBA. Exadata x2, Exadata x 5 & Exadata x 8 administration, Technical Architect, Project management, IT Infra Lead. 10 years exclusively in UAE.Supported on different versions and platforms of database and ERP up to 19c database and 12.2.x ERP version……….
· Designed and implemented ODA 5x servers for database.
· Designed and implemented Exadata servers for ERP database.
· Oracle Database production support, Oracle RAC support, Cloning, Disaster Recovery.
· 40+ upgrades, migration and Implementation projects on different versions and platforms of Oracle ERP and database.
· Certified Six Sigma Green Belt and OCP. Completed trainings on PMP and Togaf9.2
Embarking on building a modern data warehouse in the cloud can be an overwhelming experience due to the sheer number of products that can be used, especially when the use cases for many products overlap others. In this talk I will cover the use cases of many of the Microsoft products that you can use when building a modern data warehouse, broken down into four areas: ingest, store, prep, and model & serve. It’s a complicated story that I will try to simplify, giving blunt opinions of when to use what products and the pros/cons of each.
by Darin Briskman, Technical Evangelist, AWS
Database Freedom means being able to use the database engine that’s right for you as your needs evolve. Being locked into a specific technology can prevent you from achieving your mission. Fortunately, AWS Database Migration Service makes it easy to switch between different database engines. We’ll look at how to use Schema Migration Tool with DMS to switch from a commercial database to open source. You’ll need a laptop with a Firefox or Chrome browser.
The new Microsoft Azure SQL Data Warehouse (SQL DW) is an elastic data warehouse-as-a-service and is a Massively Parallel Processing (MPP) solution for "big data" with true enterprise class features. The SQL DW service is built for data warehouse workloads from a few hundred gigabytes to petabytes of data with truly unique features like disaggregated compute and storage allowing for customers to be able to utilize the service to match their needs. In this presentation, we take an in-depth look at implementing a SQL DW, elastic scale (grow, shrink, and pause), and hybrid data clouds with Hadoop integration via Polybase allowing for a true SQL experience across structured and unstructured data.
Hotels.com’s Journey to Becoming an Algorithmic Business… Exponential Growth ...Databricks
In the last year Hotels.com has begun it’s journey to becoming an algorithmic business. Matt will talk about their experiences of exponential growth in Data Science Algorithms whilst at the same time the team have migrated to using Spark as their core underlying architecture from SAS / SQL, migrated to the cloud from on-premise are transforming the capability of the data science function. He will also highlight the key enablers that have made this successful including CEO support, the internal concepts of organic intelligence and how Databricks has helped make this happen. He will also highlight the pitfalls on the journey.
Seeking a role as a Java Developer which will enable me to use my skills in alignment with organizational goals for growth and betterment of institution, self and society.
Making Data Timelier and More Reliable with Lakehouse TechnologyMatei Zaharia
Enterprise data architectures usually contain many systems—data lakes, message queues, and data warehouses—that data must pass through before it can be analyzed. Each transfer step between systems adds a delay and a potential source of errors. What if we could remove all these steps? In recent years, cloud storage and new open source systems have enabled a radically new architecture: the lakehouse, an ACID transactional layer over cloud storage that can provide streaming, management features, indexing, and high-performance access similar to a data warehouse. Thousands of organizations including the largest Internet companies are now using lakehouses to replace separate data lake, warehouse and streaming systems and deliver high-quality data faster internally. I’ll discuss the key trends and recent advances in this area based on Delta Lake, the most widely used open source lakehouse platform, which was developed at Databricks.
Look at Oracle Integration Cloud – its relationship to ICS. Customer use Case...Phil Wilkins
This is a presentation about Oracle Integration Cloud (ICS) and Oracle Integration Cloud Service - the relationship between the two products. We also look at customer use cases and what lead to an ICS based recommendation and what would we recommend now OIC has been made available
Data Modelling 101 half day workshop presented by Chris Bradley at the Enterprise Data and Business Intelligence conference London on November 3rd 2014.
Chris Bradley is a leading independent information strategist.
Contact chris.bradley@dmadvisors.co.uk
Webinar future dataintegration-datamesh-and-goldengatekafkaJeffrey T. Pollock
The Future of Data Integration: Data Mesh, and a Special Deep Dive into Stream Processing with GoldenGate, Apache Kafka and Apache Spark. This video is a replay of a Live Webinar hosted on 03/19/2020.
Join us for a timely 45min webinar to see our take on the future of Data Integration. As the global industry shift towards the “Fourth Industrial Revolution” continues, outmoded styles of centralized batch processing and ETL tooling continue to be replaced by realtime, streaming, microservices and distributed data architecture patterns.
This webinar will start with a brief look at the macro-trends happening around distributed data management and how that affects Data Integration. Next, we’ll discuss the event-driven integrations provided by GoldenGate Big Data, and continue with a deep-dive into some essential patterns we see when replicating Database change events into Apache Kafka. In this deep-dive we will explain how to effectively deal with issues like Transaction Consistency, Table/Topic Mappings, managing the DB Change Stream, and various Deployment Topologies to consider. Finally, we’ll wrap up with a brief look into how Stream Processing will help to empower modern Data Integration by supplying realtime data transformations, time-series analytics, and embedded Machine Learning from within data pipelines.
GoldenGate: https://www.oracle.com/middleware/tec...
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Informatica provides the market's leading data integration platform. Tested on nearly 500,000 combinations of platforms and applications, the data integration platform inter operates with the broadest possible range of disparate standards, systems, and applications. This unbiased and universal view makes Informatica unique in today's market as a leader in the data integration platform. It also makes Informatica the ideal strategic platform for companies looking to solve data integration issues of any size.
• 11+ Years of IT Industry experience in Analysis, Design, Development, Maintenance and Support of various software applications mainly in Data Warehousing (Informatica Power Center, OWB, SSIS and Business Objects), Oracle (SQL, PL/SQL) and Teradata in industry verticals like Finance, Telecom, Retail and Healthcare.
• Work experience in client facing roles in UK and Ireland.
• Performed numerous roles in Business Intelligence projects as Data warehouse System Analyst, ETL Designer, Onshore coordinator, Technical Lead and Senior Data warehouse Developer roles with multinational IT result-driven organizations
• Extensive experience on Data integration projects accessing sources like Teradata, Oracle and SQL server.
• Created robust EDW Solution from various types of sources like Flat files, XML Files, EDCDIC Cobol copybook from Mainframe systems, DB2 Unload files.
• Extensive experience on Data discovery, cleansing using Informatica IDQ.
• Resolved Inconsistent and Duplicate Data issues during Data Analysis to Support Strategic EDW Goals.
• Extensive experience of Data Integration using Informatica Power center Tool stack.
• Strong knowledge on Data Warehousing concepts, ETL concepts, Data Modeling, Dimensional Modeling.
• Conducted training on Informatica and have achieved awards for proficient training capabilities.
• Excellent understanding of OLTP and OLAP concepts and expert in writing SQL, Stored procedure on Teradata, Oracle and SQL Server.
• Extensive experience in implementing Data Warehousing methodologies including STAR SCHEMA and SNOW-FLAKE SCHEMAS & 3NF for huge data warehouses.
• Extensive knowledge on Change Data Capture (CDC) and SCD Type 1, Type 2, Type 3 Implementations.
• Excellent understanding of Kimball and Inmon Methodologies.
• Provided leadership when addressing high level technical issues and questions with the functionality of the reporting and business intelligence applications.
• Managed the current and strategize to foresee and plan for the future engineering needs in Data Integration space.
• Performed roles as a interface and coordinator between Database Administration, ETL Development, Testing teams and reporting teams to eliminate the road blocks for smooth flow of information.
• Hands on experience in tuning ETL mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
• Expert in designing and developing of complicate ETL mappings using Informatica PowerCenter.
• Proficient in optimizing performance issues using Informatica PowerCenter and Teradata.
• Having experience on using Teradata utilities (TPT, BTEQ, Fast Load, MultiLoad, FastExport, Tpump).
• Exposure in writing Shell Scripting as per given requirement.
• Work extensively in Teradata GCFR tool.
• Experience in SAP ECC integration with Informatica.
• Got training in Tableau, Qlik View& SAP BW 3.5and done POC for the same.
• 11+ Years of IT Industry experience in Analysis, Design, Development, Maintenance and Support of various software applications mainly in Data Warehousing (Informatica Power Center, OWB, SSIS and Business Objects), Oracle (SQL, PL/SQL) and Teradata in industry verticals like Finance, Telecom, Retail and Healthcare.
• Work experience in client facing roles in UK and Ireland.
• Performed numerous roles in Business Intelligence projects as Data warehouse System Analyst, ETL Designer, Onshore coordinator, Technical Lead and Senior Data warehouse Developer roles with multinational IT result-driven organizations
• Extensive experience on Data integration projects accessing sources like Teradata, Oracle and SQL server.
• Created robust EDW Solution from various types of sources like Flat files, XML Files, EDCDIC Cobol copybook from Mainframe systems, DB2 Unload files.
• Extensive experience on Data discovery, cleansing using Informatica IDQ.
• Resolved Inconsistent and Duplicate Data issues during Data Analysis to Support Strategic EDW Goals.
• Extensive experience of Data Integration using Informatica Power center Tool stack.
• Strong knowledge on Data Warehousing concepts, ETL concepts, Data Modeling, Dimensional Modeling.
• Conducted training on Informatica and have achieved awards for proficient training capabilities.
• Excellent understanding of OLTP and OLAP concepts and expert in writing SQL, Stored procedure on Teradata, Oracle and SQL Server.
• Extensive experience in implementing Data Warehousing methodologies including STAR SCHEMA and SNOW-FLAKE SCHEMAS & 3NF for huge data warehouses.
• Extensive knowledge on Change Data Capture (CDC) and SCD Type 1, Type 2, Type 3 Implementations.
• Excellent understanding of Kimball and Inmon Methodologies.
• Provided leadership when addressing high level technical issues and questions with the functionality of the reporting and business intelligence applications.
• Managed the current and strategize to foresee and plan for the future engineering needs in Data Integration space.
• Performed roles as a interface and coordinator between Database Administration, ETL Development, Testing teams and reporting teams to eliminate the road blocks for smooth flow of information.
• Hands on experience in tuning ETL mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
• Expert in designing and developing of complicate ETL mappings using Informatica PowerCenter.
• Proficient in optimizing performance issues using Informatica PowerCenter and Teradata.
• Having experience on using Teradata utilities (TPT, BTEQ, Fast Load, MultiLoad, FastExport, Tpump).
• Exposure in writing Shell Scripting as per given requirement.
• Work extensively in Teradata GCFR tool.
• Experience in SAP ECC integration with Informatica.
• Got training in Tableau, Qlik View& SAP BW 3.5and done POC for the same.
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
StarCompliance is a leading firm specializing in the recovery of stolen cryptocurrency. Our comprehensive services are designed to assist individuals and organizations in navigating the complex process of fraud reporting, investigation, and fund recovery. We combine cutting-edge technology with expert legal support to provide a robust solution for victims of crypto theft.
Our Services Include:
Reporting to Tracking Authorities:
We immediately notify all relevant centralized exchanges (CEX), decentralized exchanges (DEX), and wallet providers about the stolen cryptocurrency. This ensures that the stolen assets are flagged as scam transactions, making it impossible for the thief to use them.
Assistance with Filing Police Reports:
We guide you through the process of filing a valid police report. Our support team provides detailed instructions on which police department to contact and helps you complete the necessary paperwork within the critical 72-hour window.
Launching the Refund Process:
Our team of experienced lawyers can initiate lawsuits on your behalf and represent you in various jurisdictions around the world. They work diligently to recover your stolen funds and ensure that justice is served.
At StarCompliance, we understand the urgency and stress involved in dealing with cryptocurrency theft. Our dedicated team works quickly and efficiently to provide you with the support and expertise needed to recover your assets. Trust us to be your partner in navigating the complexities of the crypto world and safeguarding your investments.
1. Page 1 of 5
AJAY AGRAWAL Mobile - 9770173414
Email - ajay.agr08@gmail.com
Experience Summary
● 3.4 yearsof IT experience inAnalysis,Design,Development,Implementation,TestingandSupportof Data
Warehouse andData IntegrationSolutions.
● Havinga relevantITexperience inBankingandFinance DomainespeciallyworkingforclientBarclays,UKand
PNCBank, USA.
● Good knowledgeof DataWarehousingconceptsandprinciplesof StarSchema,Snow flake Schema, SCD.
● Proficientin Informatica9.6.1 & 10.2, Oracle SQL/PL SQL, Teradata SQL and UNIXShell scripting, Hadoop
Ecosystem.
● Exposure in overall SDLC including requirement gathering, data modelling, development, testing, debugging,
deployment, documentation, production support.
● Have worked in migration project which was from Oracle database to Teradata.
● Having strong hands on experience in extracting data from RDBMS, XML files, Flat files, Mainframe source
(Powerexchange).
● Preparationof highandlowlevel designdocumentsfromthe Use Case or BusinessDevelopmentdocumentto
ensure the Productqualityasperthe SDLC phase byphase .
● Experience in performance tuningthe mapping.
● Creatingthe STMdocumentsforthe User requirements
● InteractingwithcustomersandotherstakeholdersinassistingforUAT.
● Independently perform complex troubleshooting, root-cause analysis and solution development.
● ReceivedStarof the Month award forsuccessful deliveryof the ADAPT.
● Conductingtechnical trainingsforawareness withnewtechnologiesforassociatesinaccountlevel.
Technical Expertise
ETL Informatica Powercenter 9.6.1,
Informatica Powercenter 9.5.1
Informatica Powerexchange
RDBMS Oracle 11g/10g, Teradata
Database Tools TOAD , TERADATA STUDIO
HadoopEcosystem Hadoop, MapReduce, HDFS, HBase, Zookeeper, Hive, Pig,
Sqoop, Cassandra, Oozie,
Change ManagementTools Harvest, ALM, Udeploy , Artifactory
ProgrammingLanguages/
Tools
C, C++, PL/SQL, Putty
ScriptingLanguage Shell Scripting, Python
2. Page 2 of 5
Qualifications
Course
Institution
Board /University
Year of
Completion
Aggregate (%)
B.E
(CSE)
Bhilai Institute of
Technology, Raipur
Chhattisgarh Swami
Vivekanand Technical
University, Bhilai
2014 71.2
HSC
Deshbandhu Eng.
Med.
School , Raipur
Chhattisgarh state board 2010 82
HSSC
Deshbandhu Eng.
Med.
School , Raipur
Chhattisgarh state board 2008 86.3
Project Profile #1
Project AMG E-Signature / SID Claim
Customer PNC Financial Services Group
Period July 2017 – Till date
Description E -Signprojectwill focusonenhancingthe WealthManagement Client Experience
in several areas where our clients, leaders and employees have identified
deficiencies. The areas covered in this project include Client On boarding, Client
Communication(Internal &External),Mobility(whichtranscendsbothOnboarding
and Communications) and Account Closing.
SID Claimwill helpthe client to automate the fee calculation process i.e claim for
the advisors depending on the incentives defined. It will also focus on keeping
track on approved, pending and rejected opportunity.
Role Module Lead, Developer
SolutionEnvironment Operating system : Unix, Windows
Language : PL/SQL
Databases : Oracle 11g, Teradata
Special Software/Tools : Informatica 9.6.1
Scripting : Shell Scripting
ProjectSpecificTools Informatica, Toad for Oracle
Highlights Responsibilities as a Informatica and PL/SQL Developer:
● Understanding Business requirements.
● Work with business and document Business Problem and probable
solution of the problem
● Streamlined the ETL Code and implemented the required standards
● Developing suitable solutions, alternatives and options
● Have created ETL mapping, sessions, Workflows and Unix Objects as per
3. Page 3 of 5
the requirement
● Have created stored procedures, packages and triggers as per business
requirement
● Have used Harvest and Udeploy to migrate the ETL code and Unix
components to higher environment
● Review scope and solution documents with the business team on
immediate basis
● Review scope and solution documents with the IT management team
● Review components and test results prepared by other team members
● Keep solution and approach transparent to Business and IT Management
team. No solutionshouldbe implementedwithoutapproval fromBusiness
and IT management
Service Practice Asset Management (AMG)
IndustryPractice Banking
ProjectLocation Pune, MH- India
Project Profile #2
Project AMG APE
Customer PNC Bank, USA.
Period 1st SEP 2016 to 15th JUN 2017
Role Developer
Responsibilities ● To prepare design documents for ETL mappings.
● Worked on Requirement of loading .CSV files to tables on SQL Server
using various Informatica transformations.
● Worked on requirement where XML files were as a target.
● Created UNIX scripts for validating column Header in CSV file.
● Created PL/SQL stored procedures for the client website.
● Wrote PRE and POST SQL commands in session properties to manage
constraints that improved performance and written SQL queries to
perform database operations according to business requirements.
● Migrated repository objects, services and scripts from development
environment to QA and production environment.
● To prepare test cases for unit testing and unit test it.
Solution Environment Informatica, SQL Server, Toad for Oracle, Unix, Mainframe.
4. Page 4 of 5
Project Profile #3
Project PNC_RET_DEV_PTK_ICD_Migration
Customer PNC Bank,USA
Period 31st July‘ 2015 to 31st AUG 2016
Role Developer
Responsibilities
● Analyzing the existing jobs in Informatica, Unix shell scripts and PL/SQL
scripts.
● To prepare Informatica mappings from existing PL/SQL scripts.
● To migrate existing Informatica mappings from Oracle to Teradata and make
it Teradata compatible.
● Informatica conversion from PL/SQL scripts.
● Identified performance issues in existing sources, targets and mappings by
analyzingthe dataflow,evaluatingtransformationsandtunedaccordinglyfor
better performance.
● Worked with tools like TOAD and TERADATA STUDIO to write queries and
generate the result
● Wrote PRE and POST SQL commands in session properties to manage
constraints that improved performance and written SQL queries and PL/SQL
procedures to perform database operations according to business
requirements.
● Worked with Session Logs, Informatica Debugger, and Performance Logs for
error handling the workflows and session failures.
● Migrated repository objects, services and scripts from development
environment to production environment. Extensive experience in
troubleshooting and solving migration issues and production issues.
● To prepare design documents for ETL mappings.
● To prepare test cases for unit testing and unit test it.
Solution Environment Informatica, Teradata Studio, Toad for Oracle, Unix, Blue fox Mainframe.
5. Page 5 of 5
Project Profile #4
Project TPE-Commodities
Customer Barclays, UK
Period March 2015 to June 2015
Role L2 Production Support Executive
Responsibilities ● Understanding existing business model and customer requirements.
● To monitorthe jobsand lookoverthe issues in case of Job failures and delay
in feeds.
● To look over the user queries relevant to application in TPE commodities.
● Quick problem solving (on-time SLA) and giving solutions on short time.
Solution Environment Autosys, Service now, COMET, Openlink Endur
Career Profile
Dates Organization Role
March 2015 – Till Date TATA Consultancy Services Systems Engineer
Declaration
I hereby declare that the above mentioned Information is correct up to my Knowledge and I bear the
responsibility for the correctness of the above mentioned particulars.
(AJAY AGRAWAL)