Ravikanth Marpuri has over 9 years of experience in IT testing with expertise in ETL/BI/data migration testing. He has extensive experience testing tools such as IBM Datastage, Oracle Warehouse Builder, Cognos Data Manager, Informatica, SSIS, and SSAS. He has worked on projects in banking, finance, telecom, and energy domains. Ravikanth is skilled in test automation, performance testing, and leading teams of up to 20 members. He created a test automation framework that reduced regression testing time.
• Associate Consultant pursuing Executive MBA with 3+ years of experience in Healthcare ,Banking domain & software development, implementation in the areas of Data warehousing using IBM web sphere Data stage 8.1 tool and IBM Info Sphere Data stage 8.7,ETL Architecture, enhancement, maintenance, Production support, Data Modeling, Data profiling, Reporting including Business requirement, system requirement gathering.
A Tighter Weave – How YARN Changes the Data Quality GameInside Analysis
Hot Technologies with David Loshin, David Raab and RedPoint Global
Live Webcast on August 20, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=cc1ff3fd6d8642b3cc2d3866358387b1
The game-changing power of Hadoop is no longer questioned in the world of data management. But the bigger story these days is YARN, sometimes called Hadoop 2.0. This innovation extends the power of Hadoop to the entire spectrum of enterprise applications, and has a uniquely compelling story for data quality. In fact, solutions built around YARN now promise to revolutionize this critical field, by weaving together the myriad data quality practices into a comprehensive, end-to-end platform.
Register for this episode of Hot Technologies to hear veteran Analysts David Loshin and David Raab as they explain how YARN has opened up a new world of possibilities for enterprise data quality. They’ll be briefed by George Corugedo of RedPoint Global who will demonstrate how his company uses YARN as the backbone for a next-generation data quality platform. He’ll show how long-standing best practices can be stitched together quickly, and can be augmented by the latest advances in machine learning and predictive analytics.
Visit InsideAnlaysis.com for more information.
Making the leap from "gatekeeper" to strategic business partner often requires the QA/test group to centralize and standardize the selection of test tools, the development of test processes and templates and the training of testing staff. Only then can it break the organizational silos which typically hobble testing efforts, present a consistent and credible face to their business customers and develop the specialized expertise needed to meet today's testing challenges.
Tackling non-determinism in Hadoop - Testing and debugging distributed system...Akihiro Suda
[Presented at FOSDEM 2016: https://fosdem.org/2016/schedule/event/nondeterminism_in_hadoop/]
Developing and maintaining distributed systems like Hadoop is difficult. The difficulty comes from many factors, but we believe that one of the most important reasons is lacking of a good debugger for bugs specific to distributed systems. (e.g., non-deterministic hardware faults, message ordering, ..)
In the talk, we will show Earthquake, our open-source debugging framework for distributed systems. Earthquakes permutes Ethernet packets, Filesystem events, Java/C function calls, and injected faults in various orders so as to control non-determinism in the cluster. Basically, Earthquake permutes events in a random order, but the user can write his/her own state exploration policy (in Go language) for finding deep bugs efficiently. Earthquake also controls non-determinism of the thread interleaving by calling sched_setattr(2) with randomized parameters.
We will also share our successful stories about testing some Hadoop components with Earthquake. For ZooKeeper, we found a distributed race condition bug which decreases availability of a ZooKeeper cluster. We also reproduced a known ZooKeeper bug that no one had successfully reproduced for 2 years, and analyzed its cause. For YARN, we found a disk-fault tolerance bug that inappropriately marks faulty node as healthy. We also found bugs of non-Hadoop softwares, such as etcd.
With Earthquake, you can also test your real distibuted systems without any modification.
• Associate Consultant pursuing Executive MBA with 3+ years of experience in Healthcare ,Banking domain & software development, implementation in the areas of Data warehousing using IBM web sphere Data stage 8.1 tool and IBM Info Sphere Data stage 8.7,ETL Architecture, enhancement, maintenance, Production support, Data Modeling, Data profiling, Reporting including Business requirement, system requirement gathering.
A Tighter Weave – How YARN Changes the Data Quality GameInside Analysis
Hot Technologies with David Loshin, David Raab and RedPoint Global
Live Webcast on August 20, 2014
Watch the archive:
https://bloorgroup.webex.com/bloorgroup/lsr.php?RCID=cc1ff3fd6d8642b3cc2d3866358387b1
The game-changing power of Hadoop is no longer questioned in the world of data management. But the bigger story these days is YARN, sometimes called Hadoop 2.0. This innovation extends the power of Hadoop to the entire spectrum of enterprise applications, and has a uniquely compelling story for data quality. In fact, solutions built around YARN now promise to revolutionize this critical field, by weaving together the myriad data quality practices into a comprehensive, end-to-end platform.
Register for this episode of Hot Technologies to hear veteran Analysts David Loshin and David Raab as they explain how YARN has opened up a new world of possibilities for enterprise data quality. They’ll be briefed by George Corugedo of RedPoint Global who will demonstrate how his company uses YARN as the backbone for a next-generation data quality platform. He’ll show how long-standing best practices can be stitched together quickly, and can be augmented by the latest advances in machine learning and predictive analytics.
Visit InsideAnlaysis.com for more information.
Making the leap from "gatekeeper" to strategic business partner often requires the QA/test group to centralize and standardize the selection of test tools, the development of test processes and templates and the training of testing staff. Only then can it break the organizational silos which typically hobble testing efforts, present a consistent and credible face to their business customers and develop the specialized expertise needed to meet today's testing challenges.
Tackling non-determinism in Hadoop - Testing and debugging distributed system...Akihiro Suda
[Presented at FOSDEM 2016: https://fosdem.org/2016/schedule/event/nondeterminism_in_hadoop/]
Developing and maintaining distributed systems like Hadoop is difficult. The difficulty comes from many factors, but we believe that one of the most important reasons is lacking of a good debugger for bugs specific to distributed systems. (e.g., non-deterministic hardware faults, message ordering, ..)
In the talk, we will show Earthquake, our open-source debugging framework for distributed systems. Earthquakes permutes Ethernet packets, Filesystem events, Java/C function calls, and injected faults in various orders so as to control non-determinism in the cluster. Basically, Earthquake permutes events in a random order, but the user can write his/her own state exploration policy (in Go language) for finding deep bugs efficiently. Earthquake also controls non-determinism of the thread interleaving by calling sched_setattr(2) with randomized parameters.
We will also share our successful stories about testing some Hadoop components with Earthquake. For ZooKeeper, we found a distributed race condition bug which decreases availability of a ZooKeeper cluster. We also reproduced a known ZooKeeper bug that no one had successfully reproduced for 2 years, and analyzed its cause. For YARN, we found a disk-fault tolerance bug that inappropriately marks faulty node as healthy. We also found bugs of non-Hadoop softwares, such as etcd.
With Earthquake, you can also test your real distibuted systems without any modification.
Applying Testing Techniques for Big Data and HadoopMark Johnson
Testing “Big Data” can mean big time investment; several hours often spent just realize you made a simple typo. You fix the typo and then wait another couple hours for your script to hopefully this time run to completion. Even if the Big Data script or program ran to completion are you sure your data analysis is correct? Getting programs to run to completion and to assure functional accuracy per the requirements are some of the biggest hidden problems in big data today.
During this overview presentation we will first introduce unit and functional testing techniques and high level concepts to consider in the Hadoop Ecosystem. The second half of the presentation we will explore real testing examples using tools such as PigUnit, JUnit for UDF testing, BeeTest and Hive limited test data set testing.
Performance Testing of Big Data Applications - Impetus WebcastImpetus Technologies
Impetus webcast "Performance Testing of Big Data Applications" available at http://lf1.me/cqb/
This Impetus webcast talks about:
• A solution approach to measure performance and throughput of Big Data applications
• Insights into areas to focus for increasing the effectiveness of Big Data performance testing
• Tools available to address Big Data specific performance related challenges
Achal Raghavan's case analysis (along with those from other authors) published in Vikalpa (the IIM Ahmedabad journal) in Oct-Dec 2007. Deals with the challenges faced by Infosys in transitioning from low-end system maintenance jobs to high-end consulting / solutions projects. The analysis includes a strategy recommendation. Though published several years back, the analysis is especially relevant now, when the "Infosys 3.0" growth strategy is under increasing scrutiny.
“If you have an apple and I have an apple and we exchange these apples then you and I will still each have one apple. But if you have an idea and I have an idea and we exchange these ideas, then each of us will have two ideas.” (George Bernard Shaw)
Unlike many other resources that get depleted when shared, an idea or a knowledge nugget only gets enriched. From an era where labor and capital ruled, we now have evolved to a period where knowledge is seen as the key, if not the sole differentiator.
Big Data Testing Approach - Rohit KharabeROHIT KHARABE
This presentation speaks about -
1) How to perform big data testing
2) Tools that can be used for testing
3) Different validation stages involved
4) Performance testing
Effective testing for spark programs Strata NY 2015Holden Karau
This session explores best practices of creating both unit and integration tests for Spark programs as well as acceptance tests for the data produced by our Spark jobs. We will explore the difficulties with testing streaming programs, options for setting up integration testing with Spark, and also examine best practices for acceptance tests.
Unit testing of Spark programs is deceptively simple. The talk will look at how unit testing of Spark itself is accomplished, as well as factor out a number of best practices into traits we can use. This includes dealing with local mode cluster creation and teardown during test suites, factoring our functions to increase testability, mock data for RDDs, and mock data for Spark SQL.
Testing Spark Streaming programs has a number of interesting problems. These include handling of starting and stopping the Streaming context, and providing mock data and collecting results. As with the unit testing of Spark programs, we will factor out the common components of the tests that are useful into a trait that people can use.
While acceptance tests are not always part of testing, they share a number of similarities. We will look at which counters Spark programs generate that we can use for creating acceptance tests, best practices for storing historic values, and some common counters we can easily use to track the success of our job.
Relevant Spark Packages & Code:
https://github.com/holdenk/spark-testing-base / http://spark-packages.org/package/holdenk/spark-testing-base
https://github.com/holdenk/spark-validator
Testing Big Data: Automated Testing of Hadoop with QuerySurgeRTTS
Are You Ready? Stepping Up To The Big Data Challenge In 2016 - Learn why Testing is pivotal to the success of your Big Data Strategy.
According to a new report by analyst firm IDG, 70% of enterprises have either deployed or are planning to deploy big data projects and programs this year due to the increase in the amount of data they need to manage.
The growing variety of new data sources is pushing organizations to look for streamlined ways to manage complexities and get the most out of their data-related investments. The companies that do this correctly are realizing the power of big data for business expansion and growth.
Learn why testing your enterprise's data is pivotal for success with big data and Hadoop. Learn how to increase your testing speed, boost your testing coverage (up to 100%), and improve the level of quality within your data - all with one data testing tool.
1. Ravikanth Marpuri
Email: marpuri.ravikanth@gmail.com Mobile: 9986685553
DOB: 10-June-1980
Summary
A total of 9 years 10 months of experience in IT industry with 7+ yrs. in ETL /BI/Data Migration Testing
and 1+ in Big data Testing.
Acting as a SME in ETL/BI/Data Migration Testing for whole company.
Expertise in ETL Testing/BI Reports Testing
Experience in IBM-Banking Data warehousing Architecture.
Experience in Banking/Finance, Telecom Billing, Customer Care Domains, Energy Trading.
Experience in Testing DWH ETL Tools IBM- Datastage, Oracle Warehouse Builder (OWB), Cognos-Data
Manager, Informatica, SSIS, SSAS, Talend.
Experience in testing BI (OLAP) reports on tools Cognos 8.1, Cognos 8.4, cognos 10 & Business Objects,
Qlikview.
Good Exposure in Data ware housing (ETL) Automation Testing Tool –Query Surge, Informatica DVO
Exposure to Big data (Hadoop Environment-Hbase, Hive ,HDFS) & SparkSQL
Exposure to NoSQL databases (Hbase, MangoDB)
Experience in Build process & Deployment of Data warehousing Project code at client place in UAT &
production boxes.
Seasoned professional with outstanding project planning, execution, consulting, monitoring.
Excel at communicating with stakeholders to provide accurate reporting and information regarding the
ongoing projects and initiatives
Experienced in coordinating, negotiating and motivating the resources in support of timeline and IT
project deliverables.
Experience in Implementing and Driving Agile Methodology (Scrum).
Worked with Performance testing of Cognos Reports & Web-Based applications with tool WEB
Performance Load Tester, Open STA
Experience in leading 5-20 members team and Acted as single point of contact for all deliverables of QA
team.
Created a Test Automation framework Called Business Intelligence Test Automation Framework (BI-TAF )
which helped team in reduction regression testing time (SQL SERVER,ORCALE Databases)
An effective communicator with exceptional relationship management skills with the ability to relate to
people at any level of business.
Experience in Implementing and Driving Agile Methodology (Scrum).
Experience with SUBVERSION, TFS
Companies Worked for:
Working as Senior Associate QA L2 in Sapient consulting private limited ,Bangalore from
AUG2012 - Till Date. (EMP#97245)
Working as Senior Quality Engineer in Misys Software Solutions Pvt .Ltd ,Bangalore from
Dec2010 – AUG2012 (EMP#2440)
Working as Test Analyst in Convergys Information Management (India) Pvt. Ltd ,Hyderabad from
April 2007 – Dec2010 (EMP#100253161)
Worked as Technical Associate in Tech Mahindra, Pune from May 2005 – March 2007 (EMP#11274)
1
2. Key Deliverables across the tenure
Data ware Housing (ETL Testing, Reports Testing & Data Migration Testing):
Carried out ETL Testing Using OWB, Data Stage, Data Manager, and Informatica, Talend.
Experience in testing OLAP reports on tools Cognos 8.4,10 & Business objects, Qlikview.
Developed BI Automation Framework especially for ETL Testing and Reports Data validation which has
reduced the regression testing time significantly.
Experience in Data ware housing (ETL) Automation Testing Tool –Query Surge, Informatica DVO
Deployment of code in Test Environment and Running the ETL Jobs to load data into Dataware house
Experience in Implementation of Project code at client production environment and Done UAT along
with clients.
Exposure to Big Data Testing on Hadoop and MapR ecosystems.
Used the most comprehensive approach of data warehouse testing. i.e., validating the data at each
transformation, at each level of DWH i.e. Source to Staging tables, staging tables to DWH (Atomic Area),
DWH to Data marts, Data marts to Business intelligence reports with Field-by-Field data verification to
check the consistency of source and target data.
Wrote the Complex SQL Queries on data staging tables and data warehousing tables to validate the data
results.
Verified whether data transformations are correct according to the business rules and data warehouse is
populated with the transformed data.
Analyzed the rejected records that don’t fulfill transformation rules.
Data Migration testing is done on data coming from heterogeneous data sources (ERP, People soft, SQL
Server).
Good at finding out the performance issues of ETL Jobs.
Involved in validation of OLAP unit testing
Involved in System Testing ,Database testing& Drill down features testing of the OLAP Reports
Done Performance Testing of Cognos Reports (Web) With Web-Performance Load Tester, Load Runner.
Preparing Defect Reports and Test Summary Report documents.
Prepared Test Scenarios for User Acceptance which will deal with Data Reconciliation from source to
Cognos Reports and matched the MBI Cognos report to Source System Reports.
Project Operations:
Conducting system or process study for project planning, scoping, estimation and tracking
Identifying risks against testing delivery & Finding the Strategy to mitigate the risks.
Identifying test entry, exit criteria.
Designing Test Strategy, developing test plans and approving test cases.
Scheduling project for releases on coordination with project managers and clients.
Defining best practices for project support and documentation
Ensuring customer satisfaction and getting repeat or new business.
Preparing multiple reusable Artifacts.
Involved in Preparing the RFPs for the projects involved in ETL/Data Migration projects.
Software Quality Assurance
Providing scope, resource and time estimates for projects presented to QA for testing and analysis.
Presenting QA test status and progress reports as appropriate to top management.
Working with business analysis function to ensure QA testing requirements are identified and included in
the requirements.
Monitoring development activities and reporting project progress.
Experience in implementing Agile Methodology and scrum, XP Methods
Experience in implementing Agile techniques of project management (Product Backlog, Sprint Planning,
Daily Stand-up etc)
Experience in Implementing CMM Levels & Collecting project Metrics.
Certified Information Technology Infrastructure Library, ITIL V3 Foundation Exam.
2
3. Team Management
Closely working with the software engineering team for handling system and integration testing,
functional and user acceptance testing.
Delegating the work to the team members on a priority basis and basing on the skill set.
Ramping up the team’s knowledge base if the team member is new to the product or process by providing
appropriate trainings.
Educational Qualification
B.E (Electronics and Communication) From University of Madras May 2001
Certifications:
Done ITIL® V3 Foundation (Information Technology Infrastructure Library) from EXIN
Summary of Skills
Skill Summary • Data Modeling & Data Warehousing
• Cognos8.4,8.2,8.1 reportnet,Cognos 7.0, Web-Intelligence, ETL technologies
using Informatica,Oracle Ware House Builder,Cognos Data Manager
• Efficiency in Designing using Structured Designing Approach and Object
Oriented Approach) and SQL programming.
Telecom Billing products Geneva 5.0,Convergys Infinys 2.5,3.0
Operating Systems
DOS ,Windows 2000/98/95, Windows NT4,Windows XP,UNIX
RDBMS SQLSERVER,DB28.0,Oracle 9i
CASE Tools ER-Win
OLAP & ETL Tool Datastage 8.5,OWB,Cognos Report Studio, Cognos Data Manager, Cognos
8.4,8.2,8.1 &10.0, Business Objects, Ab Initio, Micro strategy ,Talend, Qlikview
Languages Known SQL, VB Script
Defect Management Tools Bugzilla, Test Director, Team forge CTF
OtherToolsFamiliar/Worked with Quest Central for DB2,Lotus Notes Client,QTP 8.2,MySAP,Oracle SQL Developer
Projects worked on:
Company Sapient Nitro
Project Customer Intelligence Platform
Duration AUG2014 - Till Date
Size 5
Customer Intelligence plat form is developed to analyze how the customer is thinking about a brand, company or
product in social media like Twitter & Facebook by getting the sentiments or klout scores of each tweet, post &
comments in social media and presenting data through Dashboards in Qlikview.
Spark Application is used to get the live streaming data from twitter based on the key search word and Java
applications are developed to get the data from Facebook to HBase tables and ETL jobs are developed in Talend to
load the data to HDFS files and final Hive fact tables and Dashboards & Reports are deleloped in Qlikview to
present data.
Platform : Hbase, HDFS, HIVE, Talend, Qlikview.
Company Sapient Nitro
Project Boston Consulting Group(BCG)-CT Recruiting Project
Duration June2013 - Till Date
Size 9
BCG is world Leading consulting company and it hires people from all over world for positions.
Currently BCG has 2 recruiting systems Recruiting online (Europe, SA , Austraila) & eRecruiting (North America,
Asia Pacific) and Recruiters from one region cannot access candidates of other region, And BCG is moving
Recruiting process to new system called Avature and BCG wants all the data to be integrated and have a single
Global view of candidate, Where all Recruiters can see the Global candidate.
For moving data from 2 legacy systems to a common repository ETL jobs has been developed and to get data from
New system(Avature) to Common repository Web services(.net) has been developed.
Platform:-Oracle Warehouse Builder (OWB), .Net web services
3
4. Company Sapient Nitro
Project Direct Energy-(Trade& Risk Reporting)
Duration AUG2012 –June2013
Size 7
Direct Energy (Trading & risk Reports) will help all the stakeholders in getting information
about how the energy (Power/Gas) trading is done. The solution will provide the user to access the
ODS-cube, which the users can slice and dice the cube to get the metrics values for various dimensions.
SSIS (SQL Server Integration Services) is responsible for extracting the data from trading source systems
(Endur/Warr), after applying business transformation rules and then loading the refined data in ODS.
Then aggregated data from ODS is processed through SSAS (SQL Server Analysis Services) into ODS-Cubes
(Market/Credit)
Platform: SSIS, SSAS
Roles and Responsibilities
• Acted as a single point of contact for all the Testing deliverables for multiple projects.
• Setting up the QA team right from scratch & managing multiple QA projects teams
• Defining the scope and Plan, lead and actively participate in testing
• Interaction with developers and Business Analysts to effectively analyze the client’s requirements.
• Reviewed business requirements documents & technical Specifications & Documented Test Plans &Test
Cases corresponding to business rules and other operating conditions.
• Replicated complete Complex ETL jobs in Single Queries and able to test large volume data.
• Involved in Build deployment & Loading data in various environments
• Interacting with clients.
• Acted as a Requirement Manager for offshore team.
Company Misys Software Solutions
Project Misys Business Intelligence(MBI)
Duration Dec2010 - Till Date
Size 20
The goal of Misys Business Intelligence (MBI) is to deliver operational performance insights and analysis of Bank to
management at all levels of the business in Subject areas like Profitability, Balance Sheet, Maturity, Interest rate,
Credit Risk Analysis, Through Dashboards, Scorecards, Reporting and basic analytics,
Misys has Core banking& Treasury Capital Markets Products like Bank Fusion Equation & Bank Fusion Universal
Banking, LoanIQ and SUMMIT Etc. Which were used by many banks and will act as a source system for
Datawarehosue and MBI Extracts the data from this source system then transforms the data in to business
importance and Load it to Data ware house
MBI enables executive management to use this information more accurately, mitigate risk, aid in enterprise
business goals of delivering revenue growth and increasing margins, Analyze the Risks that are associated.
Projects: Bank Fusion Equation (BFEQ), Bank Fusion Universal Banking (BFUB), MIDAS etc.
Roles and Responsibilities
• Acted as a single point of contact for all the Testing deliverables.
• Lead a Team of 10 members( Equation and Automation)
• Interaction with developers and Business Analysts to effectively analyze the client’s requirements.
• Reviewed business requirements documents & technical Specifications & Documented Test Plans &Test
Cases corresponding to business rules and other operating conditions.
• Replicated complete Complex ETL jobs in Single Queries and able to test large volume data.
• Tested the Cognos Reports by writing the complex Queries
• Created a Business Intelligence Test Automation Frame work which helped team to reduce the time on
regression testing.
• Done Estimations based on the scope defined.
• Maintaining the Test Environment and Deploying the code and running the ETL jobs
• Experience in Implementation of Project code at client production environment
• Done UAT along with clients.
Platform: Data stage 8.5, Cognos 10.0, SQL SERVER 2005, SQL SERVER 2008 R2.
4
5. Company Convergys IMG Pvt Ltd
Client Enterprise Information Systems (CM LOB – EIS)
Project Enterprise Information Services (EIS)
Duration April 2007 – Dec2010
Size 9
The goal of Enterprise Information Services (EIS) is to deliver operational performance insights and analysis to
management at all levels of the business. Through Dashboards, Scorecards, Reporting and basic analytics, EIS is
enabling executive management to use this information more accurately, mitigate risk, aid in enterprise business
goals of delivering revenue growth and increasing margins.
EIS project helps Customer care operations in accurate forecasting, efficient scheduling, intelligent call routing and
proactive intra-day management of customer contacts and agent resources. These reports are used by Team leader
to CEO to effectively manage operations of Customer care management.
Projects: MDSAYF, NICE Quality Reports, Timekeeping Reports, Work force Management Reports, Agent
proficiency Reports, Graduation Reports.
Roles and Responsibilities
• Interaction with End Users, developers and Business Analysts to effectively analyze the client’s
requirements.
• Acted as Project lead to 9 members Team and Acted as Single point of contact for all deliverables.
• Reviewed business requirements documents & technical Specifications & Documented Test Plans &Test
Cases corresponding to business rules and other operating conditions.
• Carried out ETL Testing Using OWB, Data Manager.
• Involved in validation of OLAP unit testing on tools (Cognos 8.4 & Business Objects)
• Involved in System Testing ,Database testing& Drill down features testing of the OLAP Reports
• Done Performance Testing of Cognos Reports (Web) With Web-Performance Load Tester.
Platform: Oracle Warehouse Builder 10g (ETL), Cognos-Data manger, Cognos 8.1 Report Studio, Oracle 10g
(Database)
Company Tech Mahindra (MBT)
Client AT&T
Project AT&T-Rating & Billing Application
Duration May2005 – March 2007
Size 5
AT&T-SPA-VI (Sales Productivity Application) for AT & T (SBC, South Bell Corporation) is using the Billing system
called Geneva 5.0 which is a Convergys product to bill their usage of their service.
AT&T Rating and Billing system is a customization project where different modules are customized to the AT & T
specific requirements.
Right from first Customer Service Manager-Order Management-Offer management-Rating the usage and Billing of
Usage and Bill generation etc
Roles and Responsibilities:
• Acted as Module Lead for the some Systems.
• Analyzing user Requirements and developed Test plans
• Develop Automated Scripts and executed on win runner.
• Has done Security testing ,Cookie testing, Functionality testing, Browser Compatibility testing
• Done Performance testing on the web-based Applications using Open STA.
• Worked on Unix Environment to execute some scripts
Platform: UNIX, VMS
5
6. Company Convergys IMG Pvt Ltd
Client Enterprise Information Systems (CM LOB – EIS)
Project Enterprise Information Services (EIS)
Duration April 2007 – Dec2010
Size 9
The goal of Enterprise Information Services (EIS) is to deliver operational performance insights and analysis to
management at all levels of the business. Through Dashboards, Scorecards, Reporting and basic analytics, EIS is
enabling executive management to use this information more accurately, mitigate risk, aid in enterprise business
goals of delivering revenue growth and increasing margins.
EIS project helps Customer care operations in accurate forecasting, efficient scheduling, intelligent call routing and
proactive intra-day management of customer contacts and agent resources. These reports are used by Team leader
to CEO to effectively manage operations of Customer care management.
Projects: MDSAYF, NICE Quality Reports, Timekeeping Reports, Work force Management Reports, Agent
proficiency Reports, Graduation Reports.
Roles and Responsibilities
• Interaction with End Users, developers and Business Analysts to effectively analyze the client’s
requirements.
• Acted as Project lead to 9 members Team and Acted as Single point of contact for all deliverables.
• Reviewed business requirements documents & technical Specifications & Documented Test Plans &Test
Cases corresponding to business rules and other operating conditions.
• Carried out ETL Testing Using OWB, Data Manager.
• Involved in validation of OLAP unit testing on tools (Cognos 8.4 & Business Objects)
• Involved in System Testing ,Database testing& Drill down features testing of the OLAP Reports
• Done Performance Testing of Cognos Reports (Web) With Web-Performance Load Tester.
Platform: Oracle Warehouse Builder 10g (ETL), Cognos-Data manger, Cognos 8.1 Report Studio, Oracle 10g
(Database)
Company Tech Mahindra (MBT)
Client AT&T
Project AT&T-Rating & Billing Application
Duration May2005 – March 2007
Size 5
AT&T-SPA-VI (Sales Productivity Application) for AT & T (SBC, South Bell Corporation) is using the Billing system
called Geneva 5.0 which is a Convergys product to bill their usage of their service.
AT&T Rating and Billing system is a customization project where different modules are customized to the AT & T
specific requirements.
Right from first Customer Service Manager-Order Management-Offer management-Rating the usage and Billing of
Usage and Bill generation etc
Roles and Responsibilities:
• Acted as Module Lead for the some Systems.
• Analyzing user Requirements and developed Test plans
• Develop Automated Scripts and executed on win runner.
• Has done Security testing ,Cookie testing, Functionality testing, Browser Compatibility testing
• Done Performance testing on the web-based Applications using Open STA.
• Worked on Unix Environment to execute some scripts
Platform: UNIX, VMS
5