1. P. Rama Mohan Reddy E-Mail: prmreddy@gmail.com
Senior Testing Engineer Mobile: +91-9900574192
OBJECTIVE:
To pursue challenging career in the esteemed organization, forcontinuous learning, research and contribution to
organizational growth.
PROFESSIONAL SUMMARY:
Having 8 years of Experience in Software testing which includes 4 years of ETL/DWH, Database Testing and 4 years of
Manual Testing.
Experience in Data Warehousing Concepts.
Knowledge on INFORMATICA POWER CENTER 9.5/8.6 tool.
Experience in ETL Testing/Data warehouse Testing using IINFORMATICA POWERCENTER and SQL
Performed Validations on Dimension and Fact Tables based on mapping logic.
Performed Validations on SCD’s (slowly changing dimensions) based on Project Architecture
Performed validationson INTIAL LOAD AND INCREMENTAL LOAD’S.
Experience in testing Enterprise Business Intelligence Reports and Dashboards developed by using OBIEE reporting tool.
Experience in Execution of ETL SQL test scripts.
Good Exposureto Data Warehousingconcepts like Dimension and Fact tables,Star and Snowflake schemas and SCD’S.
Experience in FUNCTIONAL, INTEGRATION, SYSTEM, REGRESSION and SMOKE Testing.
Experience in LOCALIZATION/GLOBALIZATION Testing.
Havinggood knowledge in DATABASE (SQL) Testing.
Experience in AGILE Testing Process.
Knowledge on UNIX commands.
Complete understandingof SDLC, STLC, DEFECT LIFE CYCLE and TESTING METHODOLOGIES.
Having exposer towards leadingteam having4 to 6.
Good analytical skillsand innovativeproblemsolvingability.
Good communication and inter-personal skills.
Highly motivated professional with integrity and commitment to quality and perfection.
Experience in Bug lifecycle& Reporting defects in bug trackingtools.
Experience in testing applications in various industry-domains like Affiliate Marketing, Health Care, DWH and BI
Reporting Applications.
Experience in Defect TrackingTools like JIRA, APTEST Manager and Net Results Tracker.
Experience in Writing,Reviewing, Executing Test Cases and developingTraceability Matrix and TestCoverage reports.
Prepared status summary reports with details of executed, passed and failed test cases.
Efficiently lead the team to meet the project goals and organization goalsthatarederived
Participated and Prepared Project level Test Plan and Test Estimations.
ACADEMIC PROFILE:
Master of Computer Applications (M.C.A) from OSMANIA UNIVERSITY, HYDERABAD (A.P).
Bachelor of Science (B.Sc.) from SRI KRISHNA DEVARAYA UNIVERSITY, ANANTAPUR (A.P).
TECHNICAL SKILLS:
OperatingSystems : Windows 7, Windows XP, UNIX
Database : Oracle,DB2, My SQL, MS SQL Server
Defect TrackingTool : Net Results Tracker (NRT)
Test Management Tool : APTEST Manager, Quality Centre
Project ManagementTool : JIRA
SCM Tools : MicrosoftVisual SourceSafe
ETL Tool : INFORMATICA9.5/8.6
Reporting tool : OBIEE
CAREER PROFILE:
Worked as ETL Tester in CTS –Bangalorethrough SCORG International Consultancy fromMay-27 to July-26- 2016.
Worked as Test Lead in LIiquidHub –Hyderabad through Tech Minfy Consultancy Services fromSep 2015 to Feb 2016.
Worked as Senior Testing Engineer at Unilever–Bangalore through TestLabs from January 2014 to February 2015.
Worked as Senior Testing Engineer at Nous InfosystemsPvt. Ltd - BangalorefromFebruary 01 2007 to July 2013.
2. PROJECTDETAILS:
#1:
Project Name : NUPLAZID - Data Aggregator Management Solution
Client : ACADIA Pharmacy,USA.
Duration : September 2015 to till date
Role : Test Lead & ETL Tester
Team Size : 4
Data base : Oracle
Tools : TOAD, JIRA
Environment : Windows XP, INFORMATICA 9.5, TABLEAU
Description:
Acadia will be launching a new product 'NUPLAZID', dealing with Parkinson’s Disease Psychosis (PDP). As part of the 'Nuplazid'
launch Acadia will be collecting referral, shipment, inventory and patient information from specialty pharmacies (SPP) containing
Nuplazid prescription data, shipment and inventory information from specialty distributors (SD), Patient Services information
from LASH Services (HUB), Patient Tokenization information from MSA, Payer Master information from DRG and sending to
LiquidHub (LH). LiquidHub’s - Liquid Analytics Data Aggregation solution will integrate Specialty Pharmacies (SPP), Specialty
Distributors (SD), LASH Services (HUB), Patient Tokenization from MSA and Payer Master information from DRG to provide quality
and actionabledata that will beused for Analytics and Reportingin support of the 'Nuplazid' productlaunch.
All these people (SPP, SD, LASH, MSA, and DRG) will send the data through text files (.txt) to LiquidHub through ACADIA.
LiquidHub’s - Liquid Analytics Data Aggregator Management Solution will process these files for cleansingand integration.
Once cleansed and integrated, LiquidHub will send the finalized transactions and extracts to respective clients where
they will be integrated into the Client Data Warehouse, that will be used for Analytics and Reporting in support of the
'Nuplazid' productlaunch.
Responsibilities:
Involved in functional Study of the application and understandingthebusiness requirements
Receiving File LayoutsfromSPP’S,SD’SandLASH Services(HUB)
PreparingMaterList Of Columns(MLOC) Sheetforall the receivedFile Layouts
Receiving Field Requirements , Field Validationsdocuments from SPP’S, SD’S and LASH Services (HUB)
PreparingQuality Control Checks (QC-Check’s) for Field Validation Documents
PreparingTest Cases for all QC Checks
ExplainingtheFileLayouts, Field Requirements , Field Validationsdocuments to team members
Assigning Task to Team on Daily Base
Consolidatingthe Task and sendingDaily Status Report
ConsolidatingDaily Task and SendingWeekly Status Report
Sending Issues and Escalationsto Onsite Coordinator Daily Base/Weekly Base
Interactingwith OnsiteCoordinator to resolve the issues
Attending Daily Callswith OnsiteTeam to discuss ProjectImplantation Statuses
Attending Status Meeting’s with Manager twice a week and updating the ProjectProgress,Issues and Escalations
PreparingTest Scenarios and CreatingTest Data for these scenarios
Creating Sample Source Files For Initial Run and ran PMCMD commands for files processing through Landing Staging
ODS DWE, validatethe data in each level and generating Test Extracts
Collecting Sample Source Files For Initial Run from Client’s and ran PMCMD commands for files processing through Landing
Staging ODS DWE, validate the data in each level and generating Test Extracts and then sending to client for
review and approval
Running the SQL Queries in Toad to validatethe data in each level Landing Staging ODS DWE
Updating existing test cases & test requirements as per the new enhancements done in the Field Requirements and Field
Validation documents
Open defects and execution results were updating through Daily status report to onsite people, project manager and
internal team members
Co-ordinate with team and create daily testexecution results
3. #2:
Project Name : RSUNIFY
Client : Hindustan Unilever Limited
Duration : January 2014 to February 2015
Role : Senior Testing Engineer
Team Size : 5
Data base : MS SQL SERVER 2005
Bug tracker : BugTracker.NET
Environment : Windows 7, Windows XP, Delphi,Report Builder,GODB,
Integrated Applications : CU (Central Unify), COMM (Common Outlet Master Management), BIW (Business
Information Warehouse), Quantum & Shakti
Description:
RSUNIFY is a billingpackagethat completely automates all operations of a Redistribution Stockist(RS) of HUL. Unify is a mini
ERP solution,thatresides in RS offices and communicates with a central server (called Central Unify) to receive all HUL information
Pertainingto his operations - such as productand pricemaster, schemes etc - and thus dramatically bringdown the time required
for communication of OPS/pricechanges.It also enables HUL to pick up on information such as Claims and Sales MIS - thus
enablingfaster claimsettlement for RS's as well as actionableinformation to the sales teamof the HUL.
Responsibilities:
Functional Study of the New Requirements / Change Requirements (Enhancements) of the Application
Attending Briefingsessionsfor New Requirements / Change Requirements with Development Team
Understandingthe New Requirements / Change Requirements and preparingTest Scenarios and Test Cases
Conducting Peer reviews for Test CaseReview’s within the Team and updatingreview comments in Test CaseReview log
PreparingTest Data for the Test Cases
Functional Execution of Test Cases of New Requirements / Change Requirements in Interim Builds,Alpha Build
FindingDefects and reporting in bug tracker tool Bug Tracker.NET
Defects were tracked, reviewed, analyzed by usingdefect trackingtool Bug Tracker.NET to closebefore Alpha Build
Performing Fresh DB testing of RSUNIFY after releasefor the use of Newly Created Redistribution Stockist’s (RS’s)
Performing Integration Testing with inter linked applications GODB, CU (Central Unify), COMM (Common Outlet Master
Management), BIW (Business Information Warehouse), Quantum & Shakti.
#3:
Project Name : Analytics Reporting
Client : LinkShare Corporation
Duration : January 2011 to July 2013
Role : Senior Testing Engineer
Team Size : 4
Data base : Oracle,DB2
Tools : TOAD, APTEST Manager and Net Results Tracker
Environment : Windows XP, PHP, INFORMATICA 8.6, OBIEE
Description:
Rakuten LinkShare is a leading provider of full-service online marketing solutions specializing in the areas of Affiliate
Marketing domain. Link Share created DW for their Publishers and Advertisers reporting and analysis. The DW flowed with Star
schema model. In this project the source data comes from Source Systems (Merchandiser files, Auditor tables) processes the data
to staging db(ODS) and from there it process to EDW(Dim, Fact tables). From Data Warehouse Report will pull data and publishes
in Publisher Reporting, Advertiser Reporting, Analytics Reporting, Custom Reporti ng. The data in the data ware house is process
into target data base for reporting and analysis. INFORMATICA is used as an ETL tool to load data from different data sources like
databaseand flatfiles to the target data base.
The LinkShare Analytics applications provide in-depth reporting and analysis beyond the reports that are included in the
interfaces and allow publishers & advertisers to view information organized in one or more pages which appear as tabs across the
top of the application and can display anything that is accessible with web browser, such as reports, trend graphs, images, charts,
tables, text, and links to web sites and documents and also have four main areas, which correspond to the main business
processes: Network Transaction Analysis, Offer Analysis, Partner Analysis, and Consumer Analysis. The reports and analysis in each
of these areas arecustomized so that they return only the information thatpertains to a given user.
4. Responsibilities:
Involved in functional Study of the application and understanding the business requirements in order to create test
requirements and test cases
Test cases walkthrough meetings with onsitepeople
Running the Jobs/workflows for ETL process to Sync up data between Oracledata baseto ODS and DW tabl es
Verifyingthe data in target data baseafter ETL Process
Performed column data mappingbetween Source and Target data bases
Interactingwith onsite people to resolve the issues
Performing Functional, Regression, GUI, Localization/Globalization, Smoke and exploratory testing on Affiliate and
Merchant Analytics ReportingInterfaces
Running the SQL Queries in Toad to validatethe UI Report data with DB Results data
Defects were tracked, reviewed, analyzed by usingdefect trackingtool Net Results Tracker
Co-ordinate with team and create daily testexecution results
Updating existingtest cases & test requirements as per the new enhancements done in the application
Open defects and execution results were updating through test execution status report to onsite people, project manager
and internal team members on daily basis
#4:
Project Name : Global Data Warehouse(GDW)
Client : PSI Global
Duration : April 2010 to December 2010
Role : Testing Engineer
Team Size : 4
Tools : Informatica 8.6
Environment : Oracle10g,Windows XP, TOAD, QC
Description:
GDW is a Centralized In-Bound and Out-Bound data System. The data in the GDW is getting from different source systems
like Project Management System (PMS), Resource Allocation System (RAS), Work Request Management (WRM) and Project
Accounting and Reporting (PAR) system. It is very difficult to make a decision at Organizational reports from single system. It is
basically a Project and portfolio Management System, which supports at Organizational level decision making for cost and budget
plans.
The data in the DWH is processed into OLAP data marts for reportingand analysispurposes.Informaticaisused as an ETL tool for
loadingdata into data warehouse from different data sources such as OLTP databases and flatfiles.
Responsibilities:
Involved in Designed, Reviewed & Executed Test cases
Running the jobs/Workflows for ETL process
Prepared and Executed SQL queries to verify Dimensional and Fact
Verifyingthe ETL data in target database
Verified column mapping between sourceand target
Reporting daily testingstatus
Defect Analyzingand Reporting in QC
#5:
Project Name : Custom Reporting
Client : LinkShare Corporation
Duration : July 2009 to March 2010
Role : Testing Engineer
Team Size : 3
Tools : APTEST Manager and JIRA
Environment : Windows XP
Description:
LinkShare Corporation is implementing a new BI Reporting feature called Custom Reporting interface integrated in Advertiser
Dashboard by using Agile Methodology for Advertisers. By using this custom reporting interface Advertisers can create their own
reports for custom date range and for any filter conditions by usingdifferentcolumns under different categories.
5. Responsibilities:
Handlinga 3 member’s team
Involved in daily scrumcallswith onsitepeople
Sending out daily testexecution report to project stake holders with open defects
Understandingthe business / functional requirements which arecreated as user stories (story ids) in JIRA
Preparingand review test cases for the JIRA stories and uploadinginto APTEST Manager tool
Executing test cases and updatingtest results with printscreens in APTEST Manager tool
Reporting bugs in NRT and creatingnew story ids for the confirmed bugs in JIRA by selectingrequired fields
Interacting with onsite people over phone, email and chat on daily basis to discuss about story demos, clarifications, issues
and get solutions
Closely Work with onsite people to resolve issues and in accomplishing the completion of tasks on-time without impacting
the timelines set by clients
#6:
Project Name : iBio
Client : ICON
Duration : September 2008 to June 2009
Role : Validation TestingEngineer
Team Size : 6
Data base : SQL SERVER 2008 R2
Defect Tacking Tools : Entity issuetracker tool and anomaly creation
Environment : Windows XP, ASP .Net and C#
Description:
IBio is a web based application. This application handles the clinical sample movements across the Lab. Clinical Samples are
brought in bulk in bags called pre-login bags and are registered in this application. Then the samples are kept in freezer and then
the sample details are entered into the application by providing the IDS numbers. Then the samples are moved for testing to
another lab in bags called Traveller bags. There the clinical sample are tested on Equipments like Gen5, Gyros, Elisa ECT using the
values of Q’s and S’s defined by “Principal Investigator”. Then the data in the output file (Tab delimited text or XLS or SML) is
parsed by windows service. Then the final result analysis is done using “BARG” reports module.As a validation testing engineer
validating the iBio application by executing the approved test scripts in DRYRUN and Functional testing execution rounds and
findingissues.
Responsibilities:
As a validation testing engineer we are updating the test scripts which are provided by manual testing team as per
validation protocol and 21 CFR part11 assessments and sendingto the clientfor final approval
Cross verify the requirements updatedin RTM which is provided by Manual Testing team with the updated test scripts and
modify the requirements in RTM accordingly and sending the updated RTM to clientfor final approval on modulewise
Performing Smoke Testing once the new build is received from development team and sending the Smoke Testing issues
report to the client
PreparingConsolidated issues reportfor every new build with the previous builds
Validating the application by DRY RUN and Reporting the issues to development team by adding into Entity Issue Tracker
with detailed description and screen shots
Validating the application by Functional Testing and Reporting issues by creating anomaly’s with detailed description and
screen shots
#7:
Project Name : Advertiser dashboard (Overlord)
Client : LinkShare Corporation
Duration : January 2008to August 2008
Role : Testing Engineer
Team Size : 6
Database : Oracle,DB2
Tools : TOAD, APTEST Manager and Net Results Tracker, JIRA
Environment : Windows XP, PHP
6. Description:
Advertiser Dashboard application is a web based application used by Link Share’s Advertisers also known as Merchants or
Retailers across the world. An Advertiser, also known as a Merchant or retailer, is a web site or company that sells a product or
service online, accepts payments and fulfils orders. LinkShare operates one of the largest online performance marketing networks
in the industry today which has created millions of partnerships between advertisers and publishers. Through Advertiser
Dashboard LinkShare offers expert management of search marketing investments, compreh ensive campaign management for lead
generation, and a wide range of solutions for establishingand growingsuccessful affiliatemarketing programs.
Responsibilities:
Involved in daily scrumcallswith onsitepeople (Agile group)
Understandingthe user stories and preparingtest requirements and test cases
Interacting with onsite people over phone, email and chat on daily basis to discuss about story demos, clarifications, issues
and get solutions
Creating test sessions and performingFunctional and Regression Testing
Findingthe defect and reportingdefecting in NRT and trackingthe bugs by status
Sending out daily testexecution report to project stake holders with open defects
Closely Work with onsite people to resolve issues and in accomplishing the completion of tasks on-time without impacting
the timelines set by clients
#8:
Project Name : Barbarossa
Client : LinkShare Corporation
Duration : February 2007 to December 2007
Role : Testing Engineer
Team Size : 12
Database : Oracle,DB2
Tools : TOAD, APTEST Manager and Net Results Tracker
Environment : Windows XP, PHP
Description:
Publisher Dashboard basically deals with Publishers. Publishers also known as affiliates place merchants' ads, text links, or
product links on their web sites, shopping engines, blogs, etc. or include them in e-mail campaigns and search listings in exchange
for commissions on leads or sales. Link Share tools for Publishers are designed to drive conversion and to help you earn more
commissions. The LinkShare publisher Dashboard is a modern, web-based environment enables merchants to track, manage, and
evaluate their affiliate program activity on the Internet, from any computer, using a single interface to analyze their partnerships
with other sites whileprovidingyou with a simplesnapshotof your earnings.
Responsibilities:
Understanding the business and functional requirements in order to create test requirements and test cases in APTEST
Manager tool
Updating the existingtest cases as per the enhancements
Performing Functional, Regression,GUI, Localization/Globalization, Smoketesting with Publisher Dashboard Application
Defects were tracked, reviewed, analyzed by usingdefect trackingtool Net Results Tracker
Updating the onsitepeople, project manager and internal team with the status on daily basis
Updating Daily Activity Report (DAR) on daily basis