This document provides a summary of Ayaz Qureshi's qualifications and experience as a Software QA Test Automation Engineer. It includes over 10 years of experience in software testing, test automation, performance testing, and working with various testing tools like Selenium, JMeter, LoadRunner, Quality Center, and Quick Test Professional. Ayaz has extensive experience in test automation including developing frameworks, designing and executing test cases, defect tracking, and working in agile environments. He is proficient in languages like Java, VBScript, and tools like SQL Server, Oracle, and Linux/Unix.
1. Ayaz Qureshi
6524 Lee Valley Drive, Apt. 203
Springfield, VA 22150
703-473-6937 (Cell)
Email: m.ayazqureshi@gmail.com
OBJECTIVE
Seeking a position of Software QA Test Automation Engineer /QA analyst
SUMMARY
• Strong knowledge of software development processes and methodologies
• Familiarity with testing distributed application in large Web-based environments
• Strong knowledge of SQA and testing philosophies and methodologies
• Ability to design and implement customized test fixtures
• Solid experience with manual testing, including test planning and execution
• Familiarity with the process for releasing a test set into production
• Highly experience in developing automated tests using test tools and scripting languages
• Expertise experience with STLC management tools Quality Center and Test Director
• Experience in authoring load, performance, and endurance test scripts in LoadRunner
• Experience in working with test automation frameworks, such as keyword and data-driven
with Quick Test Professional
• Strong SQL, PL/SQL skills
• Solid analytical and problem-solving abilities
• Knowledge and experience working in a iterative/agile test process
• Knowledge of special testing needs as relevant to testing of mobile solutions, internet portals,
and web-based applications (security, load, application servers, differences in browsers)
• Expert in industry standard software development methodologies and life-cycles
• Knowledge in test planning, product verification, product validation, and test automation;
implementation phases of the development methodology and life-cycle
• Ability and desire to work in a spirited, collaborative environment
• Ability to identify and prioritize important tasks independently
• Self-motivated, willing to learn new concepts, technologies, and ability to produce quickly
TECHNICAL SKILLS
STLC Tools Quick Test Professional, Load Runner, Quality Center, Test
Director, Selenium, Jmeter, ALM Performance Center, Sprinter,
Unified Functional Testing
Programming Languages VB.NET, Java, ASP, JSP, J2MEE, Visual Basic , JavaScript,
VBScript,
Application Software Microsoft Visio, VSS, Mercurial Hg, Excel, Word, PowerPoint
Databases Oracle, Microsoft Access, Microsoft SQL Server, DB2, Sybase,
2. Tools SQL*Loader, TOAD, SQL Analyzer, SQL Profiler
Operating Systems Windows XP, Windows 2003, UNIX, Linux, MS-DOS
Others IIS, Tomcat/Apache, UML, Web Services, IE, Firefox, Opera
PROFESSIONAL EXPERIENCE
Capital One, Richmond, VA
July 2013– Present
Software Automation Tester
Job Description:
Perform quality assurance, quality control, and security tests for systemdesigns, processes,
and security features
• Automated test cases using Selenium, WebDrivers and TestNG Framework
• Production Support: Smoke testing on all production update in various environments. Script
writing, plan and executing test including Automation with Selenium
• Automated testing tools such as Junit and Selenium to conduct systems, integration, user
acceptance, positive and negative, functionality, object, and regression tests.
• Design, create, and customize scripts using various scripting language and testing tools, such
as JavaScript, Selenium with Java, JUnit, TestNG, and QTP 11, for data-driven network
systems and others using JAVA Language
• Worked as a Automation Tester, responsible for development and maintenance of
Automation Frameworks, tools and solutions. Managed and coordinated onsite/offshore
functional test efforts and Automated functional testing
• Write and execute automation test script for UFT and Selenium
• Participate in the automated testing tool vendor selection process. Conduct a Pros & Cons
analysis of HP QTP and Selenium
• Performed manual and light selenium IDE script-driven sanity and regression, cross-
browser testing to ensure consistency.
• Create solutions to improve scripts by designing new functions, synchronization threads and
processes, and check points
• Test system requirements for bugs and glitches using various web-based test management
software such as ALM Quality Center 11.00
• Analyze system designs, requirements, and documentation to effectively develop test scripts,
and test specific scenarios for required levels of security and quality-control testing
• Identify and resolve technical problems with systems by comparing newly designed project
interface requirements with current interfaces in the mainframe-based legacy system
• Analyze and verify data requirements and layout reports for various database designs and
other systems
• Collaborate with business users and customers to clarify system requirements to improve
the user interface and the design and development of the system processes
• Work directly and independently with customers to perform usability testing to thoroughly
review and test scripts
3. • Perform complex analysis and testing support for government clients by executing
regression and system testing and manually integrate systemimprovements
• Analyze physical system designs to develop system test plans and outline an estimated
timeline for test schedules
• Examine business and operational requirements for new and modified systems
• Review system specifications for design functionality and user documentation ensuring
functionality coordinates with user instructions
• Work closely with development team to identify and resolve any system-related problems;
discuss solutions and make recommendations to senior leaders; implement and test these
solutions
• Used LoadRunner for performance and stress testing of the application to improve its
efficiency and scalability, measured hits per second and response time.
• Installed and configured LoadRunner, recorded Vuser scripts for various scenarios and
• Responsible for implementing LoadRunner, Performance center, JMeter based
infrastructure including: Architecting the load testing infrastructure, hardware & software
integration with LoadRunner
• Prepared Test Cases, Vugen scripts, Load Test, Test Data, Execute test, validate results,
Manage defects and report results
Verizon, Richmond, VA
May 2011– June 2013
Quality Assurance Analyst
Job Description:
• Informed supervisor of important developments and obtains guidance and direction on
individual assignments
• Represented the company through customer visits and consultation for the solution of
technical problems
• Conceived ideas and developed testing events and actions for products to meet objectives
• Performed business analysis in accordance with established theories and methods
• Planed, designed, and conducted lab and tests of developmental and competitive products
• Accountable for complete results on development projects and special function within
assigned area
• Prepared proposal to supervisor on new product designs and project modification
• Communicated technical results and information effectively both in written and oral form
• Maintained lab equipment, instruments and resources and used efficiently
• Provided direction for design and drawing of products, product components and test
apparatus
• Built and represented and directed fabrication and procurement of prototypes and test
equipment
• Developed and created master test plans and related documents, test cases, and test
schedules
• Executed test cases and test scenarios across development projects
• Involved in functionality, user interface, regression, security, and UAT
4. • Identified and tracked defects, issues, risks, and action items
• Validated requirements for system testing, report preparation, defect recording, and defect
tracking
• Performed regression testing to validate the resolution of any software or system defects
• Used Quality Center a web-based test management tool for centralized control over the
entire testing life cycle
• Wrote and executed SQL queries to interpret test results and create test data
• Created, enhanced and maintained high-end object repository for various functional and
regression test using Quick Test Professional
• Executed written test case scenarios, including manual, automated, and data-driven
regression testing, and GUI verification by using Quick Test Professional (QTP).
• Developed Keyword Driven and Data Driven Frameworks test scripts using VBScript
• Used LoadRunner for performance and stress testing of the application to improve its
efficiency and scalability, measured hits per second and response time.
• Installed and configured LoadRunner, recorded Vuser scripts for various scenarios and
• Analyzed the results in performance, load, and performance monitor using LoadRunner
GEICO, Cleveland OHIO
November 2009 - April 2011
Automation Test Engineer
Job Description:
• Transitioned user stories and collected test requirements in creating test cases and test
procedures with emphasis on automation testing and scripting
• Applied the set of operations, and disciplines for the planning, analysis, design and
construction of information systems across a major sector of the organization
• Participated in test case coverage, test case design, and script design and reviews
• Developed test scripts along with maintaining and enhancing the automated test
framework supporting a continuous integration environment with automated smoke and
regression testing
• Ensured high test and code coverage, maintainability of scripts, reliability of equipment,
and overall robustness of environment and solution during the entire development cycle
• Responsible for performing analysis of requirements, writing requirements verification
points and providing feedback on requirement testability
• Responsible for having a thorough understanding of the projects test environment(s) and the
projects policies for working in the test environment(s)
• Responsible for installing software into and upgrading test environments (hardware and
software) including in-house applications and 3rd party applications
• Responsible for integration testing applications as appropriate to use on the internet portal
• Responsible for working with a team including development, system engineering and
customer representatives in combined integration test efforts
• Involved in continuous support of overall software quality and testing with continuing
refactoring of scripts and test cases as required and enhanced test coverage (system,
performance, interoperability, stress, negative testing, etc.)
5. • Clearly logged defects, maintained test data and results, and monitored/analyzed automated
test runs and reports
• Supported the identification and debugged of software defects and champion the resolution
of bugs and issues
• Wrote SQL, PL/SQL scripts used RDBMS testing especially CRUD operations and verified
ACID properties with the SQL queries on databases
• Responsible for developing manual test cases in HP Quality Center and executing tests
according to software test processes and procedures
• Responsible for developing automated test cases within Quick Test Professional and custom
scripting as appropriate to the test case
• Used Quality Center to manage and organize STLC activities like Requirements coverage,
Test Case Management, Test Execution Reporting, Defect Management, and Test
Automation
• Used automated test scripts designed and defined by VBScript
• Used LoadRunner to create Vuser Scripts using VuGen, used Controller to generate and
executed LoadRunner scenarios
• Provided test status report for the pinnacle management
CareFirst, Ashburn, VA
May 2008 –October 2009
QA Engineer
Job Description:
• Involved in interacting with the stakeholders, development teams, end users and business
analysts in understanding the business requirements analysis, development of software
testing documentation
• Analyzed and documented the software specifications for both the client-facing and internal
windows and web applications
• Verified the requirements and business functionalities
• Designed, developed and implemented business logic architecture and object-oriented
testing for middle and large windows/web-based information retrieval systems and database-
driven applications
• Collaborated with user interface team, developers and architects to design and develop
functionally rich, robust, user friendly applications as defined by business requirements
• Responsible for entering defect reports in the projects approved defect tracking system
• Responsible for providing information as requested in a timely manner
• Responsible for escalating schedule and process issues
• Participated in the creation of standardized and project-specific plans and procedures for
testing
• Participated in developing project-schedules with well-defined tasks, deliverables, time
estimates and required resources
• Supported and follows software development methodologies and life cycle
• Supported and follows software development standards and procedures as relevant to
automated testing
• Planed and organized the testing process, created database of manual and automated tests,
build test cycles using Test Director
6. • Conducted functionality and regression testing during the various phases of the application
using Quick Test Professional
• Wrote automation test scripts using VB script
• Developed complex SQL scripts using SQL queries for database testing
• Performs other related functions, including special projects, as required and requested
REFERENCES
AVAILABLE UPON THE REQUEST