Your SlideShare is downloading. ×
Less01 1 introduction_module
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Less01 1 introduction_module

573
views

Published on


0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
573
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
21
Comments
0
Likes
2
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Oracle Application Testing Suite: Introduction
  • 2. Topics Covered
      • FMStocks
      • Testing Concepts
      • e-Tester
      • e-Manager
      • e-Load
      • e-Reporter
      • ServerStats
  • 3. FMStocks Introduction to Oracle Application Testing Suite
  • 4. How to Access FMStocks
    • Navigate to: http://Instructor*/FMStocks in a web browser
    * See Instructor for the name of the host machine
  • 5. What Is FMStocks?
      • Fitch and Mather Stocks is a sample brokerage application
      • Used to test Microsoft ASP and .Net technologies
      • This course will refer the FMStocks site as a sample application
      • You can download the application at http://www.fmstocks.com
  • 6. FMStocks Architecture
      • Developed for Windows 2000
      • Built using the standard 3-tiered technology approach
        • Presentation Layer – IIS and ASP
        • Business Logic Layer – COM+
        • Data Services Layer – SQL Server
  • 7. FMStocks Architecture Presentation Logic Data IIS ASP Business Components SQL Server Web Browser ODBC ADO OLE DB
  • 8. FMStocks Transactions
    • Key transactions of the FMStocks application:
      • Open a new account
      • Login
      • Research a company
      • Purchase a stock
      • Sell a stock
      • Balance and portfolio data
      • Logout
  • 9. Testing Concepts Introduction to Oracle Application Testing Suite
  • 10. Importance of Testing
      • Improve the quality of the application
      • Decrease risk of users discovering problems
        • “First impressions are lasting impressions”
      • Reduce cost of fixing application (earlier in development cycle)
  • 11. Importance of Testing
      • Improve the quality of the application
      • Decrease risk of users discovering problems
        • “First impressions are lasting impressions”
      • Reduce cost of fixing application (earlier in development cycle)
  • 12. Testing Overview Plan Test Record Test Cases Implement Tests Evaluate Tests Repeat Tests Track Defects
  • 13. When To Test
      • The testing process should begin early in the systems development life cycle
      • Testing should begin with a review of the requirements documents and proceed all the way through to User acceptance testing
  • 14. What Is A Test Plan?
    • Test Plan: describes the test strategy, scope, resources and schedule of testing activities. It identifies the test requirements, test cases, expected results, pass or fail criteria, and risks associated with the plan.
  • 15. Importance of Test Plan
      • Organize, manage and schedule the testing effort
      • Repeatability
      • Improve coverage and efficiency
  • 16. Components of A Test Plan
      • Test strategy
      • Test objectives & scope
      • Test requirements
      • Expected results & pass/fail criteria
      • Risk assessment & priority
      • Test cases
      • Staffing & responsibilities
      • Test deliverables
      • Miscellaneous test plan components
  • 17. Test Strategy
      • Describes the general approach of the test project.
      • Specifies which stages of testing (unit, build, and system) are addressed and which kinds of testing (functional, performance, stress, and so on) are to be performed within each stage.
      • Includes test completion criteria (that is, allow the software to progress to acceptance testing when 95 percent of test cases have been successfully completed).
  • 18. Test Objectives & Scope
    • Describe the objective for testing:
      • EXAMPLE: To locate errors and demonstrate how well the system satisfies its specifications.
    • The scope should list:
      • What features are being tested?
      • What features are not being tested (covered by another plan)?
  • 19. Exercise 1
      • Using the FMStocks site, develop a list of features to be tested.
      • Discuss the possible strategies and objectives for testing.
  • 20. Test Requirements
    • Test requirement: reflects the behaviors that are to be verified of the application-under-test.
      • A requirement must be verifiable!
  • 21. Classification of Test Requirements
    • Business Functions: express the purpose of the software in terms of the business it serves, and relate most directly to the user’s expectations of the software.
    • User Interface Behaviors: are standard behaviors and sets of attributes to be verified.
    • Other Functions: are not directly related to business transactions (administrative functions, functions to set operator preferences, and supporting utility functions).
  • 22. Think About The Requirements
    • Identify the most important transactions.
    • Identify the most frequently used transactions.
    • Questions to ask:
      • “What transactions will most adversely affect my business objectives if they don’t perform correctly?”
      • “What transactions are absolutely mission critical to my users?”
  • 23. Decomposition of Test Requirements
    • Create a hierarchy of high-level test requirements, or test items.
    • Break down each test item into more detailed test requirements, or features to be tested.
  • 24. Exercise 2
      • Choose a feature from the list developed in Exercise #1.
      • Create a test requirement hierarchy with test items, and features to be tested.
      • For each requirement, classify it as a business function, user interface, or other function.
  • 25. Expected Results & Pass/Fail Criteria
    • For each requirement, state the Expected Result.
    • Define the Pass/Fail Criteria for each requirement.
      • Functional Criteria: whether the function meets the expected result.
      • Performance Criteria: specify the response times needed by the users.
      • Operation Under Stress: describes acceptable response to abnormal conditions such as unusually high transaction rates (I.e. Does the site slow down? Does it cease to function?)
  • 26. Risk Assessment
    • For each requirement, identify the risk factor. The most important requirements to test are those with the highest risk.
      • What could cause a requirement to fail?
      • What is the effect, or impact of a test requirement failing?
      • What is the likelihood, or probability, that a test case will fail?
  • 27. Risk Assessment
    • First, list all of the causes of a failed requirement.
    • Second, for each cause, assign a number for the level of impact (that is, 1=low, 3=medium, 5=high)
    • Third, for each cause, assign a number for the likelihood of failure (that is, 1=lease likely, 3=likely, 5=highly likely)
    • Finally, multiply the level of impact by the likelihood of failure to get the risk factor.
    Impact * Likelihood = Risk Factor
  • 28. Priority
    • Once you have determined the risk factor, use it to assign a priority level to each requirement.
    • The priority will indicate the order of importance.
    • Example:
      • Risk factors (1 - 8) = Low priority
      • Risk factors (9 - 15) = Medium priority
      • Risk factors (16 - 25) = High priority
  • 29. Test Cases
    • For each requirement, design a test case to test it.
    • Next, develop the test case.
      • For manual tests, document each step of the test case.
      • For automated tests, document each step of the test case, and generate the script using the automation tool. (Be sure to reference the script file in the test plan).
    • Create data files (if needed) and reference their location in the test plan.
  • 30. Think About Test Cases
      • Understand how users will use the site and record scripts that reproduce these transactions.
      • Transaction-based scripts provide more meaningful results and are easier to work with.
      • Longer scripts can be more difficult to work with and debug.
  • 31. Exercise 3
      • Using the requirements from exercise #2, specify the expected results for each.
      • Prioritize the requirements. Explain the reason for the prioritization.
      • Create a test case to satisfy each requirement.
  • 32. Test Environment
    • Specify the needed and desired properties of the test environment, including:
      • Hardware (computer type, memory, hard-drive size, and so on)
      • Communications and system software
      • Bandwidth
      • Mode of usage (that is, stand-alone)
      • Other software or supplies needed (operation system, browser type, and so on)
      • Level of security
    • Identify the source for all needs not currently available.
  • 33. Staffing & Responsibilities
    • Identify team members required for completing the test plan, and list the responsibilities for each.
      • Test Manager: provides over all direction for testing.
      • Test Engineer (Design/Development): design and develop test cases.
      • Test Engineer (Test Execution): executes test cases.
      • Test System Administrator: installs and configures software/hardware on the test machines according to the test environment specifications.
  • 34. Test Deliverables
    • List of the test materials developed by the test team, during the test cycles, that are to be delivered before the completion of the project.
      • Test plan (requirements document & test cases)
      • Defect tracking reports
      • Final release report
  • 35. Miscellaneous Components
    • Task List: specifies the tasks to be executed. The order in which tasks will be performed, and who will perform the tasks. The task order should take into consideration the requirement priorities, and which tests can be executed using the same setup.
    • Schedule: specifies the date and time that each task will be performed.
  • 36. Test Plan Example
    • Refer to the FmStocks Test Plan in Appendix A of your workbook.
  • 37. Track Defects
    • How do you identify defects?
      • A defect will be any result that does not comply with the requirements document
    • It is possible to find defects that comply with the requirements document. In such cases:
      • Use common sense
      • Anything that would make a user not want to continue to use the site is a defect
  • 38. Testing Approaches Introduction to Oracle Application Testing Suite
  • 39. Functional Testing
    • Positive Testing
      • Positive testing exercises the basic functionality dictated in the requirements documents.
      • Positive testing exercises functionality using the data the user IS SUPPOSED to use.
    • Negative Testing
      • Negative testing exercises the error handling functionality of the application under test.
      • Negative testing uses data the user IS NOT SUPPOSED to use.
  • 40. Functional Testing
    • Black Box Testing
      • Test by exercising the UI and validating that the UI returns the expected results
      • Black box testing focuses on the values returned by the UI, not on what is happening “behind the scenes.”
    • White Box Testing
      • Test by exercising the UI and validating against database entries and code break points.
      • White box testing focuses on what is happening “behind the scenes.”
  • 41. Functional Testing
    • Grey Box Testing
      • A combination of white box and black box testing.
      • Validates database entries as a result of exercising the UI without stepping through code
  • 42. Regression Testing
    • Re-executing previously performed tests
    • Usually occurs after a new build has been released to validate that no new defects have been introduced
    • Applies to all aspects of testing
      • Functional
      • Scalability
    • Regression testing lends itself to automation due its repetitive nature
  • 43. Questions?
  • 44.  

×