University of Virginia NATIONAL PARTNERSHIP FOR ADVANCED ...

469 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
469
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
3
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Majordomo list server managed by SDSC Builds Team: creation of distribution, automated deployments to the testbed, and testing against these deployments.
  • University of Virginia NATIONAL PARTNERSHIP FOR ADVANCED ...

    1. 1. Testing and Validation of the NPACKage NPACI All-Hands Meeting March 19, 2003 Marty Humphrey Assistant Professor Computer Science Department University of Virginia
    2. 2. NPACKage Project Structure <ul><li>Participation across the partnership </li></ul><ul><ul><li>UCSB, SDSC, UVA, ISI </li></ul></ul><ul><ul><li>Dedicated staff at each site </li></ul></ul><ul><li>Software contributors </li></ul><ul><ul><li>Activities focused under current NPACI allocation </li></ul></ul><ul><li>NPACKage specific activities </li></ul><ul><ul><li>Packaging team </li></ul></ul><ul><ul><li>Builds team </li></ul></ul><ul><ul><li>Testing team </li></ul></ul><ul><li>Resource leads </li></ul><ul><ul><li>Responsible for local deployment </li></ul></ul><ul><li>User Services </li></ul><ul><li>Executive Committee </li></ul><ul><ul><li>Frequent review of progress </li></ul></ul>
    3. 3. What’s So Tough about Software Test? <ul><li>Software has lousy quality historically </li></ul><ul><li>Simple program can have huge space of execution </li></ul><ul><li>“Testing is not innovation but rather verification ” </li></ul>
    4. 4. Why Does Software Have Bugs? <ul><li>miscommunication </li></ul><ul><li>software complexity </li></ul><ul><li>programming errors </li></ul><ul><li>changing requirements </li></ul><ul><li>time pressures </li></ul><ul><li>poorly documented code </li></ul><ul><li>bugs in software development tools </li></ul>
    5. 5. Basics of Software Testing <ul><li>Unsatisfied goals </li></ul><ul><ul><li>Find cases where the program does not do what it is supposed to do </li></ul></ul><ul><li>Unwanted side effects </li></ul><ul><ul><li>Find cases where the program does things it is not supposed to do </li></ul></ul>
    6. 6. Methodology <ul><li>Define the expected output or result. </li></ul><ul><li>Don't test your own programs. </li></ul><ul><li>Inspect the results of each test completely. </li></ul><ul><li>Include test cases for invalid or unexpected conditions. </li></ul><ul><li>Test the program to see if it does what it is not supposed to do as well as what it is supposed to do. </li></ul><ul><li>Avoid disposable test cases unless the program itself is disposable. </li></ul><ul><li>Do not plan tests assuming that no errors will be found. </li></ul><ul><li>The probability of locating more errors in any one module is directly proportional to the number of errors already found in that module. </li></ul>
    7. 7. Terminology <ul><li>White box testing </li></ul><ul><ul><li>Based on knowledge of internal logic/source code </li></ul></ul><ul><li>Black box testing </li></ul><ul><ul><li>Not white box testing :^) </li></ul></ul><ul><li>Unit testing </li></ul><ul><ul><li>To test particular functions or code modules </li></ul></ul><ul><li>Integration testing </li></ul><ul><ul><li>Testing of combined parts </li></ul></ul>
    8. 8. Terminology (cont.) <ul><li>Performance / Stress testing </li></ul><ul><ul><li>Good for testing scalability </li></ul></ul><ul><li>Regression testing </li></ul><ul><ul><li>Re-testing after modifications </li></ul></ul><ul><li>Acceptance testing </li></ul><ul><ul><li>Performed by end-users </li></ul></ul><ul><li>Completion criteria </li></ul><ul><ul><li>How do you know when you’re done? </li></ul></ul>
    9. 9. Don’t the individual projects test their software? <ul><li>Yes, of course, but we can do more </li></ul><ul><li>Value of centralized testing </li></ul><ul><ul><li>Complex interaction tests can be better achieved organizationally , not in isolation </li></ul></ul><ul><ul><li>Incomplete testing of any component reflects poorly on the organization (NPACI) </li></ul></ul><ul><ul><li>Independent testing can greatly improve software quality </li></ul></ul>
    10. 10. Value Added at UVa <ul><li>Work closely with other NPACKagers </li></ul><ul><ul><li>NPACKage packagers (Larry Miller, UCSB; Bill Link, SDSC) </li></ul></ul><ul><ul><li>NPACKage “Builders” and “Testbed Deployers” (Mats Rynge, USC/ISI) </li></ul></ul><ul><li>UVa: RH Linux 7.3 (8.0?), AIX 4.3, [Solaris] </li></ul><ul><li>Combination of “black box” and “white box” </li></ul><ul><li>Use Drivers and Stubs provided by NPACKage contributors </li></ul><ul><ul><li>Provide feedback to NPACKage contributors </li></ul></ul><ul><li>Goal: Testing Deployment </li></ul><ul><ul><li>“ Given a Deployment, does it work?” </li></ul></ul><ul><ul><li>NPACI Testbed and NPACI Production </li></ul></ul><ul><li>Goal: Testing Deployability </li></ul><ul><ul><li>Broader impact in community </li></ul></ul>
    11. 12. Software for the NPACKage <ul><li>NMI </li></ul><ul><ul><li>Globus, Condor-G, NWS, GSI-OpenSSH , KX.509, GridConfig, ( MyProxy , MPICH-G2) </li></ul></ul><ul><li>DataCutter </li></ul><ul><li>Storage Resource Broker – SRB </li></ul><ul><li>GridPort </li></ul><ul><li>Ganglia </li></ul><ul><li>APST </li></ul><ul><li>GridSolve </li></ul><ul><li>LAPACK For Clusters – LFC </li></ul>
    12. 13. Testing of NMI: SURA <ul><li>UVa: part of SURA NMI Testbed </li></ul><ul><ul><li>Globus </li></ul></ul><ul><ul><li>Condor-G </li></ul></ul><ul><ul><li>NWS </li></ul></ul><ul><ul><li>GSI-OpenSSH </li></ul></ul><ul><li>UVa co-PI on MyProxy NMI grant </li></ul>
    13. 14. Testing of DataCutter <ul><li>Last extracted from CVS on March 11, 2003 </li></ul><ul><li>Tests from developers </li></ul><ul><ul><li>a. Make sure the daemons start g. Placement test </li></ul></ul><ul><ul><li>b. Buffer test h. Layout test </li></ul></ul><ul><ul><li>c. Instance test i. Cluster test </li></ul></ul><ul><ul><li>d. Returns test j. Directory test </li></ul></ul><ul><ul><li>e. Crash test k. Grid array averager </li></ul></ul><ul><ul><li>f. Endian test </li></ul></ul><ul><li>Thanks to Shannon Hastings </li></ul>
    14. 15. Testing of SRB <ul><li>Just received SRB 2.0.0/2.0.1 early this week </li></ul><ul><li>Plan </li></ul><ul><ul><li>Exercise unit tests in SRB (awaiting descriptions) </li></ul></ul><ul><ul><li>Interactions with other NPACKage components </li></ul></ul><ul><ul><ul><li>DataCutter </li></ul></ul></ul><ul><ul><ul><li>GridPort </li></ul></ul></ul><ul><ul><ul><li>GridFTP </li></ul></ul></ul><ul><ul><ul><li>GSI </li></ul></ul></ul><ul><ul><li>Many clients, server </li></ul></ul><ul><li>Thanks to Reagan Moore, Wayne Schroeder </li></ul>
    15. 16. Testing of GridPort <ul><li>Last extracted (v 2.2) from CVS on March 14, 2003 </li></ul><ul><li>Run test Perl scripts from distribution </li></ul><ul><li>We use in our Alpha Project </li></ul><ul><ul><li>“ Protein Folding on the Grid” (with C. Brooks, M. Crowley of TSRI) </li></ul></ul><ul><li>Thanks to Maytal Dahan, Mary Thomas </li></ul>
    16. 17. Testing of Ganglia <ul><li>Last extracted from CVS on March 11, 2003 </li></ul><ul><li>Currently executing on 9 nodes on UVa’s Centurion </li></ul><ul><li>Tests </li></ul><ul><ul><li>Start a ‘gmond’ and pull its XML tree (“telnet localhost 8649”); verify with “xmllint --verify out.xml“ </li></ul></ul><ul><li>Planned </li></ul><ul><ul><li>The ganglia-python package from SDSC's Rocks group includes an MDS provider that obtains data from ganglia. </li></ul></ul><ul><ul><li>Is the information correct? </li></ul></ul><ul><li>Thanks to Federico Sacerdoti, Phil Papadopoulos </li></ul>
    17. 18. NPackage Testbed <ul><li>Machine contributions from resource sites </li></ul><ul><ul><li>Many thanks for the access to the systems for development and testing, especially: </li></ul></ul><ul><ul><ul><li>U.Mich </li></ul></ul></ul><ul><ul><ul><ul><li>6 AIX4.3, 1 ia64 RH7.1 </li></ul></ul></ul></ul><ul><ul><ul><li>SDSC </li></ul></ul></ul><ul><ul><ul><ul><li>Cluster: 4 RH7.3, 2 AIX5.1 </li></ul></ul></ul></ul><ul><ul><ul><li>UCSB </li></ul></ul></ul><ul><ul><ul><ul><li>1 SUSE 8.1 </li></ul></ul></ul></ul><ul><ul><ul><li>USC/ISI </li></ul></ul></ul><ul><ul><ul><ul><li>1 RH7.2 </li></ul></ul></ul></ul><ul><li>Point your favorite LDAP browser at: </li></ul><ul><ul><li>giis.npaci.edu:2135 </li></ul></ul>
    18. 19. Testing NPACKage Stage-1 <ul><li>Resource discovery and monitoring infrastructure </li></ul><ul><ul><li>Hierarchical discovery and monitoring cache/index (MDS) </li></ul></ul><ul><ul><li>Host resource information (MDS) </li></ul></ul><ul><ul><li>Cluster resource inform (Ganglia) </li></ul></ul><ul><li>Binaries </li></ul><ul><ul><li>ia64-redhat-7.1 </li></ul></ul><ul><ul><li>power3-aix-4.3 (testing...) </li></ul></ul><ul><ul><li>power3-aix-5L </li></ul></ul><ul><ul><li>x86-redhat-7.2 </li></ul></ul><ul><ul><li>x86-redhat-7.3 (tested) </li></ul></ul><ul><ul><li>x86-suse-8.1 </li></ul></ul><ul><li>Is the information correct ? </li></ul>
    19. 20. Automation <ul><li>Some tools exist </li></ul><ul><ul><li>Dart </li></ul></ul><ul><ul><li>Tinderbox / TreeFire (Mats) </li></ul></ul><ul><ul><li>“ Inca Build System” (TeraGrid) </li></ul></ul><ul><li>Issue : Determining Cause/Blame </li></ul><ul><ul><li>Builder </li></ul></ul><ul><ul><li>Deployer </li></ul></ul><ul><ul><li>Packager </li></ul></ul><ul><ul><li>Resource Provider </li></ul></ul><ul><ul><li>Software Developer </li></ul></ul>
    20. 21. Bottom Line <ul><li>NPACI software infrastructure is transitioning to production </li></ul><ul><li>Evolving plan for testing and validation </li></ul><ul><ul><li>Providers’ tests  tests at UVa  </li></ul></ul><ul><ul><li>tests on NPACI testbed  deployment to NPACI sites </li></ul></ul><ul><li>Integrate and influence NMI testing and TeraGrid testing </li></ul><ul><li>Next: APST v2.0, GridSolve, LFC </li></ul><ul><li>NPACKage: infrastructure (and soon) integration with Applications (e.g., CHARMM Portal) </li></ul>

    ×