Uploaded on

 

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
525
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
33
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Avaliação de Desempenho de Sistemas de Informação MSc. Luiz Barboza [email_address] http://barbozaluiz.blogspot.com/
  • 2. Sobre mim...
    • Mestre em Ciência da Computação com 10 anos de experiência de mercado, atuando como Arquiteto de Software, com 04 anos direcionados para a área Financeira e com os 03 últimos focados na Garantia de Qualidade para o setor de Telecomunicações.
    • Qualificação
      • Mestre em Engenharia de Software pelo IPT/USP
      • MBA em Gestão Empresarial pela FGV
      • Especialista em Gestão de TI pela FIAP
      • Bacharel em Ciência da Computação pela UFPE
    • Certificações
      • SCEA - Sun Certified Enterprise Architect
      • TIBCO Enterprise Message Service Certified
      • ITIL - ITIL Foundation Certified Professional
      • IBM/Rational Specialist for Rational Requirements Management with Use Cases(+ReqPro)
      • IBM/Rational Certified Solution Designer - IBM Rational Unified Process V7.0
      • IBM/Rational Solution Designer – Object Oriented Analysis and Design(+Rose)
      • SCWCD - Sun Certified Web Component Developer for the J2EE
      • SCPJ - Sun Certified Programmer for Java 2 Platform
  • 3. Programação
    • Ementa
      • Aborda noção de carga de trabalho ("workload") e a sua caracterização; técnicas de modelagem de sistemas; ferramentas e metodologias para obtenção de dados de sistemas; medidas de desempenho: orientadas a usuário e orientadas ao sistema; ferramentas de medidas de desempenho: monitores de hardware e software;paradigmas("benchmarks"), simulação de sistemas e estudo de casos.
    • Objetivos
      • Estudar conceitos e técnicas para dimensionar ou avaliar sistemas de informação. Aplica ferramentas para a avaliação desses sistemas e os resultados da análise de desempenho para a sua otimização.
    • Bibliografia
      • CHWIF, L., MEDINA, A. C. Modelagem e Simulação de Eventos Discretos: Teoria e Aplicações. 1ª ed. São Paulo: Bravarte, 2006.
      • TARTUGNO, A. F., DiPASQUALE, T. R., MATTHEWS, R. E. It Services: Costs, Metrics, Benchmarking and Marketing. Safari Books online, Prentice Hall, 2000.
      • LOUKIDES, M., MUSUMECI, G. D. System Performance Tuning. 2a ed, Safari Books online, O´Reilly, 2002.    
      • JAIN, J. The art of computer systems performance analysis. Nova York: John Wiley & Sons, 1991.
    • Avaliação
      • 2 Exames individuais discursivos.
  • 4. Agenda
    • Princípios de análise desempenho
    • Planejamento e Preparação para o Teste de Performance
    • Stress Testes
    • Monitoramento aplicação desempenho
    • Análise de desempenho do trafego de rede
    • Analise e Ajuste de Performance na camada Web
    • Análise de Desempenho de código
    • Analise da camada de dados
    • Estimativa de Capacidade
    • Modelagem de Desempenho
  • 5. Test: Simple Workflow Tester Run Test Log Tester Code Test Scripts Test Designer Model Test Case Test Manager Plan Test Plan Test Analist Report Defect Log Test Management Tool Test Management Tool Test Script Tool
  • 6. Workload Model
  • 7. Requisitos de Performance
  • 8. ACT
  • 9. JMeter
  • 10. RPT
  • 11. RUP Overview Focused on the Test Discipline
  • 12. RUP Structure
    • Organization along time
      • Lifecycle structure: phases, iterations
      • Process enactment: planning, executing
      • Activity management, project control
    • Organization based on content
      • Disciplines, roles, artifacts, activities
      • Process configuration, process enhancement
  • 13. Organization Along Time
  • 14. Phases and Iterations Commit resources for the elaboration phase Commit resources for construction Product sufficiently mature for customers to use (Understand the problem) (Understand the solution) (Have a solution) Acceptance or end of life Planned (Business) Decision Points Preliminary Iteration Architect. Iteration Architect. Iteration Devel. Iteration Devel. Iteration Devel. Iteration Transition Iteration Transition Iteration Planned (Technical) Visibility Points Inception Elaboration Construction Transition
  • 15. Major Milestones: Business Decision Points time Inception Elaboration Construction Transition Commit resources for the elaboration phase Lifecycle Objective Milestone Commit resources for construction Lifecycle Architecture Milestone Product sufficiently mature for customers Initial Operational Capability Milestone Customer acceptance or end of life Product Release
  • 16. Key RUP Elements: Roles, Activities, Artifacts Role Activity Artifact responsible for performs
  • 17. Roles Perform Activities and Produce Artifacts Example: Requirements-> Workflow Detail-> Define the System
  • 18. Key RUP Element: Role
    • A Role defines the behavior and responsibilities of an individual, or a set of individuals working together as a team.
    • Team members can “wear different hats,” that is, each member can play more than one Role.
  • 19. Key RUP Element: Activity
    • A piece of work a Role is responsible for, that the Role may be asked to perform
    • Granularity: a few hours to a few days
    • Repeated, as necessary, in each iteration
  • 20.
    • A document or model produced, evolved, or used by a process
    • The responsibility of a Role
    • Likely to be subject to configuration control
    • May contain other artifacts
    Key RUP Element: Artifact
  • 21.
    • The conditional flow of high-level activities (Workflow Details) that produces a result of observable value.
    Key RUP Element: Workflow
  • 22. Workflow Details Example: Workflow Detail: Prepare Environment for Project Example: Environment: Workflow
  • 23. Summary of Major Artifacts
  • 24. Additional Process Element: Concepts
    • Attached to the relevant Discipline
    • Explanation of key ideas
    • Examples of Concepts
      • Requirements
        • Requirements Management
        • Types of Requirements
        • Traceability
      • Analysis and Design
        • Software Architecture
        • Analysis Mechanisms
        • Web Architecture Patterns
  • 25. Additional Process Element: Guidelines
    • These are Rules, recommendations, heuristics that support activities and their steps. They:
    • Describe specific techniques.
      • Transformations from one artifact to another
      • Use of UML
    • Are attached to relevant discipline.
    • Are kept short and to the point.
    • Describe well-formed artifacts and focus on qualities.
    • Are used also to assess the quality of artifacts.
    • Are tailored for the project.
  • 26. Additional Process Element: Tool Mentors
    • Attached to relevant activity
    • Explain how to use a specific tool to perform an activity or steps in an activity
    • Linked by default to Rational tools:
      • RequisitePro: requirements management
      • Rational Rose: visual modeling, using UML
      • SoDA: documentation generation
      • ClearQuest: change management, defect tracking
      • … and more
  • 27. Additional Process Element: Templates
    • Attached to relevant document type
    • Predefined artifacts, prototypes:
      • Documents (Microsoft® Word™, Adobe® Framemaker™)
      • MS Project
      • HTML
    • Tailored for the process
  • 28. Additional Process Element: Roadmap
    • Roadmaps are used to:
    • Apply the general-purpose process to solve specific types of problems.
    • Describe process variants using phases.
    • Provide a mechanism for extending and adapting the process.
    • Highlight certain process features to achieve a particular goal.
  • 29. Test Discipline
  • 30. Test: Discipline
    • Purpose: Testing focuses primarily on the evaluation or assessment of quality realized through a number of core practices:
      • Finding and documenting defects in software quality.
      • Generally advising about perceived software quality.
      • Proving the validity of the assumptions made in design and requirement specifications through concrete demonstration.
      • Validating the software product functions as designed.
      • Validating that the requirements have been implemented appropriately.
    • Test discipline acts in many respects as a service provider to the other disciplines.
  • 31. Test: Guidelines
    • Test Case
    • Test Data
    • Test Ideas for Booleans and Boundaries
    • Test Ideas for Method Calls
    • Test Ideas for Statechart and Flow Diagrams
    • Test Plan
    • Test Script
    • Unit Test
    • Workload Analysis Model
  • 32. Test: Concepts
    • Acceptance Testing
    • Exploratory Testing
    • Key Measures of Test
    • Performance Testing
    • Product Quality
    • Quality Dimensions
    • Stages of Test
    • Structure Testing
    • Test Automation and Tools
    • Test-Ideas Catalog
    • Test-Ideas List
    • Test Strategy
    • The Lifecycle of Testing
    • Types of Test
    • Usability Testing
  • 33. Test: Concepts: Types of Test
    • Functionality
      • Function test
      • Security test
      • Volume test
    • Usability
      • Usability test
    • Reliability
      • Integrity test
      • Structure test
      • Stress test
    • Performance
      • Benchmark test
      • Contention test
      • Load test
      • Performance profile
    • Supportability
      • Configuration test
      • Installation test
  • 34. Test: Activities and Roles
  • 35. Test: Artifacts and Roles
  • 36. Test: Workflow
  • 37. Test: Define Test Mission
    • Identifying the objectives for the testing effort and deliverable artifacts.
    • Identifying a good resource utilization strategy.
    • Defining the appropriate scope and boundaries for the test effort.
    • Outlining the approach that will be used, including the tool automation.
    • Defining how progress will be monitored and assessed.
  • 38. Test: Verify Test Approach
    • Verifying early that the intended Test Approach will work and that it produces results of value.
    • Establishing the basic infrastructure to enable and support the Test Approach.
    • Obtaining commitment from the development team to provide and support the required testability to achieve the Test Approach.
    • Identifying the scope, boundaries, limitations, and constraints of each tool and technique.
  • 39. Test: Validate Build Stability (Smoke Test)
    • Making an assessment of the stability and testability of the build: Can you install it, load it, and start it?
    • Gaining an initial understanding—or confirming the expectation—of the development work delivered in the build: What was effectively integrated into this build?
    • Making a decision to accept the build as suitable for use—guided by the evaluation mission—in further testing, or to conduct further testing against a previous build. Again, not all builds are suitable for a test cycle, and there is no point wasting too much testing time and effort on an unsatisfactory build.
  • 40. Test: Test and Evaluate
    • Providing ongoing evaluation and assessment of the Target Test Items.
    • Recording the appropriate information necessary to diagnose and resolve any identified issues.
    • Achieving suitable breadth and depth in the test and evaluation work.
    • Providing feedback on the most likely areas of potential quality risk.
  • 41. Test: Achieve an Acceptable Mission
    • Actively prioritizing the minimal set of necessary tests that must be conducted to achieve the Evaluation Mission.
    • Advocating the resolution of important issues that have a significant negative impact on the Evaluation Mission.
    • Advocating appropriate product quality.
    • Identifying regressions in quality introduced between test cycles.
    • Where appropriate, revising the Evaluation Mission in light of the evaluation findings so as to provide useful evaluation information to the project team.
  • 42. Test: Improve Test Assets
    • Adding the minimal set of additional tests to validate the stability of subsequent builds.
    • Assembling Test Scripts into additional appropriate Test Suites.
    • Removing test assets that no longer serve a useful purpose or have become uneconomic to maintain.
    • Maintaining Test Environment Configurations and Test Data sets.
    • Exploring opportunities for reuse and productivity improvements.
    • Conducting general maintenance of and making improvements to the maintainability of test automation assets.
    • Documenting lessons learned—both good and bad practices discovered during the test cycle. This should be done at least at the end of the iteration.
  • 43. Test: Workflow Detail: Test and Evaluate
  • 44. Test: Simple Workflow Tester Run Test Log Tester Code Test Scripts Test Designer Model Test Case Test Manager Plan Test Plan Test Analist Report Defect Log Test Management Tool Test Management Tool Test Script Tool
  • 45. Test: Tool Support
    • Rational TestStudio®
      • Robot.
      • LogViewer
      • TestManager Rational Test Realtime®
    • Rational XDE Tester®
    • Rational PerformanceStudio®
    • Rational PurifyPlus®
      • Rational Purify®
      • Rational PureCoverage®
      • Rational Quantify®
  • 46. Avaliação de Desempenho de Sistemas de Informação MSc. Luiz Barboza [email_address] http://barbozaluiz.blogspot.com/